PEP 638: Syntactic Macros

As someone who has never done meta-programming before, this looks very interesting. Considering how powerful it seems, will this make metaclasses redundant?

4 Likes

I don’t think so. This is intended for some specific use case as explained in the examples:

Where I see macros having real value is in specific domains, not in general purpose language features.

And unlike metaclasses, syntactic macros come with a slight performance downside:

For code that does use macros and has already been compiled to bytecode, there will be some slight overhead to check that the version of macros used to compile the code match the imported macro processors. For code that has not been compiled, or compiled with different versions of the macro processors, then there would be the usual overhead of bytecode compilation, plus any additional overhead of macro processing.

I’m glad to see that Python team is continuously taking into consideration of other domains like data science that rely heavily on Python, macros will alleviate some of these domain challenges without making changes to the language itself. As someone currently studying data science, I often stumble upon APIs that are non-Pythonic in many third-party DS libraries, and the new macros might introduce more non-Pythonic practice and less readable code, as mentioned by many critics in this Hacker News thread.

If this proposal gets accepted, I’d like to ask to the people behind this, please add an explicit and fairly strict guideline to the documentation page on when and when not to use macros, much like the PEP 8 style guide, or the zen of Python. So by following the macro guide itself can still be Pythonic.

1 Like

Can someone help me understand what the actual syntax is for defining a macro in this PEP?

I saw this in the usage examples:

# (func, _ast.STMT_MACRO, VERSION, ())
stmt_macro!:
    multi_statement_body

but that just looks like an example of using a macro defined in that particular way, not actually how to define such a macro.

Farther down in the PEP I see:

def macro_processor(kind, version, *additional_names):
    def deco(func):
        return func, kind, version, additional_names
    return deco

Which clearly wraps a function into a “macro tuple”.

But how do you actually tell the compiler that this particular tuple is supposed to be a macro?

I would have expected some kind of new statement for this, e.g.

defmacro(func, macro_type, version, additional_name):
    ...

I’d also like to suggest that “sibling macros” are a bad idea, and are more likely to cause confusion than provide benefit. I understand that their intention is to enable creating decorator-like behavior, but I don’t see much value in that.

Edit: as per the PEP, it seems like defmacro! could itself be a macro that marks a “macro-tuple” as a macro. But it’s still not clear how to define a macro in the first place.

2 Likes

In my point of view, macro should not bring a lot of new syntax elements to the language. Instead, it should add minimum new points to connecting the runtime-feature and parsetime-feature together.

Just puting input code snippet as plain string. And add all operations and feature on the top of string would be good.

Adding too many keywords like the proposal mentioned will only make Python become over-flashy and become how C++ be.

In my point of view, just func$( ... ) be adequate, and func(s: str) -> str: ... is just a normal Python callable which takes exactly 1 string argument and output string result. It is just declared in somewhere else before.

This will keep Python concise.

We can get inspired from Rust, but Python is more dynamic than Rust, We can utilize the dynamic nature of Python to maximum the potential of macro.

I have two disagreements with this:

  1. I prefer ! as a “sigil” instead of $, e.g. def foo! and foo!().
  2. I strongly oppose the use of strings as macro inputs and outputs. That is not “syntactic”, and if you want that, you can use a tool like GPP. The inputs and outputs should either be AST objects, or some simplified representation thereof. This is what the PEP currently specifies here and that is how it should stay.

I also think I answered my own question from before. To define a macro, you write a function that accepts and emits one AST node object, and decorate it with the special @macros.macro_processor decorator. This I think is reasonable, and I also think it more or less aligns with Evan’s proposal:

import ast, macros

class RewriteXY(ast.NodeTransformer):
    def visit_Name(self, node: ast.Name) -> ast.Name:
        if node.id == 'x':
            return ast.Name(id='y', ctx=node.ctx)
        else:
            return node

@macros.macro_processor(macros.STMT_MACRO, 0)
def replace_x_y(node: ast.AST) -> ast.AST:
    return RewriteXY.visit(node)

y = 123
with replace_x_y!:
    print(x)  # Rewritten to `print(y)`

I have also come around to being OK with sibling macros, as long as the ! sigil is present.

However I do think that if macros have special calling syntax, then it’s also reasonable to give them special defining syntax. That is, foo!() is directly analogous to def foo!() or def! foo(), one of the latter being syntactic sugar for @macros.macro_processor(EXPR_MACRO, 0).

The integer versioning is also a little hard to envision as being anything but error-prone and cumbersome for users.

1 Like

Is PEP 638 still being worked on? It looks super interesting to me, and I’d love to know if the Python devs are still interested in the idea.

AFAIK PEP 638 is not being worked on, but if you’d like to start macros right now, there are existing libraries which are (in my opinion) better-implemented.

There’s the venerable MacroPy3 library (abandoned many years ago, but still works flawlessly!)
Then there’s the now-abandoned rewrite mcpy, inspired by MacroPy
and there’s the modern (still maintained i think) mcpyrate, originally forked from mcpy but has since diverged and been fleshed out with extra features