Python Idea - Latex style macros

My goal

My goal is to inspire people about what macros could do, and what macros could look like.
And to open up a conversation about what we’d like macros to look like in Python.

I’m not able to implement this myself, and I’m not expecting anyone to implement this for me.
But maybe we’ll eventually have better macros for having had this discussion.

Last time I posted in Ideas, and people thought I was rude because my idea wasn’t yet ready to be a PEP. I’m posting this in Help. If you think it deserves to be in Ideas, please move it. If not, please help me make this idea better :slight_smile:

Python Idea - Latex style macros

I have seen a couple of ideas and problems discussed that could potentially be solved by macros.
So far, most attention has gone to Rust-style macros, as with # PEP 638 – Syntactic Macros, proposed in 2020, which currently has status “draft”.

I think it could be good to also consider another source of inspiration, Latex. Specifically, I would propose copying the following:

  • Macros are identified by the character \.
  • Macros specify some pattern of text replacement
  • Macros can be defined similar to Latex’ \NewDocumentCommand, as specified in the xparse documentation.
  • When a Python source file is run or imported, all the macros are resolved to create an “expanded script file”, and this file is parsed by the current python interpreter.

Motivation

The following are some problems that could potentially be “solved” by macros:
PEP 505 – null-coalescing via ?., ??, and ??= (and ?[])
nested creation of dictionary keys
Introduce a “bareword” list/dict literal
Pseudo-Uniform Function Call Syntax
Assure keyword
Method for interpolate “normal” string like an f-string
c-strings
Deferred Evaluation (somewhat hackishly)
PEP 671 – Syntax for late-bound function argument defaults(extremely hackisly)
f-strings as docstrings: github implementation stackoverflow question about the lack-of-a-feature
There was also an adjacent discussion on discuss.python.org recently: DSL Operator – A different approach to DSLs

More generally, macros are an extremely powerful tool, and people are bound to find applications of them once they occur within python.

I do envisage them primarily as personal tools for customising Python to your preferences, that others can ‘disable’ by resolving them all, which would result in a clean macro-free python file.

Further details of implementation

arg specs (1)

To solve the problems listed above cleanly, macros need (at the very least) access to the preceding expression*, the following word*, expression*, line, or block, and to any text contained in user-specified boundary tokens.

Traditionally a latex macro is defined as

\NewDocumentCommand{\macro_name}{argment specs}{macro pattern}

The latex tradition is to use [] for optional arguments, and \NewDocumentCommand is a mouthful, so we could modify this to

\NewMacro{\macro_name}[preceding arg specs]{following arg specs}{macro pattern}

arg specs (2)

In the phrase \macro a.b.c, a would be the first word, and a.b.c would be the first expression. One could argue that I should call the former a token, or that I’m using the word expression incorrectly. This is meant as the opening for a constructive construction, so I’m open to ideas for better naming.

arg specs (3)

xparse uses extremely terse notation for the argument specs, which takes a little getting used to. For the sake of making this easier to read without having to memorise a list of symbols, lets write the argument specs as for example word($func), block($bl), and tokens""($string), with the $-variable a thing that the macro can act on. (In latex these are refered to inside the macro as #1, #2, etc, which won’t work well for a variety of reasons.)

Explanation via example: PEP 505

If you want to be able to write a if a is not None else b succinctly, PEP 505 proposes a ?? b. With these macros you could define

\NewMacro{\??}[expression($X)]{expression($Y)}{($X if $X is not None else $Y)}

so that c = d + a \?? b / 2 is equivalent to c = d + (a if a is not None else b) / 2.

Equally for =?? you could define

\NewMacro{\=??}[expression($X)]{expression($Y)}{
if $X is None:
  $X = $Y
}

((Note here the spacing could be ambiguous. One could foresee problems with the above definition and a use as

def f(x=None):
  x \=?? 3

which would with a poor implementation resolve as

def f(x=None):
  if x is None:
  x = 3   # Error

but I believe that problem can be solved.))

If you want an easy way to write a.b.c if (a is not None and a.b is not None) else None, as I understand one of the possible implementations of the proposed ?. operator to be, that could be defined via

\NewMacro{\?.}[expression($X)]{word($Y)}{($X.$Y if $X is not None else None)}

so that a\?.b\?.c is expanded into

((a.b if a is not None else None).c if (a.b if a is not None else None).c is not None else None)

which I believe is equivalent if slightly less efficient.

Explanation via example: nested creation of dictionary keys

what was requested was a way to do

tree = {}
tree.set(('customer', 'id', 'name'), "York")
tree['customer']['id']['name'] == "York"

Possible implementation with macros:

\NewMacro{\.set}[expression($dic)]{tokens()($args)}{
  do_set($dic, $args)
}
def do_set(d: dict, keys: tuple, value: Any)->None:
  *most, last = keys
  for key in most:
    d = d.setdefault(key, {})
  d[last] = value
  return

then

tree.\set(('customer', 'id', 'name'), "York")

resolves into

do_set(tree, ('customer', 'id', 'name'), "York")

which does the requested operation.

One could argue this is cheating, and the macro should actually resolve to

tree.setdefaul("customer", {}).setdefaul("id", {})["name"] = "York"

or

tree.setdefaul("customer", {}).setdefaul("id", {}).__setitem__("name", "York")

or even

tree.setdefaul("customer", {}).setdefaul("id", {})
tree["customer"]["id"]["name"] = "York"

I find this difficult, both with latex-inspired macros and with rust-inspired macros, because the pattern .setdefaul("key", {}) gets interrupted at the end, and because working with vectors of indeterminate length is hard in both Latex and Rust. But any system of python macros needs to be able to accomplish tasks like this. In a fair manner, without growing too complicated in syntax.

A solution I can come up with involves the Latex E/e argspec, which tests whether a particular symbol is present. Let us write this argspec here as E{=}($is_end), which means $end is True if = is present, and False otherwise.
(This is a genuinely useful and powerful construct. In plain Latex it’s quite common and useful for \macro and \macro* to mean something subtly different for example.)
Now we can define:

\NewMacro{\set}{tokens[]($key) E{=}($is_end) expression($maybe_value)}{
  \if ($is_end) {.__setitem__($key, $maybe_value)}
  \else {.setdefaul($key, {})\set}
  \fi
}

which should resolve tree\set["customer"]["id"]["name"] = "York" into

tree.setdefaul("customer", {}).setdefaul("id", {}).__setitem__("name", "York")

Explanation via example: bareword list/dict

The OP of this topic wanted an easier way to write for example

__all__ = ["cmp_op", "stack_effect", "hascompare", "opname", "opmap",
           "HAVE_ARGUMENT", "EXTENDED_ARG", "hasarg", "hasconst", "hasname",
           "hasjump", "hasjrel", "hasjabs", "hasfree", "haslocal", "hasexc"]

and suggested

__all__ = <cmp_op stack_effect hascompare opname opmap
           HAVE_ARGUMENT EXTENDED_ARG hasarg hasconst hasname
           hasjump hasjrel hasjabs hasfree haslocal hasexc>

We can already write

__all__ = '''cmp_op stack_effect hascompare opname opmap
           HAVE_ARGUMENT EXTENDED_ARG hasarg hasconst hasname
           hasjump hasjrel hasjabs hasfree haslocal hasexc'''.split()

but the problem is that then not all IDEs parse it correctly.

This is one of the motivations why I suggest that eg PyLance should view the expanded script file, rather than the raw script (with macros in it).

It would seem like a good solution to me to include the \Eval macro, which evaluates python code but is not allowed to resolve imports (because macros with access to eg sys are too scary), so that you could simply write

__all__ = \Eval{'''cmp_op stack_effect hascompare opname opmap
           HAVE_ARGUMENT EXTENDED_ARG hasarg hasconst hasname
           hasjump hasjrel hasjabs hasfree haslocal hasexc'''.split()}

or if you want to hide the .split() method,

\NewMacro{\"""}{tokens{SELF}{"""}($text)}{
  \Eval{"""$text""".split()}
}

__all__ = \"""cmp_op stack_effect hascompare opname opmap
           HAVE_ARGUMENT EXTENDED_ARG hasarg hasconst hasname
           hasjump hasjrel hasjabs hasfree haslocal hasexc"""

The OP of that thread also wants to be able to write

_specializations = {
    "RESUME": [
        "RESUME_CHECK",
    ],
    "TO_BOOL": [
        "TO_BOOL_ALWAYS_TRUE",
        "TO_BOOL_BOOL",
        "TO_BOOL_INT",
        "TO_BOOL_LIST",
        "TO_BOOL_NONE",
        "TO_BOOL_STR",
    ],
    ...
}

as something like

_specializations = <
    RESUME:
        RESUME_CHECK
    TO_BOOL:
        TO_BOOL_ALWAYS_TRUE
        TO_BOOL_BOOL
        TO_BOOL_INT
        TO_BOOL_LIST
        TO_BOOL_NONE
        TO_BOOL_STR
>

I don’t see a quick & easy way to do that with macros. My experience tells me a macro could achieve it, but that it will be ugly/complicated code.
But I do think enabling people to embed yaml into their python code would be a good thing because it makes python work for more people.
If other people need to work on it later and they don’t like the embedded yaml, it should be possible for them to resolve the macros, and continue working in plain python.

Explanation via example: Pseudo-Uniform Function Call Syntax

Could be implemented as

\NewMacro{\.}[expression($object)]{word($function) tokens(){$args}}{
  $function($object, $args)
}

then for example

"abcde"\.len() == len("abcde")

Explanation via example: Assure keyword

Here this system of macros runs into a spot of trouble.
I thought I had a solution, but don’t think there’s much you can do with macros that goes beyond the functionality that you can achieve with

def assure(maybe_none):
  if maybe_none is None:
    raise ValueError
  return maybe_none

I mean you could define a macro so that you can call it as

a = \assure f(b)

instead of

a = assure(f(b))

but that doesn’t help the type checker.

Explanation via example: Method for interpolate “normal” string like an f-string

The problem is you have a string like f"I want a {robot} brain" that you want in multiple places.
The solution is to just use a macro instead of an assignment, and it works perfectly.

\NewMacro{\my_fstring}{}{f"I want a {robot} brain"}

x = \my_fstring
...
y = \my_fstring

There’s potential for a little awkwardness because the macro has to be defined at the global level of your file, so you can’t locally create it inside a function. But I think it’s worth the trade-off.

Explanation via example: Deferred Evaluation (somewhat hackisly)

Is honestly the same as the above. You want an expression that evaluates to x+2*y whenever you assign to it? \NewMacro{\macro}{}{(x+2*y)}. It won’t have all the behavior that people who ask for deferred evaluation want, but honestly don’t get the impression that there is an agreed upon proper definition of “deferred evaluation”. And with macros, fans of the concept can figure out what they want, modify the behaviour of the macro, and eventually maybe they’ll have something with well-defined behaviour that they’re happy with and they can tell the rest of us about.

Explanation via example: PEP 671 – Syntax for late-bound function argument defaults(extremely hackisly)

\NewMacro{\late}[word($argname) tokens:=($type_hints)]{tokens()($late_bound expression) line($rest_of_line)}{
  $argname : $type_hints | None = None $rest_of_line
  \newline
  if $argname is None: $argname = $late_bound
}

so that

def f(a: str, b: list = \late [a]):
  ...

resolves into

def f(a: str, b: list | None = None):
  if b is None: b = [a]
  ...

and because macros resolve one at a time, from left to right,

def f(a: str, b: list = \late [a], c: list = \late []):
  ...

resolves into

def f(a: str, b: list | None = None, c: list | None = None):
  if c is None: c = []
  if b is None: b = [a]
  ...

so that does mostly work.

There is a ‘problem’ that

def f(a: str, b: list = \late [c], c: list = \late [a]):
  ...

works but

def f(a: str, b: list = \late [a], c: list = \late [b]):
  ...

resolves into

def f(a: str, b: list|None = None, c: list|None = None):
  if c is None: c = [b]
  if b is None: b = [a]
  ...

which does not.

I am also noticing here that there should be a good way to have optional arguments for macros. So that you could design a macro that work regardless of whether the late-bound variable has type hints. Latex has a pretty good system, but sadly I broke it in my attempt to remove the magic symbols such as o, O, m, R etc.

Explanation via example: f-strings as docstrings

The most common situation where this is desirable is when you have some CONSTANT that you want to mention in the docstring.
With my proposed system you could do

\NewMacro{\MyConstant}{}{100}
MY_CONSTANT = \MyConstant

def f():
  \Eval(f"docstring that mentions {\MyConstant}.")

which admittedly is much more awkward than what I expected to end up with.
You could also write (to use a classic Latex pattern):

\NewMacro{\MyConstant}{E{*}($is_string)}{
  \IfBooleanTF{$is_string}{"100"}{100}
}
MY_CONSTANT = \MyConstant

def f():
  "docstring that mentions "\MyConstant*".")

which resolves into

MY_CONSTANT = 100

def f():
  "docstring that mentions ""100"".")

but maybe (especially if you’re allowed to invent new macro rules) there are better solutions to be found.

I envision macros only being used as tools of convenience

Macros are dangerous. There are good reasons why Latex wasn’t adopted as a multi-purpose programming language, even though it is Turing complete. When I look back at my old latex files, they make sense to me, but they’re not production code, and a lot of them could never be production code.

Someone pointed out in another thread about macro’s that there is a danger that use of macros transforms code from something that makes sense for everyone into something that only makes sense to the writer, because macros make a language so customizable. With copilot being on the rise, this is especially important.

On the balance I think this is the responsibility of individuals and organizations. You can already write python that is incomprehensible to most readers. For example by creating inheritance brambles. (I’m tempted to share a code base I saw recently, but I don’t want to embarrass/anger the author.)

But it would probably be best practice to use no more than 1 macro per file, and/or to make sure that future programmers can convert the whole thing to the “extended script file”, forget about the macros, and continue from there. In other words, (in most cases,) after resolving the macros you should still have good (if slightly repetitive) Python code.

The potential role of (these style of) macros in evolving Python syntax

Python develops conventions.
Like

import numpy as np

occurs in almost all code that import numpy.

If a significant fraction of python users converges on the same macro, I think that that’s a sign that it should be considered for general syntax.

At the same time, I don’t expect that to happen quickly. And if (within a coding community) everyone knows what \set or \|> means, then that macro escapes the “this is a personalization that you should not expect other people to understand” niche. And then the prolonged use of that macro isn’t a big problem.
It would still be annoying that that macro gets resolved when you go through a resolve-all-macros process, so it would still be valuable to integrate such a macro into the general python syntax.

Are you expecting macros to be reusable across files?

My understanding of how this would work in practice is that, since these macros are being expanded at compile time, they can’t be Python objects which means that they can’t be passed around via module dicts making them un-import-able and essentially locked to the file they’re in. i.e. If you like \?? then you’d have to copy/paste that whole \NewMacro{\=??}... block into the top of every script that uses it.

And in anticipation of someone suggesting it, no I don’t think a macro/preprocessor equivalent of import would work due to how heavily Python now relies on runtime changes to sys.path and sys.meta_path in order to resolve regular imports.

Yes I would expect macros to be reusable across files. They’d have to be.

I hadn’t given any thought to how

from utils import \MyMacro1, \MyMacro2

would work, and you make a good point that it could be hard.

Could it be possible to pass macros around as raw strings? You could run the import machinery twice, the first time to import the macros, then the second time to run the expanded script. Would be a little inefficient, but I think it could work.

I don’t know either.

The other complication with importing macros is that, since they can change source code, they also change bytecode.

Currently, when you compile a .py file to pycache, the .py file is effectively self contained in that any other modules it imports can be updated, replaced, missing, carry syntax errors, etc without affecting the output bytecode of the target module. This means that pycache can be trivially invalidated based either on either os.stat("original.py").st_mtime or a hash.

But with this change, modifying one module can invalidate the bytecode of another that (indirectly) imports it. The cache invalidator would have to navigate all the way back up the import tree to where the macro defined to see if anything along that path has changed. That sounds pretty slow [1] and difficult to me. It would also break package distribution mechanisms that ship bytecode precompiled – all Arch/Alpine Linux packages for example would have to be rebuilt if any of their dependencies are upgraded.


  1. and there’s a lot of push to make startup times faster which this change would counteract ↩︎

1 Like

You do want macros to be interpreted before the code is compiled, right? You’d have to have something like the C preprocessor. import likely wouldn’t work without major reworking of the front end tokenizer, as import actually runs as the byte code is interpreted.

Using something like

#include utils

would create two different kinds of namespace, one (the import case) where \MyMacro1 is part of the utils namespace, one (the #include case) where it’s not. Beginners (at least) would perhaps have a bit of trouble distinguishing the difference.

If you want macros, you could just use m4 I think. At the very least, you could mess around with it and see where it works and where it doesn’t.

Edit: I forgot to check the PEPs. PEP-638 is probably worth a read.

You could, I’ve done it. But then you can’t simply import your module - you have to have a build step where you turn it into the actually-runnable version first. Kinda not what I want from Python, so not something I’ve ever bothered with outside of tinkering.

1 Like

So PEP exists. What is wrong with it? Why do we need a new one?

I think a lot of stuff regarding implementation strategy is covered in PEP so I think there are 2 things to focus on:

  1. Syntax
  2. AST based or Raw Text?

Regarding syntax, I think it would be good to take 1 simple example and think of what syntax would be best.

Thus, I propose to start with macro_rules! and compare different proposals. So far there is PEP and there is this proposal.

PEP638

# DEFINE
def foo(tree):
    tree.body = ast.Call(ast.Name("print"), [tree.body])
    return tree

printer = (func, _ast.EXPR_MACRO, VERSION, ())

# USE
from! macro_location import printer
x = printer!("Hello World")

Personally, I quite like this. It feels quite Pythonic.

While PEP638 manipulates ASTs, this proposal is more along the lines of C macros.
Wouldn’t this be a bit of step backwards?
How would above example look done using Latex style macros?

If I understand correctly…:
a) C - Text Macros
b) Rust - Syntactic Macros (From users POV very similar to C, but implemented at different level)
c) Python - ?

I think it could be sensible that Python being higher level language takes one step forward with Syntactic Macros + macro implementation is direct AST manipulation instead of writing the syntax.

Slightly less convenient, but I think this offers more functionality by being able to use NodeTransformer and similar techniques.

Not trying to be a downer, but I think it’s Latex that should be made to be more like Python rather than Python more like Latex.

Anyway, I think as with a lot of such ideas, it’s best to carefully examine some example cases and compare what they look like in ordinary Python and what they look like with macros. (Normally, an idea starts with a motivation rather than showing syntax.)

4 Likes

@dg-pb and @NeilGirdhar : Trying to explain my motivation better:

PEP638 proposes macros as essentially fancy functions, that are able to do things that are currently impossible.

My motivation is that there are a lot of things that are already possible with regular python functions that people aren’t happy with because it doesn’t look good enough. (Including myself.)

For example, people aren’t happy with

x = y if x is not None else x

and they want to be able to write

x = x ?? y

It is already possible to define a function, such that you can write

x = function(x, y)

but that’s not good enough. And though I’m -0 on the ?? syntax right now, I understand why that last solution isn’t good enough.

This is a ‘problem’ that can’t be solved by PEP 638. I mean, it’s hardly an improvement to write

x = !macro(x, y)

My pitch is that it should be possible to implement macros in a manner that enables people to solve the aesthetic problems they have with Python.
And that it might be possible to do so without severe side effects such as causing a lot of work for IDE developers and making code unreadable.

I have worked with 1 code base that was created using macros, and the solution they had lines up with what @smontanaro said, that the macros fundamentally end up in a different (type) of name space.

As I understand the problem you’re describing, it is that if you use library foo, and foo uses macro_utils, then if foo=0.1.3 was compiled using macro_utils=1.2 but you install in your environment foo=0.1.3 and macro_utils=1.4, the python code in foo could be different from “expected” and hence caching would break.

The solution that I’m familiar with is to bake the macros into the library. You could install foo=0.1.3 which uses macro_utils=1.2, and bar=0.2 which uses macro_utils=1.1 and macro_utils=1.4 for your own use, and there shouldn’t be a conflict, because the library package should only care about the expanded script file[1].
So macro libraries would end up fundamentally different from python libraries.

Now to shoot my own boat, I recall we may have actually had a family of libraries where it was important to have the macros match up (because the macros were used to define & describe the API the libraries used to communicate with each other). I think we had something like foo-1.1, foo-1.2, … foo-1.19 all available for install, as well as bar-1.1, bar-1.2, … bar-1.19, where the number identifies the macro version.
So there are some pitfalls macro design would need to avoid, because that’s not a great situation to have.


  1. The version in which the macros are resolved ↩︎

PEP638 would not be able to do this.

And I think it is a good thing. Being able to implement macro operators is a bit too much flexibility.

This pretty much allows screwing with things until there is no resemblance to Python syntax left.

Yes for this simplest case it does look a bit verbose. But one can implement none-aware Pythonic DSL, e.g.:

x = maybe!(x or y)
x = maybe!(x.a[1] or y.b[1])

For anything more complex, the brevity of this is quite satisfactory to me.

Case 1:

This is useful (PEP505). However, on its own this can hardly be a justification for such text-replacement macros.

Furthermore, \=?? is quite ugly and if there are not many good reasons for macros to be able to do text replacement, such might be able to achieve by extending PEP638. E.g.:

import! operators
coalesce_op = (func, _ast.OP_MACRO, VERSION, (), '??')
operators.register(coalesce_op)

This would not work, just to make a point that PEP638 could potentially be extended to address some of the needs without needing to resort to C-style text-replace macros.

Case 2:

This seems hardly worth it. Gain is minimal (if any), while complexity introduced is quite substantial.


Personally, I think it would be good to collate examples of what macros should be able to address.

Then:

  1. eliminate those that have other sufficiently good solutions and those that aren’t worth the effort having minimal benefits
  2. see which ones can be addressed with PEP638.
  3. try to think if there are possible ways to extend the PEP to address those that it can not handle.

And only resort to more far out ideas once there is conviction that PEP638 is not a satisfactory route.


Also, I think it is worth noting that, in my opinion, chance for acceptance of text-replace macros is currently close to none.

And for it to change all other options have to be exhausted first. At the very least there has to be a conviction that path of PEP638 is not enough.

Not saying that it is impossible to push completely new text-replace macro idea through, but it would have to be something truly magical to bypass the above.

Latex neither, this is not a macro, but a custom operator, as most of the examples in the OP.

1 Like

It may interest you all to know that python already has macros:

And its predecessor

And its predecessor:
https://macropy3.readthedocs.io/en/latest/
And its predecessor:

These macro libraries are incredibly powerful already, without any direct support from the python core dev team or cpython itself

The only major limitation is that because these operate as import-time ast transformers, they can only operate on python-parseable code (meaning, for example, you can’t use them to implement new operators like ‘??’)

1 Like

Yup.

PEP638 is an attempt to bring concepts that were explored in those packages into standard library.

There are many benefits to it.

Throwing my 2c in as a teacher, and helper of many beginners, and also someone that regularly has to understand other’s code. From this perspective, this proposal creates code that my beginners would find completely incomprehensible.

It was written:

It is already possible to define a function, such that you can write
x = function(x, y)
but that’s not good enough.

It’s not clear to me why that isn’t good enough. It’s clear, obvious, straightforward, easy to teach, easy to understand. Sure it’s like 5 characters more typing, but I’m strongly of the opinion that less typing is not an advantage.

1 Like

Just empirically, I do see people writing

x = x if x is not None else y

but I don’t see people writing

x = update_if_none(x, y)

and I understand that decision on an intuitive level.

In abstract I would argue that “it’s like 5 characters more typing” doesn’t do the situation justice, because the current situation with functions is like like Polish Notation. Sometimes it’s bothersome, in the same way that

assign(x, sub(add(pow(a,p), prod(b, pow(c, q)), prod(prod(2, prod(d, d)), my_func(a,b,p))

quickly becomes intractable.

I don’t know how relevant it is to the ?? question (which I couldn’t represent well in any case because it’s not a feature I desperately want), but I expanded on this point because it is relevant to some of the other examples.

That’s fair. Personalised macros had the same effect in my experience in Latex, in that you shouldn’t expect people to be able to understand a document that contains more than 1 personalised macros. Even then it takes more effort to read code with new macros in it. I believe I did write this in the OP.

Not sure about etiquette of this forum, whether I should reiterate my “answers” to this “problem”, but:

  • It’s already possible to create incomprehensible code. I semi-frequently see code I can’t comprehend properly due to inheritance tangles, and I’m not a beginner anymore.
  • You already need to stick to conventions to keep your code comprehensible. There are some things that are almost always “forbidden” but in rare circumstances they make code better. I believe macros (in the context of python) naturally fall into this niche.
  • The point of being able to “resolve” the macros is precisely that people should be able to read your code without dealing with your macros.

One could argue that based on this view we shouldn’t have macros at all, and people do. Maybe they’re right. :man_shrugging:


Thanks!


I would define a macro as “something which modifies code before it runs”. Then again, even that definition probably isn’t exact. I don’t know whether it properly describes excel macros for example.
Yes, I’ve deviated pretty far from the inspiration. So it’s not Latex anymore. Do you think canonical latex macros would be better (for introduction into Python)?


@dg-pb :
I could go a step further, and implement “solutions” for all of these issues using plain no-macro python.

The bareword list can already be written as """some words""".split(), the bareword dict can be embedded as a string literal and then passed to a yaml parser function together with locals(), and instead of pseudo-ufcs you can call functions as function(object, args).

Late-bound function argument defaults can be solved via a decorator

@latebound_defaults(b='[a]')
def f(a: str, b:str): ...

where latebound_defaults would be some work to get right, but I think you can achieve it using eval.

But I think these arguments are missing the point.

You appear to be a fan of PEP638. What makes you excited about PEP638? What would you be able to do that you currently can’t?

would you really prefer x=maybe!(x or y) over x=maybe(x, y)? Maybe tastes differ, but to me overwriting the meaning of buildins feels to me even more dangerous than allowing people to introduce macro operators.
Going this route is almost begging for people to do things like

@!maybe
def my_func(*args):
"""in the body of this function `x or y` is interpreted as `y if x is None else x`"""

which means you get incredibly potent action on a distance.


Finally, does PEP638 allow you to resolve the macros to create a plain macro-less python file?

The first one is macro with 1 argument (allowing DSL), the second is a function with 2 arguments. They do not do the same thing. maybe! is Maybe Monad DSL macro.

Regardless, I am more interested in general direction rather than details that are not necessarily set in stone.

It is robust, well thought out, seemingly derives from PyPI packages hopefully combining what worked best and sensibly adapting concept to introduction to standard library with fairly minimal implementation (only 2-3 new AST node types) and everything falling into places nicely.

The fact that macros act on ASTs is arguably a step towards “pythonic”. While LaTeX macros kinda feel to go the opposite direction.

It is not like I am a fan of it, maybe there is even better concept. It just simply looks well and I can not see how this proposal can realistically compete with it. PEP638:

  • Syntax is more beautiful
  • ASTs do seem more Pythonic compared to text-replace approach (and more flexible too, while not having indentation issues)
  • Implementation is elegant and thought out
  • Builds on top of past experience

I honestly can not see a single point where LaTeX style macros would excel against the PEP.

Macros. :slight_smile: I think the proper case analysis is needed to go forward here as per below ↓

Yes, that would be productive. Taking case by case, example by example, providing complete implementations side-by-side of different macro ideas alongside no macro solution. Then comparing all cases on convenience, complexity, performance, etc…

Thanks for your considered reply.

I do! I have a family of “safe” functions that enables me to use an object regardless of its state. For example:

>>> safe_int(None)
0

>>> safe_int(' 123 ')
123

>>> safe_int('')
0

>>> safe_int('abc')
0

>>> safe_int('123a')
123

Doesn’t solve all these issues, but does make clean, simple code. “Simple is better than complex”.