Hacker News new | past | comments | ask | show | jobs | submit login
PEP 638 – Syntactic Macros (python.org)
107 points by rbanffy 8 days ago | hide | past | favorite | 78 comments

So the gist is: allow executing code at compile-time once, to not let it run a runtime again and again. And as a side-effect, you can implement new syntax-features, without polluting the interpreter.

Actually a nice idea. Surely something most devs wish to have from time to time. Though, it's also dangerous and can lead to even more unreadable code. Linter will probably also have "fun" adapting to this. And debugging this..

Well, anyway, personally I don't like the syntax. Putting a sign at words end feels just not pythonic. The Idea itself I favor more than I dislike it, as the advantages seems to outweight the disadvantages. As this is a PEP, I assume this will come in any case?

> As this is a PEP, I assume this will come in any case?

A PEP is just a proposal. The index (https://www.python.org/dev/peps/pep-0000/) lists about 170 PEPs that are abandoned, withdrawn or rejected.

Not only more unreadable code, but probably even more correctness problems since you don’t have a type checker validating the expansions (writing correct macros is even harder than writing correct code).

> version is used to track versions of macros, so that generated bytecodes can be correctly cached. It must be an integer.

This worries me. If you're debugging a macro, you have to remember to increment the version every time you change it, or your changes mysteriously don't take effect. And you have to make sure your macros are clean and deterministic or you might bake things into the cache that don't belong there.

> The syntax tree of Python is available through the ast module.

That module can be annoying to work with because it doesn't make stability guarantees. I've even had to deal with code that broke in a maintenance release (3.8.4) because of this change: https://bugs.python.org/issue40870

I can't think of any better alternatives, besides doing nothing. But it looks like it'll be unusually easy to use wrong, not to be used lightly.

I’ve not read the proposal in full (skimmed it, so might’ve missed something admittedly), but if you were debugging a macro you wrote, why not just blow away the pyc files each iteration vs incrementing the version? I see the version increment more for when you distribute the macro.

Yeah, that's probably more practical. Or a wrapper that uses a timestamp as the version so it's always higher than last run.

Still—one more thing you have to remember.

That’s why you provide well-featured transparent debugging facilities.

I was really excited when I read the title, but I quickly became disappointed. I've always thought macros were neat. Coming from Rust, macros like `vec!` make the language age a lot cleaner and easier to read.

The issue with adding them to Python is that Python is already clean and easy to read (for the most part). They point out Python's simplicity being one of it's biggest strengths, and I think macros will help add to the clutter on top of Python. I also just don't think we need them in Python. All of the examples they gave were underwhelming and seemed like they didn't offer a really good way to convince people that this is a worth while feature.

I guess we'll see how things play out. If it gets approved and implemented, I assume that they've found good uses for it.

I love this proposal. I think I would use it too much at first, and the pull it back to using it a reasonable amount :)

I've wished that Python had something similar to C#'s LINQ, where a user could express a query using familiar list comprehension syntax, but instead of the comprehension actually evaluating, my search library could somehow receive the AST and use it to build an optimized query. ORM libraries often play tricks with dunder methods to achieve a similar DSL feel. This could give me something very close to that.

I'm also drooling over the parser example, as someone who maintains a parsing library. I support building a parser from a a grammar file currently because doing it in code is a bit clumsy, but the example makes it look pretty.

There's also countless times people have had a great idea for "with" blocks, but it turns out it's not really possible because it would require the with statement to analyze/capture the child statements inside. This proposal gives you exactly that.

The syntax isn't spectacular, but for the most part I can't think of anything much better. I do think the "sibling statement" syntax will be hard to understand visually. I would make it start with a @ since it's like a decorator for the next statement:

@log! print("Hello")

> I love this proposal. I think I would use it too much at first, and the pull it back to using it a reasonable amount :)

I can think of a bunch of things over the years where I did exactly that on first discovering it.

A small warning though: I write a lot of perl, and while I care a lot about readability for the next developer, plenty of people fail at the "pull it back" phase so it's definitely a trade-off.

Why would you add macros to the most succinct widely used programming language that is loved for its readability?

Macros are generally useful because they A) make something more succinct, B) improve performance by doing something at compile time instead of runtime, or C) enable metaprogramming. The downside is that they risk the creation of unreadable or overly magic code.

Python already is succinct and already has metaprogramming (although not to the extent of Lisp, as pointed out by the rationale). And we all know Python is slow, but the downsides of making it less readable through macros outweigh the upsides IMHO. Python is loved for being readable. Instead let's improve on its general performance.

If macros are used responsibly, they can make a program more readable.

I have to write lots of boilerplate code for certain things that in other languages would be much simpler; for example when I'm doing pattern matching in Python I have to use a sequence of if statements and complex expressions. I'd love a pattern matching macro because that would let me express my logic in a manner which more closely matches the semantics.

Macros can be abused just like decorators, classes, or functions to make complex code that's difficult to understand or modify. While macros give you a bit more rope, used properly they can be very handy.

> If macros are used responsibly, they can make a program more readable.

(AST based) Macros can really be an asset in a statically typed language (like Rust or Crystal), they are crap in C since it's text substitution [1], but an interpreted language that already has a lot of meta programming capabilities doesn't really need macros. Python has problems, but you don't fix those with Macros on top of them. They will never be used responsibly.

1: With a better macro system C could have been a much better and secure language IMHO.

Metaprogramming has a runtime cost that macros (even crappy C-like ones) don't. In the case of an interpreted language, this is an important feature.

Having said that, I haven't seen metaprogramming being overused or abused much in Python.

You shouldn't be using Python in the first place if you're worried about the runtime cost of metaprogramming.

If macros offer the same benefit at no additional cost, why not?

Macros have decidedly nonzero cost. Look at the discussion from the core devs surrounding this; just the implementation is a major change for the parsing and AST modules with unclear performance implications. Introducing this much complexity to the language is far from "no cost" as well. I predict that if this PEP goes through, it will be a watershed moment for Python and the language will become an unrecognizable mess (more so than already) within a few years. If GvR was still in charge, I would be sure that this would be dismissed. Now, I am not so sure, and that troubles me.

A nicer and more complete syntax for anonymous functions, or some reasonable equivalent (like JS arrow defns, or Ruby code blocks) might serve most of the same needs as a full-blown macro facility, without some of the complications. It's pretty common in Ruby for libraries to define DSLs with syntax similar to what macros were used for in, say, '80s LISP code.

(The nearest current Python equivalent is lambdas, but there are very awkward limits on the varieties of control structure they can contain.)


I've been studying Racket lately. Historically I've mainly worked in C and Python, although I've scratched the surface with lisps before. One of the things that strikes me about Racket is that every new special form I learn has its own evaluation rules. They may be very similar, they may follow certain patterns, but they don't attain total consistency. (Especially confusing to me has been the existence of expressions with arity other than one, which are OK in some forms but not in others.) Most of the special forms are macros underneath; you can go and read them, although they're still parenthesis soup to me at the moment. Still, there's a path through the fog -- you know that underneath it all, there is a small number of primitive forms, and all this syntactic abstraction is ultimately built on top of them.

Contrast this with Python. Python has a lot of syntax. You can't reason your way through Python by reference to a small number of primitive forms. Instead, just about every expression evaluates to an object, and if you know what kind of object it produces, you can go look up what that object does. What I really enjoy about learning Racket is seeing how syntactic abstraction lets the language get away with not creating new data structures when it doesn't have to. However, a marriage of the two approaches seems like too much. To understand what a macro does, now you're going to have to map the macro's special syntax back to Python's ordinary syntax, and then you're going to have to figure out how to use whatever kind of objects that syntax produces. This imposes a whole new layer of reasoning on the reader.

I'm not a fan of this PEP and I'd avoid using it in my code if it were adopted. I already have enough trouble explaining generator comprehensions to my colleagues; a half dozen new quasi-Python languages that compose with each other in surprising ways would be more than enough to make them retreat back to C#.

That doesn't even look like Python. If you really want to convert Python to a LISPy language just use hy. It is much more complete. I really hope this proposal doesn't get merged but after the walrus operator I am not so sure.

do you have any ideas for making it look more like python?

I don't think it is possible to make macros look like the host language without homoiconicity.

However the proposal has this in the rationale section:

> Python is both expressive and easy to learn; it is widely recognized as the easiest to learn widely-used programming language. However, it is not the most flexible. That title belongs to lisp.

If the main purpose of the proposal is to make Python "flexible" like Lisp why not go the extra mile and add lisp syntax as well (like hy)? That will solve the problem of macros and flexibility.

Non homoiconic languages (Elixir, Julia) usually use quote/unquote syntax, with quote being a construct that transforms code to AST and unquote evaluates an AST. For example in elixir:

  a = 3
  quote do
    if unquote(a) > 3 do
Which produces: {:if, [context: Elixir, import: Kernel], [{:>, [context: Elixir, import: Kernel], [4, 3]}, [do: 6]]}

In Elixir and Julia the AST is represented by a normal type within the language itself just like Lisp, so the macro can just receive that type, manipulate it with the base language tools or using the special syntax above and return it.

> Non homoiconic languages (Elixir, Julia) usually use quote/unquote syntax

So do homoiconic languages; quote/unquote/quasiquote and friends all originate in, or at least are used in, the Lisp family of homoiconic languages. (IIRC, Scheme uses more them than the CL branch of the tree, but both major branches use quote, and the ' abbreviated syntax for it.)

Yes, I ended up writing as if they were different when it's a direct translation, except that quote transforms the visible syntax into a different hidden syntax, which is what makes it non homoiconic.

> Python is now sufficiently powerful and complex, that many proposed additions are a net loss for the language due to the additional complexity.

This should be made into an automated response.

I honestly love macros, so I would probably use that feature much more frequently than decorators (if it's even approved, I would expect a lot of macro requests for Python over it's history), but this seems like something that should have done early and instead of all those mechanisms created for metaprogramming already that would now compete for usage (unless they are implicitly deprecated and macros become the "one obvious way of doing"). And will they tempt changes to the already mature interface of so many libraries like flask, django, numpy, pandas, tensorflow 2 making yet another way of doing stuff (since they can't break all past code)?

And in general, non-lisp languages with macros like Elixir even recommend never using them until exhausting all alternatives. Macros always break expectations (if it didn't you wouldn't use them), compose poorly, hard to debug without adequate tools and requires a lot more documentation. Great for removing boilerplate though, which is my main usage.

The approach to hygiene seems very lazy and short sighted as proposed. Given python's culture, I think that the macro system should be designed such that using macros is as easy as possible even if that adds complexity to writing them. Therefore, I wish that we could have real hygiene since it helps give better error messages.

I do think that there should be ways to break hygiene, but hygiene should be the default. I've also found that breaking hygiene is almost always unnecessary when you combine it with a feature like syntax-parameters in Racket.

Maybe it would even make sense to initially limit macros to something like syntax-case which enforces hygiene with later plans to add in more complex macros which can break hygiene.

Finally, for those who think that macros are impossible to debug, I challenge you to check out Racket's macro debugger which allows you to see each macro individually get expanded.

Please, no.

Add this and the next thing you know is every new library has a gratuitous badly designed DSL.

But this is Python, not Ruby.

Also, MacroPy already exists https://macropy3.readthedocs.io/en/latest/. Which is great! But let's not bless this as a standard part of Python.

A bunch of badly designed DSLs already exist, we just call them APIs or maybe fluent APIs instead of DSLs though they are pretty much identical.

Pandas, Tensorflow, numpy, etc. are all such complex APIs that they are indistinguishable from DSLs. That they happen to cooperate to a greater or lesser degree with one another and the rest of Python doesn't change that fact. Most Racket DSLs that aren't teaching languages allow for using almost all of the Racket features in addition to some new ones and today it is generally more popular in the community to create mini-languages which are basically just a set of macros and functions that work on within Racket.

Macros actually make the DSL situation better because it heavily discourages the nonsense YAML/JSON DSLs that everyone despises. Instead of thinking of the complexity macros will introduce, think of how many YAML DSLs will be killed instead!

I think one of the best things about Python currently is that it prevents people with this mindset from running amok :)

> Pandas, Tensorflow, numpy, etc. are all such complex APIs that they are indistinguishable from DSLs.

The advantage of their current APIs, what distinguishes them from the DSLs we're talking about, is exactly that they're "just Python". You may need to consult the docs or the src to understand what a method does... but at least it's just a method, you know how to call it, how to pass args in etc. It has the same level of abstraction as the rest of your code, it is composable on that level.

IMHO Racket is an example of what to avoid. Sure it has great tools for literally building your own #lang. But even in 'vanilla' Racket code... I did a bit of Racket recently and it was constantly frustrating to find in many libraries things that look like, and could have been, just regular functions instead are syntax macros.

Suddenly you don't know how anything works. Where you think you are passing a string arg to a function... you find you can't substitute a func call returning a string, because it's not actually a "string arg to a function", it's a syntax macro that only recognises a string token in that position. Now you have to dig through the source code to understand what whimsical constructs the library author has inflicted upon you, maybe have to write your own macros to work around it at that level.

It's exactly this kind of gratuitous DSL nonsense I would not want to see in Python. It certainly turned me off Racket (and I wanted to like it - there's a lot to like!). Maybe it's part of the reason why Racket and LISPs generally are still niche languages, while Python is hugely popular.

> the nonsense YAML/JSON DSLs that everyone despises

Hey, at least it's not XML :)

There's also TOML and ProtoBufs...

I think these are all good choices, if all you need are data-structures.

If you need actual programming language features in your config that is when these become unsuitable. So rather than inventing your own half-baked DSL why not use one of the thoughtfully-designed "config languages" out there now like Dhall, Cue, Starlark. You won't have to document it or maintain it, and it probably has other useful features and just works better than what you would have made yourself.

Python doesn't suffer from config language crappiness. Every single library I see is configurable through its API first, JSON a distant and optional second place.

> Python is both expressive and easy to learn; it is widely recognized as the easiest to learn widely-used programming language.

Adding macros seems like a good way to destroy this reputation.

I am not convinced they are needed, and the examples provided leave me quite indifferent. But I haven't been enlightened by LISP yet so perhaps it's just me.

In particular,

> Many domains see repeated patterns that are difficult or impossible to express as a library. Macros can allow those patterns to be expressed in a more concise and less error prone way.

On the contrary, I believe such generic multi-domain libraries would be a hot mess to maintain. These abstractions would be way too high level, and undoubtedly different fields have significant, if small, different requirements.

> easy to learn

Used to be easy to learn but now it's just easy to get started.

> Used to be easy to learn but now it's just easy to get started.

What about it has got harder to learn?

To do something technical involves knowing some technical library, but I believe that was always the case.

As the language keeps growing, it gets harder. If my code gets illuminated by constellations of stars and double stars or all my functions are decorated, most of my colleagues start giving me bad looks. In my humble opinion they are on the right. That's not what they signed for when they became Python fans.

We also didn't sign up to perform dark magic in order to publish our applications to desktop.

> , I believe such generic multi-domain libraries would be a hot mess to maintain.

I think you (or I) understood it backwards. They see macro as a way to avoid multi-domain mess. Each domain could develop its own macro without having to worry about other domains, while in the base language everyone has to be considered.

In practice I have never observed a single domain in which Python's language constructs are terribly insufficient.

I don't think the bar is at "terribly insufficient" to be honest.

Python has one of the best ORMs, the best web dev framework, the best REST framework, the best numeric processing library, the best machine learning libraries, and just all around excellent libraries for any domain you could possibly imagine.

I have never thought, gee I wish I had macros. I have thought a hundred times more about multi line lambdas, and even so the number of situations where they would make my life easier bs being a source of confusion is very, very small.

It's an essentially finished language at this point. The library warts are not worth it given the amount of language machinery you'd have to invent.

i actually think both Django and Tensorflow would benefit from macros. in both cases you spend a lot of time writing what's really a DSL embedded into Python (defining models using a magic metaclass thingy / defining computation graphs, basically writing ASTs by hand).


for another example, consider

  Foo = namedtuple('Foo', ['a', 'b'])
where the namedtuple function actually has to hack around the fact that it's not a macro: https://github.com/python/cpython/blob/644e94272a89196801825...

tldr: Foo needs to have

  __module__ = 'yourmodule'
  __qualname__ = 'yourmodule.Foo' 
or else stuff like Pickle won't work right. so namedtuple has to reach into the caller's stackframe to see what module it was called from! this wouldn't be an issue if it was a macro and just expanded to a class definition.

So, I agree that it would look slightly better if it were a macro.

Having said that, is it worth the additional language complexity and opening a Pandora's box of footguns? No. The idioms are not _ideal_, but I would not consider that to be dramatically painful, not would I even say it slows down development much.

I think it's very difficult to improve idioms at this point without adding complexity in the language to the point where, IMO, it's no longer worth it. I think a strong point of Python vs other languages in the same class is that there's a well-defined limit, both as a community language and language capabilities, as to how much complexity and rope you're willing to give the user.

As with spice in the kitchen, a bit of this feature, wisely used may add savor to the project.

I'll definitely use this in conjunction with pytest, so that there are test cases and data documenting what is going on.

Less seriously: let the International Obfuscated Python Programming Parades begin!

I feel like Python is losing its way, its maintainers seem to feel the need to "compete" by implementing poor copies of other languages' syntax. The "walrus" assignment operators, the proposed pattern matching, now this...

I assumed this change had the opposite motivation since people who would ask for more language features could now do them in macros while keeping the core language small.

Complexity is complexity, and a lot of it is irreducible. Pushing complexity up into libraries does nothing to ameliorate. In fact it makes it worse, as have to work with libraries with different idioms makes the language a hodgepodge of slightly incompatible conventions.

In Python the complex machinery is in the language and the stdlib. All additional libraries that work with the language generally use stdlib constructs for data interchange, or at least base their work on it. That makes for an extremely pleasant programming experience.

Learning a different macro per dependency is going to lead to catastrophe. I know it's in an exaggeration, but I see this in macro-based languages all the time.

These is why I believe a BDFL is needed. To cut crap like this in the bud

Type hints already made the strict typing sharks crave for blood and start trying to turn Python into Java

So you think just because other people disagree with you they shouldn't have a voice? What if Guido really like the macro proposal and wanted to fast track it?

That's right, we're not playing democracy here, go play democracy with php, ruby and javascript idiots. Do we need second ruby or lisp? Answer is clear.

Hurts my eyes just a liiitle bit. Methinks The MUSIMP/MULISP-approach would be less painful. Firstly you find a way to transform the language in S-expressions and then you just write regular lisp-macros. This would be hidden from regular Python-user, of course. As I recall the macrodefinitions were already totally unreadable if transformed into Algol/Python, so no worry about syntax.

This is a sound and logical approach, so it won't happen in Python.

With the level of lisp-hate out there, this approach might make this feature less likely to happen. It's quite a sound idea tho.

You do not actually need to see Lisp-code at all. In Musimp you could write a macro in Algol, but you had to understand properties of the S-code. Thus you would write 1+2 as LIST('+,1,2) which was then printed out as [+,1,2], +(1,2) or 1+2 depending on the situation and properties of the symbol '+.

There are so many good ideas that are trapped in the past, tens hundreds or thousands of years old.

Wow! I had no idea that Microsoft made a Lisp?! MuLISP [0]

I think there are two issues with the implementation of macros, and I am not really qualified, but this the internet so here goes.

1. Getting access to the quoted parse tree

2. Writing code to either generate, modify, etc, that parse tree.

So folks already do (2) by spraying code into a buffer and exec or evaling it.

They do some of (1) by reading the textual src, the bytecode or the parse tree. Mentioned elsewhere in the thread are MacroPy and HyLang.

All of that said, I think a language with better semantics about deferred evaluation or a pre-eval step (maybe it doesn't need need full macro macros), who knows.

One thing that I found really powerful in Zig [2] and Dlang are the facilities around CTFE (compile time function evaluation) [1] and how they obviate solve many of the same problems.

Minor nit, I think the PEP could have done a better job showing now HOW it could be used, but what EXISTING programs could be made simpler or achieve things not possible with the current system.

I do think that some sort of lightweight code generation step could both reduce the lines of code, make things simpler for the outer layers of the stack and make things more performant. Obviously, they can go the other way, but limiting the upside because of possible downsides isn't something we should do with code. Python already has an infinite number of ways to have unmaintainable, inscrutable code, you should see my list comprehensions and my itertools code! :) What is another one if it captures more of the original essence of Python?

[0] http://www.edm2.com/index.php/MuLISP

> In addition to an interpreter and compiler the system includes an editor with a WordStar-like command set

Wait what? I am getting flash-forwards of a ColorForth like environment for Lisp using DOS graphics. This is like some retro-futuristic timeline where Microsoft went down the Lisp+Xenix path!

[1] https://en.wikipedia.org/wiki/Compile_time_function_executio...

[2] Which also mentions Terra, which I think should be required in any discussion around macros https://news.ycombinator.com/item?id=13761571


> I had no idea that Microsoft made a Lisp

They didn't. It was 'made' by the company Soft Warehouse and sold through Microsoft.

That syntax is ugly as sin and I can’t tell from the PEP exactly what the examples are doing which is rare. I hope this doesn’t pass or if it does, with major improvements.

Interesting idea, but it seems the motivation is lacking and the notation is a bit chaotic. The examples mainly show existing ways of doing things with extra ! characters, and don't actually seem to define most of the macros.

What is the killer app to justify this?

Could it e.g. give us better matrix notation for numpy and sympy?

The PEP appears to contradict itself about implementing the with statement using macros.

> It is possible to demonstrate potential language extensions using macros. For example, macros would have enabled the with statement and yield from expression to have been trialed.

But later, in the example showing an implementation of the with statement:

> The above would require handling open specially.

If the macro-based with statement implementation can only handle the built-in open function and not arbitrary context managers, it's much more limited and not really the same feature. Does this mean that macros as proposed here cannot actually be used to implement something like the with statement?


> Annotations, either decorators or PEP 3107 function annotations, have a runtime cost even if they serve only as markers for checkers or as documentation.

But this runtime cost, if the decorator returns the original function, is essentially the cost of a function call when the module is loaded. If the decorator doesn't put a wrapper around the original function, there is no runtime cost when the function is called, only when it's defined. You would need a huge number of decorated functions for this to have a a detectable effect on module loading performance, I'd expect.

I'm puzzled by this part:

> Runtime compilers, such as numba have to reconstitute the Python source, or attempt to analyze the bytecode. It would be simpler and more reliable for them to get the AST directly:

     from! my.jit.library import jit

    def func():
If you write this as

    def func():
then the jit decorator can ask for the function's AST easily via the standard AST library.

It's not that straightforward under the hood. Python doesn't keep an object's source code around.

inspect.getsource(func) uses func.__code__.co_filename and func.__code__.co_firstlineno to figure out where the definition of func starts, runs that part of the file through a tokenizer, and uses heuristics to guess where the definition stops.

Sometimes it guesses wrong, particularly if lambdas are involved. Sometimes the code moved around in the meantime. Sometimes the code never even existed on disk, for example because it was entered into a REPL.

That's what they mean by "reconstitute the Python source".

Thanks. I know I have used this decorator approach in the past to grab the ASTs for Python functions, but I guess my scenarios were all on the happy path. I see now where this might not be bulletproof.

The whole point of python is to eschew 'magic' in favor of dumb explicitness, and this is about as big of a conflict as you can get to that. Python lacks statement lambdas not because they've never considered the possibility but explicitly because "if it's that important it deserves a name (as an inner function)". Type annotations work so well in python as compared to ruby or clojure exactly because of this philosophy. The PEP seems to make no mention of things like reproducibility, IO in macro expansion, static analysis like in mypy, or any of the other realities of things like this. If macro expansion is limited to something carefully statically interpretable like constexpr then I'm cautiously receptive to it, OR as it positions itself to be 'for experimenting with language features' and that's genuine then I'd be fine with it if it's only enabled in debug builds, but in lieu of those the instant it's released generally it's a python 4 flag day.

Conceptually, I think this makes sense. People already bend Python into horrendous DSL-esque abominations using the flexibility already afforded. Macros might allow for these contortions to be more principled and vettable.

OTOH, I am not keen on this opening the door to even more stuff newbies have to learn, especially in the ML domain, where you just know people will use macros to do all manner of "clever tricks".

This proposal also seems too limited to truly enable the kind of performance speedups numerical Pythonistas really want.

The ! notation feels icky and "un-python" to me. Maybe that's just (lack of) mere exposure effect.

What I want instead: improve/stabilize tools and CPython hooks for decompiling, code generation, AST manipulation, and the like. When I can go source -> AST -> source with confidence, maybe lets revisit macros.

I think there is a lot of potential here from the perspective of improving type hinting. Decorators, which execute at runtime, often make type inference very challenging, this could go a long way to making this better. Further, it could also enable more performant static compilation, should one want to do that.

sadly, it looks this still won't get us multi-line lambdas, because macro-expressions can only take other expressions as arguments :(

(though the PEP wasn't really clear on this. the grammar rule they give is:

  macro_expr = MACRO_NAME "(" testlist ")"
but i wasn't able to find a definition for `testlist`.)



but taking inspiration from the PEP's reimplementations of python keywords, we could (maybe? kinda?) add macros for expression-oriented versions of statements, and just use those instead! really hope this gets accepted, can't wait to write some truly appalling macros >:D



also hey, at least it looks like it can handle do-notation!

  x = do!(
    a := foo(),
    b := bar(),
    pure(a + b)
i guess you could even use

  a <= foo()
for binding. i'm sure everyone would love that ;)

This PEP is the most vague defined proposal I've read so far. It's funny how statically typed languages want to drop macros and second semantic in favor of primary language like Zig. And somebody wants a second language with corresponding complexities in mega-dynamic Python.

PEP author didn't give a though why the awesome and flexible lisp-like family of languages is in roadside of industry, and Clojure the most popular has 24 place in github and declining. Just think about why this pathetic graveyard has no potence to offer any AI/Data science library. https://en.wikipedia.org/wiki/List_of_Lisp-family_programmin...

> Upon encountering a macro during translation to bytecode, the code generator will look up the macro processor registered for the macro, and pass the AST, rooted at the macro to the processor function. The returned AST will then be substituted for the original tree.

This seems limiting. Just the AST at that point? No more context (potentially alterable) than that?

EDIT: I have a use case in mind, a Haskell-style (kinda) where clause. For such “trailing” syntax it would be necessary to introduce a placeholder higher up the tree

Reading the motivation about the "high cost of adding a new feature" to a programming language puts me in mind of the first line of the Scheme language standard: "Programming languages should be designed not by piling feature on top of feature, but by removing the weaknesses and restrictions that make additional features appear necessary."

Interestingly, to a large extent, it is macros that makes that approach possible.

As far as I understand, the code inside the macro has to be syntatically valid Python?

I think it would be cool to have sql baked directly into Python, something like:

    t = pd.DataFrame(mydata)
      t = select new_col=col1 + col2 from t
But I don't think this PEP would support that, since "t = select new_col=col1 + col2 from t" isn't valid Python syntax.

I've been working with Python for deep learning for the last ~4 years and I can't think of a time where I truly needed this.

I find it very worrisome that the PEP author talks about Python getting useless features to then go on and present another arguably useless feature. It gives a very "just one more" feel to the whole thing.

> For example, until recently flow control within the try-finally and with statements was managed by complicated bytecodes with context dependent semantics. The control flow within those statements is now implemented in the compiler, making the interpreter simpler and faster.

Anyone know where I can read more about this specific change? Seems very interesting for compiler / interpreter engineers.

Before everyone gets outraged, reading through the whole pep (proposal, like an RFC) is pretty quick. What is interesting in how it changes the semantic weight of comments, good use of Wadler's Law.

the garbage feature creep flood gates have opened

I see the CLOS transition is near full completion

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact