Hacker News new | comments | show | ask | jobs | submit login

The Python philosophy is that macros do more harm than good, making it harder for someone to read/understand your code in the long term. These and other "constraints" make the code more accessible to others and even yourself.

Macros are just another abstraction tool. If you poorly use an abstraction tool, it makes the code harder to read, if you properly use an abstraction tool, it makes the code easier to read.

Here's a function that makes code harder to read:

def sumAList(aList): return 7

This doesn't mean that functions are bad.

Now there is an argument that macros make code harder to read in that I've yet to see a really good macro system that isn't dependent on the code having very little syntax (e.g. S expressions), since the more different the code is from the AST, the harder it is to manipulate the code successfully.

Combined with the fact that more syntax can make code much easier to read, there is a conflict here.

However, I don't think that's the argument you are making.

The argument is that macros have non-local effects; they interact with the code in which they are applied. This means that the macro definition and the code surrounding the macro invocation can't necessarily be understood in isolation.

This is also essentially the argument against global variables.

A function definition is, in general, far away from the function call. If you think this problem is more severe for macros than for functions, then you should articulate why. You may well have a very valid point in your mind but I think it needs to be expressed somewhat more specifically.

For non-hygienic macros, it's essentially the variable capture problem.

For hygienic macros, I don't know of a good argument that they are inherently more difficult to understand separately from their invocation than a function.

(I'm not personally arguing against macros - or global variables for that matter - just trying to state the argument).

I think it should be obvious that you use programming constructs only when difficulty of understanding it is less than the difficulty of understanding code without it (over the whole program). This applies to functions, classes, macros, frameworks etc.

Full macros (like in CL where they are just functions that don't evaluate their arguments) give the programmer same power as compiler writer or programming language designer.

ps. To really get benefit from Lisp macros, you would need to standardize code walker. Without code walker, macros can't reach their full potential.

What is code walker?

If you want local reasoning you will have to reject Turing completeness, otherwise you could implement a language with global variables and eval a bunch of code in that language.

I think we shouldn't be limiting our tools. We should instead limit their use. Global variables can be nasty, but it's nice to have them when your code is best expressed with global variables. Same for macros.

Yes, but any given function is usually extremely easy to understand, because it's only, say one level of abstraction, then a macro is a few levels higher than that. And as you go up in your levels of abstraction, it gets harder and harder to really understand what's going on. Sure, some macros are intuitive and easy to follow, but those are usually easily replicated with other things, especially in a dynamic language like Python.

I also don't mean to imply that Lisp is lesser for using macros. I love Lisp and any implementation clearly requires macros. But Lisp is also, undeniably, harder to read for this and other reasons.

Hmmm. I think your argument is roughly that there is a "sweet-spot" for abstraction when it comes to readability?

That is a point at which less abstraction makes the code harder to read, and more abstraction makes the code harder to read?

I will agree with this in specific cases (i.e. for any given solution, there is a point at which adding abstraction can't improve readability), but I'm not certain I agree in the general case (i.e. that using macros cannot improve readability).

I guess it also depends on what you mean by "really understand what's going on." I started out programming in C. Now in C, if you know your compiler well, you can predict fairly accurately what binary code will be generated when you compile with optimizations off. Moving to higher level-languages you lose this ability, and no longer "really understand what's going on."

For systems programming, I may still use C to get this advantage. For other problem domains, I sacrifice this knowledge because representing my problem more tersely in a higher level language makes the code more readable and easier to understand. Now I will never know exactly which instructions will be executed when I write in Python.

Similarly, with sufficiently fancy macros, I may not know what LISP code is generated, but if the macros do what they say they do, it can make my code less verbose, more understandable, and easier to maintain. There are times when really understanding what is going on trumps the terseness, and those times I don't use macros.

Also, I love Python. It embeds well in C (which is where my original background is), and it has very good portability, and a good set of libraries.

I also implied, but didn't say straight out that Python has a good reason for not having macros: Part of its design is to look like pseudo-code. See also Norvig's comment to the OP. Macros that operate on text rather than trees (C preprocessor, m4, etc) are far more error prone, and probably a Bad Idea. Therefore if you want your language to look like something other than a tree, you have to forsake macros that operate on code as it is written. I have seen for several languages (Python among them I believe) Lisp-like macros that operate on the AST of the language. They have not caught on. I have several theories why this is so, but right now my preferred one is that it feels too much like you're hacking the compiler, and Compilers Are Scary.

right now my preferred one is that it feels too much like you're hacking the compiler

I think what you implied just before that is a stronger argument: it's way too distant from the base language, and this semantic distance is so costly that it's not worth the trouble.

"And as you go up in your levels of abstraction, it gets harder and harder to really understand what's going on."

There is a difference between abstraction and indirection. Just because you've added the latter doesn't mean you've gained any of the former.

"Sure, some macros are intuitive and easy to follow, but those are usually easily replicated with other things, especially in a dynamic language like Python."

How would you implement SETF in Python? Or how about compile-time link checking for a web application (http://carcaddar.blogspot.com/2008/11/compile-time-inter-app...)?

At the International Lisp Conference last year, I put in an evening event called the "Great Macro Debate", in which these issues were discussed. (We encouraged humor, and flaming as long as it was witty, so it was a lot of fun.) What you say is true to an extent. Macros, like most things, can be abused. If you have a group of Lisp programmers, one thing you can do is have the junior ones request advice from the senior ones about what constitutes "tasteful and idiomatic" use of macros, and vet particular macros, since it is rather hard to crisply "pin down" just what those things mean.

I wish there was a language called harmless which common folk could use to express common thought without any fear. It would go like this:

do this do this and that and that too do this amen

No branching nor decision trees as not to confuse common folk. Now programming is a socially acceptable activity!

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact