Hacker News new | past | comments | ask | show | jobs | submit login

It is hard to have empathy for a problem you don't have.

I have only recently become open to the idea that people might legitimately experience pain using an unfamiliar syntax.

For me, it is a non-issue. From Lisp to C to APL to Forth to Prolog, syntax was never an issue for me. I greatly enjoy learning languages with different approaches to syntax. It has never caused me pain. Only joy. Then again programming languages are my special interest.

It kind of easy to dismiss complains about syntax as intellectual lazyness. After all, learning Lisp-style syntax takes maybe a few hours tops, how can that be a problem? Syntax is just the easiest to criticize but would these people really learn the language if it had curly braces or is it just an excuse?

I don't know. I am the kind of person whose day gets ruined by an app minimally changing its UI so maybe I shouldn't be too judgy about syntax sensitive people.






It’s not about an unfamiliar syntax. S-exprs are objectively hard to read. The same shape for function calls, macros, blocks, and data means I can’t distinguish them by sight to detect the code structure. I have to do conscious paren matching.

Structure recognition should be pushed as far down in the subconscious as possible. Rainbow parens help, but it’s not nearly enough to stop other expression fragments from jumping into attention. Likewise clojure’s different bracket types for data structures, likewise the editor highlighting the paren matching the one at the cursor. Better than nothing, but incomparably worse than just having visually distinct syntax in the first place.

C-style is fine. Python-style, ML-style, SQL-style, BASIC, shell: all fine for structure recognition. But lisp is just a soup. Or a fog.

Same problem with elasticsearch queries, too.


Nobody in Lisp does conscious paren matching. We rely on indentation. If you randomly deleted a parenthesis from some )))))), I would not notice. I would rely on the machine to tell me something is wrong.

Most "Lisp is unreadable" forum posting activity is just trolling by nonpractitioners. You can usually tell because it doesn't hit on the real readability issues, only the imaginary ones.

To read a Lisp dialect, you have to know what numerous words mean.

A seasoned Lisp coder cannot read the following, besides understanding its source code structure as data. I would guess that defcrunk foo defines something named foo, which is a crunk whatever that means. After that I have to be looking at the documentation of defcrunk (or asking AI).

  (defcrunk foo (splat zing)
    (:crom jang)
    (:flit (burch culd)))
I suspect that a good proportion of Lisp is like this for newcomers.

Sometimes people make new Lisp dialects because they don't like the words and they want to make up their own. It might not be their main reason but it figures in there. Someone else has to learn those, including people that already know an existing Lisp or two.

Also, my above defcrunk thing will parse in many a Lisp dialect; and I could make a macro to make it work in some way. Trolls about Lisp have really latched on to this one. Their leader, a front end web developer, wrote a little article about it about an imaginary curse ...


Learning s expressions and the evaluation rule takes about 13 seconds. It’s one of the very first things in SICP. Learning how to use the loop macro, well, I’m still working on that.

> It is hard to have empathy for a problem you don't have.

I do not find this myself, and it is a standard part of the design of a lot of commercial software: intentionally thinking about disabilities, impairments, and difficulties, and working on accommodating people who struggle with these things.

Examples:

* GUIs that flash the screen to signify a bell sounding for deaf users.

* Keyboard-operated GUIs for users with motor or sensory impairments who can't use pointing devices. (Blind users can't see a mouse pointer so can't use one.)

* UIs with alternate colour schemes for people with colour-blindness.

It has long been a source of irritation for me that FOSS tools are so resisitant to implementing this.

> I have only recently become open to the idea that people might legitimately experience pain using an unfamiliar syntax.

Good gracious. That is a surprise to me.

I love the _idea_ of Lisp and have written about it at length, but I find it, and APL, and many other languages, impenetrable.

For ordinary non-technical people, "algebra" is a synecdoche for "something that is really hard to understand".

Algebraic notation is just about the maximum level of mathematics that many non-specialists can handle. The idea of a letter or symbol standing for any number so that it is possible to reason about arithmetic without specifying the numbers being manipulated is brain-bending for the majority of people.

And yet, this is the sine qua non of programming languages. It's the first level you must master.

C is a very simple language. It has terse notations for common operations, such as incrementing a value.

It is not "low level". It vaguely represents the machine architecture of a PDP-11 from 50 years ago. It's nothing like any 21st century CPU. It is not "close to the metal". It is not "portable assembler".

But it's about as simple as a lot of people can handle, so thousands love it.

Go lower level -- to assembly language -- and you put off the majority of those who aren't genius-level.

Go higher-level, to matrix maths or to working directly in lists and ASTs, and you put off loads more.

Go sideways to working with stacks, like Forth, and you dissuade a load more.

Eliminate arithmetic precedence with RPN and you alienate thousands more. A few love their HP calculators, or Postscript, but most can't handle it.

And they might not know why they can't but they are angry when they are told to just ignore an unscalable wall. Put an impassable barrier in someone's path, tell them it will fade away and stop being noticeable, and what would you expect but anger and resentment?

This is why I argue that BASIC has great merit that is missing from things like Python. Not syntactic whitespace: Python mixes text with code in output, it forces beginners to deal with abstract concepts like "editors" and "files", plus the ubiquitous OOPS -- all things that are meaningless to beginners.

BASIC replaced this with the simple brilliance of _line numbers._

And so the pros hate it and take pride in hating it -- because contempt culture is endemic in software.

https://blog.aurynn.com/2015/12/16-contempt-culture

C, also, is extremely dangerous, and this appeals to the machismo of stereotypically nerdy geek types, who lack the conventional signs of machismo.

Thus, take C, make it safe by removing all the dangerous bits, but keep that terse syntax, and the result is Java -- loved by thousands of workman coders who are untrained but like an easy tool. Much of world business is glued together in Java.

But it lacks the element of danger so the macho nerds detest it. Contempt culture again.


For years I maintained an ERP system written in BASIC. It was legitimately terrible. The lack of lexical scoping and decent flow control made it nearly impossible to reason about for even simple changes. There’s nothing brilliant about line numbers and all-global variables, it’s literally one of the laziest, sloppiest design decisions you could make in a programming language.

Granted, if your only experience of BASIC is fond memories of the TRS-80 at your grade school, yeah any contempt for it will seem simplistic and poorly motivated. Maybe try having to depend on it for literally anything in a professional setting and you’ll have a more informed perspective.


1. I helped maintain a payroll system written in GW-BASIC for a year or two. It was designed cleanly and well, with each module of code a separate file chain-loaded in from disk as a de facto overlay system. The variables that each module inherited were thoroughly documented in code comments at the top of each module, so it was clear what it would inherit in RAM and which you needed not to touch.

Just as one can write spaghetti code in any language, one can equally write good clean code in any language.

2. Remember what the _B_ in BASIC stands for. Despite that it was used professionally, yes, but there is in any and all fields a need for easy tools for beginners to learn with.

https://www.fortressofdoors.com/take-the-pedals-off-the-bike...

As discussed here:

https://news.ycombinator.com/item?id=42697467


The overlay system is an even worse paradigm than global variables within a program. And one I too am very familiar with.

What happened here was that this payroll system of yours was written by people who saw clearly the danger of BASIC's many footguns and mitigated it the best way they could, by supplementing their code with vast quantities of english prose. This is not an argument in favor of BASIC.

> Just as one can write spaghetti code in any language, one can equally write good clean code in any language.

This is only half the truth. It's like saying it's equally possible to drive well or poorly in a 1990 Yugo as it is in a 2025 Accord. The car actually does matter in a lot of cases. I despise Python's tooling, but if one were to do a thoughtful rewrite of your old payroll system in Python, all those code comments would be completely unnecessary (it would be impossible to commit the kind of error that the comments were needed to warn against), and the code itself would be far easier to read and maintain.


> the code itself would be far easier to read and maintain.

As it happens, my boss, who wrote the app, quit, started his own company, and did rewrite it, from memory. He did it in QuickBASIC 4: it became a compiled binary app, in structured code. That made it more readable, sure: the point here being, it wasn't necessary to rewrite it in another unrelated language to get that win.

But QB was a cut-down version of the MS BASIC Professional Development System. It was intended as a pro tool, not as a thing for learners.

In other words:

• Don't mix up simple BASICs intended for beginners with pro-level ones.

• Don't assume that a more "serious" language means more readable code, because it doesn't.

• The flipside: don't assume that more readable code needs a more serious language. It doesn't.

None of your criticisms address what I'm trying to say, which is that BASIC was in its time a good tool for beginners, and the evangelists of, say, Python have failed to understand _why_ it was good at what it did. Python is _not_ a globally better thing.

Tools that are good for pros may be bad for beginners, just as tools for beginners can be bad for pros. This is surely not a stretch or a controversial statement.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: