Hacker News new | past | comments | ask | show | jobs | submit login

Is it really that much worse than C or JS? Most popular languages other than Python use () and {}, <> are just for generics like many other languages, " is used for strings, - is just a minus sign, and _ is just used when you need to ignore something when matching just like it is used in Python.

Yes, Rust is a very symbol rich language, but it also is very information dense which makes it easier to read than many other languages once you learn it.




Information density isn't the end goal though, otherwise we'd all be writing into something like APL or its derivatives (K/Q/J/etc). And yeah, if you have ever missed a & in C++ you can easily see why a single character symbol can be annoying (i really prefer C#'s approach where you not only use ref and out in function declarations, but also during function calls - though overall i prefer Pascal's more explicit and verbose syntax).


I think the goal programming languages optimize for is somewhere between, on the one hand, "learnability" / "time to fluency"; and, on the other, "power/expressivity when fluent." When you multiply these measures together, gives you an area measured in "productivity over time of each man-hour invested into the project by population of developers of various stages of fluency with the language" (where for an average developer on the project, any given man-hour is partially spent writing code, and partially spent becoming more fluent in the language.)

APL-likes (and Forth-likes, and Lisps when you use the macro features) are a bit too far on the "power when fluent" axis, at the expense of learnability, and so the total area of the product is small. Might be good for one-man projects, but not for large ones.

Minimal languages like ASM or Java, where there are just a few primitives and everything else is design patterns, trade high "learnability" for low "power when fluent", and so also have a reduced optimization product.

Most languages aim somewhere in the middle, though often with a bias toward the side where they think the optimum might truly lie; for example, Go is slightly on the "learnable" side, and Rust is slightly on the "power when fluent" side, but both seem to be generally more productive per average-programmer-man-hour invested into FOSS projects than languages that more heavily favour just one axis.

(Side-note: I'd love to see some real numbers crunched on this. A comparison of rival FOSS projects to implement some shared standard in different languages would make for a pretty good "natural experiment" to look at productivity over. First thing that springs to mind for me personally are the rival Ethereum clients of Geth (= Go) and Parity (= Rust), but I'm sure you can think of your own examples.)


> where there are just a few primitives and everything else is design patterns

C++ is design-patterns heavy and symbol heavy at the same time.

The heaviest design-pattern language champ would be JavaScript 1.5 with the lack of construct (assuming people work in super-huge JS project with hundreds of developers, not a simple web-app).


This isn't about learnability or time to fluency, it is about readability. The & issue i mention for example isn't about how easy it is to learn about references, but about how readable is when, e.g, you are reading some patch in a code review and you miss someone forgetting it when returning a huge collection which can causes severe performance issues.

These are mostly orthogonal issues to how easy and/or powerful a language is.


My definition of "fluency" is that it's the point where every well-written piece of code in the language is easily readable to you. ("Fluency" of a spoken language is the point where you can speak it and parse-while-hearing it without thinking, so I think this is a sensible definition.)

By this standard, most programmers never achieve 100% fluency in even their favorite programming language, unless their favorite language is one of the dead-simple ones; and there are some programming languages (esolangs, certainly) that nobody is 100% fluent in; and many (the APLs, Forths, etc.) that only a few human beings on earth are 100% fluent in. (That's not to say that there aren't many people who can read and write them—just, not fluently. If you need to look at a programming-language reference more often than you need a dictionary to parse out the meaning of a sentence in your native spoken language, you're not fluent.)

A language that's less "readable", by your terming, is just one that takes longer to become fluent in. Given that the programmers working on any given project are going to be in a mixture of phases in their careers, and thus have invested different numbers of man-hours into language fluency, there's going to be a mixture of fluency levels on any project (where less-fluent readers experience the language as "less readable.") Longer time-to-fluency means that the same number of invested man-hours get you less fluency, and so on projects in languages with longer time-to-fluency, but the same distribution of programmer career-levels, you'll see lower average fluency (or a skew toward the bottom of the distribution, really) and thus more complaints of low readability. Mind you, this isn't really a fact about the language itself, but a fact about the project—if everyone on a project are 20-year veterans of the language, any language can be "readable."

I would note that we can measure time-to-fluency; it's objective. We can teach people a programming language, test them along the way, and see how many man-hours of study it takes to get 100% on the tests. We can't measure a language's "readability"; it's subjective. But it likely has a heavy correlation to time-to-fluency, so time-to-fluency is the measure you should care about if you care about A-B testing programming language syntax features for their effect on "readability."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: