Hacker News new | past | comments | ask | show | jobs | submit login

I long for a language which has a basic featureset, and then "freezes", and no longer adds any more language features.

You may continue working on the standard library, optimizing, etc. Just no new language features.

In my opinion, someone should be able to learn all of a language in a few days, including every corner case and oddity, and then understand any code.

If new language features get added over time, eventually you get to the case where there are obscure features everyone has to look up every time they use them.




Common Lisp seems to tick the boxes. The syntax is stable and it doesn't change. New syntax can be added through extensions (pattern matching, string interpolation, etc). The language is stable, meaning code written in pure CL still runs 20 years later. Then there are de-facto standard libraries (bordeaux-threads, lparallel,…) and other libraries. Implementations continue to be optimized (SBCL, CCL) and to develop core features (package-local-nicknames) and new implementations arise (Clasp, CL on LLVM, notably for bioinformatics). It's been rough at the beginning but a joy so far.

https://github.com/CodyReichert/awesome-cl


The "very compact, never changing" language will end up not quite expressive, thus prone to boilerplate; look at Go.

Lisps avoid this by building abstractions from the same material as the language itself. Basically no other language family has this property, though JavaScript and Kotlin, via different mechanisms, achieve something similar.


I like to think that lisp is its own fixed point.


The Turing Machine programming language specification has been frozen for a long time, and it's easy to learn in a few days.

So has John von Neumann's 29 state cellular automata!

https://en.wikipedia.org/wiki/Von_Neumann_cellular_automaton

https://en.wikipedia.org/wiki/Von_Neumann_universal_construc...

(Actually there was a non-standard extension developed in 1995 to make signal crossing and other things easier, but other than that, it's a pretty stable programming language.)

>Renato Nobili and Umberto Pesavento published the first fully implemented self-reproducing cellular automaton in 1995, nearly fifty years after von Neumann's work. They used a 32-state cellular automaton instead of von Neumann's original 29-state specification, extending it to allow for easier signal-crossing, explicit memory function and a more compact design. They also published an implementation of a general constructor within the original 29-state CA but not one capable of complete replication - the configuration cannot duplicate its tape, nor can it trigger its offspring; the configuration can only construct.


Such languages exist. Ones that come to mind offhand are: Standard ML, FORTH, Pascal, Prolog.

All of which are ones that I once thought were quite enjoyable to work in, and still think are well worth taking some time to learn. But I submit that the fact that none of them have really stood the test of time is, at the very least, highly suggestive. Perhaps we don't yet know all there is to know about what kinds of programming language constructs provide the best tooling for writing clean, readable, maintainable code, and languages that want to try and remain relevant will have to change with the times. Even Fortran gets an update every 5-10 years.

I also submit that, when you've got a multi-statement idiom that happens just all the time, there is value in pushing it into the language. That can actually be a bulwark against TMTOWTDI, because you've taken an idiom that everyone wants to put their own special spin on, or that they can occasionally goof up on, and turned it into something that the compiler can help you with. Java's try-with-resources is a great example of this, as are C#'s auto-properties. Both took a big swath of common bugs and virtually eliminated them from the codebases of people who were willing to adopt a new feature.


Prolog has an ISO standard... I am not sure if it's still evolving, but specific Prolog implementations can and often do add their own non-standard extensions. For example, SWI-Prolog added dictionaries and a non-standard (but very useful) string type in version 7.

That said, it is nice that I can take a Prolog text from the 1980s or 1990s and find that almost all of the code still works, with minor or no modifications...


Elixir?

From the v1.9 release just a few weeks ago: https://elixir-lang.org/blog/2019/06/24/elixir-v1-9-0-releas...

> As mentioned earlier, releases was the last planned feature for Elixir. We don’t have any major user-facing feature in the works nor planned. I know for certain some will consider this fact the most excing part of this announcement!

> Of course, it does not mean that v1.9 is the last Elixir version. We will continue shipping new releases every 6 months with enhancements, bug fixes and improvements.


That's just an announcement that they reached the end of the list of user-facing syntax changes on their roadmap.


Interesting!

I imagine churn will still happen, except it will be in the library/framework ecosystem around the language (think JavaScript fatigue).


Most Elixir projects have very few dependencies because the Elixir and Erlang stdlibs are very batteries included. You don't typically reach for a dependency unless you need most of its features. Often you will reimplement the parts you need in your own code, except where it's reasonably complicated (pooling, DB connections, ORMs, web frameworks) or tricky to get right (security, password hashing, paxos).


Brainfuck has been extremely stable. You can learn every operator in minutes.


someone should be able to learn all of a language in a few days, including every corner case and oddity, and then understand any code.

Why should this be true for every language? Certainly we should have languages like this. But not every language needs to be like this.


Well, maybe not for every language, but probably for a language where simplicity has been a major feature.


I started using Python seriously only three years ago after 30 years of other languages and I didn't find it very simple. Maybe the core of the language is simple but the standard library and many other important modules can be very complicated. Among similar languages Ruby and JavaScript are far simpler.


JavaScript used to be simple... But Promises, closures, prototype chains, and npm/modules/minification/webpack has added a massive amount of complexity to being able to just read and understand a bit of code.


Javascript isn't simple any longer. And I'm not sure Ruby is that simple, not once you dig into the advanced features.


Verrrrrrry few languages in common use are like this.


All you're doing then is moving the evolution of the language into the common libraries, community conventions, and tooling. Think of JavaScript before ES2015: it had stayed almost unchanged for more than a decade, and as a result, knowing JavaScript meant knowing JS and jQuery, prototype, underscore, various promise libraries, AMD/commonjs/require based module systems, followed by an explosion of "transpiled to vanilla JS" languages like coffeescript. The same happened with C decades earlier: while the core language in K&R C was small and understandable, you really weren't coding C unless you had a pile of libraries and approaches and compiler-specific macros and such.

Python, judged against JS, is almost sedate in its evolution.

It would be nice if a combination of language, libraries, and coding orthodoxy remained stable for more than a few years, but that's just not the technology landscape in which we work. Thanks, Internet.


It's apples and oranges.

Python was explicitly designed and had a dedicated BDFL for the vast majority of its nearly 30 year history functioning as a standards body.

JS, on the other hand, was hacked together in a week in the mid-90s and then the baseline implementation that could be relied on was emergent behavior at best, anarchy at worst for 15 years.


Agreed, but the anarchy of JS was a result of a dead standards process between the major vendors that resulted in de facto freeze. The anarchy is direct result of a stewardship body not evolving the language to meet evolving needs.


The only frozen languages are the ones nobody uses except for play or academic purposes.

As soon as people start using a language, they see ways of improving it.

It isn't unlike spoken languages. Go learn Esperanto if you want to learn something that doesn't change.


Esperanto does change, in that new items of vocabulary are introduced from time to time. For example, 'mojosa', the word for 'cool' is only about thirty years old.


This is why a lot of scientific code still uses fortran, code written several decades ago still compiles and has the same output.

How long has the code which was transitioned to python lasted?


> How long has the code which was transitioned to python lasted?

A long time. 2to3 was good for ~90% of my code, at least


Good for 90% of your code is not equivalent to getting precisely the same results from unmodified code written in the 80s.


More likely to mean 90% of projects, not 90% of each file, which would mean that every one was broken.


We will review that statement in 30 years!


Likely, we will review that statement in 2038 at the latest.


I have compiled Fortran programs from the 70s on modern platforms without changing a line. The compiler, OS, and CPU architecture had all disappeared but the programs still worked correctly.


Fortran has added a whole lot of features over time though.


but you can still compile F66 with Intel Fortran compiler 2020 (and other compilers as well)


This isn't that good of a metric for code utility. Sure, very-long-lived code probably solved the problem well (though it can also just be a first-mover kind of thing), but a lot of code is written to solve specific problems in a way that's not worth generalizing.

I write a lot of python for astrophysics. It has plenty of shortcomings, and much of what's written will not be useful 10 years from now due to changing APIs, architectures, etc., but that's partly by design: most of the problems I work on really are not suited to a hyper-optimized domain-specific languages like FORTRAN. We're actively figuring out what works best in the space, and shortcomings of python be damned, it's reasonably expressive while being adequately stable.

C/FORTRAN stability sounds fine and good until you want to solve a non-mathematical problem with your code or extend the old code in some non-trivial way. Humans haven't changed mathematical notations in centuries (since they've mostly proven efficient for their problem space), but even those don't always work well in adjacent math topics. The bra-ket notation of quantum mechanics, <a|b>, was a nice shorthand for representing quantum states and their linear products; Feynman diagrams are laughably simple pictograms of horrid integrals. I would say that those changes in notation reflected exciting developments that turned out to persist; so it is with programming languages, where notations/syntaxes that capture the problem space well become persistent features of future languages. Now, that doesn't mean you need to code in an "experimental" language, but if a new-ish problem hasn't been addressed well in more stable languages, you're probably better off going where the language/library devs are trying to address it. If you want your code to run in 40 years, use C/FORTRAN and write incremental improvements to fundamental algorithm implementations. If you want to solve problems right now that those langs are ill-suited to, though, then who cares how long the language specs (or your own code) last as long as they're stable enough to minimize breaking changes/maintenance? This applies to every non-ossified language: the hyper-long-term survival of the code is not the metric you should use (in most cases) when deciding how to write your code.

My point is just that short code lifetimes can be perfectly fine; they can even be markers of extreme innovation. This applies to fast-changing stuff like Julia and Rust (which I don't use for work because they're changing too quickly, and maintenance burdens are hence too high). But some of their innovative features will stand the test of time, and I'll either end up using them in future versions of older languages, or I'll end up using the exciting new languages when they've matured a bit.


by the way, three-decades has gone since FORTRAN became Fortran.


From what I've seen, Go is the closest we have for mainstream language resistant to change.


Recently the Go team decided not to add the try-keyword to the language. I'm not a Go programmer and was a bit stumped by the decision until I saw a talk of Rob Pike regarding the fundamental principle of Go to stick to simplicity first. [1]

One of the takeaways is, that most languages and their features converge to a point, where each language contains all the features of the other languages. C++, Java and C# are primary examples. At the same time complexity increases.

Go is different, because of the simplicity first rule. It easens the burden on the programmer and on the maintainer. I think python would definitely profit from such a mindset.

[1] https://www.youtube.com/watch?v=rFejpH_tAHM


In my opinion, someone should be able to learn all of a language in a few days, including every corner case and oddity, and then understand any code.

"Understanding" what each individual line means is very different from understanding the code. There are always higher level concepts you need to recognize, and it's often better for languages to support those concepts directly rather than requiring developers to constantly reimplement them. Consider a Java class where you have to check dozens of lines of accessors and equals and hashCode to verify that it's an immutable value object, compared to "data class" in Kotlin or @dataclass in Python.


Sometimes a language introduces a concept that's new to you. Then you need way more time. For example, monads : I understood it (the concept) rather quickly, but it took a few weeks to get it down so I could benefit from it.


Try C maybe? It is still updated, but only really minor tweaks for optimisation.

Also Common lisp specs never changed since the 90s and is still usefull as a "quick and dirty" language, with few basic knowledge required. But the "basic feature set" can make everything, so the "understand any code" is not really respected. Maybe Clojure is easier to understand (and also has a more limited base feature set, with no CLOS).


C compilers like GCC and Clang have dialect selection options that work; if you pick -std=c90, you can write C like it's 1990.


remember the gang of four book? such books happen when the language is incapable of expressing ideas concisely. complexity gets pushed to libraries which you have to understand anyway. i'd rather have syntax for the visitor pattern or whatever else is there.


Python 2.7 is not far from that language.


What's stopping people from forking the language at python 2.7? Let the pythonistas add whatever feature they feel like while people who need stability use "Fortran python" or whatever.


Probably most of the people who like writing language interpreters understood that Python 3 fixed a lot of mistakes, so it would be funner to work on.

Though I'm surprised nobody really wrote a transitional fork (six gets you a lot of the way but "Python 2.8 which has _just_ the str/bytes change" would have been useful).

Ultimately Python 2 isn't a better language, it's just the language everyone's code was in...


In my fantasy-py language, there is no "str", base types would be explicit. unicode() bytes(). "something" could have an implicit u"". Composite types could be explicit. If I want a set of int's, I can use mypy now to s1: t.Set[int] = set(), but that's just linting.


> What's stopping people from forking the language at python 2.7?

If you don't want to change/add something to the language, then why fork it?. You can just continue using it as it is!


The implementation needs to be maintained.



I truly wish this would become a thing. It's really frustrating having to update my installed packages and my code for some stupid change the language designers thought is sooo worth it. Just stabilize the bloody thing so I can do some work. Updating code so it meshes with the "latest and greatest" is _not real work_.


Fixing the entirely broken string/bytes mess up in Python 2 was worth it by itself. For bonus points old style classes went away, and the language got a significant speed boost. And now it’s not going to die a slow death, choking on the past poor decisions it’s burdened with.

Trivializing that by suggesting it was some offhand, unneeded solution to a problem that some dreamy “language designer” thought up is at best completely and utterly ignorant.

Also maintenance, in all forms, is work. That does involve updating your systems from time to time.


> and the language got a significant speed boost.

I have not seen a clear win in real benchmarks. 3 was slower for the longest time, and nowadays it seems head to head depending on the project.


Check out https://speed.python.org/comparison/. It’s not significantly faster, but it’s getting more so.


I don't know how to say head-to-head more than this graph

https://speed.python.org/comparison/?exe=12%2BL%2B3.6%2C12%2...


I would say this makes it a bit clearer:

https://speed.python.org/comparison/?exe=12%2BL%2B3.6%2C12%2...


Maybe it's work if you get paid by lines of code and JIRA tickets but programming is just a tool for me to my real work done. So I would like to spend as little time programming as I possibly can.


Nobody here gets paid per Jira ticket or line of code.

Sure, if you don’t program and just write ad-hoc (unmaintainable?) scripts then the transition is annoying. But it’s also not required. Don’t re-write your scripts, you can always ensure that Python 2 is present.

But if you’re maintaining a project that uses the wider ecosystem, then you are at the mercy of that ecosystem. And, at the time of the decision to make Python 3, that ecosystem was saying “Python 2 has a lot of horrible legacy decisions that make it harder than it should be to write good code”.


Containers or environment management solve this problem quite easily. All of my major projects have a conda environment alongside them, and I expect I'll be shifting things over to Docker containers as my org starts shifting things to the cloud.


Isn't that what C is?


Certainly Common Lisp.


Lua is pretty close, and pretty close to Python in terms of style and strengths.

Edit: I actually forgot about the split between LuaJIT (which hasn’t changed since Lua 5.1), and the PUC Lua implementation, which has continued to evolve. I was thinking of the LuaJIT version.


I'm in operations and I've spent much of my career writing code for the Python that worked on the oldest LTS release in my fleet, and for a very long time that was Python 1.5...

I was really happy, in some ways, when Python 2 was announced as getting no new releases and Python 3 wasn't ready, because it allowed a kind of unification of everyone on Python 2.7.

Now we're back on the treadmill of chasing the latest and greatest. I was kind of annoyed when I found I couldn't run Black to format my code because it required a slightly newer Python than I had. But... f strings and walrus are kind of worth it.


That's what Go has been so far but it might see some changes soon after being "frozen" for ~10 years.


Why can't you do this with Python? No one said you had to use any of these new features...

Though to me that's like saying, "I want this river to stop flowing" or "I'd prefer if the seasons didn't change."


All human languages change over time. It is the nature of language.



Go? I moved a lot of my datascience and machine learning process to Go. Only thing really left in Python land is EDA


Absolutely agree. How many times have you heard "that was true until Python 3.4 but now is no longer an issue" or "that expression is illegal for all Pythons below 3.3", and so on. Not to mention the (ongoing) Python 2->3 debacle.


> Not to mention the (ongoing) Python 2->3 debacle.

When will this talking point die? It's not "ongoing". There's an overwhelming majority who have adopted Python 3 and a small population of laggards.


> There's an overwhelming majority who have adopted Python 3 and a small population of laggards.

That small population includes every BigCo with a large python codebase.


Who cares about syntax that doesn’t work in old, dead versions of Python 3? 3.5 and above is all that matters.


Lua is a great small language.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: