
Programming language versioning bothers me - signa11
http://rachelbythebay.com/w/2012/07/25/versions/
======
wyldfire
> Now you're left to the lowest common denominator in the name of
> compatibility. Have you really accomplished anything? What's the point of
> having new features if you can't actually use them?

Programming languages making new revisions is better than the absence of
revisions.

> All of this would be different if you could just compile this stuff to a
> binary format

Hardly. C99/11, C++98/03/11/14/17/2a -- these changes modify syntactical
language features available during compilation _and_ target libs. This article
is from 2012 but its author appears to come to this conclusion as of yesterday
[1]. Unfortunately, instead of embracing the change that takes place in both
Python and C/C++, she seems to reject it all due to the complexity introduced.

Indeed, some folks tend to embrace portability over all else. CPython itself
only started to move beyond requiring C89 within the last couple of years, and
they're selective on which C99 features they're willing to leverage. It's an
extraordinarily portable project, IMO. OTOH there are projects who leverage
more modern language features in C or C++. Those projects sacrifice
portability, often to move the complexity of some of their design elements
into the compiler and/or platform libraries. I believe that the latter is a
better option, when it's available.

[1]
[http://rachelbythebay.com/w/2018/04/02/cpp/](http://rachelbythebay.com/w/2018/04/02/cpp/)

~~~
jcelerier
> It's an extraordinarily portable project, IMO

is it, though ? I write daily in C++17 and have yet to find a relevant
platform where a pure C89 program such as CPython would work and mine
wouldn't. I mean, for hell's sake, you can target Commodore 64 and MS-DOS with
the latest c++ revisions.

~~~
x1798DE
I imagine this depends on how you define "relevant". I think a lot of
companies still use AIX, and Python compiles just fine on it, but I don't
think there's a compiler that supports C++17 on it.

~~~
jcelerier
There is GCC 4.8 at least, which is more than enough to build GCC 7 and use
c++17 afterwards

------
Apreche
The only reason this is a problem is because a language like Python is
interpreted. Programs are usually deployed as source. This means the platform
that will execute the program needs to have Python installed, and that
environment comes into play.

With C you are usually delivering a compiled binary executable. If they did
add a feature to C, you could use the new compiler with the new feature on
your local development environment. It wouldn't matter at all to the place you
deployed it, as long as you make a compatible binary.

There do exist some tools to build binaries from Python, and other interpreted
languages. They still need a lot of work. I would say that it should even be a
part of python itself. I see no reason that I shouldn't be able to do
something like

    
    
      python --compile hello.py > hello
    
      ./hello
    
      "Hello World"
    

Or on Windows

    
    
      python --compile hello.py > hello.exe
    

Now the target environment doesn't even need Python to be installed at all!

~~~
wyldfire
> With C you are usually delivering a compiled binary executable. If they did
> add a feature to C, you could use the new compiler with the new feature on
> your local development environment. It wouldn't matter at all to the place
> you deployed it, as long as you make a compatible binary.

You're mistaken. C has added features to its standard library before. It
allows you to detect the presence of this feature at compile-time with a
preprocessor definition. But if your compiler supports the feature and your
target does not, in most cases you will get a fatal error regarding symbol
resolution from the loader.

This problem is remarkably similar to the one faced by Python developers. The
difference is that Python's philosophy includes a statement regarding
"batteries included", so its standard library has taken on more frequent
additions than C's.

In order to portably target multiple environments, you will occasionally need
to handle ImportError (disable a product feature or have a fallback
implementation) or (in C, e.g.) provide a fallback implementation designated
as "weak". Or forego portability, capitalize on the feature, and raise the
requirements for your target.

~~~
AnimalMuppet
By "from the loader", do you mean the linker? Most of us regard that as part
of compiling.

Or are using .dll or .so files?

For a static C program, regarding compiling and linking as part of the same
(build) process, I see only two possible outcomes. Either I get an error at
build time, or I get an executable that I can drop on the target that will
run.

The one exception I can see to this is if the program is going to make an OS
call that some versions of the OS don't support.

~~~
wyldfire
> By "from the loader", do you mean the linker? Most of us regard that as part
> of compiling.

No, I meant from the dynamic loader (like ld.so).

> Or are using .dll or .so files?

Yes, that's a very common idiom for libc and libstdc++/libc++. Yes, you're
right that statically linked binaries will not suffer this dependency and
usually end up more portable (often at a cost of some memory and
security/maintenance).

> The one exception I can see to this is if the program is going to make an OS
> call that some versions of the OS don't support.

Exactly. This is actually a little less rare than C library features, from
what I've seen. But, here, like ImportError and providing weak impls, you can
handle ENOSYS.

------
zeveb
This is a little-appreciated benefit of using Common Lisp: the language is
both standardised and stable, so a library from 2000 is very nearly as likely
to run on an implementation today as it was then, or as another library is
now.

I do wish that there'd been a newer standardisation effort since, for a couple
of reason, but it's really, _really_ nice that the people haven't had to waste
effort updating working code to deal with language changes.

~~~
sifoo
Agreed, that's one of the more important advantages of Common Lisp in my book.
But the reason it works so well is that the standardized version was based on
plenty of experience and experimentation, and that it's powerful enough to be
extended from user code; same goes for C to some extent.

------
daotoad
Perl 5's twofold approach to solving this mess is to use compile time checks
to manage new features. The _use VERSION_ syntax lets you specify a version of
perl. The _feature_ pragma lets you enable or disable new features on a case
by case basis. In my experience, this works really well.

The _feature_ pragma (see
[https://perldoc.perl.org/feature.html](https://perldoc.perl.org/feature.html))
lets you manage new features on an individual basis. You just write: _feature
NAME_OF_FEATURE;_. For example, _feature signatures;_ In this case adding
subroutine/method signature support. This is a compile time check.

If you don't want to hassle with enabling features on a piecemeal basis, you
can also specify a version of perl you wish to rely on with _use VERSION_ (see
[https://perldoc.perl.org/functions/use.html](https://perldoc.perl.org/functions/use.html)).
For example, if want all of the standard features of perl 5.26.1, you say _use
v5.26.1_.

------
crdoconnor
> Can this happen in any language? Sure. Play with enough deep, dark stuff and
> I'm sure you can can unleash run-time equivalents of nasal demons anywhere.

Also databases, caches, REST APIs, syscalls (e.g. libc), docker, dependent
libraries, browsers and much much more....

In spite of that I don't see it as being an intrinsically difficult problem to
solve. Just make a best effort to run the same versions of stuff in dev and
test as you do in prod (with python, pyenv and dependency pinning is your
friend here) and make liberal use of sanity checks - if it's only tested on
versions x->y, put in a check that fails with a message if it sees anything
else.

The main problem isn't that ^^ that is hard, it's just that nobody really does
it or recognizes how much time can be saved by doing it.

------
cestith
This is a problem in some contexts but not others. Much software is run in
environments controlled by the same party as the code. If one needs a new
feature, one upgrades.

Dependency management for software you're distributing is a hassle, to be
sure. The version of the language is one of many considerations. System calls,
memory constraints, paths, and many other things can differ from one
environment to another.

------
phyrex
It blew my mind when I first realized that in Clojure projects the programming
language is a dependency of the project itself. Of course, there is still the
JVM, but I personally have never run into JVM version problems.

~~~
coldtea
> _Clojure projects the programming language is a dependency of the project
> itself._

That's no different than any other interpreted language...

~~~
lilactown
The difference is one of semantics; of course all interpreted languages depend
on their interpreter environment, but for some reason they do not always
actually put it inside of their dependency tree a la package.json,
requirements.txt, etc.

The reason this is so natural for Clojure is that it's a hosted language; the
standard library & language itself is decoupled from the runtime platform
(JVM).

With Clojure, it's actually funny to think that I have projects on my hard
drive that use 3-4 different versions of clojure (not even counting various
versions of clojurescript!), without any issues. Doing this with, say, Node.js
requires a lot of extra ceremony.

~~~
JamesLeonis
To add to this, Clojure benefits strongly from being a Lisp. The language
syntax itself might change once an ice age, and even then libraries can extend
it with macros. Old Clojure code will very likely run with the latest version.

Clojure(script) benefits as a compiled language as well. If Clojure 2.0
suddenly discovered Postfix notation was the GREATEST THING EVER, my 1.6, 1.8,
and 1.9 projects and JARs are entirely unaffected. They could even run side by
side, from the same interpreter version, without so much as a shrug of
concern.

> (not even counting various versions of clojurescript!)

That made me laugh! Clojurescript should use a date stamp as the incremental
version.

------
vira28
It was really challenging for me when i wanted to learn Javascript. The
articles and blogs which i came across online are written in many different
versions, and the syntax.

~~~
nimos
Don't even get me started on ES2015 and ES6 being the same thing.

------
ggm
I don't know the author. Is it possible she is young enough not to have lived
through the mutability of C before we settled on a stable definition?

Her argument seems to say 'coming from c' which suggests she didn't experience
difference in handling of ints, shorts, signed and unsigned, alignment,
#pragmas. C on SCO UNIX was not identical to C on an Apollo domain. It might
have been one C standard (K&R) before ANSI C, but it was not one
implementation and 'language features' are not at root that different to
coding to compiler behaviour.

Oh.. getopt() wars...

------
DubiousPusher
This is a frequent pain in the .Net world. I find the solution pretty simple
though. Setup your dev environment to only target the oldest platform version
you need to run on. If you ever get a new client with even older platform
versions, dial back the version in your environment and see what breaks.

I guess though, I have the benefit of checked types. When using this same
strategy with an interpreted language, you're definitely leaning hard on your
test coverage.

~~~
tigershark
This _was_ a frequent pain before .Net 4.5. Now thanks to Roslyn you can use
the latest language features and target .Net 4.5. A lot of stuff works even on
.Net 4.0 if you fix that annoying problem with the class they moved around.

~~~
DubiousPusher
True. For reasons too irritating to explain, I'm often working with pre .Net
4.0 environments.

------
maltalex
Needs (2012) in the title

------
benmmurphy
i guess this is where go really shines. it's a nice modern language that
statically compiles by default. you can kind of do the same with a scripting
language. just statically compile the interpreter and distribute the
interpreter with your program. [tho, this might be harder than it sounds
depending on the licenses of the libraries you have to link with] i know a lot
of windows java programs ship a JVM internally as part of the install process.

------
nivertech
add (2012)

------
hderms
It's a reasonable point but perhaps a bit outdated? Containers more or less
allow interpreted languages to be "statically linked". In the past you had to
care more about whether system X had library Y installed but that seems like
an edge case in today's world.

~~~
mieseratte
Not everyone can, or wants to use containers. They solve certain issues but
bring others into the mix. This is hardly a "solved" problem. It is very much
not outdated.

~~~
NervousTechno
> This is hardly a "solved" problem

You query the language runtime for the version and use it as an assert. It's
standard everywhere, except in the web where polyfills are still a thing.

~~~
mieseratte
Sorry, I was meaning to say that "containers!" is not the solution.

