Nim empowers the programmer to do just about anything, use GC or switch to manual MM, target any computing platform that supports C (i.e. everything), reinvent the language with macros, while packaging it up very elegantly indeed.
Elm promises almost the exact opposite. I mean it's elegant, but it's also limited - purely functional, immutable, no nulls, no unchecked list access. And that's the kind of limitation that addresses the problem that Bruce has with other languages - dealing with all the complexity of how something's working. The promise of Elm is that if your code compiles, it works, leaving you to focus on getting the important stuff right - information flow and business logic - instead of tracking down another silly invalidity that never should have made it to runtime.
They're both young, but I wish Andreas Rumpf (Nim) and Evan Czaplicki (Elm) and their teams all the best in taking on the "big boys" with these plucky languages, and would encourage Bruce Eckel to consider skipping ahead of the curve and writing for these alternatives.
This is a very brilliant way to describe exactly what makes Nim excellent. I hope you don't mind if I quote you on this :)
Go is useless, but Rust and Nim have much in common (easy integration with native toolchains, stdlib-less bare-bones development for embedded systems); why did you decide that Rust is not for you?
* Decorators: I grant that they're hard on a debugger; I wish that there were a slightly more sane way for the control flow to play out. However, syntactically they're super clear.
* Coroutines: What's the problem here? I prefer Twisted's deferred model, but I have no problem with coroutines.
* Dynamic typing: I mean, if this is a religious issue, then I don't want to offend, but seriously: especially for newcomers, the type system in Python is simple. And clear.
* Exceptions as part of control flow: And why not? Exceptions are a part of ordinary human logic. Why do you regard them as unclear or complex?
* Boatload of basic types: I can do with a few fewer exposures of "low level concepts" in types (ie, int, float, double, decimal), but it's not that bad.
* Deep class hierarchies: I'm not even sure what to say to this. It seems like an objection to OOP generally.
* eval: I think we all agree that eval introduces opacity and complexity, but do you prefer it be removed?
* Massive standard library: Compared to other languages of a similar age, Python does an awfully good job at providing One Obvious Way to use the standard library to do what (and only what) you want.
* schism between 2 and 3: Hear hear.
* Eggs vs wheels: Yeah. Fuck.
* pip vs setup vs easy_install: Yeah we get it. Packaging is a problem.
Overall, I think Python is simple and clear. If I had to pick an objection, none of the ones you raised makes my list. I'll probably go with the syntax for the "type hints" system. And the GIL. And the varying definitions of "is" across implementations.
But all in all, it's pretty darn simple and clear.
For any given line it's hard to know if it will raise and if so what type of exceptions.
I do like in Go that if an error can occur there will be a return value for it. Makes you much more aware of when things might go wrong.
With python it seems to be more like "find out at runtime" our "read the docs and source for every function you call"
Keeping track of that, (oh wait, this is a numpy array, not a python list...) and things like that expand the amount of the language that a programmer must keep in her head at any one time, and are the definition of complexity.
There's no known silver bullet for the complexity / power tradeoff, which is why we get so many cool languages!
In : import numpy as np
In : a = np.array([1,2])
In : b = np.array([3,4])
In : a+b
Out: array([4, 6])
In : a = [1,2]
In : b = [4,5]
In : print(a+b)
Out [1, 2, 4, 5]
>>> x = numpy.array([1,2])
>>> y = numpy.array([3,4])
>>> x + y
>>> x + [3,4]
>>> [1,2] + y
>>> [1,2] + [3,4]
[1, 2, 3, 4]
> a <- array(c(1,2))
> b <- array(c(3,4))
> a + b
 4 6
> class(a + b)
If you put the syntax on top of well-thought-out semantics, you get something like Nim ( http://nim-lang.org/ ).
I do have one good thing to say about it, though: it made systems type stuff fun. It has a featureset of "gradual affordance" that lets you implement something now and optimize later. That part is admirable, and most like Python. The next time I feel inclined I'll be looking at D, though - it's a more mature ecosystem, in several respects.
Thank you, it's good to be warned. My main reason for liking Python was that the community was so friendly; having used other languages where you say anything and get shot down it's a deciding factor for future choices.
For instance, take comparison operators - by having < and == work across all the built-in types (including sequences), it's really easy to compare and sort stuff. Which is essential to writing terse and efficient algorithms.
Here's a helpful video from Feynman that kinda explains it: https://www.youtube.com/watch?v=YaUlqXRPMmY
Python language (ditto Perl, Ruby..) uses Babylonian approach, while the languages with more fundamental building blocks (like say Scheme, Haskell..) are more like Greek approach.
So Python it is not simpler as a language per se, but it is simpler to mold onto the problem. At least that's my explanation why people (myself included) like to use it, despite well understanding there are more elegant (mathematically) languages out there.
Ruby on the other hand is simple. There are objects and there are messages, that's it.
I'd never heard of it before but it looks pretty cool.
His rant is great, for more than just scala.
The link that works for me on a desktop browser: https://www.youtube.com/watch?v=4jh94gowim0
Wow, this is a brutal talk.
In case others are curious as to content: For me the major takeaway was that the person who has written more core Scala code than anybody thinks that the collections and probably the whole language is irredeemably fucked. He had example after example where if you do a reasonable action in 2 different ways, you get 2 different results, and that it was impossible to know why without an incredibly deep understanding of the implementation details.
This makes me sad, as I really like a fair bit of what Scala does. But the video makes a persuasive case that Odersky, et al, bit off more than could be chewed, throwing in too many interesting ideas without really thinking through what happens when those ideas intersect and combine.
He hypes a language and makes money.
He bashes that language and makes money.
If we take the systems design view that new languages rise as solutions to existing problems, initially they are greeted as making thing X easier. But, since they work differently in places than previous language, they expose new emergent dynamics of software development, some of which will be pathological.
Once the strengths of the language are understood, improving on the usage of that language is then based on understanding the systemic pathologies that language induces and how to work around them.
If you think you can do it, do it, release it. Bask in the glory.
Prediction: he can't. But good luck anyway.
Yes, the 'Atomic' in the title was confusing at first, having read none of this books with the 'Atomic' moniker. In fact, his C++ books are short and to the point.
I don't see anything far fetched about his goals.