So, the author is right that we're getting more Pythonic, but he's wrong to say we'll all write in Python some day. Some day, something like Python will be the fast low level language, there will be new slower languages that are easier to use than Python, and C will be a memory. The evolution of programming languages will never end, not in our lifetime certainly, and probably not as long as we walk the earth.
You probably mean a very limited number of web app frontend code. That may be true. But I think you underestimate what is being done with software.
With a language that is 200 times slower than C or Java you can't do any data analysis, graphics or image processing, machine learning, algorithmic optimisation, financial software like trading, pricing and risk management, embedded systems, bioinformatics, simulation and a whole lot more.
You're basically excluding yourself from doing anything that mankind couldn't do before. Progress it's called. Most AI tasks require massive computational power. But even just things like pickling a Python object is too onerous for Python itself. I don't think that's good enough.
I'd be happy with just winning that battle.
The horror. The horror.
In the early days, the fact Unix was written in C was quite remarkable. At that time, OSs were written in hand-optimized assembly language.
I see no reason to stay with C-level languages (C was once called high-level) if we can make compilers that translate them into machine code that's faster than hand-optimized assembly.
Smalltalk/80 and the various Lisp machines are sufficient proof whole environments, from kernel to GUI, can be written in higher-than-C-level languages, provided the machine is fast enough. If we can extract speed from compilers and not from expensive silicon, I see no reason not to do it.
In a couple days I will attend to a lecture on porting the Squeak VM to FPGAs and running Smalltalk on silicon. That should be interesting.
There are Smalltalks that ditched (in one case all) their "primitive" methods written in C against the VM internals because the JIT compiler turned out to be good enough. (Agents for the all case. VisualWorks also lets much of the Dictionary functionality JIT to machine code with no primitives.)
The evolution of programming languages will never end, not in our lifetime certainly, and probably not as long as we walk the earth.
You forget to mention a big factor: speed of cultural transmission. Considering what Lisp, Smalltalk, Self and other languages have been able to achieve for high level languages, why are we still dealing with poky high level languages like Ruby and Python? A part of it is cultural expectations. We still expect HLL to be an order of magnitude slower, even though that expectation is decades out of date, technologically. (HLL should only be 70-50% slower now.)
Ruby in particular is basically Smalltalk + Lisp for regular people - it's taken a good chunk of the concepts and packaged it in a way more people are happy to work with.
First time I read about Scheme I was fascinated. Until I played with it a while and put it away and promptly went back to lower level languages (C, Pascal and M68k assembler at the time) because the syntax (or lack of ...) was just too alien for me.
Same with Smalltalk.
Yes, I discarded them over syntax. Speed never entered the equation - I dropped them before I got to evaluate performance.
Ruby is the first language I've worked with that could do justice to a lot of the concepts from Smalltalk and Lisp.
I very much doubt Lisp and Smalltalk has much hope of wider use than they see now - most of the important features are being added to other languages, and the remaining ones are not seen as useful enough by most developers to be worth the painful syntax.
Do I take issue over other people using other syntaxes? No. Does syntax put any kind of limitation on VM speed? No. Do I even imply that Smalltalk and Lisp are conceptually superior to Ruby? No.
What I am saying is that the current popular HLLs are behind the curve in terms of performance.
I find it funny that adult techies discard Smalltalk over syntax. There's only 5 rules, FFS. Lots of grade school kids have used it like a highly advanced multimedia Logo. That said, the traditional operator precedence of algebras is such a deep part of Engineering, Science, and Mathematics culture, that any language that doesn't add it as a convenience to those groups is probably and understandably doomed to obscurity. (In particular, Forth, Smalltalk, Lisp, Self, and others.)
If only David Simmons had better marketing and community organizing skills. Smallscript might have been where Ruby is today.
I think that's the main problem. There are lots of "Lisps" out there besides Common Lisp (which is what most people think of when they hear "lisp"). I.e. Scheme, newLISP, and Clojure come to mind.
Pick one of those three Lisps and learn it, you'll be wondering why anyone in their right mind would consider writing in any other language, unless it was a lisp. As someone once said, there's a reason people become Lisp zealots; just take a look at it.
Plenty of software is not web-based or a cron job for Linux.
Java, C# are chosen not just because of speed.
You can get vendor support from Sun for their JVM stack. You can (try to) get similar support from Microsoft for .Net stack.
Then also there are personal preferences ::
. Static Typing can be a key part of how people build complex systems in a way they feel comfortable with. When my team codes that way, may bugs are picked up at compile time versus deploy time.
. JVM has really nice remote debugging
. JVM is much closer to run-anywhere than Python
. JVM exits with hs_err files logged
And I think it's a stretch that people who use unit tests are just trying to compensate for a lack of static types. Actually, it's more than a stretch, it's false for pretty much any competent programmer.
If it were true, why would Java programmers use JUnit?
Best of both worlds -- optional static types as in Strongtalk. I can imagine development methodologies that demand you statically type everything before you release to production. This way, you get fast duck typing development and the security of type safety for the maintainer.
I disagree that testing blindly is scarcely better than manual QA. If it's automated, the cost of leveraging your tests is small. The cost of adding more tests is not multiplied by iterations. With automation, increased frequency of tests can be used to localize the cause of bugs to particular changes in time.
A lot of the pain of strong typing goes away with type inference - and strong typing means you get much better tool support, e.g. you can see a problem right there highlighted in your editor, you don't need to wait for the unit tests to run on your nightly build!
Maintenance is tangential? Most of the work across all of the fields of programming is probably maintenance.
If you are doing exploratory programming, what does unit testing get you?
It gets you the ability to change your mind and do deep reworkings of very complicated systems with a higher degree of confidence. It's most valuable for maintenance, while duck typing, however.
How can you write the tests before the code if you don't have a spec to work from in the first place?
You end up doing better interface design at a finer granularity than you would have to otherwise. It slows you down, but requires greater discipline in terms of good architecture, so you end up saving time that way. Test first is really just design-up front, but with 3 or 4 orders of magnitude more iterations.
This is best for duck typing languages. Would I unit test in Eiffel, which supports Design By Contract? Doubt it.
you don't need to wait for the unit tests to run on your nightly build!
In the original Extreme Programming, one ran unit tests before checking in any code!
I hear you, but come on now, static types as a replacement for unit tests? I hear they cure cancer too!
In all seriousness, optional type safety sounds pretty good to me as long as there aren't any major tradeoffs involved.
I can imagine development methodologies that demand you statically type everything before you release to production. This way, you get fast duck typing development and the security of type safety for the maintainer.
I might be misunderstanding, but wouldn't this require you to rewrite your code before a production release? Wouldn't you then lose a great deal of the leverage of using a freedom language?
Is this a deliberate troll? Failure of reading comprehension? Where do I ever advocate static types as a replacement for Unit Tests? Why would a Smalltalker ever do that!?
I might be misunderstanding, but wouldn't this require you to rewrite your code before a production release?
There's a big difference between inserting a bunch of tags like <Integer> at the end of the development cycle, probably guided by a coding tool, possibly using Hindley-Milner type inference to automate part of the process, and rewriting. (See Haskell)
Did you not know about these sorts of tools? Are you unfamiliar with Strongtalk type annotation syntax? Please give an example that would require something as extensive as a rewrite.
EDIT: To clarify, Strongtalk type annotations are completely optional. Take almost any code, remove the type annotations, and it will run exactly the same. They are also always just the Class Name. In Strongtalk as in Smalltalk, evenything is an Object, so all types are simply Class Names. No complex types at all.
Whoa, what are you talking about? Ironically, I think you're the one failing at reading comprehension. Read the thread. Clearly my "static types as a replacement for unit tests" comment was referring to the post I'd responded to initially. It had nothing to do whatsoever with what you said. I was essentially acknowledging your point, but saying that some people seem to advocate static typing for everything. Calm down bro.
Did you not know about these sorts of tools? Are you unfamiliar with Strongtalk type annotation syntax?
Actually, no I didn't, and yes I'm unfamiliar with it ... that's why I asked you for more info.
the rest of what you said
This is all interesting to me. I'm only lightly familiar with Smalltalk and Strongtalk. I was asking for clarification because I really wasn't sure I knew enough about it. Again, no trolling intended ... no need to be defensive.
Read my posts! You're putting someone else's words into my mouth! (Ones which are uninformed and frequently used themselves to troll!) So, there should be an implicit assumption that every thread is diametrically opposed around the issue? And everyone who doesn't follow that assumption and inserts informative neutral concepts are "not paying attention?" I find that intellectually limiting, to put it kindly.
Calm down bro...I was asking for clarification because I really wasn't sure I knew enough about it. Again, no trolling intended ... no need to be defensive.
No, you were putting words into my mouth due to inattentive reading. I'm not being defensive. I'm going on the offensive. Being inattentive and putting words into someone else's mouth and being called on it is not your cue to tell them to mellow. In my book, it's time to demonstrate some intellectual integrity and apologize.
I thank you for your most illuminating reaction!
Still, some of the errors you would catch with static typing you can catch with unit tests. It's not a replacement, but probably better than nothing.
Do you really believe that?
1. It is easier and quicker to develop a correct solution in a higher level language than a lower level one.
2. Expressiveness / abstraction come at the cost of time and space efficiency.
Thus, if time and space efficiency are not a concern, then the higher level language should be preferred.
Not all code written in C is performance-critical. But sometimes it's still the best choice if your code is directly talking to hardware.
>I believe the flaw in the argument is that the higher-level languages are actually more constrained, because they have made too many broad promises and are unable to tell when they can violate them without ill effect, even though it is obvious to the programmer. We might think that your example above would be a perfect case in which we could parallelize the loop... but what if sum() had side effects, such that it would produce the wrong result if not evaluated in order? What if sum() didn't have any directly detectable side effects, but called another function which called another function which did... sometimes, from code in a runtime eval? The compiler has no way of knowing this, so it has to assume the worst, and this is why it ends up being unable to make all the optimizations that one thinks it "should" be able to make. The human looks at it and says "duh, it's called 'sum'. If it has side effects, I'm going to smack somebody up the side of the head" and proceeds to optimize in the expected ways.<
"In an ideal world, high-level languages like Python would replace all other programming languages."
That includes smalltalk, scheme, clisp, and every marvelous language not yet invented.
But then he goes on and says something very closed minded like
"Someday we will all program in python."
As if python got everything correct, and we have a perfect map of how humans think with it. The arrogance.
(map (partial reduce +) (partition 7 7 daily))
Edit: Possibly that should be partition-all from seq-utils to sum the remainder days if the length of daily isn't divisible by 7.
map(partial(reduce, +), partition(7, 7, daily))
Personally I'd be much much more happy to be using this code than the snippet in the article. In fact, I don't program Python regularly and it did take quite a while to figure out what was going on in the code.
weekly = [sum(daily[j:j+7]) for j in range(0, len(daily), 7)]
Now if you look back at the top snippet, its just normal function calls, so you can be happy to know what's going on after reading documentation for the names "partition" and "daily".
If you don't know what "map", "partial", "reduce" and "+" mean, sure you have to look up their documentation too, but they occur frequently enough that you can remember what they do after that.
The Clojure version is much more high level and less likely to introduce bugs.
Oh and it should be said that "sum" is equivalent to partial(reduce, +) so I can now say:
map(sum, partition(7, 7, daily))
See? The code is just melting away before your eyes. Also note how I have literally no idea what either of the snippets do but am already happily changing one of them.
def partition(n, step, coll):
for i in range(0, len(coll), step):
weekly = [sum(week) for week in partition(7, 7, daily)]
map(sum, partition(7, 7, daily))
> That's pretty unreadable for someone who doesn't know clojure.
The real reason for commenting was to show that it is readable and yes, providing the "readable" Python equivalent.
There are two points - firstly, Clojure is (or is becoming?) a lazy language and the list type is different, so no they are definitely not exactly the same.
This brings us to a broader point in that even if you were to have laziness in Python it still wouldn't matter because I'm not the only coder in the world and there are a lot of Pythonists like the author of the article who do it the Pythonic way, which as already hinted, I think is just braindead.
[sum(week) for week in partition(7, 7, daily)]
You are certainly correct in that. That's not what makes Lisps superior to all other languages though, "it's the syntax stupid."
Edit: Sorry, I know that comment appears trollish (and it is), what I meant to say is that unlike Python, Lisp(s) have virtually no syntax, and are more powerful because they treat code as data. It would take me a blog post to explain why that's important, but if you give the language a shot you will see why that is (and compare how long it takes you to learn it to how long it took you to learn Python!).
Regardless, every Lisp that I am able to comment on (newLISP, elisp, Scheme, CL) can do that easily, so I doubt that Clojure would have difficulty with this.
And that Python is pretty unreadable for someone who doesn't know Python.
Operator Precedence for Algebras.
Implementing an Algebra is a very important thing for a language to be good at. Being able to use algebras without having to switch mental gears is a good thing for scientists, mathematicians, and technical people of all stripes.
IMHO, the ability to easily implement your own algebra is a hallmark of a great language. Having that with the operator precedence is a bit of very justified syntactic sugar.
While I am sure that some high school courses may start with Scheme, it's certainly not a common beginner's language. I'm sure if I searched hard enough, I could find a high school class learning assembly -- but that doesn't make assembly a good beginner's language, either.
Here, for example, is typical C code, right out of K&R:
for (count = 0; count < MAX; count++)
printf("\narr[%d] = %d.", count, arr[count]);
(for ([count (in-range 0 MAX)])
(display (format "\narr[~a] = ~a." count (vector-ref arr count))))
As of June 20th, there is now an optional padding argument in clojure.core/partition.
While I appreciate the general sentiment that high-level languages will take over the world, there is very little that's new in this article and I am not impressed by the showpiece Python code from the article. For loops are the future?
I seriously doubt it. The more the compiler knows, the more it can optimize things.
What it's "daily" in this example? It could be anything. The only thing the compiler knows at compile time is that it should generate some lookups searching and calling the functions "__iter__", "__slice__", etc... Put that line into a function which receives daily as a parameter and it could mean something different at every single invocation. Human beings, looking that code think know what it does, while in fact don't have a clue. The equivalent code in clojure someone has wrote only is really equivalent under some very specific assumptions.
I know, I know, at runtime you could gather all kinds of interesting data to optimize things. But, that's not easy nor cheap and, sometimes, iterating over an array, it's just iterating over an array.
[edit:] if only... Amend to: some day, we will program as efficiently as possible, and software will do most of the annoying housekeeping.