Hacker News new | comments | show | ask | jobs | submit login
Someday we will all program in Python (davidbau.com)
53 points by empone 3012 days ago | hide | past | web | 77 comments | favorite



The only reason to write in a language like C or Java today (over Python) is speed. And there are a very limited number of applications that require that sort of speed. There's no question that Linux should be written in a low level language, or a high-performance chess bot. But I'm incredulous when I see anyone write a website in even the relatively high-level Java.

So, the author is right that we're getting more Pythonic, but he's wrong to say we'll all write in Python some day. Some day, something like Python will be the fast low level language, there will be new slower languages that are easier to use than Python, and C will be a memory. The evolution of programming languages will never end, not in our lifetime certainly, and probably not as long as we walk the earth.

I must say though that I pray for more convergence. It annoys me to write code in JavaScript and Python, having to remember subtle differences between the two as small as capitalization of true and False, and tricky pitfalls like the scope of a variable declared in an if block.

It does seem an unnecessary burden for me as a web developer to have to know a handful of languages. You can hardly make a website today without knowing HTML, CSS, Javascript, Python/Ruby, and you better know your SQL too. Then, let's talk APIs.


The only reason to write in a language like C or Java today (over Python) is speed. And there are a very limited number of applications that require that sort of speed

You probably mean a very limited number of web app frontend code. That may be true. But I think you underestimate what is being done with software.

With a language that is 200 times slower than C or Java you can't do any data analysis, graphics or image processing, machine learning, algorithmic optimisation, financial software like trading, pricing and risk management, embedded systems, bioinformatics, simulation and a whole lot more.

You're basically excluding yourself from doing anything that mankind couldn't do before. Progress it's called. Most AI tasks require massive computational power. But even just things like pickling a Python object is too onerous for Python itself. I don't think that's good enough.


You can already do image processing in lowly Javascript, example http://hacks.mozilla.org/2009/06/content-aware-image-resizin... , and machine learning can be done in pure Python, example http://montepython.sourceforge.net/ .


They're all Turing complete after all ;-)


Can we at least all agree that no one should be writing anything in a shell scripting language any longer?

I'd be happy with just winning that battle.


I like scripting my shell using a shell scripting langauge; if I have a few dozen (or hundred) console operations that need to be done often, then a shell script is more concise than ruby. If I need to add error handling, exceptions, networking, etc, then it's time to move up, but if all that's needed are external programs and some looping and conditionals, then I don't see any problem with scripting.


Why are you against shell scripting languages ?


Because I have seen the horrors that they wreak in a production system where they are responsible for mission critical logic.

The horror. The horror.


Could you give examples of it ? You could also write horrible python or ruby code.


I don't feel comfortable posting code from my employer here, sorry. But let me address your counter-argument on its own: Saying you can write bad Ruby or Python is a turing tarpit argument. You can write bad code in any language, but I've seen a lot more bad shell script code than I've seen bad Python. Add to that that shell scripting languages are very irregular and I can't see why anyone would use a shell script for something that they actually relied on that was more complicated than invoking a few commands in sequence.


"There's no question that Linux should be written in a low level language"

In the early days, the fact Unix was written in C was quite remarkable. At that time, OSs were written in hand-optimized assembly language.

I see no reason to stay with C-level languages (C was once called high-level) if we can make compilers that translate them into machine code that's faster than hand-optimized assembly.

Smalltalk/80 and the various Lisp machines are sufficient proof whole environments, from kernel to GUI, can be written in higher-than-C-level languages, provided the machine is fast enough. If we can extract speed from compilers and not from expensive silicon, I see no reason not to do it.

In a couple days I will attend to a lecture on porting the Squeak VM to FPGAs and running Smalltalk on silicon. That should be interesting.


But those machines (Smalltalk, Lisp) had dedicated HW dealing with gc, special instructions, etc.


I remember being a fly on the wall at a discussion between several very smart VM guys at Camp Smalltalk maybe 7 or 8 years ago. (VisualWorks, Smalltalk/X, Smalltalk Agents, Squeak, Dolphin.) Apparently, there's a lot of stuff chip makers could put in to general-purpose processors to generically support High Level languages. I think it's high time we had some of this, given the prevalence of Java, Perl, Python, Ruby, PHP (yes, it would benefit them too).


Cannot edit, but I left the link to the presentation out: http://fisl.vidanerd.com/palestra/30


Some day, something like Python will be the fast low level language, there will be new slower languages that are easier to use than Python, and C will be a memory.

There are Smalltalks that ditched (in one case all) their "primitive" methods written in C against the VM internals because the JIT compiler turned out to be good enough. (Agents for the all case. VisualWorks also lets much of the Dictionary functionality JIT to machine code with no primitives.)

The evolution of programming languages will never end, not in our lifetime certainly, and probably not as long as we walk the earth.

You forget to mention a big factor: speed of cultural transmission. Considering what Lisp, Smalltalk, Self and other languages have been able to achieve for high level languages, why are we still dealing with poky high level languages like Ruby and Python? A part of it is cultural expectations. We still expect HLL to be an order of magnitude slower, even though that expectation is decades out of date, technologically. (HLL should only be 70-50% slower now.)


We are dealing with "poky" high level languages like Ruby and Python because a lot of us find Lisp, Smalltalk and Self horrendous to work with.

Ruby in particular is basically Smalltalk + Lisp for regular people - it's taken a good chunk of the concepts and packaged it in a way more people are happy to work with.

First time I read about Scheme I was fascinated. Until I played with it a while and put it away and promptly went back to lower level languages (C, Pascal and M68k assembler at the time) because the syntax (or lack of ...) was just too alien for me.

Same with Smalltalk.

Yes, I discarded them over syntax. Speed never entered the equation - I dropped them before I got to evaluate performance.

Ruby is the first language I've worked with that could do justice to a lot of the concepts from Smalltalk and Lisp.

I very much doubt Lisp and Smalltalk has much hope of wider use than they see now - most of the important features are being added to other languages, and the remaining ones are not seen as useful enough by most developers to be worth the painful syntax.

The computational models and the work done on compiling them efficiently will contribute a lot though. Most Ruby implementations in the work and several Javascript implementations are all over PIC's and other concepts take from the Self project, for example.


You're glazing over my main point: the poor performance of recently developed High Level languages.

Do I take issue over other people using other syntaxes? No. Does syntax put any kind of limitation on VM speed? No. Do I even imply that Smalltalk and Lisp are conceptually superior to Ruby? No.

What I am saying is that the current popular HLLs are behind the curve in terms of performance.

I find it funny that adult techies discard Smalltalk over syntax. There's only 5 rules, FFS. Lots of grade school kids have used it like a highly advanced multimedia Logo. That said, the traditional operator precedence of algebras is such a deep part of Engineering, Science, and Mathematics culture, that any language that doesn't add it as a convenience to those groups is probably and understandably doomed to obscurity. (In particular, Forth, Smalltalk, Lisp, Self, and others.)

If only David Simmons had better marketing and community organizing skills. Smallscript might have been where Ruby is today.


> We are dealing with "poky" high level languages like Ruby and Python because a lot of us find Lisp, Smalltalk and Self horrendous to work with.

I think that's the main problem. There are lots of "Lisps" out there besides Common Lisp (which is what most people think of when they hear "lisp"). I.e. Scheme, newLISP, and Clojure come to mind.

Pick one of those three Lisps and learn it, you'll be wondering why anyone in their right mind would consider writing in any other language, unless it was a lisp. As someone once said, there's a reason people become Lisp zealots; just take a look at it.


C is portable assembly, I see no reason for it to disappear, and actually it would bother me to have to write low-level stuff in python.


I agree. While some languages go extinct, their domains remain. As we progress in computing and how to express it, we expand the domains that we program in.


"there are a very limited number of applications that require that sort of speed"

Plenty of software is not web-based or a cron job for Linux.

Java, C# are chosen not just because of speed.

You can get vendor support from Sun for their JVM stack. You can (try to) get similar support from Microsoft for .Net stack.

Then also there are personal preferences ::

. Static Typing can be a key part of how people build complex systems in a way they feel comfortable with. When my team codes that way, may bugs are picked up at compile time versus deploy time.

. JVM has really nice remote debugging

. JVM is much closer to run-anywhere than Python

. JVM exits with hs_err files logged


Some people really want static types.


More people than you'd think, hence the current fashion for "unit tests".


Static typing hardly replaces unit tests.

And I think it's a stretch that people who use unit tests are just trying to compensate for a lack of static types. Actually, it's more than a stretch, it's false for pretty much any competent programmer.

If it were true, why would Java programmers use JUnit?


As a long time Smalltalker, let me say -- we HLL language guys should get off of our high horse about static typing. Anything that you can do to move bugs from runtime to compile time is good for the maintainer. No, static types don't catch everything. But they are desirable if they catch anything at all.

Best of both worlds -- optional static types as in Strongtalk. I can imagine development methodologies that demand you statically type everything before you release to production. This way, you get fast duck typing development and the security of type safety for the maintainer.


Strong inferred typing is (generally) superior to static in the sense that it gives you the best of both worlds, flexibility and safety. Strong typing plus a clear separation between pure and impure code gets you a lot. Test harnesses are vital still, but testing blindly (which is what most unit testing does) is scarcely better than old-fashioned manual QA.


People forget that the whole point of abandoning strong typing in environments like Smalltalk was the extreme speed of exploratory programming that it supported. If you can have strong typing and have low overhead rapid exploratory programming, then why not? Duck Typing is not a goal, it's a means!

I disagree that testing blindly is scarcely better than manual QA. If it's automated, the cost of leveraging your tests is small. The cost of adding more tests is not multiplied by iterations. With automation, increased frequency of tests can be used to localize the cause of bugs to particular changes in time.


But that's tangential. If you are doing exploratory programming, what does unit testing get you? How can you write the tests before the code if you don't have a spec to work from in the first place?

A lot of the pain of strong typing goes away with type inference - and strong typing means you get much better tool support, e.g. you can see a problem right there highlighted in your editor, you don't need to wait for the unit tests to run on your nightly build!


But that's tangential.

Maintenance is tangential? Most of the work across all of the fields of programming is probably maintenance.

If you are doing exploratory programming, what does unit testing get you?

It gets you the ability to change your mind and do deep reworkings of very complicated systems with a higher degree of confidence. It's most valuable for maintenance, while duck typing, however.

How can you write the tests before the code if you don't have a spec to work from in the first place?

You end up doing better interface design at a finer granularity than you would have to otherwise. It slows you down, but requires greater discipline in terms of good architecture, so you end up saving time that way. Test first is really just design-up front, but with 3 or 4 orders of magnitude more iterations.

This is best for duck typing languages. Would I unit test in Eiffel, which supports Design By Contract? Doubt it.

you don't need to wait for the unit tests to run on your nightly build!

In the original Extreme Programming, one ran unit tests before checking in any code!


No, static types don't catch everything. But they are desirable if they catch anything at all.

I hear you, but come on now, static types as a replacement for unit tests? I hear they cure cancer too!

In all seriousness, optional type safety sounds pretty good to me as long as there aren't any major tradeoffs involved.

I can imagine development methodologies that demand you statically type everything before you release to production. This way, you get fast duck typing development and the security of type safety for the maintainer.

I might be misunderstanding, but wouldn't this require you to rewrite your code before a production release? Wouldn't you then lose a great deal of the leverage of using a freedom language?


I hear you, but come on now, static types as a replacement for unit tests?

Is this a deliberate troll? Failure of reading comprehension? Where do I ever advocate static types as a replacement for Unit Tests? Why would a Smalltalker ever do that!?

I might be misunderstanding, but wouldn't this require you to rewrite your code before a production release?

There's a big difference between inserting a bunch of tags like <Integer> at the end of the development cycle, probably guided by a coding tool, possibly using Hindley-Milner type inference to automate part of the process, and rewriting. (See Haskell)

Did you not know about these sorts of tools? Are you unfamiliar with Strongtalk type annotation syntax? Please give an example that would require something as extensive as a rewrite.

EDIT: To clarify, Strongtalk type annotations are completely optional. Take almost any code, remove the type annotations, and it will run exactly the same. They are also always just the Class Name. In Strongtalk as in Smalltalk, evenything is an Object, so all types are simply Class Names. No complex types at all.


Failure of reading comprehension?

Whoa, what are you talking about? Ironically, I think you're the one failing at reading comprehension. Read the thread. Clearly my "static types as a replacement for unit tests" comment was referring to the post I'd responded to initially. It had nothing to do whatsoever with what you said. I was essentially acknowledging your point, but saying that some people seem to advocate static typing for everything. Calm down bro.

Did you not know about these sorts of tools? Are you unfamiliar with Strongtalk type annotation syntax?

Actually, no I didn't, and yes I'm unfamiliar with it ... that's why I asked you for more info.

the rest of what you said

This is all interesting to me. I'm only lightly familiar with Smalltalk and Strongtalk. I was asking for clarification because I really wasn't sure I knew enough about it. Again, no trolling intended ... no need to be defensive.


Ironically, I think you're the one failing at reading comprehension. Read the thread.

Read my posts! You're putting someone else's words into my mouth! (Ones which are uninformed and frequently used themselves to troll!) So, there should be an implicit assumption that every thread is diametrically opposed around the issue? And everyone who doesn't follow that assumption and inserts informative neutral concepts are "not paying attention?" I find that intellectually limiting, to put it kindly.

Calm down bro...I was asking for clarification because I really wasn't sure I knew enough about it. Again, no trolling intended ... no need to be defensive.

No, you were putting words into my mouth due to inattentive reading. I'm not being defensive. I'm going on the offensive. Being inattentive and putting words into someone else's mouth and being called on it is not your cue to tell them to mellow. In my book, it's time to demonstrate some intellectual integrity and apologize.

I thank you for your most illuminating reaction!


"Static typing hardly replaces unit tests."

Still, some of the errors you would catch with static typing you can catch with unit tests. It's not a replacement, but probably better than nothing.


> The only reason to write in a language like C or Java today (over Python) is speed.

Do you really believe that?


I think it's a reasonable statement, if the following premises are accepted:

1. It is easier and quicker to develop a correct solution in a higher level language than a lower level one.

2. Expressiveness / abstraction come at the cost of time and space efficiency.

Thus, if time and space efficiency are not a concern, then the higher level language should be preferred.


What are some other benefits of C or Java over something like Ruby and Python?


The level of abstraction matches most hardware.

Not all code written in C is performance-critical. But sometimes it's still the best choice if your code is directly talking to hardware.


In that case wouldn't you write some components in C and glue it all together with something like ruby?


Sure, that sounds reasonable. But you've still chosen C to talk to the hardware.


If you're using ruby or python you're ultimately using C anyway, right ;)


You seem to completely ignore the fact that there are environments where speed does not matter, but memory is constrained. In those environments even Java is prohibited and C/C++ or even assembly is a way to go, because in these languages you have direct control over memory allocation.


I thought the second comment was interesting:

>I believe the flaw in the argument is that the higher-level languages are actually more constrained, because they have made too many broad promises and are unable to tell when they can violate them without ill effect, even though it is obvious to the programmer. We might think that your example above would be a perfect case in which we could parallelize the loop... but what if sum() had side effects, such that it would produce the wrong result if not evaluated in order? What if sum() didn't have any directly detectable side effects, but called another function which called another function which did... sometimes, from code in a runtime eval? The compiler has no way of knowing this, so it has to assume the worst, and this is why it ends up being unable to make all the optimizations that one thinks it "should" be able to make. The human looks at it and says "duh, it's called 'sum'. If it has side effects, I'm going to smack somebody up the side of the head" and proceeds to optimize in the expected ways.<


Which is exactly why Fortran compilers are still faster than C++.


I think it is funny that the guy went and said something correct like this:

"In an ideal world, high-level languages like Python would replace all other programming languages."

That includes smalltalk, scheme, clisp, and every marvelous language not yet invented.

But then he goes on and says something very closed minded like

"Someday we will all program in python."

As if python got everything correct, and we have a perfect map of how humans think with it. The arrogance.


The last part didn't seem entirely literal to me.


Or how about in Clojure?

(map (partial reduce +) (partition 7 7 daily))

Edit: Possibly that should be partition-all from seq-utils to sum the remainder days if the length of daily isn't divisible by 7.


That's pretty unreadable for someone who doesn't know clojure.


Is this better?

map(partial(reduce, +), partition(7, 7, daily))

Personally I'd be much much more happy to be using this code than the snippet in the article. In fact, I don't program Python regularly and it did take quite a while to figure out what was going on in the code.

weekly = [sum(daily[j:j+7]) for j in range(0, len(daily), 7)]

Now if you look back at the top snippet, its just normal function calls, so you can be happy to know what's going on after reading documentation for the names "partition" and "daily".

If you don't know what "map", "partial", "reduce" and "+" mean, sure you have to look up their documentation too, but they occur frequently enough that you can remember what they do after that.

The Clojure version is much more high level and less likely to introduce bugs.

Edit:

Oh and it should be said that "sum" is equivalent to partial(reduce, +) so I can now say:

map(sum, partition(7, 7, daily))

See? The code is just melting away before your eyes. Also note how I have literally no idea what either of the snippets do but am already happily changing one of them.


Just because you already happen to have a certain function at your disposal doesn't mean your language is suddenly superior.

  def partition(n, step, coll):
    for i in range(0, len(coll), step):
      yield coll[i:i+n]
allows

  weekly = [sum(week) for week in partition(7, 7, daily)]
Using Python's map() function you can even do

  map(sum, partition(7, 7, daily))
making it exactly the same as your example. And no, I probably haven't written more Python than you in my life.


Please consider my comment. Firstly, it was in response to this comment:

> That's pretty unreadable for someone who doesn't know clojure.

The real reason for commenting was to show that it is readable and yes, providing the "readable" Python equivalent.

There are two points - firstly, Clojure is (or is becoming?) a lazy language and the list type is different, so no they are definitely not exactly the same.

This brings us to a broader point in that even if you were to have laziness in Python it still wouldn't matter because I'm not the only coder in the world and there are a lot of Pythonists like the author of the article who do it the Pythonic way, which as already hinted, I think is just braindead.


I will not argue the first point, but as for the second one: how does that one author determine what is 'the Pythonic way'? I think

  [sum(week) for week in partition(7, 7, daily)]
is 'the Pythonic way' and I find that quite a bit easier to comprehend than

  map(sum, partition(7, 7, daily))
The reason for that is as simple as it runs counter to what people usually preach: it is less concise. In the former, it is immediately clear that you are partitioning a list of days into partitions of weeks and summing over those weeks, resulting in a list of weekly sums. The extra word 'week', repeated twice, makes all the difference: all ingredients for comprehension are readily provided. The latter case, on the contrary, requires you to do a few mental operations to expand the expression into something meaningful, mentally adding the concept of a 'week' to understand what the code is doing. You need to do that everytime you read the code, which makes it less easy to understand.


> Just because you already happen to have a certain function at your disposal doesn't mean your language is suddenly superior.

You are certainly correct in that. That's not what makes Lisps superior to all other languages though, "it's the syntax stupid."

Edit: Sorry, I know that comment appears trollish (and it is), what I meant to say is that unlike Python, Lisp(s) have virtually no syntax, and are more powerful because they treat code as data. It would take me a blog post to explain why that's important, but if you give the language a shot you will see why that is (and compare how long it takes you to learn it to how long it took you to learn Python!).


Isn't there more to power than that? For example, in clojure you can't change the binding of a function in a namespace after it has been compiled. In python or ruby you do have the power to do that.


I'm not a Clojure expert, so I can't directly address that, but from the little that I know of Clojure though, I think you can do that by binding a function to a var, and then changing the binding of the var. (Can any Clojure experts exlaborate?)

Regardless, every Lisp that I am able to comment on (newLISP, elisp, Scheme, CL) can do that easily, so I doubt that Clojure would have difficulty with this.


It does have difficulty with this, check the mailing list. Just because something using s-expressions doesn't mean you can override functions in a namespace.

http://groups.google.com/group/clojure/browse_thread/thread/...;


Thanks for that interesting thread! I do know though that newLISP can do this easily with a simple 'set' call.


That's pretty unreadable for someone who doesn't know clojure.

And that Python is pretty unreadable for someone who doesn't know Python.


That specific python example is very readable for someone who understands basic math (sets and functions). The clojure example, not so much.


That could be said for any language. It's just that the first language someone usually learns borrows more from algol than from lisp. This leads to the idea that Python/Ruby/PHP/Javascript are easier to learn than Clojure/Common Lisp/Scheme.


Python/Ruby/PHP/Javascript are easier to learn because most people already know a similar language. Just because it seems unfair doesn't mean it's not true.


Python and Ruby do one particular thing right that Smalltalk didn't:

Operator Precedence for Algebras.

Implementing an Algebra is a very important thing for a language to be good at. Being able to use algebras without having to switch mental gears is a good thing for scientists, mathematicians, and technical people of all stripes.

IMHO, the ability to easily implement your own algebra is a hallmark of a great language. Having that with the operator precedence is a bit of very justified syntactic sugar.


Algol languages are taught first for good reason: Algol-derived languages resemble English; Lisp isn't natural or intuitive to anyone who doesn't already know Lisp.


Oh, come on, folks. It's a perfectly reasonable comment -- Lisp is powerful and expressive, but it is not an intuitive language for beginners.


Perhaps it seems that way to you because you are more familiar with Algol syntax than Lisp syntax, or because you have gone on to learn the more advanced Lisp features. But Scheme syntax and semantics, for example, are arguably at least as simple as that of an Algol language. Scheme has been used successfully as a first programming language in high school computer science courses: http://www.teach-scheme.org/


You're making a bad assumption about my skill and preferences, and trying to use it as an argument. That's a logical fallacy. My argument doesn't depend on my personal experience: Algol-derived languages resemble English, and are therefore easier for new programmers to learn.

While I am sure that some high school courses may start with Scheme, it's certainly not a common beginner's language. I'm sure if I searched hard enough, I could find a high school class learning assembly -- but that doesn't make assembly a good beginner's language, either.


Yes, ALGOL-derived languages contain English words, and chunks of code can be read as English if you imagine the right words being inserted, but the same is true of Scheme code, and to about the same extent. In fact, it's true of most programming languages. The languages that it is most true of -- languages like Ada, Cobol, Inform, etc. -- are not notable as good beginners' languages.

Here, for example, is typical C code, right out of K&R:

  for (count = 0; count < MAX; count++)
    printf("\narr[%d] = %d.", count, arr[count]);
To read this in English, you need to fabricate a lot of words, like this: "Repeat for several times, first initializing count to zero, as long as count is less than MAX and, after each step, taking count and adding one to it: call the printf function with the arguments...". How is that exercise any worse for the equivalent PLT Scheme code?

  (for ([count (in-range 0 MAX)])
    (display (format "\narr[~a] = ~a." count (vector-ref arr count))))
An English translation of C's "while" statement is easier; the "switch" statement is harder; a C function declaration is nearly impossible without sounding awkward. Yes, actual ALGOL would be easier translate to English than C, but there are even less people starting with ALGOL than with your hypothetical assembler -- which suggests that the real reason people start with the languages they do is primarily because the languages are popular, not because of the degree of similarity to English.


> Possibly that should be partition-all from seq-utils

As of June 20th, there is now an optional padding argument in clojure.core/partition.


Good to know, thanks.


Some day we will stop writing COBOL.


2012... the year COBOL died!


weekly = [sum(daily[j:j+7]) for j in range(0, len(daily), 7)]

While I appreciate the general sentiment that high-level languages will take over the world, there is very little that's new in this article and I am not impressed by the showpiece Python code from the article. For loops are the future?


Well I guess the list comprehension and list slicing features in the above example are more interesting than the for loop.


The less the programmer says, the more flexibility the compiler has to speed things up.

I seriously doubt it. The more the compiler knows, the more it can optimize things.

What it's "daily" in this example? It could be anything. The only thing the compiler knows at compile time is that it should generate some lookups searching and calling the functions "__iter__", "__slice__", etc... Put that line into a function which receives daily as a parameter and it could mean something different at every single invocation. Human beings, looking that code think know what it does, while in fact don't have a clue. The equivalent code in clojure someone has wrote only is really equivalent under some very specific assumptions.

I know, I know, at runtime you could gather all kinds of interesting data to optimize things. But, that's not easy nor cheap and, sometimes, iterating over an array, it's just iterating over an array.


Also... The more the runtime knows, the more it can optimize away (JIT).


Unfortunately, it will take a long time for all the web dev to switch (from PHP) to Python. Lazyness is stronger than any high-level language feature or development speed. I've long given up trying to convince my clients to switch for a python-friendly host. Imagine: i still have some clients running their website on PHP4.


So true.

[edit:] if only... Amend to: some day, we will program as efficiently as possible, and software will do most of the annoying housekeeping.




Applications are open for YC Winter 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: