Hacker News new | past | comments | ask | show | jobs | submit login
The Python Paradox (2004) (paulgraham.com)
148 points by ryan-duve on Jan 27, 2023 | hide | past | favorite | 269 comments



It was right then, but it's kind of dated and misses an evolutionary lesson: beautiful code isn't enough, it has to be safe, beautiful, maintainable, productive, understandable, and resource efficient.

Ruby is still great for throwing something together fast and maintaining it until it runs into scale problems. It added gradual typing with sorbet and RBS.

Crystal is a neat typed, compiled Ruby-alike but it's buggy.

Rust hits more of these but it's ugly. Uglier than Python and approaches C++-level eyebleed. Rust generics aren't as flexible as they could be because type constraints don't have union and specialization is painful. Tag structs are laborious. Go hits more of these except it's not as flexible and not quite as safe as Rust. Rust makes enormous binaries and compiles slow and the cargo index download is glacial, I'm surprised they don't have an "sccache" for it. Go is easy to learn but then the capability of it plateaus. Go compiles and tests insanely fast.

Rust should rebase its type system to have union types, i.e., foo: i32 | f32. Also, more granular and specialized type constraints (negation, union) on generic trait parameters.

Elixir (on erlang) can be productive. It's gorgeous, it's fast, it's powerful, and fairly expressive.

Haskell and Idris are beautiful and powerful, but inaccessible to most software engineers.

Pony is beautiful, productive, extremely safe, and fast.

Zig, Nim, Kotlin and friends are all moving in the right general directions. Swift is alright too.


> Ruby is still great for throwing something together fast and maintaining it until it runs into scale problems. It added gradual typing with sorbet and RBS.

I think Ruby is much better than Python for building large applications.

Ruby is essentially Smalltalk with a neat Algol-like syntax and some Perl-isms. Its OO system is a joy to work with.

Python got so popular in data-oriented applications because it's a really simple procedural language.

More importantly, it had the right libraries (NumPy, SciPy and Matplotlib) when Torch (implemented in Lua) was no longer viable (due to LuaJIT limitations).


That's sort of outdated. Python now has type hints, protocol oriented programming support, and is quite fast (for what it is). This wasn't the case back in the day but it's all there now.


Python might have nice features, but it also has terrible features. Like using dicts for everything. It's not enough to allow good practices, you have to prevent (or at least reduce) bad practices.


Can you elaborate? What exactly is wrong with dicts?


>Like using dicts for everything

Sorry what? This is absolutely not true.


I think they meant that the underlying representation for most stuff is a dict.

Global variables are stored in a dict (`vars()`). Unless you use slots, an object is essentially a dict

Doesn't really disturb me because they're not used like a dict in practice though


You can use dataclasses instead of dict now.


Unless you use __slots__, aren't all class-based objects [specifically python2 new style `class MyClass(object): pass` or all python3 classes] dicts in Python?


I was talking about the dataclasses.dataclass annotation in 3.10/3.11. You can use slots with it.

You’re right, they’re dicts unless slots are used. I read the parent comment and saying you can define types easily. But you can with dataclasses


Rubys biggest downfall was the stupid "fancy" syntax stuff in the language, like not requiring parenthesis or return statements, insert operators, BEGIN/END blocks, e.t.c. All of those make the code base harder to read.

Also I remember the dependency management system was a pain in the ass to deal with, especially with modules with native libraries.

Python had some issues with 2, but when 3 came around and fixed all that, it became the language of choice specifically because of its consistency.


> Haskell and Idris are beautiful and powerful, but inaccessible to most software engineers.

I don't see any fundamental reason why that should be the case. It seems to me that while it's true that most engineers would have a hard time navigating these languages today, it's mostly due to socio-historical accident. It's not because the languages are particularly difficult or arcane, it's mainly because people don't already know it, and people don't like to learn to do things differently: their experience stands in the way.


> it's mostly due to socio-historical accident. It's not because the languages are particularly difficult or arcane, it's mainly because people don't already know it, and people don't like to learn to do things differently

I beg to differ. I studied compsci BSC at an eastern european university. The professors decided to test that theory and decided to introduce programing to everyone in our program through Haskell. Every year about 500 new students were thaught Haskell. I don’t know a single one of us who stuck with it. Probably there are a few odd ones, but if it were only a question of familiarity you would expect that more people from there would keep programing in it.


Our uni taught us Haskell, Prolog, C and Java. All very intensive. First it was C, then Haskell & Prolog and then Java. When I came out of uni, I wanted to use Prolog & Haskell for everything, but jobs were only in Java and C. Don’t think it’s a matter or liking them or being more productive in them; you need to earn a living.


That's a response to a slightly different question. Your answer makes it clear that exposing students to a language isn't enough to make them take up a language. The original point was that Haskell isn't per se inaccessible.


It's inaccessible because it's difficult and arcane compared to languages like Python, especially at the beginner levels when you are still trying to figure out how a for-loop works, and what are variables, and so on.


OK, but now we're back full circle, and you're disagreeing with both epgui https://news.ycombinator.com/item?id=34542953 and krisoft https://news.ycombinator.com/item?id=34544130


Yeah, of course, I am. I don't see what that has to do with anything. So my point is only valid if I support someone else?

A lot of us here have read "Learn You A Haskell for a Great Good". We've used it.

It's not a socio-historical incident, and it's not just beginners who struggle with the language either.


> Yeah, of course, I am. I don't see what that has to do with anything. So my point is only valid if I support someone else?

Your point is perfectly valid. I'm just saying the thread has just become a sequence of assertions and not particularly enlightening.


If it was accessible, then exposing students to it would be enough to make them take it up, no?


No? It may not be suited to their applications (wrong tooling / libraries ), or their collaborators may not use it, or they might simply not like it


I have a similar experience at Oxford. However, the introduction course doesn't even cover IO monad and all of our code are run in repl. No wonder the students regard Haskell as impractical as they did not even write a full program in it.


Still, if they'd found that Haskell was much easier or more productive to use than other programming languages for simple computational tasks, you'd think a few of them might have googled 'how do do IO in Haskell' and gone from there.


That seems to be getting things backwards. How could they find it easier or more productive if they haven't even done IO yet?


I would say the productivity gains of Haskell are actually most transparently obvious in the case of the sort of toy algorithmic code you're likely to be writing in a compsci class.


OK. And I would say the productivity gains of Haskell are most transparently obvious in the case of the sort of highly complex, stateful, I/O-interacting code you're likely to be writing for a successful tech company.


This doesn’t match my personal experience (having worked on a Haskell web app professionally for a couple of years). GHC does have some cool features like STM, but typical stateful code tends not to be radially more concise or easier to write in Haskell than in other high level languages (compared to, say, code for traversing a binary tree, or implementing a parser combinator library). Others may have different experiences.


I do see fundamental reasons. Haskell was designed from conception to be a platform for research and education. I’d say it was a wild success at those two things. Certain decisions that made it successful as a platform for research and education make it less suitable for use writing applications in industry.

The main decision I am thinking about is lazy evaluation. Lazy evaluation forces authors to write pure code, which drove Haskell’s community to build libraries that expressed effects in the type system, which made things like STM possible. If I wanted to design a Haskell-like for industry use, I would give it non-lazy evaluation (I’m not gonna say “strict” because I think the compiler should be free to omit the evaluation of any expression, but it sure would be convenient if we had more predictable performance characteristics).

I’m also not saying that you can’t use Haskell in industry. Obviously, people do that.


OCaml uses eager, not lazy evaluation but it is even less popular than Haskell. BTW, I like Haskell, but I feel like a novice, still learning, even though I wrote a Haskell book (you can read it online for free https://leanpub.com/haskell-cookbook/read).


Having used Haskell in industry, I can say it's actually _surprisingly_ good at boring run-of-the-mill software. The strong focus on correctness and the ecosystem-wide consistency of it's core design patterns means you spend a lot less time debugging things, and a lot more time writing the thing you actually want to write.

That said, the type system _does_ stand in the way of learning it - though it's very powerful once you get your head around it - and lazy evaluation means you do spend more time than you'd generally like hunting down space leaks if you're writing the sort of code that's capable of getting space leaks.


...which is as easy as using foldl instead of foldl' instead of foldr, which to a beginner is... mildly confusing.


Imo, one "fundamental" reason is that Haskell has a much smaller, nicer core than most languages, so it builds everything out of that core. For example, most languages have variable update statements in their core. Haskell defines at least two data types of "programs that have stateful variables" (ST and State). The result is that people want to see how everything is broken down into pure functions under the hood, and that means it takes more time to learn the language.


> It's not because the languages are particularly difficult or arcane, it's mainly because people don't already know it, and people don't like to learn to do things differently: their experience stands in the way.

I'm somewhat of a fan of Haskell (though its standard library is awful) and very much a fan of Idris (at the very least it's a better-Haskell, at best, dependent types!), but hard disagree.

These languages are difficult and arcane, compared to eg Python. A beginner to programming can get started with Python immediately, it reads like pseudocode.

Get started with Haskell or Idris, you quickly hit higher kinded types, monad transformers, free monads, final tagless. It's a different universe.


>Get started with Haskell or Idris, you quickly hit higher kinded types, monad transformers, free monads, final tagless. It's a different universe.

How is any new developer hitting any of these "quickly"? I'm working on a mature production codebase with plenty of advanced Haskell, and we barely have any tagless final code - much less free monads. Even HKDs and monad transformers are relatively recent additions to our toolbox that we reach for when appropriate - most of the code uses plain old datatypes and one or two concrete monad types.

My experience of learning Haskell is that many production codebases use advanced features because they are big boons to productivity, but few learning resources make reference to them. In fact, Haskell's biggest problem is probably the gap in knowledge between simple, low-productivity code written by newer developers, and the kind of code that actually goes into production with all the optional advanced features. A budding developer can easily write a whole application that never goes beyond `IO` without issue.


Alternatively, getting started with Java you quickly hit polymorphism, inheritance, getters-and-setters, and null object handling.

And yet everyone in my intro to CS class in uni managed to pick up Java within a week or two. Haskell is a bit more involved than Python, but not more involved than most languages.


Yes, Java is kind of the worst of both worlds. You have to learn new concepts and they aren't even very useful concepts.


I believe that to be mostly true for Lisp or ML-family (OCaml, ...) languages, but not for Haskell. Reasons would be: - lazy evaluation can be tricky if you work with files, you have to profile and force strict evaluation at key points to avoid memory explosion. - refactoring something to work with the IO monad is hard - typing some higher-order functions is hard

It may be subjective, but even for the syntax I felt a much stronger cognitive load working with Haskell.


> the cargo index download is glacial

Cargo has been working to resolve this, it's been testing a new feature that lazily fetches fine-grained index info over HTTP rather than the legacy approach, which is to clone the entire index stored as a git repo (which worked fine when crates.io was young and small, but hasn't scaled well). You can enable it on nightly Rust via the sparse-registry config flag, and it was recently accepted for the 1.68 release in March (which means you can try it out on the beta channel right now). In the future, it will become the default behavior.

https://doc.rust-lang.org/nightly/cargo/reference/registries...


You missed TypeScript, which has advanced generics with union support and the JS ecosystem is getting better each day and it’s ubiquitous.


TypeScript is fantastic —- such a flexible type system.

It was introduced 18 years after JavaScript but feels like a perfectly natural evolution of JavaScript. TypeScript _should_ feel like a wonky bolt on but it’s wonderful. A testament to the folks who built it


To be honest early TypeScript was quite bad with class dependence (functional style was an afterthought) and "namespaces" (That was still used in the mainline TS compiler until recently iirc, behaved like nested objects in JS, kind-of useful but totally missed the NodeJS file-based-module-separation) felt a lot like they wanted _Java_ or C#.

Sometimes during the late 2.x series TypeScript really came into stride with ability to express more "dynamic" types, using a functional style and having arrays-behaving-like-tuples,etc and the 3.x and 4.x series has really made it polished.

TL;DR; It's quite great now, but don't dismiss sceptical people because early versions of it kinda DID suck.


“Felt like they wanted C#” makes sense because Anders Hejlsberg was involved with TypeScript. Glad they stopped forcing C# into it and let it be its own thing. Not that C# is bad though!


> Tag structs are laborious... Rust should rebase its type system to have union types, i.e., foo: i32 | f32.

Rust has ADTs, so I'm not sure what you're saying here. Can you expand on how you think unions would make code nicer than disjoint unions? They definitely make generics much harder to think about for me.


Perhaps they mean anonymous (discriminated) unions of some kind with that syntax?


ADTs = discriminated unions + tuples. They likely mean undiscriminated unions, as in typescript.


But since rust is compiled, in practice that would be like C unions, right? A pain to work with

I mean typescript has those because the underlying runtime is Javascript, which tracks the types of stuff and handles some kind of polymorphism, so you can get away with undiscriminated unions because the runtime is doing the job of discriminating them


approaches C++-level eyebleed

Have you ever hit F12 on any C++ std function or data structure? Every time I do it genuinely feels like I'm back at day 1 freshman year of an undergrad CS degree.


The C++ library is algorithmically and data structure feature complete for most purposes. It doesn't have what boost and other libraries have, but that's okay. Ideally, a platform should ship without a standard library and use package management for all but built in types.

The problem with C++ template metaprogramming style is symbol length, the notation ordering (insane levels of nesting), and diagnostics generated from those that easily turn into absurdity.

Another issue is static languages lack universal diagnostics, introspection, and symbolic evaluation capabilities of dynamic languages with REPLs and remote debuggers.

Dynamic languages are often missing "compile" (deployment)- or at-boot-runtime type checking, data race condition borrow/mutability/concurrency analysis, and monkeypatch protections.


Post a pic


A random file in the standard library:

https://gcc.gnu.org/onlinedocs/gcc-4.6.3/libstdc++/api/a0109...

Honestly I don't think it's that bad if you can look past all the underscores. I looked around for something worse but maybe I'm just accustomed to the eyebleed that C++ has to offer. To me, Rust is uglier, but that's because it's been around a quarter of the time that I've been using C++ and I don't know what all the symbols do without looking up a cheatsheet.


If you want ugly look at the MSVC std.


Party pooper. Don't you know we are just here to bitch and moan?


Look, I'll admit, it looks terrible from the view that gdb gives... but when you see the whole file, it's just not so bad.


> Rust makes enormous binaries

But it DOES make binaries. Such a breath of fresh air after years of Java/Scala.


I never really understood this, especially in the age of small docker containers? You just yeet your jar into the container and deploy. Ok, so rust and go don't need a jvm. Why does it matter? Disk space is dirt cheap. It's not like statically linked binaries are small by any measure anyway.


Because I run binaries.

Like docker, for instance. If I had to write "java -jar docker.jar -cp ..." and fuck around for a week trying to figure out where the env vars, JVM flags, and arguments to the actual program go, I would not use docker. Let alone piping input through grep, less and tail.


Java also makes binaries with GraalVM.


    > Rust makes enormous binaries
Rust binaries are optimized to be fast (by default, fast != small), this [0] explains pretty well why, and min-sized also has many optimizations [1].

    > and compiles slow
You can have a very slow compiler with all the optimizations and safety checks, or a fast one which doesn't. This however doesn't mean there is no room for improvement.

One thing that can definitely improved on is how cargo manages dependencies, I am tired of each of my project's /target directory using >2GB of disk space, there at least should be some kind of opt-in global cache.

I am also worried about dependencies going in the same direction as the JS ecosystem, many of my projects only have 3-5 top level dependencies, but I end up downloading hundreds. A few are only compile-time, but this makes auditing very difficult and does make compile times a lot longer.

[0]: https://lifthrasiir.github.io/rustlog/why-is-a-rust-executab... [1]: https://github.com/johnthagen/min-sized-rust


At the time I'm writng this, there's a separate thread on Rust's "ugly" syntax. The issue with complex trait constraints, specialization etc. is that they impact forward compatibility: it is possible that a new version of some external dependency will add trait implementations in a way that ends up breaking your existing code. It's understandable that Rust devs would want to avoid this.


Type arguments, union types, and a sound type system: pick any two.

For example, imagine the following in an imaginary version of Rust with union types:

    #[derive(PartialEq)]
    struct Option<T> {
        value: T | (),
    }
    impl<T> Option<T> {
        fn some(value: T) -> Self {
            Self { value }
        }
        fn none() -> Self {
            Self { value: () }
        }
        fn is_none(&self) -> bool {
            match self.value {
                _: () => true,
                _: T => false,
            }
        }
    }
What does `Option::some(()).is_none()` return? Is `Option::some(()) == Option::none()`?


> Uglier than Python and approaches C++-level eyebleed.

It's quite verbose, as well, or was the last time I looked at it.

I don't demand a Perl level of terseness (that's going too far the other way), but neither is reinventing the verbosity of COBOL a good idea.


I totally agree with your take (at least for the languages I know, out of the ones you mention). For me that right balance of beauty, productivity, and so on came with Clojure. I now find it painful to use any other language, tbh.


> Go is easy to learn but then the capability of it plateaus.

Given that almost every cloud native project is written in Go I'd say this is entirely not true.


That was the past, helped by Kubernetes and Docker respective decisions to migrate from Java and Python into Go.

Plenty of them are still Java and .NET based, as the ecosystems counter reacted to Go's adoption hype, and Rust is the new kid on the block specially for cloud native Webassembly projects and eBPF.

Like Helm creators nowadays doing mostly Rust, https://deislabs.io/posts/


cloud native web assembly...

...x86_64? ;)



Gotta respect the grift I guess! Like MLM crypto schemes maybe I just Don't Get It, or I'm not the mark...


WASM is architecture-independent (ARM servers are becoming a thing, for better or worse) and has a lot of investment into sandboxing it performantly (since it also has to run in browsers).


Funny how emergence works with tools. Give a language too few tools but viral circumstances - the ecosystem diverges (Lisps, Javascript). Give it too long an iteration time but killer guarantees, you end up with committees. Python not falling into either of these traps should be understood as nothing short of magic in emergence.

I only recently discovered that python's reference typechecker, mypy, has a small side project for typed python to emit C [1], written entirely in python. Nowadays with python's rich specializer ecosystem (LLVM, CUDA, and just generally vectorized math), the value of writing a small program in anything else diminishes quickly.

Imagine reading the C++wg release notes in the same mood that you would the python release notes.

[1] https://github.com/mypyc/mypyc


> Imagine reading the C++wg release notes in the same mood that you would the python release notes.

Which mood are we talking about here? Mild dread? They keep adding new features to the language. The Python programming language is around 30. The release notes at this point should be "wow, we found a bug this year! Didn't expect to see one of those!".

The secret of Python is it is actually a large community all learning to program together. They all discovered static typing together, they're all learning about functional techniques together (removing reduce() from core Python was a really interesting move), yada yada. It is a happy, productive and cohesive community. Probably a study in how to build a community around a programming language.

But one of the reasons I like Clojure is they don't have release notes any more, the release notes are basically "hey we're working on some libraries". They built something extremely powerful and it is done, because there isn't any more power needed in the core. If you invest in learning a tool, the tool shouldn't sprout a new handle and expect to be held differently.


My mood when reading the PEPs is of excitement.

While I'm not too keen on hardcore feature divergences, most of the new features have been really nice things to have, and which are never imposed. (E.g. the walrus operator, type hints, dataclasses, etc). I don't see the need to be so outright hostile to changes, unless they were clearly breaking features.


> "and which are never imposed"

I'm not sure about that part. Even if one personally ignores something new, someone else within the same project or a dependency will employ the new shiny, and invent one or a couple more ways of doing the same thing in a different way.

That means if it's in the language, you'll have to account for it. Not saying Python is, but given enough new features a language may become a kitchen sink of ideas over time.

--

(At the risk of contradicting myself a bit here, I actually agree with the features you mentioned here are all nice to have, and I like using them. IDK if nice to have is a high enough barrier to clear though.)


Don’t lisps (or lisp-likes) push language feature creation into user space? And also with Clojure piggying on the JVM, I would be shocked if it needed continuous updates.


Python does too, to a limited extent. A good example is type annotations. You don't _have_ to use type annotations, but you can enforce it in your project by writing a typechecker or a preprocessor; and there's a subset of python that can be compiled statically.

And the Clojure example imho is a bit of a red herring, because while it allows a lot of flexibility (reader macros in Clojure are very powerful), they don't encourage ecosystem divergence, since Clojure can use JVM/Java for "Batteries" and most standard libraries. Cljs is more similar in that way, because it carries much of javascript's divergence with it.


Heavily generalizing of course, but:

Only in so far as the user actually _can_ make those changes, if deemed necessary or elegant using macro facilities. Once something has proven itself good enough to be part of the language standard, there is nothing limiting language maintainers/creators to take it into the standard and ship with next release. So it is strictly more flexible in that way and users do not have to wait on commitees.


Industries change, needs change, new possibilities emerge. A language that doesn't change is probably either dead, dying, or headed for a tiny niche. Python changes a lot because lots of people use it and have needs and desires for it. Clojure is used very little, so doesn't have the same kind of demand for improvement or community to do it (I don't for a moment believe it couldn't be improved).


"A language that doesn't change is probably either dead, dying, or headed for a tiny niche"

Or complete. Eternal change is not a value in itself.


Right, but I think any language that is "complete" is either dead, dying, or headed for a tiny niche (or already in a tiny niche). Can you think of any counterexamples?


The vast majority of languages are dead, dying or headed for a tiny niche. How many languages can you name that aren't? There can only be at most 3 languages with more than 25% market share.

Even Python is likely to die; in time. Most things do. Something will have to change first, at the moment it is doing great.


You seem to be arguing against something I didn't say. All I'm saying is that change is a sign of life, and that non-change is a sign of the opposite.


Go ye unto your tool shed, examine your spanner for signs of life, and you will have the possibility of gaining enlightenment - for you are correct in your argument.


C.


C is changing (C99, C11, C17, C2x). It is also increasingly relegated to niches like embedded and systems, though there's still a lot of other use.


Web development as well, as it turns out (compile to wasm).


Nothing is ever truly complete. Especially mediums of communication and construction, such as a programming language.


Well, the mypyc was not a side project originally, as mypy was conceived to be a complete programming language with optional static typing (https://www.slideshare.net/jukkaleh/mypy-pyconfi2012), but still (almost) 100 % python compatible and that variant was supposed to be transpileable into C.


I haven't followed development of the Python language in a while but this article was featured last week [1] and it does feel like committees.

I have followed development of C++ and... yeah, committees and subcommittees.

[1] https://chriswarrick.com/blog/2023/01/15/how-to-improve-pyth...


Well, in reality it’s more complicated there always were approaches through committee and approaches outside (conda being an initiative, and lately there is an explosion of tools making use of the pyproject.toml interface).

with packaging there are things the committee is good at (making sure that a change to packaging does not break packaging for certain configurations and use cases) and individual initiatives outside of the committee often had drawbacks improving packaging in some area and making it worse in others. But the “committee” then was able to take inspirations into the committee which means there is the best of both worlds.

IMHO the python PEP process is a good example of python going the middle way.


Frankly, yes, the PEP process looks great!


You mean the mood to find out what breaks in every minor Python release?

I have been using Python in and out for system administration since version 1.6.


I think the reason that Python made it this far is that from the beginning it had a very well-documented and capable C interface.

From this, people started making lots of libraries that accelerated code and provided bindings to other software.

At a certain point the ecosystem reached a critical mass, and the rest is history.


I might be wrong, but I believe that Mypyc is like Nuitka in that you're still using an interpreter that will end up being your bottleneck even after compiling your code. Nuitka gives a ~4x speedup for most code, the bottleneck being the Python VM.

You might want to reach for something else if you need better performance than that.


[2004] -- Python is no longer a "comparatively esoteric language" these days. I suppose now this might be an argument for hiring folks using Haskell or the like.


I'm not convinced this advantage is available to harness anymore. A small company can still outrun a larger company via less process, fewer internal entrenched interests, the ability to not have to worry as much about internationalization and other things you need to instantly address large markets and comply with various internal and external regulations, and you could continue this list for quite a while.

But $LARGE_CORP, or at least one of the $LARGE_CORPs in your space, is just as likely to be using Python themselves anymore, or some other relatively nice language. Even the bad ol' languages from 2004 have cleaned themselves up a lot since then. There are some potential much more isolated advantages to be harnessed here and there in certain cases by using the correct language in place the wrong one has become entrenched, but I just don't think this particular delta is anywhere near as available, in general, as it used to be.


Another thing is that back in 2004, a larger % of code was hand written. So, choosing a new language didn’t come with as high a switching cost outside of learning a new syntax.

Now, if you deviate from PHP, Python, Node, Rails, etc., to one of the new “esoteric” languages, you have to learn a new language, plus reinvent the wheel in a lot of areas that you’d otherwise be getting for free.

So you have brilliant developers working in esoteric languages trying to complete 25-50% more code.

Even if you can swim upstream fast enough to compete with highly competent devs working in a more popular language, then you get into hiring / onboarding challenges, documenting and debugging custom features (features that would otherwise be a “solved problem” in other languages), acquisition due diligence, etc., that persistently weighs against you.


> Now, if you deviate... to one of the new “esoteric” languages, you have to... reinvent the wheel in a lot of areas that you’d otherwise be getting for free.

I think this is the key concern. The #1 priority is how quickly Engineering can respond to business demands. Maybe the #1 priority instead ought to be culture, and that can be a great approach for Mittelstand-style/small-forever/bootstrapped businesses. But for venture-backed growth-at-all-costs businesses? Porting functionality to an esoteric language does not drive business value. Complain that you have difficulty hiring because you can't find engineers who know the esoteric language, and you're likely to get pressure from your investors to make boring technology choices instead.


I’m confused by your comment; it seems like you are talking about the quality of the language, while the article was talking about using obscure languages as a signifier of programmer quality.


There is a bunch of good "esotetic" languages which can be a similar differentiator in certain areas: Elixir, Zig, Pony, Chapel, Nim, even plain old Elm.


Or Rust, which is very popular on HN.


Rust is "not as used" but it has a following who hype it to the point that I've avoided taking time to relearn it just due to the strange vibes I get from their community. These people once mentioned, on twitter, that people who hate systemd are like reactionaries (as in, politically far right), as if a choice of init is a correlative of political ideology. When you view others' preferences for fucking software as so important to your world view that they might as well be nazis[0], you can tell your priorities are not in order.

They hardly seem like iconoclasts in that respect, it feels like there is a lot of group mentality there which is strange to me. People in a place with strange groups dynamics might not be aligned with the public outside the group (who are oblivious to the group think since they are just "normies"), and so the group is sure, against the grain in some respect, but that aspect of being separate from the general population is due to adherence to groupthink, not really due to individual eccentricity or originality.

Another aspect, rust is being hyped by some big names in Tech, so at this point, rust feels alot more like Java in its early days, so not really niche but just new and before mainstream penetration.

[0] Yes, some anti-systemd people literally say Lennart Poeterring is hitler or some nonsense, they are idiots. I just think the software he pushes is not the best.


Genuine question: Maybe the pro systemd people meant the word reactionary in it's non-left/right definition? From Wikipedia: "In political science, a reactionary or a reactionist is a person who holds political views that favor a return to the status quo ante, the previous political state of society" [0]

I've heard people use 'reactionary' to mean "doesn't want things to change", or "wants things to go back to the way they were", which would actually be a pretty reasonable opinion for a pro-systemd person to have about people opposed to systemd.

(I'm not saying that systemd is or isn't good/bad, but it was definitely a change, so describing people who wanted to keep initd as reactionary does seem to jibe with the definition of reactionary folks as being people who favor going back to the earlier status quo)

[0] https://en.wikipedia.org/wiki/Reactionary


If the anti-systemd weenies wanted to be reactionary edge lords and return to the status quo, then they should have said something when it would have made a difference during the ill-fated transition from the simple BSD init system to the steaming pile of shit AT&T System V init system that they only think is "classic" because their first experience with Unix was Linux.

It's like saying Nickelback's "Woke Up This Morning" is Classic Rock because you never heard of the Rolling Stones "Start Me Up".

https://www.youtube.com/watch?v=SGyOaCXr8Lw


They didn't. They implied it as a political association. I've since lost the thread unfortunately...


I think that's part of it, but I think they primarily mean politically reactionary. I wrote more about this here: https://news.ycombinator.com/item?id=34543573


> These people once mentioned, on twitter, that people who hate systemd are like reactionaries

I don't think you are actually referring to the entire Rust community here. Did one person say this on Twitter, and maybe a handful retweet? Besides, what does Rust have to do with systemd?

(Not a Rust user, and don't really have an opinion on systemd.)


>These people once mentioned, on twitter, that people who hate systemd are like reactionaries (as in, politically far right), as if a choice of init is a correlative of political ideology. When you view others' preferences for fucking software as so important to your world view that they might as well be nazis[0], you can tell your priorities are not in order.

I hate that everything is politically-coded nowadays, but, honestly, as much distaste as I feel even writing this, I think they're not entirely wrong / there is something significant underlying that sentiment. Especially post-2014, nearly everything on the internet, including open source technology, is highly entangled with the culture wars. For better or worse, Rust really is left-coded and anti-systemd sentiment really is right-coded. A lot of this stems from the 4chan /g/ (Technology) board's staunch opposition to systemd and Rust + the broad cultural influence 4chan retains to this day. I've been a 4chan regular for over a decade and have seen all sides of this.

I know what I'm saying sounds, and is, utterly ridiculous, but this really actually is a "thing", for some reason. Of course most people who like/dislike systemd or Rust aren't politically motivated or even aware of these associations, but there's a surprisingly large chunk of both who are, even if they aren't really consciously thinking of it in this way. It's very easy and understandable to laugh at someone accusing systemd critics of being neo-Nazis - it's the ultimate Godwin smear/cope - but for a variety of reasons the inverse pretty much does hold: alt-right technologists indeed are near-universally actively opposed to systemd, and generally actively hostile towards Rust. The Rust opposition is in large part due to their belief that Rust and Mozilla are associated with trans people and that if you use Rust you are "pro-trans agenda", "cucked", "pro-Jewish", "pro-globohomo". The systemd opposition is less concrete and seems to be more a matter of happenstance + typical conservatism/"if it ain't broke, don't fix it".


Rust is to C what systemd is to /etc/rc. Both C and /etc/rc are defining characteristics of old-school Unix culture, so it kind of makes sense even just on account of that that folks who hate systemd also hate Rust.

What puzzles me is that OpenBSD people seem to be quite actively opposed to Rust too. They are the project that disables hyperthreading for security reasons, runs ld to relink the kernel after every boot to shuffle memory addresses, patches all sorts of software to support capability self-limiting with pledge, and so on. And the idea of using a fast memory-safe language is somehow nonsensical to them. It is hard for me to take this opposition as motivated by security.


If you want to understand how it might be motivated by security, just read the LKML threads around Rust - the Rust developers’ claims are at times pretty wild, and in my experience (after 20 years in infosec), they tend to have a relatively narrow view of “safe”.

I am expecting new vulnerabilities to pop up from developer’s misunderstanding of what Rust actually guarantees, especially in the same memory space as the kernel.

On top of that, Rust implies a huge new bundle of complexity, a second compiler to have bugs in, and a new software supply chain to attack. The language is extremely complex compared to C. These are not easily dismissible problems.

While Rust is definitely a step up from C++ in embedded, I am not convinced bolting it onto existing kernels will fix more potential CVEs than it will cause.


But unlike Linux, OpenBSD is not only the kernel, it is also the userland.

The project could start moving critical software to Rust. They could even write their own crates for this purpose, or fork others’ crates to rule out supply chain attacks.

None of this would be unprecedented for the OpenBSD project. They have forked Apache, OpenSSL, they maintain their own SSH client and server. What would be new is that now all of this would not be happening in C but in another language.

Edit: I don’t even think that the above has to be done in Rust. It could be done in any other modern language. But you also mention the complexity of Rust. In what way do you see it as an infosec problem?

To me it appears that the complexity of Rust is good. The limitations that the language puts on your code give you pain before compilation, not afterwards. It makes you do work that avoids certain kinds of memory and logic bugs.


Do people even use C++ in embedded? Even subsets?


Yes.


You are correct, and that's an excellent analysis.

That's what I was getting at with my comment about how claiming we should go back to the steaming pile of shit System V init system with all its directories full of symlinks they think is "Classic Unix" is like claiming Nickelback's "Woke Up This Morning" is Classic Rock because you never heard of the Rolling Stones "Start Me Up".

https://news.ycombinator.com/item?id=34546815

There's definitely an underlying anti-BSD, anti-Berkeley, anti-hippie, anti-Mozilla, anti-woke, anti-gay, anti-trans, pro-Brendan-Eich, pro-GamerGate, alt-right sentiment that runs through all those irrationally ignorant anti-systemd weenies received belief system they parroted from 4Chan.


Would it make more sense to look at it from progressive/conservative lens instead of left/right?

One group thinks change is ultimately a good thing and will make changes for the sake of changes.

Another one thinks change is ultimately a bad thing, and will always oppose it.

Too much change is chaos, too little is stagnation. The truth is somewhere in the middle.


So, this is the problem, you can easily say something is sufficient but it might not be neccessary, or vice versa, implying the converse is often not true. It's does seem silly, but you're right a lot of conservative hackers are anti-systemd, but the converse really isn't true, I don't even think it's like "here are some exceptions" rather than most of the anti-systemd crowd are not reactionaries.

Also, "conservatism" meaning "if it ain't broke, don't fix it" is NOT equivalent to political conservatism at all. Conservatives all the time advocate for changes that contradict tradition all the time. Just look at the zoning debates in the US, none of the "keeping X neighborhood sfh" people who argue for it because of "preservation" want to minimize parking to the point they can't fit their f-150s anymore, which dwarf the trucks of the 50s. It has a veneer of "respecting tradition" but they don't really care about tradition in the strict sense. It's like most political ideologies, it centers on a set of values and ideas with justifications that come later.

The left really is the same honestly. While "progressivism" has a nice summary as being "for political change" in some respects, they too respect tradition and history when it fits their aims. It's more correct to say that the sides really have a set of values and ideas that are central, without some overarching central tenet or explanation.


Yeah, I wrote "inverse" but I meant "converse". It's definitely one of those situations, here.

>Also, "conservatism" meaning "if it ain't broke, don't fix it" is NOT equivalent to political conservatism at all. Conservatives all the time advocate for changes that contradict tradition all the time.

True. Frankly, I don't know the exact reason why anti-systemd sentiment seems tied up with conservative/reactionary political tendencies (or, more correctly, why the latter seems tied up with the former).


The difference is that Java has alway been driven from the enterprise.

And so there has been this long investment in frameworks, libraries etc that no one would ever otherwise write. Especially areas like governance, security, compliance, reliability etc.

I think Rust is just going to end up just being a better C++ not a true mainstream language.


See this article: https://www.theregister.com/2022/09/20/rust_microsoft_c/

* Microsoft Azure's CTO says C and C++ are deprecated

* Most-loved language in StackOverflow dev survey for 7 years running

* Now in the Linux kernel

Sounds like success to me.


When you say enterprise, who do you mean? Rust is absolutely being pushed by faang et al for example. Just look at the bottom of the Rust foundation page[0]. You do not see this support for things like Nim or Julia[1].

[0] https://foundation.rust-lang.org/

[1] I checked, it looks like intel is funding julia development to some extent. The rest is government research orgs like NSF and DARPA. Okay, that's "large org" support on some scale, although it's clearly because julia is meant for science, not industry.


FAANG isn't what people typically mean with enterprise, they mean programmers writing internal tools at fortune-500 non-tech organisations. It is very different from FAANG and it is usually boring languages like Java or C#. Think of all the COBOL code they had, those programs are still written and now its Java/C# instead of COBOL.


What's a "true mainstream language"?


Impersonal ad-hominen is a great new way to construct an argument.

Or, maybe not that new. "These people to that" without even saying what people you are talking about has been used on politics since forever.


The complement to this paradox is the Linus Torvalds C++ rant http://harmful.cat-v.org/software/c++/linus

> C++ is a horrible language. It's made more horrible by the fact that a lot of substandard programmers use it, to the point where it's much much easier to generate total and utter crap with it. Quite frankly, even if the choice of C were to do nothing but keep the C++ programmers out, that in itself would be a huge reason to use C.

> In other words: the choice of C is the only sane choice. I know Miles Bader jokingly said "to piss you off", but it's actually true. I've come to the conclusion that any programmer that would prefer the project to be in C++ over C is likely a programmer that I really would prefer to piss off, so that he doesn't come and screw up any project I'm involved with.

Oh Linus...


I wonder how he feels now about getting the Rust programmers.


> You push blobs of source code around the way a sculptor does blobs of clay

Love this sentence. Python is a success precisely because it enabled a lot more people to be sculptors. Ofcourse this means you see a lot ugly, unfinished pieces. And indeed bronze or iron sculptures may require a slightly different workflow and skillset.

Not clear for how long Python will retain this advantage. People parlay lists of technical reasons but most likely the new hot language will be one that makes it even easier and fun to push around blobs of code.


> ...along with Ruby (and Icon, and Joy, and J, and Lisp, and Smalltalk) the fact that [Python and Perl are] created by, and used by, people who really care about programming.

The distinction Python has (wrt to the rest of this list) is that it is a language created by someone who really cares about systems programming, descended from a language (ABC) created by people who really cared about programming theory.

(that said, my vote for the mechanism of python's current popularity is: time and chance)


Yeah it is dated. Python got hyped and now people learn it because data science hires python people, and data science pays well, not because they care.


I used to interview people for data science roles. We let people use whatever language they wanted in the interview. I expected to see a split between Python and R, but the reality was that almost everyone used Python, which was a bit of a surprise.

It's even more surprising to me given that Pandas is such a poorly designed library. (Apparently the guy who developed it was learning Python while he was writing Pandas.) I saw candidates make mistakes due to quirks in Pandas often. And quirks can be extra deadly with data science code: https://news.ycombinator.com/item?id=33797339 So overall it's a weird situation. I think a big factor is that a lot of data scientists are newbie programmers who are just piling in to what's popular & established.


even more surprising is matplotlib .... the most infuriating plotting library I've encountered and yet its de facto standard for most of data science now!


That comes from mimicking Matlabs plotting API, as a way to move people from Matlab into scientific python. "It's a feature not a bug", a bit taken to an extreme. But it was successful.


>Apparently the guy who developed it was learning Python while he was writing Pandas

This makes a lot of sense. I remember trying to learn Pandas before v0.24 and finding the syntax changing a lot between versions which just confused me even more. I later learnt some R and was blown away by how easy tidyverse, and even base R data.frames were to use.


I remember Python being hyped for being a good beginner language and a good web programming language(Django, Flask).

It does seem that Python has settled into a niche as one of the mainstream data science languages.


There's a saying "Python is the second-best language for everything".


Typed python is extremely ugly and frustrating. Does anyone have any good advice on doing it well? It seems like every library has their own system and doing any kind of casting just to make the linter be quiet is such a chore.


If you're doing casting its probably a code smell and good indicator of something being off, or you're doing something too exotic. I'd say this applies a good amount to normal compiled languages too.

To do types well, I'd recommend constraining as much as possible. E.g. if a library gives you a union of 5 possible return types, just type it as one of them that you know you'll get and don't let that "complexity" propagate further into your code base. Likewise if you're wrapping a lib function that can take 3 levels worth of complex union or umpteen varargs, don't let that bleed through in your function signature "just in case".

And most crucial advice. Don't use Google's type checker pytypes. Or any other type checker for that matter other than Pycharms built-in one and mypy. Don't faff about with vscode either. Pycharm will make your python type hinting journey a pleasant experience (barring some generics and OOP corner case scenarios.)


> Or any other type checker for that matter other than Pycharms built-in one and mypy. Don’t faff about with vscode either.

Eh, I’m not getting different editors for different languages (coding, documentation, and otherwise) used in the same project, and IntelliJ doesn’t have nice tooling for everything I need that VSCode does. Not sure how good Pycharm’s typechecking is, but pyright is, IME, usually better than mypy, and VSCode lets me belt-and-suspenders both of those if I choose (I did for a while, between being pure mypy and being pure pyright).


PyCharm will be a big upgrade over VSCode for any full sized project trying to use type hints (and really for any Python project)


I agree. It is not perfect, but if pycharm can't figure out the type of an object, humans can't be expected to either.


To me it seems like the ugliest parts of every language is the parts dealing with anything but the most simple types. When langs start introducing things like generics, then things really get gruesome.

Thats why I always personally enjoyed duck typing.


> Thats why I always personally enjoyed duck typing.

Static or dynamic? You will ask a non-duck to quack. Do you want to find out before or after you start your application?


If you care if it can quack, you see if it can quack.


Use a powerful IDE and lower your expectations. Python is halfway through a decades long transition to being a statically typed language, but there is no consensus that that's where we're going, so a minority of libraries will be a pain in the typechecker for the foreseeable future.


> It seems like every library has their own system

Well there are like 3 or 4 checkers existing.

Then there's pydantic which needs a mypy plugin because it uses the types in a very non-standard way. But I use typedload (which I wrote) so that's not an issue for me.

Other than pydantic I've just encountered libraries without type definitions, or done properly.

Perhaps the fact that I tend to avoid adding dependencies helps me there.


I've seen typing go wrong more than once and I think it always follows the same dumb non-nuanced pattern: "typing is great, therefore we should do it everywhere!"

Anyone following this approach is an idiot, it's like prescribing a very niche diet to everybody.

If you want it to be useful instead of doing it everywhere, you should limit it as much as possible. Find a few isolated places where you want types and limit analyzer to this area only (btw, it is entirely possible that there are no places like this in your code - that's a good thing). You can always extend the area that's covered by static analyzer, but you can never shrink it (at least I've never seen it happen).

If you don't do this, typing becomes viral and you will soon be doing typing for the sake of typing and fighting your tools instead of extracting value from them.


>Does anyone have any good advice on doing it well?

Yes, don't try to shove all your code into OOP, and you will be fine.



Use mypy. More mature.


There is no point in statically typed Python, it's useless.

Just a lot of people from other programming languages coming in and trying to make the language more familiar to them.


I moved from python2 to python3 to typed python3.

It makes refactor much easier, I can easily understand what parameter a function wants, without having to read the docstring or worse, the body of the function.

typecheckers do find a lot of errors that would otherwise be runtime errors.


> typecheckers do find a lot of errors that would otherwise be runtime errors.

This is a claim I see repeated over and over without much evidence to support it. In my experience type checkers find a lot of errors that would still be found at later stages and a lot of tiny errors that don't matter and can be dealt with via much cheaper tools.

Shorter feedback cycle is nice, and occasionally it does catch a real bug, but the benefits of static typing are vastly overblown.

There probably are teams where typecheckers do catch a lot of important runtime errors. Those are the teams with a horrible engineering culture. In those environments typecheckers are a bandaid, not a cure. If anything, they are hiding the real problem: you shouldn't give a perfume to someone who doesn't shower.


Typed Python code on average takes two times the time to develop and contains two times the number of bugs per delivered software feature.

It's bad, really really bad.

Refactoring can be done just as well via a good IDE and unit testing.

Using typechecking creates far more errors then it finds. They are a necessary evil in compiled languages. It's been measured that it creates more errors. That is not open for debate.


What is this based on? I have a lot of trouble believing untyped python code has half the bugs of typed code being someone who has written and read a lot of both


Actual measurements.

I don't think you should be surprised. Untyped Python code is a lot shorter and more concise than typed Python code.

The easiest way by far to reduce the bug count is to reduce the lines of code.


That's a good point I hadn't considered. where did you find these measurements?



Maybe it's my daftness but where does this post say that typed Python has twice the number of bugs?

It only seems to say that typed code takes longer to write and is longer. It doesn't seem to say that typed code has more bugs, nor does it compare typed Python with untyped Python.


I remember to having read this post back in 2004, in a state of transitioning my main language to the esoteric hacker language python, since I did not like java or perl at all. The argument in the article resonated with me and enforced my believe that python is the right choice. Looking back it was a great decision, still using python as my main language and it is still going very strong.


It's called "anecdotal evidence", the only thing that makes this post credible(-ish) is that PG wrote it.


It's not just PG: https://news.ycombinator.com/item?id=20212869

Although I do think between Python and Java, just pick your poison, they're not night and day. Unless you're writing 1990s Java consisting of factories of factories which is more to do with library style and over-conventionalism (back then when people were writing APIs after reading GOF). Even in 2004 I don't think there's much credibility in knowing both languages although I wasn't around.

What I find stupid that a lot of recruiters and jobs look for certain languages/technologies and require X years of experience in them. Sometimes requirements predating the existence of certain frameworks. Makes sense if you for some reason need a Kubenetes guy but mostly not.


> Unless you're writing 1990s Java consisting of factories of factories which is more to do with library style and over-conventionalism

But I still see this happen more than 20 years later when brilliant ex-Java folks are doing Python.

Java has this weird cultural inertia which reminds me of "Five Monkey Experiment".


Since I have been following PG on Twitter I understand that the reason he has so many good takes is because he has so many takes in general. And most people tend to skip over the incoherent, offensive or just plain dumb ones.

I wish more of those weren't flagged to death when people post them here.


PG's takes are pretty good by Twitter standards at least. Honestly his account is one of my favorites on Twitter. If you don't think his tweets are good, I'm genuinely curious whose tweets you actually like -- can you recommend any accounts?


A broken clock is right twice a day.


No. People aren't clocks. The best way to make something great is to make ten meh things and discard a hundred crap things. I guarantee you everyone whose takes you admire also have many, many bad takes (which they perhaps don't tweet, but there's a definite trade-off here and PG's lower threshold for tweeting seems to work well enough for his goals).


I take it you're not aware/familiar of/with the phrase.

Here's a deeper explanation from a random Google search result.: https://www.businesswritingblog.com/business_writing/2022/05...


This here. Regardless of talent, you will produce quite a lot of junk.

I am a programmer and a published book author (8 books so far and over 35000 sold copies). I wrote a pile of shit, too, but it helped me become a better coder and a better author.


To be fair, blog posts based on anecdotal evidence hit the frontpage on HN all the time. I think you're implicitly holding PG to a higher standard.

Then again, maybe he's asking for it, given that he calls his blog posts "essays". Lol.


Not everything needs to be scientific or supported by hard evidence. It’s just a short blog post, 100% an opinion, nothing else.


Well said. This read like pure pseudoscience to me, hehe.


Pseudoscience feels like the wrong label, because it's not trying to be science. To me, it's clearly just an opinion post based on his personal experience.


What an interesting theory. Talking to new hires, they learned python because that's what they were taught in school. Maybe in the 1990's when Python was competing with PERL for 2nd generation scripting languages that was the case that enthusiasts picked it up (along with Java beans, it was the 90's. :), but Python is institutional now in CS coursework.

EDIT: I just realized this was 2004. And now I agree with him.


Yaron Minsky of Jane Street said the same thing about OCaml


I feel like that's a bit different: Ocaml is a functional language. Functional languages have a reputation of being difficult. Plus they encourage a mathematical way of thinking about your code

By selecting ocaml developers, you filter developers who

- aren't afraid of trying stuff even if it has a reputation of being hard (or leart it at school) - are biased towards a mathematical way of thinking


Funny how Python is considered esoteric by author when Python is amongst the oldest of languages in widespread use then (2004).

- C++ 1985

- Perl 1987

- Visual Basic 1991

- Python 1991

- JavaScript 1995

- Ruby 1995

- Java 1995

- PHP 1995

- C# 2000

Back in 2004, despite how small Python was - it was still in the top 10, just past Delphi. - Java dominated, followed by PHP.


I don't know how popular the language was, but I distinctly remember it was seen as unfit for "enterprise" work. This was a time where the Serious devs would do SOAP, CORBA and other boilerplate-heavy RPC. Python was always branded "bad performance because of PIL", because multi-threading was gonna be the future anyway.

I don't know how popular Python was, but I remember it was a "toy" language, and when Reddit rewrote their whole website in it a few years later (2005? 2007?) it was seen as a bold risky move.


In 2000, plenty of enterprises were adopting Python to replace Perl, and doing CMS based on Zope.

During the late 90's there was a DDJ issue dedicated to Python.

The first Python version I used at work was 1.6.


2004.

Almost 20 years ago. Was Python even a teen?

Python does not attract smart young iconoclasts anymore. It's the scripting language everyone writes because everyone writes Python.

What languages do attract smart young iconoclasts? Nim? Elm? Whatever you're doing that is not yet old enough to drink by the time you comment?


The hype around Python didn't age well. Python is on its way to become the new Java, almost literally trying to match it feature for feature.

Interesting how it started to attract people with the same mindset that likes Java: great care for trivial correctness guarantees s.a. those ensured through type hints or producing programs through ticking boxes in existing templates. And disregard to large-scale design issues, something that was initially thought to be helped by more concise syntax and greater freedom to explore / less fear of novelty.

However, the thought that dedication to a (marginal) language will help you hire better programmers isn't new, and has some supporting evidence. I remember similar argument coming from Linus Torvalds, when he didn't want C++ programmers working on Git, for example.

I also happen to think that a lot of new(-ish) languages today are trying to ride the hype wave, and by doing so are trying to become instantly more popular rather than appealing to the tastes of a small but dedicated community. Eg. Rust is trying to be a "better C++", but in a way C++ programmers would most likely not be repulsed by it. Both superficial decision (s.a. "familiar" curly bracket syntax) and more fundamental ones are there to ensure that large audience of C++ programmers would have an easy migration path.

I think that, today, if you wanted to match the effect the author saw created by hiring Python programmers in 2004, you'd have to go after hiring people dedicated to undeservedly forgotten languages s.a. Ada, Scheme or Erlang. Rust is posed to become just a slightly different flavor of C++ with the large volume of low-skill programmers pumping out low-quality software from big corporate shops. For now, it may attract enthusiastic people, but that's not going to last.


It would be an interesting future in which Koka[1] becomes the new Python. Or Lean 4,[2] and we end up with formally proved ML code in 2043.

[1] https://koka-lang.github.io/koka/doc/index.html

[2] https://leanprover.github.io/documentation/


My vote is on Nim. Python like syntax with C++ speed. Unknown enough, but still not too new or obscure.


The community gives me pause. Nim's BDFL seems to have a habit of driving away prominent users of the language. Programmers who had contributed to the compiler have made their own hostile fork, and two of the three people who have written book-length introductions to the language have either given up on Nim or been tempted to do so. (The third is the BDFL himself.)

If anyone cares, here's some comments from one of those authors: https://github.com/StefanSalewski/NimProgrammingBook/issues/... . And from the other: https://twitter.com/d0m96/status/1592827547582332929?t=IqKBz... .

These quarrels with Araq differ obviously, but both are pretty discouraging as far as Nim's prospects are concerned.


Interesting. I saw shades of this when I tried porting a tool over to Nim, found it was much slower than trivial code in Python and Groovy, then stumbled into this discussion:

https://github.com/nim-lang/Nim/issues/9026

Nothing as strong as the above but it definitely rubbed me the wrong way. So much advertising about Nim being efficient/fast and the default way to read a file is incredibly slow and inefficient .... and they don't care.


That issue is about readLine specifically right? Not reading a file in general. The only time I get to ever use readLine is to solve adventofcode problems.


That's really petty argument against nim and it makes me sad really. 'Hostile' fork isn't true, just a fork. I wish you the best.


Most language fail to gain traction for reasons that don't have anything to do with the design of the language itself--even "petty" ones, such as the tactlessness of the people in charge. I don't want to use a technology which few other people use and which has, in my estimation, an unpromising future.


Asserting that the English language can't evolve is a weird take from someone who is trying to evolve a programming language.


My vote is for Nim too, but I'm decidedly biased. It's allowed us to do things in embedded that would've taken 3, 4 times as long and with more developers to achieve in C++.


2014

Almost 10 years ago, I translated this article to Telugu.

https://avilpage.com/2014/12/python-paradox.html


I am old.


> So a language that makes source code ugly is maddening to an exacting programmer, as clay full of lumps would be to a sculptor.

I am completely repulsed by Dart, but people seem to actually not be bothered by the hideous syntax. Must be wearing Flutter colored glasses...


When organisations were asked "Which languages and platforms in your application portfolio have been the greatest source of risk or exposure to your organization?" in the SANS Institute "A SANS Survey: Rethiking the Sec in DevSecOps: Security as Code", security officers mention Python 29.4% of the time (first of the list - they can choose up to three languages - dated 2021).

My question is: when will Python be seen as "risk or exposure 'free'" in the near future?

Edit: small typo


I'd be willing to wager that is because Python is simply highly represented in the workplaces of those interviewed. To think Python in and of itself is more risky or exposing than other languages, especially those like C/C++ where it's considered by some to be impossible to write safe programs, doesn't make much sense.


UPDATE: I realize this is a repost from 2004. Still worth exorcizing dead links since content is still relevant

What's up with the Turkish and Japanese translation links not working?

Are they just dead cos they rotted? I've been working on my own multilingual site upgrade lol so my mind is on this

@PG to future proof consider using archive.org's wayback machine for essential links like translations, shld hopefully be straightforward to roll into your blog software


> What's up with the Turkish and Japanese translation links not working?

No idea about the Japanese translation, but the Turkish one was hosted on a Turkish Slashdot clone (fazlamesai.net) back then [1]. There is still a web site at that domain but I believe it is owned by someone else these days.

Key takeaway: Host translations yourself, do not let others host it for you.

[1] https://web.archive.org/web/20040910012806/https://fazlamesa...


NB did not bother trying the others


(2004)

It calls Python "esoteric"! :-D

I agree with the sentiment even if the example given hasn't aged well.


I too tend to agree with the sentiment. What isn't mentioned though is that is quite possible to run out of people to hire. Which mostly happened to us, where we had grown and everyone in the relatively small Python community who could and would work for us was working or had worked. We had to move on to another new language (Go), although that did rub a number of the Pythonistas the wrong way.


Can you hire people who know other languages and train them in python? Why would you need to change languages?


I worded that badly sorry. Only one project switched to Go (a rewrite), and a few new projects where Go from the start. There was still a lot of Python.

We hired from Open Source communities (mailing lists, conferences, our own website), and applicants always had at least basic language skills and more importantly a desire to work with it. Once you start hiring people from outside, who are not interested in you or your tech and just interested in a job, you lose the benefits being discussed in the article. One of the reasons we expanded to include Go was to increase the size of the recruitment pool.


Around that time it was hard to get people to learn Python because they were turned off over whitespace delineated blocks. You could have their curly braces and semicolons when you pried it from their cold dead fingers.


Having written my first Python & Java way before this article was written some of this stuff doesn't hold up all these years later. It took a while for startups to start using Python but there was already big OSS software infrastructure built in it in the 1990s. And everyone noticed it was slow too, in the 1990s I remember more of the Python software being interactive and/or having desktop GUI components that were laggy compared to C/C++ ones, even worse than Java GUI elements were at the time.

An opposite observation, the engineers that want to differentiate themselves by using "less popular" tools often have some of the biggest egos and are most blind to the issues in their designs.

The one place I worked where we built the application in Python was rife with: - People with a big chip on their shoulder about Java and .NET being inferior

- Constant talk about how bad Java/.NET were and how their use of Python was so much better

- The design of the product hit every weak point of Python that Java and .NET are much stronger at

- We had bugs & data leaks that Python's issues made possible in ways that were almost impossible in Java/.NET

- Performance issues and jumping through hoops due to Python's issues around multi-threading

- Tons of tech debt and difficult to decipher code

Still there was a good result in the end, and the rapid iteration of Python was helpful in getting things done fast. The question is how bad was it to deal with the issues after the exit when the new owners had to keep scaling it.

Another place we had a bitter LISPy guy. Always trying to write everything in the most esoteric language he could, always leaving behind maintenance issues, never stopping talking about how bad the tool chains were (Java/C++) that actually brought in the revenue his job depended on.


I wonder if Graham writes from the perspective of a startup founder, i.e., he observed other founders who used Python in 2004 and got a positive impression.

I do not have this impression in 2023. No website written in Python is as polished as GitHub (has Ruby influenced the elegance?).

The scientific ecosystem is chaotic, new and half-finished tools are released daily, packaging is a mess. Of course some of the scientists are smart in their domain, but not always in programming.

Core development is pedestrian and run by unproductive, dominant bureaucrats who want to remain in power. From a CS point of view, Python is of zero interest and has few exciting areas.

Python has maximalist specifications with many bugs in the resulting case explosions. Python has too much churn and a horrible package manager.

I don't see smart people who are attracted to Python (like maybe in 2004). They sometimes use it, grudgingly.


Not statically typed until recently (difficult to navigate/refactor), global interpreter lock, kwargs grossness, no multi line lambdas, the whole version 3 upgrade fiasco.. idk it’s ok.


PHP dominated the internet and then eventually became shunned with people embarrassed to admit they use it. I sometimes wonder if Python is on this trajectory but much slower or not. We are already at the point where there are very clearly better alternatives from nearly every technical perspective, so it's carried primarily by momentum and ecosystem. Can that last forever?


> We are already at the point where there are very clearly better alternatives

Can you name them? I’m not personally aware of any. Python is here to stay for a long long time. It’s become the de facto language for AI/ML and there’re good reasons for that. And programming the AI way or with a significant component involving AI is very likely the future of programming. Until a much better general purpose tool comes around (ie a language that doesn’t get in the way and comes with batteries included - especially for non cs types who have to accomplish programming like tasks) I don’t see it going away anytime in the near horizon


Python is fine as scripting language for scienctific computing. What do you need more ?


Too many reasons to list. Numpy, pandas, matplotlib are all pretty badly designed and very annoying to work with. Dealing with different array types in numpy, pytorch, etc, I shouldn't have to do. Different array types have different interfaces, they shouldn't, I don't care why. Terrible packaging system. No threading. Slow loops. Etc.


> People don't learn Python because it will get them a job; they learn it because they genuinely like to program and aren't satisfied with the languages they already know.

At least in my area, these days there are many programmers who know Python because that's the only language they have been taught. And there are many job opportunities specifically looking for Python.


You have to put this essay in the context of year 2004. Python was not used as much as today, it was an emerging language.


Yes. I wanted to add how much have changed today, but wanted to keep it short. So just ended up putting "these days".


In 2023, it's the Rust paradox


I love learning new languages because of learning, not because it's cool. That's why learning Rust shows more about the person than being on a hype train. This is one of the best indications for hiring if I'd hire someone.


It's ugly as hell though. The person might like learning but I wouldn't think they liked aesthetics or elegance.


This post would sound so dumb if it didn’t come from the almighty pg.


Icon, that's a name I havent heard in a long time...


In my experience, programmers obsessed with unconventional programming languages are exactly the ones that prioritize (over) engineering over simply shipping things. If you think, writing your codebase in a funky functional language would bear fruits in the long turn, you're in for a surprise. The road is full of pitfalls. Here are a few I outlined in my essay[1]:

1. Hiring is harder: "When it comes to hiring people, it’s true that programmers versed with functional programming would be, on average, better than those who aren’t. However, the hiring pool itself would shrink massively. Learning a language is an investment, and not many people are using their spare time to learn a new one. As I have seen it happen, demanding functional experience soon becomes an unreasonable expectation."

2. Less resources: What's the point of choosing an esoteric langauge and having to build nearly everything from scratch. You're basically throwing away years of experience that a vibrant community provides.

3. Hackers use languages they know: I am nearing 30, and having seen rise and fall of many tech trends, I want to play my cards right. Why should I waste my energy on learning a new language, especially when it can fade into obscurity and the output is going to be the same anyway? MeteorJS[2] was supposed to revolutionize front-end programming, who is talking about it now?

Facebook was written in PHP. Mark used one language he knew and a $300B company was built on top of it. While a company I worked at that prided itself on being written in Clojure, slowly felt behind the competition and barely managed to get acquired (at a deep discount to what they raised funding at). Trust me, your time is better spent shipping, and learning internals of how things work.

[1]: https://shubhamjain.co/2018/12/01/why-paul-graham-wrong/

[2]: https://www.meteor.com/


I've had the opposite experience. Our competition used a "boring" Ruby on Rails architecture, so they were burning investment capital on cloud hosting every month, meaning they were forced to rush for milestones and cut corners so that they could grasp the next piece of funding before their runway ran out. Meanwhile our "esoteric" codebase meant we could run everything on a couple of servers we owned and spend our time building stuff there was an actual business need for. So we were the ones who got a big acquisition while they crashed and burned. I mean, if you're looking at Facebook then you should look at WhatsApp too.


I'm sure counterexamples exist (and I'd be interested in hearing what kind of software this happened with specifically), but in the 99% case the cloud hosting costs for most startups is going to be tiny compared to, say, salaries.


Properly hosting in the cloud requires some additional salaries.


I remember Ruby on Rails being a hipster technology for teens who code on MacBooks at Starbucks. Now it’s “boring”. How times have changed!


> Meanwhile our "esoteric" codebase meant we could run everything on a couple of servers we owned

A recent, not-boring technology that requires less resources than the old, boring one must be an exception.


Also I've found that teams that focus on easily understandable languages (Python I think is the best with its "executable pseudocode" syntax) are those that are placing value on teamwork rather than individual flexing.


FWIW I've had opposite experiences. :shrug: Each team is different, I guess :)


Teams that focus on hard to understand languages care about communication more?

I - er.. okay.


I have seen teams using languages traditionally considered "easy-to-understand" assuming that the language being readable is sufficient to replace communication and proper documentation.

I have also seen teams using languages traditionally considered "easy-to-understand" use various forms of meta-programming to customize the code and make it even more easy to understand... eventually reaching the point where nobody except the experts of this specific sublanguage could understand it.

I have seen teams using languages considered hard-to-understand being obsessed with documentation and communication – and others who don't care at all.

I've come to the conclusion that it's a teams thing, much more than a language thing.


I’ve never seen hard to understand language, just hard to understand code. Without exception. Do you have any specific in your mind?


> Facebook was written in PHP. Mark used one language he knew and a $300B company was built on top of it.

This example doesn't further your point at all seeing as they had to over time literally invent an entirely new language to make their PHP codebase scale. And on top of that FB went on to use a whole bunch of uncommon languages (Haskell, Erlang, D, probably a bunch of others I don't know about)


Well, as the counterfactual, you can't invent a buyer who will value your Clojure shop at the last funding round.


As a countercounterfactual, Zuckerberg spent $22 billion (original offer was less, but both due to growth in stock between offer and sale, and retention) for a company with only 35 engineers writing Erlang (Whatsapp).


> Why should I waste my energy on learning a new language, especially when it can fade into obscurity and the output is going to be the same anyway?

I used that reasoning to keep writing perl when python suddenly seemed everywhere and over-hyped.

To be honest, I probably should have just taken the time to learn python sooner. I haven't abandoned perl, and I doubt I ever will, but it would have saved me some time/trouble to have picked up python earlier since it wasn't the fad I assumed it was.

Today, I assume Rust is just a fad so I'm tempted to avoid the same mistake and pick that up too. Haven't bothered yet though.


Due to Rust's character and niche, it seems rather likely to stay. Wouldn't bet against it yet. There is so much potential for it to replace old and fragile things.


If you already know a general purpouse language well there is no reason to learn another one, unless you are tired on the lang you already know.


this piece was an article relevant to that certain point in time.

where at that time - java had slow velocity - because it was tied up in marketing speak - and enterprise jargon.

however reading in between the lines - languages or ecosystems that are not shaped up by hype / marketing to do tend to encourage faster velocity and an uncanny advantage.

in this day and age though - maybe due to resume driven development - influx of vc funding. keep in mind - when YC started it was just a seeder fund - which encouraged companies to reach ramen profitability as soon as possible, not survive on the next round of funding.

what you now experience in the present - is rubber gum / viscous velocity amongst startups due to marketing i.e things like k8s, cloud tools etc which reach the complexity of J2EE tools of back then; when only the appointed prophets could show you the right true path.

now every company big and small - talks about doing things at scale. yet if you take a step back - no one has a definition of what scale is.

at $job - the talk of scale is prevalent - yet we're reaching for tools to solve human problems. that wouldn't need tech at all.

good luck finding a place that shuns complexity, isn't afraid of letting servers fail - to make sure they're robust and reliable enough. where the presentation layer - isn't going through a yearly ritual of sacrificing hours dealing with dependencies.


So what are some present day choices for a language that can serve this people differentiator role? And yet is practical? How about F#?


Rust seems to be a popular choice for filling this role at the moment.


Imba for the web. So superior than anything else that we use it for all of my companies.


if you're trying to build a product, code is a liability. not sure that an esoteric codebase is the best signal to candidates that the bottom line is solving problems for stakeholders, rather than beautiful code.


So today it's the Erlang or Rust or Hasekll or F# paradox?


time travel from 2004 to 1960s: fortran vs lisp

was lisp considered esoteric back then?

is lisp programmer smart[er]?


Python is great for small scripts or explorations. You can't use it for anything serious that needs to scale due to the lack of concurrency and significant memory overhead. If you need something to scale you use something like C++, Java, Go, Rust, etc.


Yet it's good enough to scale Pinterest, Instagram, and Youtube.


>You can't use it for anything serious that needs to scale due to the lack of concurrency and significant memory overhead.

Please actually think about what you are saying instead of just parroting off stuff you read on blogs.

Python has concurrency. Its called multiprocessing. Of course, there is startup overhead, but it achieves the same functionality. Furthermore, processes that are most commonly threaded (like fanning out network requests) are well suited to async, which has less overhead than threads.

Also, memory considerations are relevant only for embedded systems, for which you would never use Python. Memory is dirt cheap these days.


> Python is great for small scripts or explorations.

I've used Python on program with >30000 LOC without any problems.


Typically large code bases have problems that are more intrinsic to organization than what language its in. Huge code cases in any language are usually fine if discipline is exercised.

And concurrency issues and memory overhead problem are from not knowing the right way to do it. Although I will concede that naive approach to both of those is sub-optimal.


> Although I will concede that naive approach to both of those is sub-optimal.

Paradoxically, this can actually lead to a better understanding of how to write performant code. When I first learned some of the be implementation details of python, I couldn't believe it was performant enough to work for anything ... after optimizing enough of it, I understand better what actually matters


Also people seem to forget the rule of premature optimization. Worry about performance when performance starts to be a problem. You identify the area with performance is lacking and optimize that. Compute is cheap, manpower is expensive.


Do you really have no problems? I find the lack of proper type support and proper variable scoping to be a huge issue.


Python has types now (though it didn't when I wrote the 36kLOC program I had in mind).

What problems are there with python variable scoping -- you can have global and local variables, shouldn't that be enough?


> though it didn't when I wrote the 36kLOC program I had in mind).

Yeah, that was my assumption. Still, typing in Python feels very clunky compared to TypeScript. And even though it is much better after 3.8 and 3.9 updates, the adoption of typing in various useful libraries was relatively low as far as I remember.

> What problems are there with python variable scoping -- you can have global and local variables, shouldn't that be enough?

Maybe it is just me, but the scoping rules feel weird: blocks like "with"-statement or for-loops don't create scopes. There is no distinction between declaration and assignment. Sometimes you have to use those weird "nonlocal" and "global".


The scoping is bad, they should have copied Perl's "every block is a new scope" and added an explicit "let" keyword to define variables instead of overloading assignment. The "nonlocal" and "global" and stuff like "lambda x=x" is a big giveaway that their approach was wrong, but I guess it was too late to change it.

The typing has been great in my experience. Most of my deps have types and if they don't I can autogenerate them. The powerful typing of TypeScript is mostly not needed if you're not interacting with JS code.


> Yeah, that was my assumption. Still, typing in Python feels very clunky compared to TypeScript.

I used comments in the code to say what the types were, e.g.:

    def processNotesForm(d):
        """ process the notes form
        @param d::{str:str} = the form data
        """
Of course in modern Python one would simply say:

    def processNotesForm(d: Dict[str,str]):


Lack of typing support is a major advantage. If you don't have types you don't need interfaces, generics, etc. The resulting code is shorter and less bug prone.


You still need all of that. You just have to store that information in documentation and do analysis in your head instead of relying on a static analyzer.


No, you really don't. Static typing is very complex and error prone compared to dynamic typing.

There is a lot less stuff that needs remembering.


Languages like Python and TypeScript support 'any' as a type, so you can always opt out of strong typing if you want. Most of the time generics are preferable to things randomly dying at runtime though.


Things don't die very often due to typing issues. The unit tests always pick up typing problems.

Testing the code's behavior implies testing the code's typing. And if you are not testing the code's behavior then the code isn't really tested at all.


Does your code check every single variable is not null before using it? That every function argument is of the expected type? That every attribute exists before it is accessed? And do you have unit tests for all of this too?

If so - then you're writing a whole lot of manual checking that a strongly typed language would perform for you, at compile time. If not - well then you're doing less testing than a strongly typed language would.

I agree that these issues are rare, and there is evidence that strongly typed languages have a similar number of bugs to dynamic ones. But suggesting that because you have unit tests, you don't need strong types, is a bit naive to me.


I don't actually care about any of that stuff. You are confusing technically wrong with not working. If the code works in production it doesn't matter that the code is technically wrong.

Testing via typing is very weak testing. Almost worthless. Type checking doesn't find many bugs in general.

Strongly typed languages have 2.5 times the number of bugs as dynamically typed languages per software feature.

Why would you intentationally add bugs to your code?

It doesn't make any sense to me.

Thinking using static typing in a scripting language is a good idea is pretty naive. It's like creating a version of Haskell with mutability.


> If the code works in production it doesn't matter that the code is technically wrong.

This is actually very insightful, but I'm afraid you won't be able to convince most people.

The only valid metric of code correctness is empirical: how many times code ran successfully in production. Everything else (unit tests, static typing) is theoretical and often close to useless.

I remember seeing a talk where someone analyzed all Github repos to find which languages produced most reliable software. He found C++ to be the most reliable. But not because C++ itself is great as a language. The main reason was that lots of the dependencies that other languages are using were written in C++. C++ software happened to be the most battle tested.


Very insightful indeed. "I never lock the door to my house but it's never been robbed"


Once you have got 5 years real world experience and find programs in production that work for the wrong reasons, you will understand. (And I'm not just talking about programs with dynamic typing either)

I'm aware of one case where a program traded 10 million dollars, made 5 million profit. There was a serious jaw dropping error in it. Did it matter? (and if so, to whom?)


The alternative is pretending your house is a treasury and buying a lock which costs more than any other item in your house. While your roommate develops kleptomania.


Python is a glue language. If you want to be hardcore you write something in rust (or whatever) with python bindings...


Counterpoint: wanting to be hardcore is probably not a good reason to do something.

If you want to learn rust/go/nim/haskell/whatever do that, and that is probably going to be worthwhile. Do it to learn more about programming, gain fresh perspective, pick up new skills, broaden your mind, any number of reasons. But don't do it to "be hardcore".


Or you can just pick a language like Java, Scala, Go etc and not need to manage two codebases, build tools, testing setups etc


Or just write in Python like Quora, Pinterest, YouTube, Dropbox, etc


YouTube is almost entirely C++ and Dropbox rewrote their sync engine years ago in Rust because of how ridiculously slow their Python implementation was.

No one is saying you can't make a web site using Python. Just that it is inherently a slower language that is far poorer at concurrency than many other languages.


Python is not slow for concurrency: Python has built-in cooperative concurrency that's pretty efficient, and if you care about minimum task time vs running more tasks in parallel, you need to look at multiprocess execution. In the latter case, you are generally inefficient with memory use, but if that's not an issue, you'll be effectively using your CPUs.

However, Python is slow compared to most compiled languages for simple large loops over any data structure.

A good example is how ORM libraries will fetch a million rows from the database as a list of tuples in roughly the time the db emits them at (say, a second with C-based code), yet it will take them 100x or 1000x as long to turn those rows of tuples into Python objects with pure Python code.


Both of those became unicorns before rewriting. It is quite likely that they were able to grow so much precisely because they chose Python.

Google emails prior to acquisition happen to confirm that.


I guess Youtubes backend (which is written in Python) is not hardcore enough.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: