Hacker News new | past | comments | ask | show | jobs | submit login

Reading this got me thinking and I wonder if other people feel like me about this, so I'm going to share it. This is not serious, but not entirely unserious...

I try to be a good sport about it, but every time I write python I want to quit software engineering. It makes me angry how little it values my time. It does little for my soured disposition that folks then vehemently lecture me about the hours saved by future barely-trained developers who will ostensibly have to come and work with my code. Every moment working with python (and that infernal pep-8 linter insisting 80 characters is a sensible standard in 2019) increases my burnout by 100x.

I try to remind myself that we're trying to make the industry less exclusive and more welcoming to new developers and "old" isn't necessarily "good" (in fact, probably the opposite), but damn I just don't understand it.

It used to be that I could focus on other languages (Erlang, Nemerle, F#, Haskell, Ocaml, even C++) and sort of balm myself. But now, I can't even overcome the sinking feeling as I read the Julia statistics book that I'm going to be dragged back to Python kicking and screaming in the morning, so why even bother?

And frustratingly: it's one of the few languages with decent linear algebra libraries. And that means it's one of the few languages with good ML and statistics support. So it's very hard not to use it because when you want to cobble together something like a Bayesian model things like PyMC or Edward actually give you performance that's obnoxiously difficult to reproduce.

This is what the industry wants and evidently a lot of people are okay with it, but to me it's misery and I can't work out why people seem to like it so much.




I am about to hit a decade of python experience. I work with it daily, all my major codebases are written in it.

I hate it.

It has an ok object system which is possibly its only redeeming quality. I found Racket about 4 years ago, and any new project that I work on will be in Racket or CL.

I could go on at length about the happy path mentality of python and the apologists who are too ignorant of other people's use cases to acknowledge that maybe there might be some shortcomings.

The syntactic affordances are awful and you can footgun yourself in stupidily simple ways when refactoring code between scopes. Python isn't really one language it is more like 4, one for each scope and type of assignment or argument passing, and all the sytactic sugar just makes it worse.

Not to mention that packaging is still braindead. I actually learned the Gentoo ebuild system because pip was so unspeakably awful the moment you stepped away from the happy path.

Blub blub blub, someone help I'm drowning in mediocrity.


I am about to hit two decades, with lot of pauses. I now use it professionally, but it was still my favorite language a decade ago (then I fell in love with Common Lisp and lately Haskell).

I think you need to look at it historically. Against other languages circa 2000, Python was a huge win. Easy to write (compared to C), consistent (compared to PHP), minimalist (compared to Perl), multi-paradigm and playing well with C (compared to Java), with a great standard library (compared to Lisp).

Today, the landscape is very different. Lot of languages took hints from and got influenced by Python (including Racket I am sure). Functional languages have taken off. Type inference is a standard feature.

History is always changing. There is some potential for a next Python, but we don't know what it will look like yet. I suspect it will be functional, but I don't think it will be in the Lisp family. It probably won't happen until after Rust and/or Julia get lot more adopted. Anyway, just like C, Python will be with us for decades to come, for better or worse.


Two decade Perl programmer who's been using a bunch of Python recently. I don't hate it, but I've not been getting any particularly complicated use-cases. I do miss Moose a lot, and I miss Bread::Board and I miss DBIx::Class, which SQLAlchemy does not make up for. The weird scoping didn't take too long to get right, although string interpolation still takes me too much thinking about.

What I am missing from Perl is library documentation and MetaCPAN. It seems that the go-to solution is that you're meant to create some kind of damn mini-site for any even vaguely complicated project you do using Sphinx, which seems bizarro land. Also the _requests_ documentation looks like it was written by Wes Anderson and wtf I hate it. Also I hate that all libraries have cute rather than descriptive names; yes there's some of that in Perl, but it feels like less. Bah humbug. Other than that it's fine.


This resonates with me. I miss working with Perl greatly (my current employer forbids me to write anything on it), and having to deal with Python instead makes me miss it even more.


I have nightmares of dealing with Python packaging... I finally gave up and just started using PyInstaller to ship binary blobs to people.


That doesn’t even work in all situations. What if the system requires development packages? What if it’s a different OS or architecture? Packaging is a nightmare.


What if you are on Linux and your binary requires a slightly newer version of glibc for some reason etc.

Static, portable binaries on Linux are hard.


That's why Docker is a good idea for packaging things up. You basically get exactly the combination of binaries, libraries, etc. you intend to run. Probably not a bad idea to use it for development as well. Or at least something like pyenv or whatever it is called these days.


Docker saves some pain, but introduces others.

You have a container that will run anywhere, but it has a bigger attack surface and needs updates.

I also would disagree that it's easier than using virtual environments for development, and only for packaging in some narrow use cases.


They aren't too bad IMO. However, you need to set up a build environment/VM with your choice of your earliest-supported Linux+glibc. Build on old, run on new works well.

Linux backwards-compatibility is pretty good, in that a static binary should run just fine on newer systems. I've had far worse experiences with OpenBSD, where a build on an older version of the OS would never seem to run on a newer system.


> Static, portable binaries on Linux are hard.

That’s why it can’t be an afterthought. It must be baked into the language as part of the design (golang)


Why would your binary require glibc if it’s statically linked?


In order to evaluate the relevance of this comment we need to know what else you have been using for 10+ years that you like or at least think is ok.


Thanks for sharing, it certainly makes me feel better about my troubles with accepting python. And "the happy path" might be a state you spend a /minority/ of your time in, depending on project. So first you spend 10% of your time to fix the 80% of easy features, but the tricky stuff can be more awful than it should be.

And packages. I thought I was just stupid, but thankfully there was more to it.


Quite welcome. I think you are on to something. The thought that comes to mind is "Python is a great language if you want to do exactly what everyone else is doing." Most of the love has nothing to do with the language itself, but rather the fact that people can make use of some library that a big engineering effort produced in another language entirely. Thus they conflate Python with access to that functionality. All of my code bases are pure python (in part because of the packaging issues), so maybe that is the difference, people aren't really Python users, they are numpy, pandas, django, sqlalchemy, or requests users. It would be interesting to compare the view of library maintainers vs just 'users.'


Agreed this is it, when people think Python, they think all the default use cases covered by the many libraries that let you "get things done" quickly with syntactic sugar. Alas, there's more to our work than the first few miles of getting things done. There's that scaling part too and python loses it there: no good threading, no enforced typing, bizarre scoping, half baked module system,etc


It seems to value my time quite highly, as I can achieve most things more quickly and easily in Python than any other language I know.

Can you be more specific about how it increases your burnout? Is it the language, or someone forcing you to use that linter and settings?


> Is it the language, or someone forcing you to use that linter and settings?

It's definitely both.

Preface in true internet style: these are just opinions and you may not share them. That's fine.

I really don't like Python as a language. I don't like its total lack of composability. I don't like its over-reliance on a very dated vision of OO. I don't like how its list comprehensions aren't generic. I don't like how it farms all its data structures out to C. I don't like how it uses cruddy one line lambdas and forces me to name things that are already named by the things I'm passing it to.

And also the linter just exacerbates these things, because the linter is just a crystallized slice of a kind of software engineering culture I really don't like.


Not sure why Python’s list comprehensions not being generic might be such a great downside but just wanted to add that I personally think that they are perfectly fine just the way they are. For the context, I’ve been professionally writing Python code for 14 years now. I understand it’s not the perfect language but for quite average programmers like myself it does an excellent job of getting out of the way and of letting you focus just on the data, because that’s what programming is (or should be), i.e. managing and processing data.

Like I said, it’s not the perfect language, it does not have Erlang’s pattern matching nor Lisp’s insistence on thinking about almost everything in a recursive manner, but you can at least mentally try to incorporate those two philosophies into Python once you know about them (and once you know about other similar such great programming concepts).


If by generic you mean able to use other types of collections, aren't generator expressions[1] the generic version of list comprehensions?

List comprehension:

  a = [x for x in stuff if x > 1]
Equivalent generic version:

  a = list(x for x in stuff if x > 1)
With your own collection type:

  a = MyCustomCollection(x for x in stuff if x > 1)
[1] https://www.python.org/dev/peps/pep-0289/


> I don't like how its list comprehensions aren't generic.

Python iteration is generic.

https://docs.python.org/3/reference/datamodel.html#object.__...

Please don't spread FUD.


I'm thinking of generic in the sense of monadcomprehensions, sorry.

> That's an implementation detail of CPython. I don't see how it's important for a language user either. Jython uses Java objects. PyPy uses RPython.

This strengthens my point though?

> Please don't spread FUD.

This is that culture thing I'm talking about.


I'm not really sure what your point is in criticizing "farming" to C.

> This is that culture thing I'm talking about.

You made a false claim. You referred to Python iteration as not generic, when what you really meant is that Python lacks first class monad support.

If you want a monadic programming model, you're not going to be a happy camper in the Python world. But I doubt that comes as a surprise.


> I'm not really sure what your point is in criticizing "farming" to C.

If Python is so great how come it can't even express a decent data structure that it needs? I'll happily level the same criticism at Ruby if you like.

> You made a false claim.

Firstly: This is not High School Debate Club. There isn't some kind of point system. Language and expectations like this are not only counterproductive (in that they essentially turn every conversation into a series of verbal ripostes in which the goal is to be most right rather than *learn the most) they're also tedious and unwelcome.

> You referred to Python iteration as not generic,

I did not. I said list comprehensions weren't generic, and then I tried to explain my complaint. It may be that someone has done a legendary feat of going through and creating a bunch of mixins for some of the gaps in the list comprehension coverage such that you can find some way to override how non-determinism is expressed. If so, please point me to it.


> If Python is so great how come it can't even express a decent data structure that it needs?

Why is this a requirement of a "great" language? By not requiring a language to be self-hosting, you are adding more degrees of freedom to your language's design, so I could even see an argument that writing it in C is an improvement. I don't necessarily agree with that, but I don't see why cpython written in C implies it is a bad language. Maybe you could elucidate your thinking?


you are adding more degrees of freedom to your language's design

Why do you think that? Languages/runtimes with high C integration and fairly exposed C bowels like Python and Ruby have, over time, turned out to be very hard to evolve compatibly.


> Why do you think that?

Because it's a fact? With cpython you can develop things in python if that suits you or C if that suits you. You have more freedom to choose what fits your use case. I'm not saying this is necessarily good, but I don't think it's obviously bad. I'd like to hear from people who think it's the case.

> Languages/runtimes with high C integration and fairly exposed C bowels like Python and Ruby have, over time, turned out to be very hard to evolve compatibly.

That is true, but it's also arguably one of the reasons cpython became so popular in the first place. The ability to write C-extensions when appropriate has been very powerful. It's certainly caused issues, but I think if python didn't have exposed bowels it may never have become nearly as popular. What if numpy wasn't ever written? (This isn't to say that they couldn't have exposed a better C-api with fewer issues, but hindsight is 20/20...)


I guess I don't understand how 'the design gets stuck in amber' (which you seem to agree with) and 'gives you lots of design degrees of freedom' can be true at the same time.


It gives you flexibility in writing libraries while making it harder to design a new compatible runtime. That said, PyPy has achieved pretty good C extension support while making the language faster.


The claim was 'degrees of freedom in your language's design'. It's an odd one because the history of a bunch of similar languages has been exactly the opposite. Compare, say, JS to Ruby and Python. Even with the seemingly crippling burden of browser compatibility, Javascript has evolved out of its various design and implementation ruts a lot more gracefully than either Ruby or Python.


> Language and expectations like this are not only counterproductive ... they're also tedious and unwelcome.

Who gets to be the language police? I'm fine with "High School Debate" but "verbal ripostes in which the goal is to be most right rather than learn the most" is not an accurate description of that.


You can do this if you like. In the past, I'm guilty of it as well.

But I won't engage with someone who does this the same way, because they're not engaging me as a human. Unless, of course, they're already dehumanizing me (as occasionally happens on this website) and then I don't feel quite so bad about it.


If Python is so great how come it can't even express a decent data structure that it needs?

I'm sure you know this is a deliberate design choice/tradeoff. It's arguably turned out to be a bit of a millstone for languages that eventually want to grow up to be general-purpose-ish, but that wasn't as obvious at the time.


Yes, I do. I admit, that was uncharitable of me.

I remember having similar conversations with the creator of Io and seeing the Pike mailing list have similar concerns. Hindsight is 20/20.


> You made a false claim. You referred to Python iteration as not generic, when what you really meant is that Python lacks first class monad support.

Are you really that lacking in self-awareness that you responded to someone annoyed about programming culture by going full "debate with logic, facts and reason" mode?


Just a point about linters. They are bad in every language (because everyone has different opinions on what they consider beautiful code).

If you're having a burnout because the PR you just opened are being rejected by the linter you probably just make the linter apply the modifications automatically in pipeline.

We used to hate the linter checks in my current workplace because it was really boring to fix the issues. Now the CI simple fix them, and nobody cares anymore.


I'm not the person you're responding to but what they say resonates with me as a Python developer.

I believe Python is quite possibly the best language for a few things

* Exploratory programming - such as what data scientists do

* Writing programs that will never grow beyond 150LOC, or roughly what fits on a screen + one page down

When I have those two constraints met I am almost always choosing Python.

Here are some problems I face on codebases as they scale up:

* Python conventions lead to what I consider bad code. Devs will often choose things like 'patch' over dependency injection, and I have seen on multiple occasions projects derided for providing DI based interfaces - "oh, why would you write code that looks like Java? This is Python".

There's a lot of death by a thousand cuts. Keyword args in functions are abused a lot, because they're "easy", ability to patch at runtime means it's often "easy" to just write what looks like a simple interface, but it's then harder to test. Inheritance is "easy" and often leads to complex code where behaviors are very, very non-local.

Dynamic types mean that people are often a little too clever with their data. I've seen APIs that return effectively `Union[NoneType, Item, List[Item]]` for absolutely no semantic reason, without type annotations, meaning that if you assumed a list or none came back you were missing the single Item case. The implementation actually internally always got a list back from the underlying query but decided to special case the single item case... why? I see this sort of thing a bit, and other languages punish you for it (by forcing you to express the more complex type).

* I find myself more and more leveraging threads these days. I did this often with C++, and all the time in my Rust code. Python punishes you for threading. The GIL makes things feel atomic when they aren't, reduces you to concurrency, all while paying the cost of an OS thread. Threading primitives are also quite weak, imo, and multiprocessing is a dead end.

And, really, Python is anti-optimization, which is a good and bad thing, but it's a bit extreme.

* Implicit exceptions everywhere. I see 'raise Exception' defensively placed in every Python codebase because you never know when a piece of code will fail. I see a lot of reliance on 'retry decorators' that just catch everything because you never know if an error is transient or persistent.

The common "don't use exceptions for control flow" is broken right off the bat with StopIteration. I just think error handling in Python is god awful, really.

* Mypy is awesome, but feels like a bit of a hack. The type system is great, and in theory it would solve many problems, but coverage is quite awful in open source projects and the type errors I get are useless. I actually only use mypy to make my IDE behave, I don't run it myself, because my types are actually technically unsound and the errors are too painful to work with (and honestly the type system is quite complex and hard to work with, not to mention the bugs).

There are lots of other smaller issues, such as my distaste for pip, py2/3 breakage, lack of RAII, the insane reliance on C all over the place, I think syntax is bad in a lot of places (lambdas, whitespace in general), etc but these are probably my main sticking points with using the language for large projects.

Again, Python is actually my favorite language for specific tasks, but I really think it's best optimized for small codebases.


The common "don't use exceptions for control flow" is broken right off the bat with StopIteration. I just think error handling in Python is god awful, really.

That's not an idiom python has ever subscribed to though, it's always subscribed to the "Better to ask forgiveness than permission" - I think there are strengths to both viewpoints, but honestly I think "don't use exceptions for control flow" is more of a convention that a "truth"


> honestly I think "don't use exceptions for control flow" is more of a convention that a "truth"

In absolute terms or when coding on paper, perhaps. But in the real world and if performance even remotely matters, it’s as close to a universal rule all languages end up embracing or turning into creaking hulks of slow code given how exceptions work in practice at the level of cpu execution units.

C#/IL/.NET embraced excerpts heavily at around the same time (start of the 2000s) but in time developers (in and out of Microsoft) learned the hard way that it doesn’t scale. With .NET core, exceptions for flow control are completely verboten and APIs have been introduced to provide alternatives where missing. Exceptions should be so rare if you chart all handled exceptions, you shouldn’t see any and would thoroughly explore why one or more pop up when a system or dependency hasn’t exploded.


If you care that much about performance you probably shouldn't use Python in the first place. Python deliberately prioritize ease of development above performance.


The Ocaml exception implementation is comparatively very performant, and so its exceptions are routinely used for control flow.


I'd be interested to learn about how exceptions are typically using in Ocaml for non-local control flow.

Exceptions have two core properties: (1) non-local jump (carrying a value) and (2) dynamic binding of place to jump to. Contrast this with e.g. break / continue in loops where (2) does not hold. If most use cases of performant OCaml exceptions were not making use of (2) that would be an interesting insight for programming language design.

Have you got data on this matter?


No data as such. But here's the first example in the "batteries included" list library:

   let modify_opt a f l =
     let rec aux p = function
       | [] ->
         (match f None with
          | None   -> raise Exit
          | Some v -> rev ((a,v)::p))
       | (a',b)::t when a' = a ->
         (match f (Some b) with
          | None    -> rev_append p t
          | Some b' -> rev_append ((a,b')::p) t)
       | p'::t ->
         aux (p'::p) t
     in
     try aux [] l with Exit -> l
Here, the Exit exception is being raised as an optimisation: if no modification happens, then we return the original list, saving an unnecessary reverse. The try is the only place that the Exit exception is caught, so the jump location is static, much like a break.


Thanks.

That's, as you say, a statically scoped fast exit. One does not need the full power of exceptions for this (exceptions'd dynamic binding comes with a cost). If exceptions are widely used for this purpose, one might consider adding a nicely structured goto to the language. Something like linear delimited continuations?


Sorry, I misunderstood what you were after. Ocaml exceptions are used more generally, and often make use of 2.

For instance, the basic stream/iterator in Ocaml is Enum, which always uses an exception to signal when the stream has been exhausted, rather than providing a "has_next" predicate.

The catch site of this exception is dynamic.


That is interesting. Once could argue that since fast exceptions contain non-local jumps with a static target, it is enough to supply only the former, so as to simplify the language.


Interestingly .Net has very performant exception handling as well. Not sure if something changed internally with Core, but using exceptions in place of return error codes was incredibly common within the .Net ecosystem.


.NET has always had slow exception handling because it tied/ties in Windows’ Structured Exception Handling (SEH); that’s rather slow but provides e.g. detailed stack traces even for mixed-mode callstacks.

Having ported some decently large codebases from OCaml to F#, the heavy use of exceptions in OCaml (where exceptions are very lightweight by design) had to be changed to normal control flow with monads to achieve good performance in F# — specifically because of .NET’s slow exception handling.


> The common "don't use exceptions for control flow" is broken right off the bat with StopIteration. I just think error handling in Python is god awful, really.

Python explicitly does not agree with this...

The same for a lot of your other complaints. It sounds like you are trying to write some other language in Python. Similarly if someone in a team using Java tried writing Python in Java they would complain a lot and end up with ugly hard to work with Java.


Python's exception system isn't bad. It's way ahead of Java's or C++ exceptions. The gyrations people go through in Go or Rust to handle errors are far worse. The exception hierarchy lets you capture an exception further up the tree to subsume all the more detailed exceptions. (Although the new exception hierarchy in 3.x is worse than the old one. In 2.x, "EnvironmentError" was the parent of all exceptions which represented problems outside the program, including the HTTP errors.[1] That seems to have been lost in 3.x).

I think StopIteration is being removed. That was a a bad idea.


I haven't seen anything about `StopIteration` being removed?

In fact, the Python developers recently doubled-down on the design by introducing `StopAsyncIteration`.


PEP 479 might be what GP was referring to: https://www.python.org/dev/peps/pep-0479/

PEP 479 is specifically related to generators:

> If a StopIteration is about to bubble out of a generator frame, it is replaced with RuntimeError, which causes the next() call (which invoked the generator) to fail, passing that exception out. From then on it's just like any old exception.


> Python explicitly does not agree with this...

OK, I'm willing to contend that that is the case. It doesn't have much to do with my overall issue with error handling.

> The same for a lot of your other complaints. It sounds like you are trying to write some other language in Python. Similarly if someone in a team using Java tried writing Python in Java they would complain a lot and end up with ugly hard to work with Java.

This is a very loose criticism of my post, I don't know how to respond to this. I've written Python for years, I think I gave it due credit for what it's good at.

I am not trying to write some other language with Python, I just think Python is not a very good language compared to others given a lot of fairly typical constraints.


Writing programs that will never grow beyond 150LOC, or roughly what fits on a screen + one page down

I've written Python for years, I think I gave it due credit for what it's good at.

I'm not sure how fair this assessment is, considering the multitude of Python projects of considerable size and scope.


Why? Of course it's entirely possible to build programs of a larger size. I'm just saying that after a certain point you start hitting walls. There are tons of ways to hedge against this - leveraging mypy early on, spending more money (on eng, infra, ops, etc), architecting around the issues, diligence, etc.

It would be very silly for me to say that you can't build large systems in Python, I've worked on plenty that are much larger myself.

Saying a language is possibly the best for two important use cases (exploration, small programs) is quite a statement, in my opinion. I don't think I believe that there's a language that excels so well in, say, building web services.


> Python explicitly does not agree with this...

Whats the convincing argument for defending this? Ive always heard its just the way python is. Through experience using exceptions for anything other then exceptional situations and errors seems messy.


I suppose that comes back to use cases. I am one of the 'getting things done crowd', rather than a computer scientist. A lot of my work had been in things like EDI, or similar integration code.

Imagine you are working through a series of EDI files and trying to post them to a badly documented Rest(ish) API of some enterprise system. If the file is bad (for whatever reason) you need to log the exception and put the file in a bad directory.

Pythons use of exceptions for control flow is perfect for this. If file doesn't load for whatever undocumented reason, revert back to log and handle the fallout.

"Oh I see a pattern, this API doesn't like address lines over 40 characters, I will add a custom exception to make the logging clearer, and go and try and see if I can fix this upstream. If not I will have to write some validation"

It is this dirty world of taking some other systems dirty data and posting it to some other systems dirty API that I find Python rules.

I have never worked on a large application where I owned the data end-to-end. Maybe there are better choices than Python for that?


My feelings exactly. I write a lot of kinda scientific code that is not really computational but takes one or more horribly messy datasets and combines them to do some analysis. I now routinely use exceptions to filter out bad/complex data-points to deal with later.


Data scientist here. It’s great because non programmers find it easy and we have so so many tools available.

That said, the amount of cpu cycles wasted even with best of breed pandas is insane, and when you want to do something “pythonic” on big data it all falls down. When you want to deploy that model you are also going to have problems.

That said, it’s still the best tool for the job, but it’s certainly not because of the creators of Python.


Can you explain the use case for your unsound type system? Is it unsound just because mypy types are inexpressive?


My opinion on static type systems is that, unless you're actually doing a more type driven development style and leveraging the type system, the single greatest benefit is IDEs will autocomplete for you, give you 'jump to' etc.

Python without type annotations can be painful in an IDE - it chokes a lot on those features.

Since I don't take a very type driven approach to Python (it would be too slow since I'd have to figure out 3rd party library types and shit like that) I just write annotations in places where I personally know the type. Mypy complains about this for various reasons - probably because my annotations are not technically correct, because I'm not a pro at generics in mypy and working out inheritance, as well as general mypy issues like poor support for class methods and that sort of thing.

But I ignore all of those errors because the IDE can still work things out.


What you’re describing is what happens when you have a developer who’s bad at their job.

I see the point that python invites these anti patterns.

But on the other hand, a software developer that returns Union(list[item], item]) in Python is probably also going to mess up a Java program sooner or later.


I don't think you'd ever write that in Java because Java would punish you for it. You can always blame the programmer, but I'm always going to blame the language for making 'bad' easy.


> and multiprocessing is a dead end.

Why is multiprocessing a dead end?


Huge overhead for process communication, weird interface, weird semantics, etc.


Is that still true with processpoolexecutor?

The overhead obviously is still there, but the interface is a drop in replacement for threadpoolexecutor, which looks basically just like a multithreaded/async/future-based collect.


The interface may have improved, I haven't paid attention to mp in a long while. If I'm reaching for mp I generally just accept that I'm using the wrong language.


Can you expand a bit on weird semantics and the overhead? Are you referring to some particular benchmark?


I have a hard time reasoning about multiprocessing code, for various reasons. It's bitten me in a lot of weird, distinct ways, not really worth listing here.

To communicate across mp you use pickle, which is a pretty significant performance impact relative to something like threading. There's also the issue of copy on write memory + reference counting interacting poorly.


> To communicate across mp you use pickle

I suspect this is the root cause of the difference in our experiences. My uses of mp have usually been somewhat more "embarrassingly parallel", for instance having a list of data elements which need to be processed with the same algorithm. For this use case, the usage of mp is pretty simple, often only a `pool.map(f, xs)`.

I can imagine that pickle might have tricky edge cases and/or be slow.


Sure, but:

a) All reference counts after the fork are going to cause a copy of memory, so any memory access (even a read) can trigger a copy.

b) Even to send 'xs' over you must serialize it via pickle, and then the callee must deserialize it

That may be fine for your use case but it's strictly worse than just sharing a reference across a thread.


Here’s my long list on why Python is terrible:

1) No pattern matching. In 2019, this is just unacceptable. Pattern matching is so fundamental to good FP style that without it, it becomes almost impossible to write good and clean code.

2) Statement vs expression distinction. There’s no need for this and it just crippled the language. Why can’t I use a conditional inside an assignment? Why can’t I nest conditions inside other functions? It makes no sense and is stupid and inelegant.

3) Related to 2), why do I need to use a return statement? Just return the last expression like many other (better) languages

4) Bad and limited data structures. In Python all you get are arrays, hashmaps, sets. And only sets have an immutable version. This is unacceptable. Python claims to be “batteries included” but if you look at the Racket standard library it has like 20x more stuff and it’s all 100x better designed. In Scala you get in the standard library support for Lists, Stacks, Queues, TreeMaps, TreeSets, LinearMaps, LinearSet, Vectors, ListBuffers, etc.

5) Embarrassing performance. Python is so slow it’s shameful. I wrote a compiler and some interpreters in college and I honestly think I could create a similar language 10x faster than Python. Sometimes you need to trade off performance and power, but that’s not even the case with Python: it’s an order of magnitude slower than other interpreted languages (like Racket).

6) Missing or inadequate FP constructs. Why are lambdas different from a normal function? Why are they so crippled? Why do they have a different conditional syntax? The only sort of FP stuff Python has is reduce/filter/map. What about flatMap, scanr, scanl, foldl, foldr? Or why doesn’t Python have flatten? All of these are very useful and common operations and Python just makes everyone write the same code over and over again.

7) No monads. Monads can be used for exceptions, futures, lists, and more. Having to manually write some try catch thing is unseemly and worse than the monadic Haskell or Scala approach.

8) No threads and no real single process concurrency. Despite Python being used a lot, no one really seems to care about it. How can such a problem not be solved after over 20 years? It’s shameful and makes me wonder about the skill of Guido. There’s no reason why Python couldn’t have turned into something beautiful like Racket, but instead it has been held back by this grumpy old guy who is stuck in the past.

9) Others might not have a problem with this, but I detest Python’s anything can happen dynamic typing. It makes reasoning about code difficult and it makes editor introspections almost impossible. If I can’t know the exact type of every variable and the methods attached to it, it hampers my thinking a lot. I use Python for data science and if I could just have a language that was compiled and had static typing I would be 3x as productive.

Let me conclude by saying there currently is one good reason to use Python: if the domain is ML/DL/quant/data science Python is still the undisputed king. The libraries for Python are world class: scipy, sklearn, pandas, cvxpy, pytorch, Kerala, etc.

But Julia is catching up very fast and the people I have talked to are getting ready to ditch Python in 2-3 years for Julia. I don’t think I’ve encountered anyone who didn’t prefer Julia to Python.


It seems you just want a functional language. If so, why are you using Python, as there are much better alternatives like you mentioned, if not in the data science space? I use Python only for data science and generally Elixir, Racket, Haskell etc for other use cases.


I do, but the question is, why don’t other people? It makes no sense to use Python for anything except data science.


What exactly is Kerala library for, never heard of it before?


I believe they must mean Keras


Sorry yeah autocorrect


> and that infernal pep-8 linter insisting 80 characters is a sensible standard in 2019

I don't know why 80 characters is a problem. I don't use the linter but I enforce this rule religiously with a couple of exceptions (long strings comes to mind). It forces me to think heavily about the structure of the code I'm writing. If I'm nesting so deeply, something has gone wrong. If I've got a ton of chained methods or really lengthy variables, it forces me to rethink them.

This also has the advantage of being able to put 4 files next to each other with full visibility. Vertical space is infinite, horizontal space isn't. It's probably a good idea to use it.


It's also awesome if you have to do code reviews on a laptop or don't have a massive screen available.

That said, we usually just go with autoformatting via black, which is 120 by default. No more hassle manually formatting code to be pep8-compliant. Just have black run as a commit hook, which is super easy via pre-commit [0]. And you can run the pre-commit checks during CI to catch situations where somebody forgot to install the hooks or discovered `--no-verify`.

Can't really imagine developing Python without Black any more.

[0] https://pre-commit.com/


I haven't got round to trying Black, but according to the project's README[0], the default is 88. Personally I think 79 is fine, but I can cope with up to about 100. Above that and you risk some really crappy code in my opinion.

EDIT: Sounds like the Black author agrees. "You can also increase it, but remember that people with sight disabilities find it harder to work with line lengths exceeding 100 characters. It also adversely affects side-by-side diff review on typical screen resolutions. Long lines also make it harder to present code neatly in documentation or talk slides."

[0] https://github.com/psf/black


While initially resistant I've come around on Black for our team and a failed Black check will now make a CI build fail for all our projects.

We're still using the community edition of SonarQube [0] for inspection but Black finally did away with the constant bikeshedding over formatting minutia, seems like it's saving us tons of time.

[0] https://www.sonarqube.org/


All I know is that with 80-char width I can have 2 files side-by-side on a 15" MBP along with the dir-tree on the left in an editor like PyCharm or VSCode and fully see both files wo wrapping. It helps my productivity immensely.

Same deal when it comes to reviewing PRs in GitHub. Wrapping just interrupts flow for me.


Disclaimer, I am a data scientist

I feel the complete opposite. I really enjoy working with python over any other language. R does linear models and time series better and matlab has its charm, but overall I prefer python. Python is so easy to read and quick to program in. I am so glad I am not in the Java/C++ world anymore, but I know people in different roles have to deal with different issues.


> I really enjoy working with python over any other language.

I assume you mean, "over any other language I have tried" ?

As someone with a mathematical background myself, I am always surprised at how many data scientists and quants are ignoring more mathematically principled languages like F#, OCaml and Haskell.


F#, OCaml and Haskell

Can I quickly prototype a new deep learning model and scale it to a 32 GPU cluster with very little effort in those languages?


If you put in as much time as you have learning Python, then the answer is probably yes.


Probably yes

What does it mean? Have you done it in any of those language? Have you seen it done in any of those languages?


> What does it mean? Have you done it in any of those language?

I did. I'm doing a image processing recently and use OCaml for prototyping. I've tried python (I've used it a lot for that long time ago), I've failed, it felt to awkward. I've described my experience here [1]

If you have no experience whatsoever with ML family [2], and doing all the stuff in python, you'll most likely be much more productive with python of course.

But I find ML-like languages way more pleasant, and I'm far more productive with libraries like owl [3], which are more fundamental and don't have fancy stuff, and ML, rather than with python and fancy lib like numpy/scipy.

Also Julia could be a good choice hitting a sweet spot between fancy libraries and fancy language.

[1] https://news.ycombinator.com/item?id=20457505

[2] https://en.wikipedia.org/wiki/ML_(programming_language)

[3] https://ocaml.xyz/


Right now I’m experimenting with a pretty complicated model (60+ layers of multiple types), and I plan to train it on several hundred GB of data, using 8-16 node cluster (4 GPUs per node). Does Owl have a well tested and well documented autograd library with distributed GPU support (e.g. Horovod)? With a good choice of optimizers, regularizers, normalizers, etc, so I can focus on my model and not on debugging the plumbing or implementing basic ops from scratch. And last, but not least, it must be as fast as TF/Pytorch.

If the answer is “no”, then it does not matter whether I’m an OCaml expert, because I’m still going be more productive with Python.

p.s. Julia is nice though, hopefully it will keep growing.


I feel what you're saying is that regardless of how subpar a language is compared to alternatives as long as it has community built specific libraries that solve your problems you're more productive using them than anything else.

Which is of course a fair point. A language by itself is probably not even in the top 3 considerations when choosing new tech. Stuff like runtime, ecosystem and the amount of available developers would probably be more important in most cases.


> A language by itself is probably not even in the top 3 considerations when choosing new tech. Stuff like runtime, ecosystem and the amount of available developers would probably be more important in most cases.

Totally depends on a domain. In serious mission critical software you wont use libraries, but will use the language.


Yeah I don't disagree. But even there you would have similar other considerations besides the language. Like most still end up with C/C++ there even though there are others like Crystal, Nim, but you just don't find developers who know them easily, nor do you have any ecosystem support.


> Like most still end up with C/C++ there even though there are others like Crystal, Nim,

Because C++ and C are significantly better than Nim and Crystal.

There are also Ada and Spark and aerospace and very critical stuff.

> just don't find developers who know them easily

We don't look for OCaml/Ada developers, we hire programmers, and they program OCaml and Ada. It's not a big deal for a good programmer to learn a language, especially while programming side by side with seasoned programmers.


how subpar a language is compared to alternatives

In my 6 years with Python, the only dissatisfaction with the language I felt was from parallel programming. I switched to Python from C, and at the time, I missed C transparency and control over the machine, but that was compensated by the Python conciseness and convenience. Then I had to dig into C++ and I didn't like it at all. Then I played with CUDA and OpenMP, and Cilk+, but I wished all that would be natively available in a single, universal language. Then I started using Theano, then Tensorflow, and now I'm using Pytorch, and am more or less happy with it. It suits my needs well. If something else emerges with a clear advantage, I'll switch to it, but I'm not seeing it yet.


Although I'm not sure what you mean by "deep learning", you can take a look at Spark: https://spark.apache.org/mllib/

As a bonus, it IS Python (numpy) in the background mixed with Scala. So you can use each language where they make the most sense - Python for the maths number crunching and Scala for the business logic and the architecture.

I think Spark also has .net bindings (so you can also tick F# on that list...).


As much as I love the languages you mentioned: I think it's a major weakness of them that they don't have the linear algebra libraries integrated such that you can do this the same way Python does.

There are a lot of reasons why this is.


Not until the libraries get built. They didn't exist for Python either until fairly recently in the history of all these languages.


For those unaware: Haskell has a REPL (ghci), and you can make files more script-like with the (currently most popular) build tool stack[0] if you include:

  #!/usr/bin/env stack
  -- stack --resolver lts-6.25 script --package turtle
(as you see it includes easy dependency management :) )

0: https://docs.haskellstack.org/en/stable/README/


This doesn't answer their question though. a REPL isn't magical


I dont know about F# and ocaml, but haskell's numerical libraries really pale compared to numpy.


It's language vs libraries. If you have a library that has a function

    get_the_shit_done_quick ()
than you don't care much about the language.

When you don't have such function, you need an expressive language to write it (and a bulk of python libs are not written in python, tho mostly for the performance reasons).

So it's all about finding a sweet spot between fancy libraries which do the shit for you, and fancy language, which let you to express things, absent in libraries.

This sweet spot differs from domain to domain, from user to user. Even in numerical stuff someone could have a requirement for a better language, although this domain is indeed to well defined to have enough fancy libraries.


Language vs libraries isn't just about an expressive language to build in when you don't have a library. The likelihood of a library's availability also depends on the barrier to entry. An amazing language that isn't usable by biologists won't have many libraries that solve biologist's problems.

To your original point of being "surprised at how many data scientists and quants are ignoring more mathematically principled languages like F#, OCaml and Haskell," I'd much rather use one of those languages, but I'd have to build the foundations myself. Today, they aren't the right tool for the job. They don't have the libraries I need, which means I don't build further libraries for them, making other people less likely to build on them, so they aren't the right tool for the job tomorrow either. I'd say it's a network effects thing primarily.


Well, yeah compared to R and matlab, I am willing to believe Python excels, but the person you are replying to is probably not doing data science, so he has options besides the 3 just mentioned.


Re. linting, I'd highly recommend Black with whatever line length you want - it'll reliably reformat your code, and once you lose the urge to reformat while typing it's fantastic. It's like deleting a bunch of useless mental code. And the code reviews are even better: include a linting step (ideally with isort) in CI and you can avoid 95% of formatting comments.


100% this. I just switched to using Black recently and not having to ever fix a lint issue again has been life-changing. Use Black with pre-commit (https://pre-commit.com) and never look back.


I've had some issues with black formatting some code differently depending on the OS it's run on. So perhaps almost reliably is more correct.


Make sure to run the same version of black.


Over-sensitive individuals are hard to please and often unhappy. That's more of a personality flaw than a flaw with the current state of software engineering.

If there's one thing wrong with our profession is a lack of ethics and accreditation - we're essentially letting random people build critical infrastructure and utilities.

We don't have a tooling problem, in fact we have too many tools.

I see so many people (especially on HN) fixating on tools, dissecting programming languages, editors and libraries into their most minute parts and then inevitably finding faults in them, disavowing them and swearing that other tools X, Y and Z are purer and worthier.

If you want to stop hating software and improve your burn out, stop caring about irrelevant stuff.


Is that supposed to help someone? I fail to see how telling someone "just ignore the stuff that irks you that you have to spend 40+ hours a week dealing with. You are overly sensitive and your concerns are irrelevant" helps anyone. Even if it was true, it was delivered in such a ham-fisted manner that I can't imagine anyone taking it to heart.


I sometimes have a tendency to focus only on the negative aspects of some things, while ignoring that all in all, those things are fine. I don't think I'm alone in that, certainly not in our line of work.

A call to "snap out of it" seems that it can help in such situations. Python is not a programming language that should make people burn out or angry. Very few languages should be capable of that, so I think this issue goes deeper than just flawed tools.

I find that the only way not to go nuts in this profession is to ignore most of it and most of it is really not relevant to building good software. There are just too many tools and always searching for the perfect tool is a recipe for being unhappy.


The programming language is one of the central tools though, the very language in which you express programs.

If you don't care about that, caring about "building good software" feels hollow.


The key point is that there are many languages that are more than good enough for almost any kind of problem solving.

Building good software is firstly about process and skills.


>If there's one thing wrong with our profession is a lack of ethics and accreditation - we're essentially letting random people build critical infrastructure and utilities.

https://ncees.org/ncees-discontinuing-pe-software-engineerin...

There was a license for a professional software engineer; they're discontinuing it, apparently due to lack of demand.

If folks wanted to get a "software people taking the FE" study group together, I'd join up.


As someone with a P.E. license that spent hundreds of hours studying and had to sit for that excruciating 8-hour exam twice, I don't think even 5% of the software developers in the US could pass an equivalent exam. Granted, I think the sector I took it in has a harder test than some, but it is a weed out test for someone that already took 4 years of Engineering school.

Some takeaways from this: 1.) I learned a little bit from studying, but overall even though it was hard, I would've learned a lot more by getting a Master's degree. 2.) The test isn't good at determining if you're good at your job or honestly even minimally competent in your area. For example, even in a specialized field (power systems engineering), there are thousands of diverse jobs (power generation, distribution, electrical contracting, power transmission operations, power transmission planning....etc etc) so the test only had a few questions on my particular area. 3.) There are a lot of amazingly smart people working in software, but the skill range seems to be bigger than in traditional engineering fields where most engineers are fairly similar in skillset (there are some outliers) as they all generally have to pass the same courses if their university is accredited (talking about USA here). In the software world, you have software engineers and computer science doctorates mixed with someone with very little training that is trying to wing it. That means the dataset has a far greater range on skillsets. One employee might have had a class on compilers while another just learned what a for-loop is. In engineering, we generally all show up to the first day of work with the same building blocks (thermo, statics, Dynamics, circuits, differential equations, stats, calculus, basic programming...etc). The only difference between me as a senior engineer and a new hire is 9 years of experience, knowledge of the business and tools and ability to get things done without oversight. It makes a big difference, but I wouldn't expect them to be lacking any of the tools or training that I have picked up (ok...maybe databases).

I'm struggling a bit to convey the overall message that software engineering seems a bit different and licensing would therefore need to be different if done. Perhaps you could have licensing for individual subjects? For example, you could pass a basic test on relational databases where you prove you can do basic operations such as select, where clauses, joins, updates, exports...etc. Then you'd have another to prove you were minimally competent in Java? Would that be of any value to an employer? I don't know. I'm guessing someone already does this for Oracle and Java too.


so I am studying for the FE (I need a lot of math before taking it is realistic) mostly 'cause it gives me this broad feel for things Engineers all know. (I will take the 'other disciplines' - mostly because I want this to be as broad as possible; being broad but shallow makes it a lot easier for me, too, but for me, it being broad is an advantage in other ways, too.)

I personally find tests to be way easier than school, and the schools with reputations that are worth something are... pretty difficult for people like me (who weren't on the college track in high school) to get into. (and there is something of an inverse correlation between the prestige of a school and how flexible they are about scheduling around work; especially for undergrad)

From what I've seen of the test, it does provide some foundational ideas of what engineering is about. Like, it goes a lot into figuring out when things will break - something I haven't really seen a lot of in software engineering.

What I'm saying here is that I dunno that an optimal SWE PE would test you very much on the specifics of Java or SQL or what have you. I mean, from my experience with the FE, at least, they give you a book that has most of the formulae you need... and you are allowed to use a copy of that book during the test, you just need to be able to look it up and apply it. Seems like they would do the same with Java or SQL.

(I mean, to be clear, to apply the formulae, you still need to have more math than I do. I imagine the same would be true of sql or java, only I'm pretty good with SQL, having 20 years of propping up garbage written by engineers who weren't.)

From what I've seen of the software engineers, Most of the time, the software guys throw something together, toss it over the fence and move on. Clearly, they didn't do any analysis to figure out how much load the weak point in the system can handle, or even what the weak-point was. It's incumbent upon me (the SysAdmin; usually the least educated person in the room) to feed load to it at a reasonable speed and to back off and figure out what happened when the thing falls over.

I mean, I think the real question people are bringing up here is "what if we treated software engineering, or at least some subset of software engineering more like real engineering?" - like clearly sometimes you can slap together something over the weekend and it's fine, but... if you are building a car or an airplane or a tall building or something, you probably want to put in the time to make sure it's done right, and for that, you need rules that everyone knows; the PE system, I think, establishes rules everyone knows, while I think software engineering mostly works on the concept of "it worked for me"


Wait...are you studying for the FE without getting an engineering degree? Props to you. One thing to keep in mind though is that there is a morning and afternoon session that are both 4 hours iirc. The first session is always a general session which covers the absolute basics of Engineering math, circuits, thermodynamics, statics, Dynamics, chemistry, and physics. It really is very easy if you remember the classes. Some of the circuits problems can be done in your head and the statics problems might have the resultant force being part of a 3-4-5 right triangle (again, it shouldn't take much thought). The purpose of this is to ensure you learned the absolute bare minimum in these classes. One reason the general questions have to be easy is that depending on your course schedule, it might have been two years since you took a course (Ex: you might have taken only one thermo class as an electrical engineer during your sophomore year). The afternoon test is either also general or specialized to a discipline (Ex: chemical engineer) and are much more difficult in comparison. I barely studied for the FE and felt I knocked it out of the park (especially the morning session). I spent months of all my free time studying for the PE and failed the first time...it is difficult. Keep in mind that both of the tests have a lot of breadth, but little depth. Going into an actual Engineering curriculum will teach you a whole lot more. MIT used to (might still) have a circuits Edx class online for free which covers the first out of 3 circuit classes a EE will take...that should help a little with the scale.

Software is weird as the hardware, languages, and frameworks are always changing and the optimal work done on any project seems to be just enough to keep the customers from going to a new product and not necessarily the best or even a good product in many cases. There are cost constraints in Engineering as well (good, fast, & cheap...pick 2), but it still feels pretty different to me than software engineering where something breaks all the time in any non mainstream software I've ever used.


Yeah, they'll let anyone with a couple hundred bucks sit for the FE. The chances of getting a PE or even an EIT, though, without a degree are... slim to none. But that's not really my goal? (I mean, it would be if it were just tests) Mostly I just want to know those 'absolute basics' of which you speak, and I like to have a test to study towards.

I'll check out that edx class, thanks, that sounds like my thing.


It's a vicious circle.

Our industry's working in a survival of the fittest mode, where fitness is almost exclusively measured through profit.


I dunno, man. I own a bunch of stock in BA right now. I bet that the idea of spending twice as much on software engineers and hiring licensed folk to write their software is looking pretty good to Bowing execs right about now, even from a plain profit and loss perspective.

(of course, a lot of professional engineers were involved in building that plane... but it's pretty unlikely that there were any PE software engineers involved, just 'cause there aren't many. Would that have helped? maybe, maybe not. to the detail that I've studied (not.. really all that much) it sounds like they made some really basic mistakes, like relying on a sensor that wasn't redundant, and those mistakes should have been caught by the other engineers. I don't know that it was entirely a software engineering problem. )

As software mistakes get more and more costly, it starts making more and more sense for execs to hire people who are properly licensed to supervise the project. (I mean, assuming such people exist, and for software engineering, right now, you could say such people don't exist.)


Oversensitive? Is the contractor who chooses a Dewalt or Ridgid over a Ryobi for daily work "caring about irrelevant stuff"? A drill is a drill right? Why is it different for us in software?


Maybe. (and I thought festool was the new hotness? I thought that at least rigid and ryobi had been devaluing their brand by selling cheap junk under their brand? But I'm solidly in the 'prosumer' and not 'real contractor' end of the market, so not an expert or even close observer, really.)

But I think the point OP was making is that contractors have a licensing and training program, and if you hire someone to put a roof on your house, they either have to go through that process or work under someone who went through that process. I mean, choosing the right tool is a small part of that, but someone in the chain of command is gonna have words with you if you bring your Xcelite screwdriver and try to use it on the roofing job.

That's not true almost anywhere in software, and that probably makes a big difference.

(I mean, not being educated myself, i personally have mixed feelings about licensure. But it's a good discussion to have, and I think that there should be something like the PE for software engineers (and there was, in texas, but it will be discontinued this year. )


I don't know anything about those brands you mentioned, but a programming language is more complex than most (all?) mechanical tools, it's designed over years and continues to change over decades.

It's impossible to do a perfect job under such conditions, and it's anyway impossible to please everyone.


Hey I like my Ryobi drill. :(


I love my Ryobi tools too! Just, I've assumed given their price-point, they aren't built for day in and day out needs of a contractor.


> Over-sensitive individuals are hard to please and often unhappy. That's more of a personality flaw than a flaw with the current state of software engineering.

Ah yes. Just remove the part of myself that got me into software as a child, and proceed robotically.

Why didn't I think of that?


> If there's one thing wrong with our profession is a lack of ethics and accreditation - we're essentially letting random people build critical infrastructure and utilities.

Spot on. Leftpad (and perhaps the whole js ecosystem) are good examples.


There are objective differences between languages. Examples include performance and compile-time checking. These differences are not irrelevant.

Alternatively, if you believe the differences between languages are irrelevant why do you not program everything in assembler?


Those differences don't matter much when it comes to building software that fulfills specific requirements.

I am a big fan of compile-time checking, but there's a lot of good software built without it and sadly there's also a lot of successful slow software. These are disadvantages, not an impassable barrier.


I know I’m responding to opinion but: developer productivity isn’t one of the pitfalls of Python.

Yes I agree that if you’re using Python for a large scale project involving lots of developers its not the best; but that’s because it doesn’t have a good type system.

You can’t work out why people like it so much because of this misconception. The languages that you gave as examples most definitely do _not_ value your productivity, it values correctness as enforced by a type system and refactoring needed for large projects.


I am more productive in a language with an expressive type system (e.g. Haskell) than one without. Thinking about types not only guides me towards a working solution quickly, but the checker catches all my minor mistakes. In Haskell, you can actually defer all type errors to runtime if you want to. But I have never felt this makes me more productive.


It's unclear to me how a person can be productive if the language doesn't value correctness.


There are plenty of real world situations where neither the requirements nor the prerequisites value correctness, but getting an 80% or even 99% correct set of results while flagging the unhandled cases is very valuable.

Python as a structured upgrade to Excel.


I'd say the most immediate value/payoff of correctness is ensuring your own code is consistent with what you think it does rather than correct wrt to some sort of external specification.


just spit out a bunch of code correct or incorrect looks productive and good on paper.


The bigger the team, the more distributed the engineers and the bigger the codebase, the more productivity is lost by using scripting languages. I find it infuriating because it is so frustrating and it definitely does not feel productive.

For a lone hacker, its the other way around. Compare e.g. to Golang's "programming at scale" rationale.


Definitely feel the same.

The way I think about it is that Python is a strong local optimum in a space that has a massively better optimal solution really close by. But it's nearly impossible to get most people's algorithms to find the real optimum because Python's suboptimal solution is "good enough". And the whole software industry (and in some ways, by extension, all of humanity ... to be over melodramatic) is suffering for it.


> And the whole software industry (and in some ways, by extension, all of humanity ... to be over melodramatic) is suffering for it.

I don't think the biggest services built with Python (think Instagram, Dropbox, etc.) have more consumer-facing issues than services written in other languages.

If you're talking only about developers, fine, however I also think most Python developers like the language. For me it seems that Python has strong vocal critics, that show well in places like HN, however it is not representative of the majority.

So I really don't think Python is making the humanity suffer, for any good measure of suffering.


There is great saying that Python is the second-best language for everything and it might be true. Where Python excels is that you have good-enough libraries and support almost everything. All other languages have some pain points where they shine very brightly in some areas but they have very bad or non-existing libraries and support to other areas.


```--max-line-length=120``` passed to flake8 will switch it to 120. Use another number for a different length.

Things like max line length should be something your org or your fellow contributors decide on, not something dictated to you no matter what the language.


As I've said elsewhere, the python culture I encounter is extremely serious about maintaining it. It's just not worth fighting over.


On a given company it's often winnable. Nobody in my company fought for 80 char. We ditched it and moved on.


> folks then vehemently lecture me about the hours saved by future barely-trained developers who will ostensibly have to come and work with my code.

It seems you are really complaining about having to write code which is maintainable by others than yourself?


Maybe that's true, maybe it's not.

A corporate point of view is than any programmer should be interchangeable with the minimum amount of fuss. I understand why someone building an injury organization has a responsibility to think about the future.

But I confess that sometimes I feel very demoralized when an organization implicitly tells me that my years is study and my ongoing self-education and practice is all meanless Because in principle someone fresh out of college should be able to take over my project with the two week handoff and a few docs.


But how is that the fault of Python or any particular language?


Stop using flake8. The original PEP 8 document doesn't even recommend 80 characters anymore.

Pylint and mypy and black are way more useful.


Flake8 is customisable. I use it with 120 chr override.

Maybe there are good alternatives but Flake8 starts from PEP8 until you tell it otherwise.


black -l 120


I sure do enjoy watching people hyper-fixate on a tiny thing. I'm not sure who you think you're helping by doing this, but it isn't me.


> Every moment working with python (and that infernal pep-8 linter insisting 80 characters is a sensible standard in 2019) increases my burnout by 100x.

You hyper-fixated on that tiny thing. It increases your burnout 100x, remember?


Python increases my burnout. The linter is a relatively small part of it.


It's the only problem you mentioned, and I inferred from your post that it was really bothering you. That was my thinking anyway.

I see now that you mentioned other issues in other comments.

Anyway I appreciate you sharing your feelings to an evidently unreceptive audience. It's nice to know there are better ecosystems out there waiting when I get a chance to look for my own next step.


i fully agree with you. it has a stubbornness about it and not in a good way like the stubbornness you might find in lisp, scheme, and sml derivatives. i have had to write python on only a few side projects, but it was miserable, as you say. not only is the language all over the place and embarrassingly slow, the ecosystem is as well. i tried getting an application someone left when they left running on windows 10, and it was basically a null effort without a complete rewrite, upgrade to 3.x, and upgrading/replacing GUI libraries.

if i had to write python as a day job, i would quit. i have said it before, but python is the new c++, helping people everywhere write brittle, unmaintainable code.


I'm sympathetic that Python is relatively not a great language, but IMHO an 80-character line limit is quite reasonable. It's easier to read on smaller screens, easier to view side-by-side diffs, and tends to force you to break up your code more.

That said, this shouldn't be a lint, it should just be enforced by a formatting tool as a format-on-save setting. It just destroys all the wasted arguments about formatting and the wasted time trying to manually line up code.


I'd also add that while perhaps a bit on the pessimistic side, I tend to view the 80 char rule / limit not as an ancient hardware limitation of monitors, but as a limitation of our eyes and visual processing circuitry. There is a reason why newspapers and well laid-out websites don't have 300 char width lines. Those are physically harder to read, whether we want to admit it or not, as our eyes lose track of the flow to the next line.

I'm all for decreasing unnecessary cognitive load, there should be quite enough of that without us adding more by accident.


Why not take this reasoning one step further and have Hacker news impose an 80 character line wrap limit?

If you've ever had to deal with this in an email client, you can quickly see that 80 is undershooting it in the modern era.


Funny. I preferentially read HN on my phone which has probably what looks like around a 80 char limit. It makes it really comfortable to read.


Do you do code reviews on your phone too?


Maybe I do. What's it to you?


I don't understand–I quite enjoy reading a well-formatted plaintext email that sticks inside an 80-character line width. Any mail client worth its salt should be able to display that.


> as a limitation of our eyes and visual processing circuitry.

If so, then why not put the limit on line length without trailing whitespace? Because it makes no sense that with indentation I should lose available characters.

> There is a reason why newspapers and well laid-out websites don't have 300 char width lines.

Yes, and the reason is, print and transportation are expensive, so newspapers found a way to cram as much text as possible in as few pages as possible. You don't see them replicating this style on-line, and neither you see it in magazines that are priced well above their production & distribution costs.

The reason "well laid-out websites" don't have 300 char width lines is because web development is one large cargo culting fest of design. 300 may be a bit much due to actual physical length, but your comment has 200+ on my machine and reads just fine.

I don't buy these "80 chars / short lines and ridiculous font sizes are optimal for the eyes" arguments. They fly in the face of my daily experience.


I enjoy printing code out, going outside, and reading it.


Do you think this is at least partly due to one usually being frustrated by your "official" work? Or would you prefer something like Java?

I can imagine other people's code in Python frustrating myself, but if it is only my own code base then Python is a rewarding language for me.


It's hypothetical so I can't say for sure, but I did work a lot with the JVM and I used Clojure (and I'd probably use Kotlin now as well) and I didn't feel as burnt out as I do now.


I have a acquaintance that swears by Clojure (for data modelling of sorts) and specifically that I should also use it (due to my category theoretic background).


Yeah, I'm a bit sick of Python. We have some internal utilities written in it and while developers could manage all the dependencies and everything, we kept having problems with getting everything working on the machines of our electronics techs.

Gradually pretty much everything has been rewritten in C++ (with Qt GUI framework). Way easier to be able to have a setup program (using free InnoSetup) that just extracts the eight or so DLLs required in the same folder as the EXE and that's it.

We just use Python for a bit of prototyping and data crunching here and there now.


> Way easier to be able to have a setup program (using free InnoSetup) that just extracts the eight or so DLLs required in the same folder as the EXE and that's it.

You can do the same with Python.


Yeah, we tried all sorts of ways to do it. Couldn’t get it to work satisfactorily with the particular libraries we were using.


> it's 2019

use an auto-linter - black is what most seem to be using

and stop using flake8/pylint directly, try prospector with sensible --sensitivity


I write python professionally and honestly, who cares about the 80 character limit? Just ignore it! :)


I do. This is 2019, and I like being able to see at least three files side-by-side without squinting my eyes, thank you very much.

Yes, I know I'm in the minority these days. :/


I have 3-4 files side by side, Everything else is max 80 chars. I really don't understand why people need more, because long lines are REALLY HARD to read.


You mean like 64 chars, after you account for typical indentation? I hit that limit way too often when using descriptive names.

I usually view two files side by side per screen (so 4 in total). I sometimes up this to 3 per screen, but the trick here is to use a smaller font. Right now, if I split my editor into 3 panes, each has 98 columns available.


what if the overly restrictive character limit causes a decrease in naming clarity?


I guess it can happen, but in my experience, I had more problems with overly long names hurting readability: something like ExampleDataUpdaterFactoryConfigReader. (No, I don't think IDE makes them acceptable because you still have to read them in the code, IDE or not.)

Of course, 80-character limit doesn't guarantee good naming, but it acts as a friction against adopting ultra-long names, occasionally forcing devs to find a better alternative. YMMV.


What naming clarity? Isn't every Python variable called "spam" or "eggs" anyway? ;-)


Then may I respectfully submit you or your employer should buy an appropriate monitor or you chould use two.

I'm not asking for 300 character lines. A 25% increase would be a rational and sufficient nod to the fact that monitors are bigger.


I work at the same place you do. I code often enough from my macbook, using likely the same tools that you use, and they're truly painful with >80 character lines. I don't understand how Go and Java devs survive.

I also don't think an 80 character line limit is a bad thing: small well defined functions that do only one thing are good. Long lines often encourage people to want to nest code deeply (which is terrible!) or to write complex one-liners, and that + list comprehensions is a dangerous pairing.


Java dev here. I would try and defend a 120 character limit but I have a 34" 5k widescreen monitor so I probably don't have a leg to stand on.

I'm writing a mix of Java and python at the moment. Python for lambdas behind an API gateway and Java on some containers for stuff that's more complex but evolves more slowly.

It's neither python or Java where I'm really spending my time though. It's CloudFormation, Ansible, Jenkins and stitching the whole system together for CD that's killing me. I feel like programming languages are the easy bit these days.


> I feel like programming languages are the easy bit these days.

Agreed. The mainstream garbage-collected languages are all basically the same in the grand scheme of things. The work that takes most of my time (and growing) lately is packaging, testing, deployment, etc.


The only tool that gets remotely irritable about it is the one we use for diff reviewing, afaict?

> I also don't think an 80 character line limit is a bad thing: small well defined functions that do only one thing are good. Long lines often encourage people to want to nest code deeply (which is terrible!) or to write complex one-liners, and that + list comprehensions is a dangerous pairing.

Python already starts you out at a nest of 2-4 characters, so we don't even get the full 80.

But honestly I don't think a 100 character line is going to doom us all to hyper-nested expressions.


I've never worked at a place where the prevailing python culture wasn't to set the linter on kill and just let go wild.


Well that sounds like a personal problem and it has nothing to do with the language. As a counter-anecdote, I've been working on Python projects for over a decade and have yet to experience the draconian linter settings you described.


> Well that sounds like a personal problem

Have I given you the impression that any of the opinions or experiences I have expressed here are not personal?

I've never not encountered them, so I guess you're just lucky.


Any language can benefit from the beauty of enforcing general typography guidelines that go back 100+ years.


Thanks for posting. I've wondered if it's just me or does anyone else want to leave software engineering because of Python's dominance. There's no arguing against it either because as this thread shows, "I like it for my use case, look at these libraries that let you get X and Y done SOOOO easy, so it must be great for everything."


I've contemplated leaving SE behind because of JavaScript, not because of Python. Do you tolerate JS?


I hated Javascript back when all it had were callbacks, but once native promises stabilized, I loved the simplicity + ubiquity of the promise abstraction, and then later async/await. Now it's actually my favorite dynamically-typed language.

I've noticed a lot of Javascript hate also comes from people just disliking client development (which isn't easy on any platform).


I thoroughly enjoy UI/client development, but the web app tooling (layer upon layer that is supposed to "fix" HTML/CSS/JS every few years) is frustrating to me. Thus, I avoid it like the plague. :)

To be fair, the JS ecosystem I think is a bit more tolerable than Python. JS isn't being promoted far and wide as the good-enough tool for everything like Python is. A lot of JS is focused on the "view", and I can see a dynamic language fitting there (though React/Redux is making me rethink that).


I don't know why you hate it so much TBH.

It is a language, better than MatLab/R/SPSS, which come before it.

I honestly don't think it is that bad. And they are many people don't care from a programming language perspective, they need to just finish the functionality.


> It is a language, better than MatLab/R/SPSS, which come before it.

I don't think so. I think R is a lot more expressive and not really any harder to read. It might have a steeper learning curve, but it's not so bad that I think that actually matters.


I'm currently learning R. Its a terrible, dreadful programming language. But an excellent DSL for statistical analysis. The main mechanisms for the expressive nature of R is its use of the Environment structure, and terrifying casual use of call stack inspection just to make symbolic APIs. It doesn't even make the latter part of the language without Tidyverse things like rlang. In fact without the Tidyverse efforts the language is even bad as a DSL unless you only deal with a CSV of data with like 100 data points.


I think R is a much worse programming language, like 1 indexing, very unintuitive string manipulation, dataframe being the magic facade that hides complexity, etc.

I would choose Python any day if the other option is R.


1-indexing makes sense within the realm in which R shines. It’s only CS people who seem to completely lose their shit when they encounter it. I’m glad Julia also has 1-indexing.


If you are a computer scientist doing a little data science, then R is awful for the reasons you detailed. If your function entirely consists of data science, then R is a fantastic language for the same reasons you hate it.


R is a fantastic statistical toolkit and a terrible programming language. Building software in it? Massive pain in the ass. Processing tabular datasets on the other hand is an absolute breeze that leaves pandas in the dust. Tidy verse makes it better though.


Yeah, this is generally my take. On the one hand, I cringe when I see all the discussion regarding how much better Python is than R for data science. As a data scientist, who spends 90% of my time cleaning, manipulating, and modeling data, I would take R any day over Python. But at the same time I realize that for many, data science is more 90% software engineering, with 10% being the actual data manipulation and modeling, and for these people, R is a complete non-starter.


I look at developer surveys sometimes when I'm trying to decide what to learn next. According to the 2018 stack overflow survey, 68% of python users would like to use it again in the future [1].

The surveys never tell me why though. What do people like or dislike about python? I know it has a lot of libraries people love (scikit-learn, tensorflow come to mind)

[1] https://insights.stackoverflow.com/survey/2018/#most-loved-d...


Those surveys are often targeting mosstly newer devs who know no better because they did not have the time to validate their opinion against real world.

Its not thir fault tho, you can only do so much in limited time, thats why expertize requires years.


I would rather wish Golang eat the world than Python, just because the practicality of python becomes questionable when performance is key.

In my previous startup in India, I trained unskilled personnel to become decent python developers to work on our product; everything was fine till the product grew exponentially and we had to optimise every nook and corner of it to better serve the customers and save on server bills.

So we had to optimise our application with Cython to meet the demands. So, when training someone to code if we use Python as a language; we should follow up with the disclaimer "You will learn to code faster than most other languages, but you cannot use this knowledge to build something at scale (i.e at-least when budget is of concern, when you are not instagram)".

In comparison, Golang excels in both form and function. It is just as easy to teach/learn as python and doesn't need the aforementioned disclaimer. Web services are a breeze to write and is built for scale.

I understand that there are criticisms against the language design of Go, some are valid and most are just because it was made by Google but none questioning the scalability of Go applications.


Many of my complaints about Python are valid for Go, except that Go makes even more perilous decisions than Python for error handling (and the community kinda gleefully embraces it).

But at least Go is a lot faster and has real concurrency AND parallelism, so it's definitely better than Python.


Go makes worse decisions even, I'd say. Python has gevent, one of the best and most friendly greenlet libs around too, so the concurrency/parallel issues seem fairly moot. It's true that the batteries-included versions of things, while mostly easy to use, falter at scale.


> so the concurrency/parallel issues seem fairly moot.

Python has the GIL, so true parallelism can only be achieved with basically cloning a process X times the number of processors on a computer and inter process communication.

Go doesn't have this problem.


I’m really glad to hear someone else voice these frustrations. I’ve really tried to embrace Python and everyone tells me how great it is. I feel like a failure as a developer but I dislike it so much I quit a job, that switched to Python as the primary language, and took a year off from development. I’m pretty much a polyglot as far as languages go but Python riles me.


Python is the simplest mainstream language yet still reasonably powerful. Some folks don’t like simple and I’ve found it’s better in the long term to find those that do.


It sounds like you're bothered by its syntax or lack of sophisticated type system etc. But it's just a tool, not a amusement park.

> why people seem to like it so much?

> cobble together something like a Bayesian model things like PyMC or Edward actually give you performance that's obnoxiously difficult to reproduce

And this, across many domains


Finally, someone typed what I was thinking.


>that infernal pep-8 linter insisting 80 characters is a sensible standard in 2019

So change the limit or disable that check. If someone is keeping you from doing that they're the one who's insisting on 80 characters, not the language. Who uses any linter without making some changes to its default settings?


I was smart and lucky enough to switch from Python to Clojure and from Javascript to Clojurescript. I am not even sure anymore what have I liked about Python back then. I know for sure - those who really like Python probably haven't given a heartfelt try to Clojure, Haskell or Racket.


(and that infernal pep-8 linter insisting 80 characters is a sensible standard in 2019) increases my burnout by 100x

I've yet to encounter a python linter where you can't pick and choose which rules to ignore. This is a first one to go. Annoying PEP for sure, but https://pypi.org/project/black/ almost completely eliminates your issue.


There is a lot of hype and "they are using it too" mentality as well I think. I am working on control software which, if it crashes or even takes a tiny bit too long to issue a command, could cause the company big financial loss. I was forced to do it in python "because everyone else is using python".


Every moment working with python (and that infernal pep-8 linter insisting 80 characters is a sensible standard in 2019)

It's a very sensible standard. There's a reason most books are taller than they are wide. The fact that we have bigger screens now doesn't change it.


This is about as sensible to me as saying it is a good idea because most trains are longer than they are wide. Who cares about those things, we use computers that nearly all have 4:3 or 16:~10 displays (or wider), and many of use use more than one of them.


Display size has nothing to do with it. Long lines are hard to read:

https://en.wikipedia.org/wiki/Line_length


I obviously don't agree, but it is worth noting that comparing typographic conventions for English with typographic conventions for code is not a very good idea, to me. Especially when we're discussing a language with semantically active whitespace.

A 100 character line might only see 15-70 characters of active use.


While this is a fair point, pylint bumps the limit to something like 100 characters, and you can split deeply nested logic into separate functions with ease.


HN has ~200 char lines and I find it easy to read long bikeshed arguments.


Are you implying short and wide trains would be sensible? I like using a wide screen for programming, but prefer code formatted to 80 chars. That allows me to have two vertical windows of code open side by side. It also makes it easier to use things like a graphical diff to merge code.


Well they'd be interesting, I guess. Maybe they are already short and wide and they're like crabs that scuttle to their side?


it's funny because a few people ended up using python as a prototyping tool to find better approaches to then rewrite in cpp or else. Somehow in that regard it saved them tons of time.


You can configure Pep8/Flake8 to ignore subset of rules. You might wanna look into that. Also, automatically executing it might be your preference, or not.


This is the top comment? Not very useful.


Sorry. I'm as surprise as you are. I figured I'd take a small hit the karma and I feel better about getting this off my chest.

This comment is more highly rated than some links I've submitted that made it to the front page.


I believe. Probably evidence that Python is more heavily used than ever, even in places it shouldn't be.


Well in my world, the other language is Java, not Haskell. Alone in this?


For what it's worth, Java tortures me a lot less since J8. I don't dread it the way I do python because I can use Clojure or Kotlin.


So the workplaces you've been at don't even let you change the line length limit, but they let you mix Clojure or Kotlin into a Java codebase?


Ah, no when I ran my own company I decreed it was a Clojure shop.


What about .NET Core?


Fortran has great linear algebra libraries.


What I find worse about Python than other languages is the lack of tooling and a relatively small collection of libs - many of which are half way done. In a last assignment, we decided Python is at most a hobby language. Python is great for machine learning because most research was conducting using this language, and as such, tooling is available. I would use it at most as an API exposing ML but that's pretty much it.


I don't know what parallel reality you live in but there are few languages with as many packages for getting productive things done than Python.

In web dev andam data science I have yet to see a language with as many libraries for useful stuff.

Now, if what you're looking for is high performance and precise memory management, sure, the language will never give you that.

What language has as many easily available libraries? Java, .net and npm come to mind as the only comparable ecosystems.


Well, that's what I said, there are plenty of packages for data science. For building APIs and web stuff usually there aren't. Python is nowhere near NodeJS for instance. But whatever suits, if you only know one language, it will seem like the best language out there.


This comment is so strangely off the mark I almost wonder if you're basing this off of some secondhand hearsay. Until recently, Python's largest arena of usage was web development and API scaffolding by far. A simple search would have revealed literally thousands of web development focused libraries, most centered around the Django ecosystem [1].

As another commenter mentioned: Django, DRF, Flask

But unmentioned... The old titans of Pyramid, Pylons, CherryPy, Bottle, Tornado, wheezy, web2py, and more.

A _gigantic_ portion of the community is centered around web development, and the fact that almost all web packages have centralized around Django, DRF, and Flask is a function of dozens upon dozens of popular frameworks merging in a "survival of the fittest" fashion into the best possible union of ideas. You seem to perceive "many packages" that all perform a similar function to be a good thing, but speaking from the perspective of someone who writes 70% Javascript and 30% Python I'll tell you that almost every other sane group of developers considers the highly duplicitous JS ecosystem to be a massive weakness.

The NPM ecosystem has an _ocean_ of shallow, highly similar, and one-off frameworks because Javascript developers tend to "divide and conquer" rather than bring good ideas together. Python developers deprecate their packages in favor of better ideas, or just open a pull request to alter the project rather than creating a 99th competing framework in the first place.

[1] https://pypi.org/search/?q=&o=&c=Environment+%3A%3A+Web+Envi...


No?

You have Django and DRF and Flask. DRF is, by far, the best designed framework for building APIs in any programming language I have ever seen, and I've at this game for a long time.

JavaScript has 10 times more packages than PyPi, and for the most part they are absolute garbage with no intent on being maintained.

Python has substantially fewer packages where the community rallies around them and keeps them as best-in-class. SQLAlchemy, DRF, factoryboy, requests, etc., are all incredible one-stop-shops for the vast majority of use cases.

You don't need 15 libraries for doing HTTP requests. You just need one good one that does the job so you never have to think about the problem ever again. Python excels at this class of libraries and by comparison npm has appalling choices.


[flagged]


I'll take a framework that has worked effectively for a decade without significant changes where I can use all my accumulated experience to get things done over the random musings and experimentation of the half-baked JS libraries where you're stuck for hours solving relatively trivial issues that I have never had to think about in DRF because DRF just works and is backed by some of the most experienced people in API architecture in the industry.

When you think about and iterate judiciously in a problem domain over a decade instead of trying something different every day to see what sticks, you'll see you can get remarkably far.


>Oh nice, you have two frameworks. Well, whatever suits your fixed mindset.

Let me see, you've got Django, DRF, Flask, Flask-Restful, Flask-Restplus, Flask-API, sanic, pyramid, Cornice, hug, falcon, eve, bottle, tornado, aiohttp, vibora, weppy, starlette...

Those are just some that I've picked out of my head.


Most of those enumerations are Flask, and the others are basically useless.


And what exactly are Flask and Django missing to merit more packages doing the same thing?

Routing requests and managing HTTP fundamentals is a solved problem. There is literally zero value in adding another framework when the real complexity is in business logic.


That’s the issue. These are routing requests and not much more. I recommend looking around at java, node, php, c# web frameworks for more details on what a web framework should do. Sqlalchemy and other hobby libs can extend flask and django but those are similarly limited.


Then you've not really delved into the issue at all.

Django does routing, forms, ORM, templates, APIs, GIS, you name it. Flask is more minimialist and expects third-party package for this.

Django also has a very extensive ecosystem of libraries for managing the common web use cases.


> In a last assignment, we decided Python is at most a hobby language.

I don't know if you are just trolling, but this is the silliest, most detached from reality thing I've read online today. Python was key in building numerous massive public companies like Instagram and Dropbox. It's one of the most popular and widely used programming languages on the planet for everything from APIs to desktop clients to data pipelines. It had a lot of early popularity in the Silicon Valley start-up scene in the early 2000s, even pre-dating the Ruby on Rails web dev trend.

The guy who founded the company that runs this very website wrote about the draw of Python 15 years ago [1] at which point it was already widely used in certain niches. This is before any deep learning libraries existed. I remember first playing with Python around 1999 or so.

> I would use it at most as an API exposing ML but that's pretty much it.

I don't know how old you are or how long you've been in the industry, but the ML thing is a "second act" for Python. Deep learning grew up in a time and place where Python was a good fit which put Python in the right place to benefit. But Python had already lived a long and full life before any of that happened.

It's fine if you don't like Python or don't think it is a good fit for a project, but claiming it is a "hobby language" with a "lack of tooling and a relatively small collection of libs" is a good way to get laughed out of the room. It has one of the largest and most diverse libraries. And as far as tooling, Python is one of the most popular languages for implementing tooling. Check out ansible.

> There are plenty of packages for data science. For building APIs and web stuff usually there aren't. Python is nowhere near NodeJS for instance.

This has got to be a troll, right? Just some of the most popular web frameworks: Django, TurboGears, web2py, Pylons, Zope, Bottle, CherryPy, Flask, Hug, Pyramid, Sanic... Lots of huge websites were originally built with Python like Instagram, Reddit and YouTube. Of course they mature into complex distributed architectures using all kinds of languages, but it's all about the right tool for the job.

[1] http://www.paulgraham.com/pypar.html


It appears tho the right tool for the task at hand, as a codebase grows, is NOT Python. A good starting point for someone fresh of off a university campus, and easily impressionable. An "object oriented" language without even the basics of scope visibility is not object oriented. Half done libs are not libs that one might consider production grade. Debugging tools where you need to dive into execution branches and heck knows what other archaic debugging techniques are not debugging tools. Lacking documentation or poorly written readme fiels are not documentation. This whole Python movement is gaining a bit of traction because it became popular in universities. The realities of real life programming will sway many religious programmers towards more suitable languages.

So yeah, fantasising about how Python "is eating the world" is a good dream, but the only thing that Python is eating is dust left behind by far more developed programming languages, surrounded by far more modern ecosystems around them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: