Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In most applications you will not be able to get to the level of performance when any of this is going to matter.

For example, if you program Python then there is so much more conditionals in Python runtime itself than you will not see any difference from couple of your conditionals being predicted better or worse. You will also be doing a lot of other stuff (like spending time in framework code, calling libraries, performing I/O) that will be making any gains from this pretty insignificant.

Where this comes into picture is if you are really bent on improving performance of your code on a grand scale (when you develop algorithmic trading or maybe operating system), when you have a very dense, busy inner loop (for example video encoding) or when you develop compilers.



I think that's more of an indictment of how bad the concept of "production python" is.


I disagree with that. Production doesn't mean that the code runs fast, but that it is stable and reliable.

There are use cases where you need a performant system that uses CPU really efficiently, in these cases Python is probably not the right tool.


Lots of code doesn't run so often that it needs to be optimal. If you use proper data structures and algorithms then you'll avoid the real problems. The constant factor slowdown you get from language choice won't matter for the vast majority of lines of code.


This is given as an excuse to write sloppy code and then we get the bloated mess that is modern software and the modern web.

Plus python has many problems besides performance.


Sloppy code would be using the wrong data structures and algorithms. Sloppy code would be optimizing nothing at all.

If you write things with good big O performance, for almost all of your lines of code you're done. It will perform fine.

Bloated messes don't come from writing a bunch of business logic in python instead of C, not while processors are this many orders of magnitude faster than they used to be. They comes from layers and layers of abstractions, or doing things completely the wrong way.


But if you're processing 10's of thousands of records in an O(n) loop, it will still be 10's of thousands of times slower than processing a single record in that loop. i.e. having good big O still doesn't mean the code will be fast.


Are you saying this is a scenario where O(1) is possible? Then "good big O" is O(1). Code that is O(n) is a mistake.

If you're saying the fastest method is O(n), then sure, no method can guarantee fast processing of the entire list. But that fact sure isn't python's fault!


My takeaway was the opposite. You have a fairly generous budget of if's and code size if you use a low level language. But if you use a higher level language you could run into the limits.


Bit of a tangent but in the userland app space I'm increasingly interested in sidestepping the whole "conditionals" question with finite state machines, eg with a lib like XState.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: