I can't help but find type hints in python to be..goofy? I have a colleague who has a substantial C++ background and now working in python, the code is just littered with TypeAlias, Generic, cast, long Unions etc.. this can't be the way..
Typing is a relatively easy way for the human author and the machine to notice if they disagree about what's going on before problems arise. It is unfortunate that Python doesn't do a good job with types, I was reading earlier today about the mess they made of booleans - their bool type is actually just the integers again.
Prior to 2.3 Python didn't have booleans, just "truthiness". In 2.3 they added the Boolean class as a subclass of int (because of patterns of development it was a pragmatic choice). True and False were introduced, but they were able to be reassigned which could cause all manner of fun. 3.x made them keywords which put a stop to that but the int aspect remained.
A related screw-up is implicitly casting everything to bool. A lot of languages made that mistake.
Overall I'd say they didn't do an awful job though. The main problems with Python are the absolutely abysmal tooling (which thankfully uv fixes), the abysmal performance (which sometimes isn't an issue, but it usually becomes an issue eventually), and the community's attitude to type checking.
Actually type checking code you've written yourself with Pyright in strict mode is quite a pleasant experience. But woe betide you if you want to import any third party libraries. There's at least a 50% chance they have no type annotations at all, and often it's deliberate. Typescript used to have a similar problem but the Javascript community realised a lot quicker than the Python community that type hints are a no-brainer.
Because Python decided that (for the usual New Jersey reason, simplicity of implementation) bool should just be an integer type the Liskov criterion comes into play. If we can X an integer and we've agreed bool is an integer => we can X a bool. That's not what booleans are but hey, it's sorta close and this was easier to implement.
So, can we add two bools together? Adding booleans together is nonsense, but we've said these are a kind of integer so sure, I guess True + True = 2 ? And this cascades into nonsense like ~True being a valid operation in Python and its result is true...
Out of curiosity, I tried running `~True` in a Python 3.14.2 repl and got this output (the -2 is part of the output):
>>> ~True
<python-input-1>:1: DeprecationWarning: Bitwise inversion '~' on bool is deprecated and will be removed in Python 3.16. This returns the bitwise inversion of the underlying int object and is usually not what you expect from negating a bool. Use the 'not' operator for boolean negation or ~int(x) if you really want the bitwise inversion of the underlying int.
Yes, the article I was reading was about proposals to er, undeprecate this feature. Reasoning that well, sure it's obviously a footgun - but it works for integers and we've said bools are integers so...
> So, can we add two bools together? Adding booleans together is nonsense, but we've said these are a kind of integer so sure, I guess True + True = 2 ? And this cascades into nonsense like ~True being a valid operation in Python and its result is true...
The bitwise negation is indeed janky and inaccurate, but True + True = 2 is absolutely a valid thing to say in boolean algebra. Addition mean "or", and multiplication means "and."
I always remember learning that 2 was a legit enough way to represent the result of 1 + 1, but the internet seems to agree with you mostly. Though I contend that 1 + 1 = 2 is unambiguous, so is fine.
Huh, I learn something new, I was not aware of "Two element Boolean algebra" nor just how deep this particular rabbit hole goes.
It's fine that 1 + 1 = 2. That's just integer arithmetic. The problem is that the booleans are not "just integers" and so Python's choice to implement them as "just integers" while convenient for them has consequences that are... undesirable.
the alternative should be using a real statically-typed language instead of glorified comments that don't do anything without outside tools.
I understand that very large code bases have been built in python and this is a compromise to avoid making them rewrite Ks upon Ks of LoC but as it stands, Python type annotations are akin to putting a Phillip's head screwdriver on a ball peen hammer; the screwdriver is not a real screwdriver and the ergonomics of the hammer have been compromised.
Well yes I agree using Rust or whatever would be better, but if your options are Python or Python with type hints, then the latter gets you closest to proper static typing. They're really not that bad with Pyright in strict mode. Mypy is rubbish.
This, honestly. Seeing all those billionaires on inauguration day lined up to kiss the ring was utterly pathetic. Like what is the fucking point of having billions of dollars if you're just going to be someone else's bitch. And for what? A couple more billion dollars. Oof
I hate it when people obfuscate the fact that organizations are not individual entities. Someone, or a group of people, decided to do this, not some faceless org. Some had to greenlight this.
Semi tangent but I am curious. for those with more experience in python, do you just pass around generic Pandas Dataframes or do you parse each row into an object and write logic that manipulates those instead?
Speaking personally, I try not to write code that passes around dataframes at all. I only really want to interact with them when I have to in order to read/write parquet.
Pass as immutable values, and try to enforce schema (eg, arrow) to keep typed & predictable. This is generally easy by ensuring initial data loads get validated, and then basic testing of subsequent operations goes far.
If python had dependent types, that's how i'd think about them, and keeping them typed would be even easier, eg, nulls sneaking in unexpectedly and breaking numeric columns
When using something like dask, which forces stronger adherence to typings, this can get more painful
The circumstances where you would use one or the other are vastly different. A dataframe is an optimized datastructure for dealing with columnar data, filtering, sorting, aggregating, etc. So if that is what you are dealing with, use a dataframe.
The goal is more about cleaning and massaging data at the perimeter (coming in, and going out) versus what specific tool (a collection of objects vs a dataframe) is used.
I find it hilarious that people applaud Norway, whose economy is heavily driven by exporting petroleum gas and crude oil, for leading the world in clean energy adoption.
What would you like Norway to do? It's been more successful than other countries (generally speaking) insulating its economy from Oil. Would it be better to _not_ also try to drive adoption of clean energy?
What? Their economy is highly dependent on fossil fuel exports. Just because the oil is being burned somewhere else does not absolve them from having dug it up in the first place. Everyone is living on the same planet
Wasn't my point. Domestic economy has suffered less from Dutch Disease than a lot of other countries in their position.
My point is that reducing this to "Norwegians still buy cars and cars are bad" is reductive, and if people are going to buy cars, reaching high levels of electric car use is a good outcome.
Now there should also be focus on reducing car use period. But that doesn't mean it was a mistake to electrify transportation in the meantime.
My point is that whatever is working for them is not even remotely applicable to 90% of the other countries. They are better than Saudi Arabia and other rich countries, but that's about it.
Climate change is a global problem.
Fossil fuels burned in Norway or somewhere contributes the same amount of CO2.
It's kinda like shipping your plastic trash to another country and have them dumping it into the ocean and going "look how clean we are, 0 plastic trash!"
Is the oil company evil for selling me oil or am I evil for buying it. I know which message is more palatable for the average consumer. I also know that without buyers there would be no sellers.
People are blaming vibe coding but the real culprit was hiring leetcoders in the first place. I genuinely believe the stark decrease in quality of most products across the industry has been driven by that.
A developer said to me once, outside of work hours: "Why are you serious about [fundamentals]? [Framework] is where all the money is at." Work cultures that embrace those with his philosophy will have trouble. And here we are.
They exist but are rare and don't hire often. I know a guy (self taught programmer) who got his first major a job at a company doing native ui (not even using OS frameworks, straight GPU stuff).
The company does highly complex simulation software used by movie studios for explosions and other effects.
He got hired by word of mouth recommendation from someone at the company that had met him.
It takes as much luck as it takes skill to get these sorts of jobs, sadly.
reply