I installed CachyOS on a spare ssd with the idea that if it became a headache I would go back to windows. All my games work, even weird open source ones. All my sounds, monitor, kvm switch, volume rockers, Bluetooth game controllers, headsets, everything works with zero issue. Cuda works, I can run ML models, and it all works much better and faster than Windows. There's been no reason so far to switch back. Next, I'll wipe my NVME drive and be done with windows for good.
There is no carbon capture technology on earth that can be rolled out at a scale over the next few years that can compete with planting trees. Especially not one that has just been invented in one university. Ash grows 90cm per year, that's all carbon. Scale that to millions and billions.
Boiling the kettle means I can make pasta and a sauce in about 12 minutes, boiling on the stove just adds tons of time. Given that the kettle is next to the stove it's simply self sabotaging not to use it, one button press.
Yeah huge difference if you have 240V mains. A 120V electric kettle is not much of a timesaver (though perhaps still worth something if it frees up a place on the cooktop that you need for something else).
Actually, this has already happened in a very literal way. Back in 2022, Google DeepMind used an AI called AlphaTensor to "play" a game where the goal was to find a faster way to multiply matrices, the fundamental math that powers all AI.
To understand how big this is, you have to look at the numbers:
The Naive Method: This is what most people learn in school. To multiply two 4x4 matrices, you need 64 multiplications.
The Human Record (1969): For over 50 years, the "gold standard" was Strassen’s algorithm, which used a clever trick to get it down to 49 multiplications.
The AI Discovery (2022): AlphaTensor beat the human record by finding a way to do it in just 47 steps.
The real "intelligence explosion" feedback loop happened even more recently with AlphaEvolve (2025). While the 2022 discovery only worked for specific "finite field" math (mostly used in cryptography), AlphaEvolve used Gemini to find a shortcut (48 steps) that works for the standard complex numbers AI actually uses for training.
Because matrix multiplication accounts for the vast majority of the work an AI does, Google used these AI-discovered shortcuts to optimize the kernels in Gemini itself.
It’s a literal cycle: the AI found a way to rewrite its own fundamental math to be more efficient, which then makes the next generation of AI faster and cheaper to build.
This is obviously cool, and I don't want to take away from that, but using a shortcut to make training a bit faster is qualitatively different from producing an AI which is actually more intelligent. The more intelligent AI can recursively produce a more intelligent one and so on, hence the explosion. If it's a bit faster to train but the same result then no explosion. It may be that finding efficiencies in our equations is low hanging fruit, but developing fundamentally better equations will prove impossible.
That's a monetary tightening, fiscal policy has been getting looser in all developed nations for a while, except places like Greece which were forced to tighten
OK I'll translate it. Developed nations' central banks have been reducing the supply of money, while their governments have been spending a higher proportion of it.
Not for ICE, the only train worth getting for me. Also I don't live in Germany, in the Netherlands. Trains are expensive. Even though I live in the Randstad, near big cities, I don't have any good public transport here. It's just not worth it for me.
Haha that's funny, the person that introduced me to programming set me the bedlam cube as a challenge. I used dancing links and later on adapted it to sudoku!
reply