Apple is not perfectionistic. Apple is performative. The entire company is performing software development instead of actually doing it.
Apple's development process is a marketing pitch-driven hallucination - project management by buzzword and individual career status progression.
It's almost entirely inward-looking. The connection to Rest of World is increasingly mythical and remote.
Some good work gets done in spite of this. But senior management doesn't understand quality - either in the internal sense of having bug-free robust product, or the design sense, where products meet real user needs in a satisfying, creative, and delightful way.
Nice graphic design though. Apple is still the leader there. Processor dev has also been exceptional.
IMO it's time for most of the C-suite to step down and let much younger talent take over and shake things up.
Has things gone cracy since I last used OSX for real in 2008 or whatever? Windows have become such a shitshow since Windows 7. I kinda assumed Apple didn't follow suite.
I recently switched from Excel. I'm not a FOSS ideologue, but I found that Calc does more of the things I need with less of the distraction, more customisability, and less irritation with bugs and breaking updates.
It's true I'm not a spreadsheet power user - but not many people are.
The trouble is that excel really does have a disproportionate share of all the tiny applications that keep the world running, so it's very very hard to move.
Sorry, could you elaborate on "Meanwhile people in the West are being dropped into poverty."?
As for " lowering the risk of armed conflict - how's that working out for everyone?" - pretty well, thank you. Deaths in armed conflicts have massively decreased since the 1980s with most of them originating from local wars in Africa prior to 2022. (see https://ourworldindata.org/war-and-peace) - the Ukraine conflict is a return of exactly the kind of imperialist expansion that we've seen in the leadup to World War 1 which happens when isolated countries with territorial ambitions seek to expand.
I'm curious - in 2021, the top 1% earned 26% of all income in the united states. What % of total contribution to the country's federal income tax would you consider fair?
Canada is at the top of my mind. You need to get a visa and go through the "full immigration processing" even if you are only transiting through and not leaving the airport.
There's a startup doing something close to this. I can't remember the name and I'm not going to look it up, but the pitch is that you feed it a copyright stock image and it uses AI to create a usable-but-clearly-different near equivalent - a situation where absence of copyright is a feature, not a bug.
Technically it's a derivative work. Practically you'd never tell, and proof of derivation is impossible.
The law as it currently stands is completely unable to deal with these issues.
It's not even clear what the issues are, because copyright is primarily about protecting income rights from significant original invention. The mechanical act of making a copy is somewhat incidental.
When invention is mechanised (or if you want to be less charitable, replaced by algorithmic grey goo) the definition of "significant original invention" either needs to be tightened up or replaced.
Reality can be interpreted as non-local. There has been no conclusive proof it isn't.
c isn't a limit on the kind of non-locality that is required, because you can have a mechanism that appears to operate instantaneously - like wavefunction collapse in a huge region of space - but still doesn't allow useful FTL comms.
Bell's Theorem has no problem with this. Some of the Bohmian takes on non-locality have been experimentally disproven, but not all of them.
The Copenhagen POV is that particles do not necessarily exist between observations. Only probabilities exist between observations.
So there has to be some accounting mechanism somewhere which manages the probabilities and makes sure that particle-events are encouraged to happen in certain places/times and discouraged in others, according to what we call the wavefunction.
This mechanism is effectively metaphysical at the moment. It has real consequences and was originally derived by analogy from classical field theory, with a few twists. But it is clearly not the same kind of "object" as either a classical field or particle.
There may be no conclusive proof, but it's a philosophically tough pill to swallow.
Non-locality means things synchronise instantly across the universe, can go back in time in some reference frames, and yet reality _just so happens_ to censure these secret unobservable wave function components, trading quantum for classical probability so that it is impossible for us to observe the difference between a collapsed and uncollapsed state. Is this really tenable?
Strip back the metaphysical baggage and consider the basic purpose of science. We want a theoretical machine that is supplied a description about what is happening now and gives you a description of what will happen in the future. The "state" of a system is just that description. A good _scientific_ theory's description of state is minimal: it has no redundancy, and it has no extraneous unobservables.
An iPad doesn't have the thermals of a laptop. They are not the same device.
People who want root on everything and a command line are a tiny minority of the population.
Most users don't even know what root is. They don't want to know. It doesn't interest them. They don't find it a limitation because command line computing is something they're actively indifferent to.
Having a powerful engine (for burst computing) changes nothing. They still don't care. They want something that runs CapCut or whatever, and that's the extent of their interest in technology.
Thank you. Looks like I overcomplicated things. I won't even use a vintage CPU, maybe a STM32, so that would be even easier. I guess the only complicated thing is if I want to invent "cartridges" so that involves some hardware.
And yeah I agree those earlier discrete logic games are tough to copy. I think those games were made in hardware with little software.
I think what he meant is that the earlier arcades were done directly in hardware with very little software, so you have to copy the hardware or use emulation.
Apple's development process is a marketing pitch-driven hallucination - project management by buzzword and individual career status progression.
It's almost entirely inward-looking. The connection to Rest of World is increasingly mythical and remote.
Some good work gets done in spite of this. But senior management doesn't understand quality - either in the internal sense of having bug-free robust product, or the design sense, where products meet real user needs in a satisfying, creative, and delightful way.
Nice graphic design though. Apple is still the leader there. Processor dev has also been exceptional.
IMO it's time for most of the C-suite to step down and let much younger talent take over and shake things up.
reply