Optionality has costs. If you live your life like it's going to go astray, then you miss out on a lot of the upside if it doesn't go astray (such as by being a stay at home mom, if that's what you actually want to do). The statistic that 50% of marriages end in divorce is often bandied about, but it also means that 50% don't. Which means that going all-in on your marriage is a completely reasonable thing to do.
what you say is true. But consider, the “cost” of going back part time is not very big. It’s not very stressful, and _greatly_ reduces long term risk.
Your take is a bit like saying in the year 2000 “i believe Apple is an amazing company, i’ll go ALL IN with my life savings”. If you’re right the you think you’re a genius. But what if you were wrong? What if apple turned out like IBM? Then you’d look back and think “how could i have been so stupid? so naive”.
It's a really bad analogy. And the "cost" of working part time for someone who doesn't want or need to work is literally every single hour they spend working. If they're working 20 hours per week, that's 20 hours per week spent doing something they don't want or need to do. It's a huge cost.
> My wife is fairly unusual in that she runs her own full-time business. Many moms don’t like her, presumably because they gave up their careers to do this and are jealous that she does both.
FWIW, my experience is that the dynamic at play in these situations is that women who run their own businesses or otherwise have high-powered careers tend to have a constellation of personality traits that is significantly shifted vs. those of stay at home moms, plus their daily lives are very different, so they don't really fit in. Saying that without value judgement, just an observation.
> It's telling that ARM, Apple, and Qualcomm have all shipped designs that are physically smaller, faster, and consume way less power vs AMD and Intel.
These companies target different workloads. ARM, Apple, and Qualcomm are all making processors primarily designed to be run in low power applications like cell phones or laptops, whereas Intel and AMD are designing processors for servers and desktops.
> x86 is quickly becoming dead last which should be possible if ISA doesn't matter at all given AMD and Intel's budgets (AMD for example spends more in R&D than ARM's entire gross revenue).
My napkin math is that Apple’s transistor volumes are roughly comparable to the entire PC market combined, and they’re doing most of that on TSMC’s latest node. So at this point, I think it’s actually the ARM ecosystem that has the larger R&D budget.
This hasn't been true for at least half of a decade.
The latest generation of phone chips run from 4.2GHz all the way up to 4.6GHz with even just a single core using 12-16 watts of power and multi-core hitting over 20w.
Those cores are designed for desktops and happen to work in phones, but the smaller, energy-efficient M-cores and E-cores still dominate in phones because they can't keep up with the P-cores.
ARM's Neoverse cores are mostly just their normal P-cores with more validation and certification. Nuvia (designers of Qualcomm's cores) was founded because the M-series designers wanted to make a server-specific chip and Apple wasn't interested. Apple themselves have made mind-blowingly huge chips for their Max/Ultra designs.
"x86 cores are worse because they are server-grade" just isn't a valid rebuttal. A phone is much more constrained than a watercooled server in a datacenter. ARM chips are faster and consume less power and use less die area.
> So at this point, I think it’s actually the ARM ecosystem that has the larger R&D budget.
Apple doesn't design ARM's chips and we know ARM's peak revenue and their R&D spending. ARM pumps out several times more cores per year along with every other thing you would need to make a chip (and they announced they are actually making their own server chips). ARM does this with an R&D budget that is a small fraction of AMD's budget to do the same thing.
What is AMD's excuse? Either everybody at AMD and Intel suck or all the extra work to make x86 fast (and validating all the weirdness around it) is a ball and chain slowing them down.
Minor thing, but I prefer ‘cultish’ to ‘cultic’ for your usage. In academia, ‘cultic’ means anything to do with worship and lacks the association with cults as discussed in this thread, whereas ‘cultish’ is how I usually see people adjectivize ‘cult’ in the way you are doing.
Uh, Flash died because Apple refused to support it on mobile Safari. Perhaps Flash would have died anyway, but that is the proximate cause. And Apple's competitors were falling over themselves to market Flash support as a competitive advantage vs. iPhone.
On the other hand, information flow seems like the hardest problem for any large org to solve. How do you actually know what the ground truth is when you’re the CEO of a company many with tens of thousands employees?
On the one hand, your politics seem diametrically opposed to RA Fisher’s, so it seems unlikely your username is a tribute, but on the other hand, how many statisticians are named RA Fisher?
I chose it long ago and it was a tribute to the man. Then I learned about support for eugenics, and how he shilled for big tobacco by repeatedly insisting a causal relation between smoking and cancer was unidentifiable. He did certainly have some advanced statistical ideas. These days I keep it more as a tribute to the field of statistics. Fisher is worth learning— with context.
Politics-wise, I benefit from learning from Fisher (but obviously the reverse isn’t true). So it’s reasonable to say my views are more informed.