Hacker Newsnew | past | comments | ask | show | jobs | submitlogin





It's crazy how well Joel Spolsky's works aged.

Some of them ended up proving wrong (like he once predicted that making graphic cards would be a very thin margin business[0]), but most are still true today. Probably truer than when they were written.

[0]: https://www.joelonsoftware.com/2002/06/12/strategy-letter-v In his defense, by the time 'video chips' were totally different things from today's mighty 5080.


> today's mighty 5080

Consumer GPUs really are not where the money is. It’s only a tiny fraction of Nvidia’s revenue, and much lower margin. Datacenter GPUs are where the volume and margins are.


You’re not wrong, but Nvidia’s consumer cards also still have pretty massive profit margins all things considered. It’s just they have obscene profit margins in the data center cards that make the consumer cards look small by comparison.

Yeah but that's mostly because they dominate the benchmarks. And part of it is better hardware engineering but also part of it is coupling with optimized software bits that they pushed to devs very well.

It was very well done and strategic on their part but realistically without that they are not that much better than AMD and the margins would be much smaller if it had 1-to-1 compatibility as an x86 has.

Also, since improvements have been plateauing and entry level is becoming enough for more and more people the margins are going to diminish even more, high-end GPU is going to become even more niche overtime IMO.

So, I'd say he is not wrong in spirit, the timeline is just bigger than expected.


I would have expected that consumer GPUs still have higher volume, but that Datacenter GPUs have much, much higher margin and therefore significantly higher revenue and profit. Is that not the case?

Datacenter GPUs definitely amount to a bigger volumn in 2025[0]. But even when it wasn't the case, Nvidia had been running at a very high margin for years.

[0]: https://nvidianews.nvidia.com/news/nvidia-announces-financia...

Out of $46B about $41B is from datacenter.


If Nvidia made SoC GPUs for mobile devides, then they'd might have higher volume, depending on market share. But gaming and workstation PCs that benefit from a high-performance discrete GPU are a pretty niche market these days, whether laptop or desktop.

It powers the nintendo switch.

Nvidia had a very high gross margin % (for a hardware company) of ~50% way before AI hype. Even before A100.

Joel is my favorite essay writer. His writing has established so many norms that cargo culted to hell. It's nice to paste the original articles to a whole new generation.

People know they should do coding tests in interviews, but not how they're designed to find smart and get things done.

People know they should code clean, but not how it only needs to be clean enough to spot dangers.

Some companies are paying super high prices for interns, and then don't try to integrate them into their recruiting funnel.


Another note in his defense, that was before integrated graphics really (well, integrated graphics were integrated into the motherboard at the time, haha). If we include modern iGPUs they are pretty low margin devices I guess (or at least it is hard to separate out their value from the rest of the CPU package). Of course iGPUs aren’t add-on cards like he describes, but I don’t think it really weakens his overall point—the most popular GPUs in the world are so cheap nowadays they can’t even justify the premium of having their own PCB or taking up a physical PCI slot.

I'm not an expert in the space, but I'm guessing that graphics cards have only been a non-thin-margin-business for like 5 years. If that's true, the Joel's prediction about graphics cards may even still turn out true on net -- it could still go back and stabilize at low-margin.

That said, I think your overall point isn't affected by my idea. Even a great mind can be wrong occasionally. He doesn't have to be right every time.


He also pioneered 'velocity tracking' for SWE tasks. Seems like a major disservice in hind sight. But then again I don't think I can't blame him for Agile and Jira.

But I think he's the only one to have done it right. I've never seen velocity tracking correct for measured inaccuracy in each developer's estimates. I've tried so many times to implement his EBS approach, but no one wants to do it.

I worked for a company which adopted FogBugz. The multiplier it calculated to be applied to developer time estimates quickly diverge towards positive infinity. It's probably fair to share some of the blame for that between us and it. Nonetheless, we managed to hit our quarterly release deadline well in advance of the predicted five to six years :-P.

Sure. If I remember the details correctly it was some sort of individual approximation of 'veolicity story whatever points' too, to make it less arbitrary and obviously stupid to KPI-PIP-stack rank people with.

And in all likelihood they would have stayed a thin-margin business.

CNNs and LLMs became feasible and viable much later. People in the 1500s would also have bet on the automobile and the Wright brothers. Right after drinking coffee from their microwave.


Really? I think he seems entirely wrong when it comes to bloatware. For instance, whenever I open Adobe lightroom on my high-powered desktop PC it turns into a space heater and is crammed with junk features that I feel the need to turn off.

There's a few that have not, like his stance on rewrites. Apparently, if Joel's guru status had been properly respected, Netscape would be a powerhouse today, running on its Navigator codebase, because "The idea that new code is better than old is patently absurd. Old code has been used. It has been tested. Lots of bugs have been found, and they’ve been fixed. There’s nothing wrong with it."

https://www.joelonsoftware.com/2000/04/06/things-you-should-...

There is of course some truth wrapped up in those thick layers of punditry, but even today that article gets trotted out as some kind of Revealed Truth of Software Engineering to be swallowed and digested without qualification.

(Edit: I agree that from-scratch total rewrites are rarely a good idea, and should require blindingly obvious justification at the very least. I still cannot take all the points about Old Code at face value, since rewrites often come about after Lots of Bugs have been found and continue to be found)


> Netscape would be a powerhouse today

In our real timeline, Netscape did rewrite. During the rewrite their market share halved. And they died a few years later. So yeah, the lesson here is if you're okay to let your company just die and rebuild a new one, it's perfectly fine to rewrite the whole codebase.


In all fairness, a lot of that was about IE being bundled into the OS and other rather underhanded strategies employed by MS.

Rewrite needs to be seen in context of just fix the old code in place. You are always stoppable and making money, but spend some effort into cleaning up the things you wish you had done differently, enabling tomorrows features, while also have spare "manpower" to use to build new features.

Rewrites often do work out in the long term. However you end up spending 10 years on the rewrite and those are 10 years where the old thing isn't getting as many features (everyone knows it is dead long term so they only invest in the most important features thus allowing your competition time to catch up). Worse those, in 20 years (10 years after the rewrite replaces the old thing) the rewrite is showing age because of things you wish you had done differently.

Which is why I advocate for fixing the old code in place over time. No matter what your best ideas while turn out to be wrong in some way. Architecture decisions that seemed good turn out of have downsides. Languages that go obsolete. APIs everyone uses that you realize should have been done different (If they are only used in one places that is an easy change). Core data structures that no longer meet your needs, but you can't change without touching a million places that update it. You will need a long term effort to fix those mistakes whatever they are. Sure it will take longer to get the old thing to where it needs to be than the rewrite, but you are not saving anything as the new thing will have different mistakes that you will regret in 10 years.

Also the above is in context of a large program. My program to draw names for the family Christmas drawing is easy enough for anyone to write from scratch in an hour so who cares. When your are on a project that costs over 1 billion dollars to write you have different constraints (some of you are on small projects that it is better to write from scratch each time)


I think you're claiming an overly strong interpretation of Joel's stance on rewrites, something like "new code is never better than old code." While that's kind of what your quoted excerpt says, that's not what it means in-context.

In-context, his point is pretty obviously (I think) more like "given a piece of code, it's never better to rewrite." His point is not that newer software can't come along and be better than old software. Rather, all other things being equal, rewriting is an inferior development choice.


Rewrites can and sometimes do result in better code. They just don't usually help a business bottom line. And rewriting stuff didn't help Netscape anyway (but it did help Firefox, after Netscape went out of business)

It’s very similar to the second one (Simplicity).

To be fair, the author may not have been aware of these previous posts. I find that “kids these days” don’t really read the classics.

For those not aware, Joel Spolsky, Steve McConnell, Don Norman, and a whole bunch of others, really thought hard about our vocation, and wrote down their thoughts.

Might be worth a gander.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: