Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum? I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year but compared to revenue it seems silly to be worrying about a few thousand dollars per year delta.
I buy the best phones and desktops money can buy, and upgrade them often, because, why take even the tiniest risk that my old or outdated hardware slows down my revenue generation which is orders of magnitude greater than their cost to replace?
Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.
Spend at least 1% of your gross revenue on your tools used to make that revenue.
What is the actual return on that investment, though? This is self indulgence justified as « investment ». I built a pretty beefy PC in 2020 and have made a couple of upgrades since (Ryzen 5950x, 64GB RAM, Radeon 6900XT, a few TB of NVMe) for like $2k all-in. Less than $40/month over that time. It was game changing upgrade from an aging laptop for my purposes of being able to run multiple VMs and a complex dev environment, but I really don’t know what I would have gotten out of replacing it every year since. It’s still blazing fast.
Even recreating it entirely with newer parts every single year would have cost less than $250/mo. Honestly it would probably be negative ROI just dealing with the logistics of replacing it that many times.
> This is self indulgence justified as « investment ».
Exactly that. There's zero way that level of spending is paying for itself in increased productivity, considering they'll still be 99% as productive spending something like a tenth of that.
It's their luxury spending. Fine. Just don't pretend it's something else, or tell others they ought to be doing the same, right?
My main workstation is similar, basically a top-end AM4 build. I recently bumped from a 6600 XT to a 9070 XT to get more frames in Arc Raiders, but looking at what the cost would be to go to the current-gen platform (AM5 mobo + CPU + DDR5 RAM) I find myself having very little appetite for that upgrade.
Yes? I think that's crazy. I just maxed out my new Thinkpad with 96 GB of RAM and a 4 TB SSD and even at today's prices, it still came in at just about $2k and should run smoothly for many years.
Prices are high but they're not that high, unless you're buying the really big GPUs.
Where can you buy a new Thinkpad with 96GB and 4TB SSD for $2K? Prices are looking quite a bit higher than that for the P Series, at least on Lenovo.com in the U.S. And I don't see anything other than the P Series that lets you get 96GB of RAM.
You have to configure it with the lowest-spec SSD and then replace that with an aftermarket 4 TB SSD at around $215. The P14s I bought last week, with that and the 8 GB Nvidia GPU, came to a total of USD $2150 after taxes, including the SSD. Their sale price today is not quite as good as it was last week but it's still in that ballpark; with the 255H CPU and iGPU and a decent screen, and you can get the Intel P14s for $2086 USD. That actually becomes $1976 because you get $110 taken off at checkout. Then throw in the aftermarket SSD and it'll be around $2190. And if you log in as a business customer you'll get another couple percent off as well.
The AMD model P14s, with 96 GB and upgraded CPU and the nice screen and linux, still goes for under $1600 at checkout, which becomes $1815 when you add the aftermarket SSD upgrade.
It's still certainly a lot to spend on a laptop if you don't need it, but it's a far cry from $5k/year.
> maybe $250/month (...) which you can then use to go and earn 100x that.
25k/month? Most people will never come close to earn that much. Most developers in the third world don't make that in a full year, but are affected by raises in PC parts' prices.
I agree with the general principle of having savings for emergencies. For a Software Engineer, that should probably include buying a good enough computer for them, in case they need a new one. But the figures themselves seem skewed towards the reality of very well-paid SV engineers.
> Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum?
Yes. This is how we get websites and apps that don't run on a normal person's computer, because the devs never noticed their performance issues on their monster machines.
Modern computing would be a lot better if devs had to use old phones, basic computers, and poor internet connections more often.
Yes, that's an absolutely deranged opinion. Most tech jobs can be done on a $500 laptop. You realise some people don't even make your computer budget in net income every year, right?
That's a bizarrely extreme position. For almost everyone ~$2000-3000 PC from several years ago is indistinguishable from one they can buy now from a productivity standpoint. Nobody is talking about $25 ten year old smartphones. Of course claiming that a $500 laptop is sufficient is also a severe exaggeration, a used desktop, perhaps...
Overspending on your tools is a misallocation of resources. An annual $22k spend on computing is around 10-20x over spend for a wealthy individual. I'm in the $200-300k/year, self-employed, buys-my-own-shit camp, and I can't imagine spending 1% of my income on computing needs, let alone close to 10%. There is no way to make that make sense.
Yes, you don't want to under spend on your tools to the point where you suffer. But, I think you are missing the flip side. I can do my work comfortably with 32GB RAM, but my 1% a year budget could get me more. But, why not pocket it.
The goal is the right tool for the job, not the best tool you can afford.
I agree with the general sentiment - that you shouldn't pinch pennies on tools that you use every day. But at the same time, someone who makes their money writing with with a pen shouldn't need to spend thousands on pens. Once you have adequate professional-grade tools, you don't need to throw more money at the problem.
If you are consistently maxing out your computers performance in a way that is limiting your ability to earn money at a rate greater than the cost of upgrades, and you can't offload that work to the cloud, then I guess it might make sense.
If, you are like every developer I have ever met, the constraint is your own time, motivation and skills, then spending $22k dollars per year is a pretty interesting waste of resources.
DOes it makes sense to buy good tools for your job? Yes. Does it make sense to buy the most expensive version of the tool that you already own last years most expensive version of? Rarely.
Most people who use computers for the main part of their jobs literally can't spend that much if they don't want to be homeless.
Most of the rest arguably shouldn't. If you have $10k/yr in effective pay after taxes, healthcare, rent, food, transportation to your job, etc, then a $5k/yr purchase is insane, especially if you haven't built up an emergency fund yet.
Of the rest (people who can relatively easily afford it), most still probably shouldn't. Unless the net present value of your post-tax future incremental gains (raises, promotions, etc) derived from that expenditure exceeds $5k/yr you're better off financially doing almost anything else with that cash. That's doubly true when you consider that truly amazing computers cost $2k total nowadays without substantial improvements year-to-year. Contrasting buying one of those every 2yrs vs your proposal, you'd need a $4k/yr net expenditure to pay off somehow, somehow making use of the incremental CPU/RAM/etc to achieve that value. If it doesn't pay off then it's just a toy you're buying for personal enjoyment, not something that you should nebulously tie to revenue generation potential with an arbitrary 1% rule. Still maybe buy it, but be honest about the reason.
So, we're left with people who can afford such a thing and whose earning potential actually does increase enough with that hardware compared to a cheaper option for it to be worth it. I'm imagining that's an extremely small set. I certainly use computers heavily for work and could drop $5k/yr without batting an eye, but I literally have no idea what I could do with that extra hardware to make it pay off. If I could spend $5k/yr on internet worth a damn I'd do that in a heartbeat (moving soon I hope, which should fix that), but the rest of my setup handily does everything I want it to.
Don't get me wrong, I've bought hardware for work before (e.g., nobody seems to want to procure Linux machines for devs even when they're working on driver code and whatnot), and it's paid off, but at the scale of $5k/yr I don't think many people do something where that would have positive ROI.
It is crazy for anyone making any amount. A $15k desktop is overkill for anything but the most demanding ML or 3D work loads, and the majority of the cost will be in GPUs or dedicated specialty hardware and software.
A developer using even the clunkiest IDE (Visual Studio - I'm still a fan and daily user, it's just the "least efficient") can get away without a dedicated graphics card, and only 32GB of ram.
It's when you find ways to spend the minimum amount of resources in order to get the maximum return on that spend.
With computer hardware, often buying one year old hardware and/or the second best costs a tiny fraction of the cost of the bleeding edge, while providing very nearly 100% of the performance you'll utilize.
That and your employer should pay for your hardware in many cases.
For starters, hardware doesn't innovate quickly enough to buy a new generation every year. There was a 2-year gap between Ryzen 7000 and Ryzen 9000, for example, and a 3-year gap between Ryzen 5000 and Ryzen 7000. On top of that, most of the parts can be reused, so you're at best dropping in a new CPU and some new RAM sticks.
Second, the performance improvement just isn't there. Sure, there's a 10% performance increase in benchmarks, but that does not translate to a 10% productivity improvement for software development. Even a 1% increase is unlikely, as very few tasks are compute-bound for any significant amount of time.
You can only get to $15k by doing something stupid like buying a Threadripper, or putting an RTX 4090 into it. There are genuine use-cases for that kind of hardware - but it isn't in software development. It's like buying a Ferrari to do groceries: at a certain point you've got to admit that you're just doing it to show off your wealth.
You do you, but in all honesty you'd probably get a better result spending that money on a butler to bring your coffee to your desk instead of wasting time by walking to the coffee machine.
I try to come at it with a pragmatic approach. If I feel pain, I upgrade and don't skimp out.
========
COMPUTER
========
I feel no pain yet.
Browsing the web is fast enough where I'm not waiting around for pages to load. I never feel bound by limited tabs or anything like that.
My Rails / Flask + background worker + Postgres + Redis + esbuild + Tailwind based web apps start in a few seconds with Docker Compose. When I make code changes, I see the results in less than 1 second in my browser. Tests run fast enough (seconds to tens of seconds) for the size of apps I develop.
Programs open very quickly. Scripts I run within WSL 2 also run quickly. There's no input delay when typing or performance related nonsense that bugs me all day. Neovim runs buttery smooth with a bunch of plugins through the Windows Terminal.
I have no lag when I'm editing 1080p videos even with a 4k display showing a very wide timeline. I also record my screen with OBS to make screencasts with a webcam and have live streamed without perceivable dropped frames, all while running programming workloads in the background.
I can mostly play the games I want, but this is by far the weakest link. If I were more into gaming I would upgrade, no doubt about it.
========
PHONE
========
I had a Pixel 4a until Google busted the battery. It runs all of the apps (no games) I care about and Google Maps is fast. The camera was great.
I recently upgraded to a Pixel 9a because the repair center who broke my 4a in a number of ways gave me $350 and the 9a was $400 a few months ago. It also runs everything well and the camera is great. In my day to day it makes no difference from the 4a, literally none. It even has the same storage space of which I have around 50% space left with around 4,500 photos saved locally.
========
ASIDE
========
I have a pretty decked out M4 MBP laptop issued by my employer for work. I use it every day and for most tasks I feel no real difference vs my machine. The only thing it does noticeably faster is heavily CPU bound tasks that can be parallelized. It also loads the web version of Slack about 250ms faster, that's the impact of a $2,500+ upgrade for general web usage.
I'm really sensitive to skips, hitches and performance related things. For real, as long as you have a decent machine with an SSD using a computer feels really good, even for development workloads where you're not constantly compiling something.
One concern I'd have is that if the short-term supply of RAM is fixed anyway, even if all daily computer users were to increase their budget to match the new pricing and demand exceeds supply again, the pricing would just increase in response until prices get unreasonable enough that demand lowers back to supply.
I don't spend money on my computers from a work or "revenue-generating" perspective because my work buys me a computer to work on. Different story if you freelance/consult ofc.
I mean, as a frontline underpaid rural IT employee with no way to move outward from where I currently live, show me where I’m gonna put $5k a year into this budget out of my barren $55k/year salary. (And, mind you - this apparently is “more” than the local average by only around $10-15k.)
I’m struggling to buy hardware already as it is, and all these prices have basically fucked me out of everything. I’m riding rigs with 8 and 16GB of RAM and I have no way to go up from here. The AI boom has basically forced me out of the entire industry at this point. I can’t get hardware to learn, subscriptions to use, anything.
8GB or 16GB of RAM is absolutely a usable machine for many software development and IT tasks, especially if you set up compressed swap to stretch it further. Of course you need to run something other than Windows or macOS. It's only very niche use cases such as media production or running local LLM's that will absolutely require more RAM.
No modern IDE either. Nor a modern Linux desktop environment either (they are not that much more memory efficient than Macos or windows). Yes you can work with not much more than a text editor. But why?
It's the "how much can the banana cost, $10?" of HN.
The point they're trying to make is a valid one - a company should be willing to spend "some money" if it saves time of the employee they're paying.
The problem is usually that the "IT Budget" is a separate portion/group of the company than the "Salary" budget, and the "solution" can be force a certain dollar amount has to be spent each year (with one year carry-forward, perhaps) so that the employees always have good access to good equipment.
(Some companies are so bad at this that a senior engineer of 10+ years will have a ten year old PoS computer, and a new intern will get a brand new M5 MacBook.)
I buy the best phones and desktops money can buy, and upgrade them often, because, why take even the tiniest risk that my old or outdated hardware slows down my revenue generation which is orders of magnitude greater than their cost to replace?
Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.
Spend at least 1% of your gross revenue on your tools used to make that revenue.