Hacker News new | past | comments | ask | show | jobs | submit login

I remember getting in an argument a few years ago during a budgeting meeting at a job, where the prospect of upgrading our two-year-old Macbook Pros came up. This company was a startup that wasn't doing particularly well with money, and I said that you don't need a super-fast laptop to program...especially since this job was Node.js based and none of the work we were doing was processing-heavy.

This started somewhat of an argument with my peers who claimed they needed a top-of-the-line Macbook to do everything because you can't program on anything slower. Management ended up caving and buying the new laptops.

I stand by my point on this though; as a proof of concept, I "lived" on an ODroid XU4 for a month awhile ago, doing all my programming and everything on there. I was happy to get my big laptop back when I was done with this experiment, but I never felt like weaker hardware impaired my programming ability.




This only makes sense if you pretend developer time is free and low morale has no effect on productivity.

10 minutes of lost productivity a day due to inadequate hardware means you are paying at least $4k per year per dev for that shit pile of a laptop. Usually it costs more than 10 minutes a day too.

The hit to morale when you tell a professional they can't pick their tools is even worse. Now they are spending at least 1 hour a day looking for a new job because devs are good enough at math to know if you can't afford a new laptop you've got six months of runway tops.

If you believe employee happiness has any correlation to productivity you always buy them whatever hardware makes them happy. It is a small price to pay.


> 10 minutes of lost productivity a day

and yet not every developer not using a tiling windowmanager is fired on the spot for wasting company time.

"productivity" (or what people think of as productive activities) is overrated. Shiny new hardware makes employees, especially technically oriented ones, feel important and appreciated. That's about the gist of it. Nothing wrong with that, but let's not blow up egos with claims of "productivity".


Productivity is one of those things that I think should be motivated with reward for increasing it instead of punishment for decreasing it. I love the nice feeling I get when I use my own custom window manager to shave another 3 seconds off a thing I commonly do. It's an amazing feeling, it makes me feel like Tony Stark, building my own personalized JARVIS, a program that automatically does exactly what I would have done manually. That's a big part of why I built my window manager and why I want to share it with people, because I want them to feel that same excitement and joy of directly improving your own quality of life, even in a tiny but very real way. I would open source it and give it away for free if I could do that and still keep the lights on.


My reaction to customizations that shave off seconds is: "so what, it'll be blown away the next time the tech stack changes." I do automate, but there's a subtle difference in goals.

If I automate my personal toolset, I follow the same procedure I use around automation anywhere else: don't start off doing it to save time, do it to increase reliability. I will write small scripts, sometimes one-liner scripts, sometimes largish hundreds-of-lines scripts. But the outcome I am aiming for is that I have a procedure that is documented and puts all the configuration in a place where I can see it, so that when the situation changes, it is fixable. A productivity boost is a frequent byproduct of successfully automating, but it's usually a side component to "reliable and documented". The boost is perceived as reduced friction and increased conceptual integrity: fewer things to check or to accidentally get out of sync, and thus less stress involved.

Focusing on UI being both shiny and fast is likewise often missing the point - which is the case when discussing new hardware. There are order-of-magnitude thresholds for human attention that are important to keep in mind, but hitting a lower attention threshold usually doesn't solve a productivity problem. It creates a case of "wrong solution faster", drawing the user into a fast but unplanned and reactive feedback loop.

See for example the case of writers who like the Alphasmart, a device that makes doing editing so tedious that you don't do it, you just write the draft and edit later.


I work with .NET and that used to mean you have to be on a Windows® computer. At a place I used to work at, I had an HP elite book Windows 7 laptop with i7 processor, 8GB RAM, and a spinning hard disk. That by itself is not the problem. The problem is there is an "asset management" software installed (I assume by default) that is overly active which when combined with a antivirus with "real-time protection" meant a subversion checkout can take a long time. This definitely degrades employee morale I think.


I would fire explorer/finder etc. programmers on the spot


What does that mean? You would fire anyone who opens up Finder ever? As opposed to what? Doing everything from the command line? This sounds ridiculous without clarification.


Alternatively you could watch them and learn some new tricks.


>10 minutes of lost productivity a day due to inadequate hardware

What on earth are you doing that a 2 year old mac is inadequate for?

Yeah there is a point that the hardware is a problem. I'm working on a 5 year old, mid range PC and I don't think an upgrade would really change any of my day to day work. Maybe InjeliJ would be a little faster and compile times would be a fraction faster but I doubt I'd notice it.

I have a 10 year old PC at home and the only pull to upgrade is gaming, but I'd rather not sink time into that (I get enough time sitting behind a screen at work) so I hold back on spending money on it too.

Maybe the developers happiness does drop if you give them older hardware but I don't think that's based on realistic changes in performance of that hardware.


> Maybe InjeliJ would be a little faster and compile times would be a fraction faster but I doubt I'd notice it.

Just because you don't notice doesn't mean that it's not there. The argument is that it's insane to pay developers north of $100k per year + benefits and then don't invest $4k every 3 years to buy them fast hardware.


No, the argument is that it's insane to pay developers north of $100k per year + benefits and then don't invest $4k every 3 years to buy them marginally faster hardware.

And I don't think that argument is particularly convincing. Typing this from my 3.5 year old work MacBook Pro.


> marginally faster hardware

Not sure where your experience is originated in. It obviously also depends on what exactly you do with your computer. Mid 2015 Macbook Pro to Mid 2018 Macbook Pro got a 50% improvement for compute workloads according to Geekbench [1].

I work on largish (win/mac/linux/ios/android) C++ code bases and compile and test run times are definitely an issue. Switching to newer computers every few years definitely boosted productivity for me personally (and for my co-workers as well). We mostly saw that compile times halved for each generation jump (3 years) as a combination of IO, memory and CPU performance increases.

Not sure what marginally faster hardware means exactly for you, but for us it's definitely been significant, not marginal.

YMMV, but if you do the math of saving 10 minutes / day * $200 /h * 200 days that's > $6000 per year it becomes pretty hard to economically argue against investing in faster tooling of some sort.

Typing this from a 2.5yr old Macbook Pro.

[1] https://browser.geekbench.com/mac-benchmarks


If you're dealing with something intensive then upgrading makes a lot of sense. If you've got large compilation times in your pipeline or if you're doing machine learning and need to throw loads of hardware at a problem I totally get that. I'm sure there are plenty of other situations that justify this too.

But if you're like me where most of that happens off on the build machines there is very little impact in upgrading your hardware.

A 50% improvement on compute workloads probably wouldn't be noticeable on the setup I run. Outside of compiling I don't think I push a single core much above 30%.

I guess it really comes down to what you're doing.


> Mid 2015 Macbook Pro to Mid 2018 Macbook Pro got a 50% improvement for compute workloads according to Geekbench

If this is enough to make a huge difference then you should be running the workload on a low end or better server instead of a mac book. You'll get much more performance for a fraction of the cost and won't have to pay for the things you don't need, like a new battery, screen, etc that come attached the the CPU you need.

> I work on largish (win/mac/linux/ios/android) C++ code bases and compile and test run times are definitely an issue. Switching to newer computers every few years definitely boosted productivity for me personally (and for my co-workers as well). We mostly saw that compile times halved for each generation jump (3 years) as a combination of IO, memory and CPU performance increases.

Have you exhausted all other avenues there? Do you have distributed builds? Is Everything componentized? Do you compile to RAM disks?

For that matter, why a macbook? Why not a high end gaming laptop with more CPU, RAM, GPU resources?


YMMV indeed. Most devs aren't working with large C++ codebases with slow compile times and making $400k working only 40 weeks a year.


This conversation reminds me of that conversation from a day or two ago, when a writer argued that, if you're trying to hire senior engineers, you need to woo them, rather than expecting them to woo you.

In high-end job markets, hardware is so cheap compared to salary, etc. that there is no reason not get the top stuff. If you don't, that may impinge productivity, and in addition it will send a very negative signal.


Honestly this doesn't just go for high-end jobs. If you're a coffee shop and refuse to shell out for some nice stools for employees to take a rest on then you're just not using math.

All labour in the US costs far more than a decent chair or a nice ergonomic keyboard and mouse, a hands free headset... All of these things are peanuts compared to the costs of employing someone.


I keep telling my employer this, and similar arguments, but no matter what, they still won't buy me one of these:

https://www.mwelab.com/en/


Ah, I’m sure that’s because it would be hard to fit in the office, not because of the price.

It’s actually cheap enough that you could buy it yourself if you wanted to. It certainly went into my bookmarks, just in case I ever actually need a literal battlestation.


Why does that page require javascript to be enabled before you can see anything? ;(


Buy it yourself. "Don't ask for permission, ask for forgiveness."


I know we're not meant to say people didn't read the article. But did you click the link to the chair?

I think it was a joke.


Yeah I clicked the link. I was also joking, saying he should buy it with his own funds, naturally the business wouldn't reimburse him :P


Yup. I really appreciate working a place where my employer understands this. Any tools, equipment, technicians we need to bring in...anything...we just say and we get it. The difference between this and my last job where we used equipment until it broke down...and then had to keep using it...where even a pair of earplugs seemed to be a source of stress...it's like night and day.

I honestly don't think I could work for a company that cheaps out on equipment and supplies anymore. The difference it makes every day to the quality of not only the work I do, but the working conditions for me and the people I work with is worth it.


I seriously think there are power games being played by employers who give their labor terrible working conditions when the marginal improvement would have such a tremendous morale boost. Perhaps out of worldly impotence, such an employer feels satisfied in seeing the toil of someone subservient.


Basically anything is cheaper than salary. It’s why I don’t understand companies that do things like take away the free coffee to ‘save money’.

If I can provide free coffee to the whole office on my own salary alone, maybe that’s not a great way to save money.



And yet, people insist on using heavy, slow development tools and SDKs, which make much more of an impact on productivity and iteration times than a 2 year difference in hardware.


> Usually it costs more than 10 minutes a day too.

Anything that slows you down or irritates you eats up more minutes than just the raw time you're waiting too.

Say something takes 1 second to compile vs 6 seconds on slower hardware and you're having to recompile ten times to fix a bug. The raw waiting time might only be 10 seconds vs 60 seconds but that extra pause of having to wait every time to see if you've fixed bug might annoy you enough to drain significantly more productivity out of you.

You're best keeping your developers happy so they enjoy plowing through the work instead of hating it.


Once it takes more than 10 seconds, many people are likely to alt-tab to ”fun” - and often take quite a while to tab back


You won't get a 6x performance improvement from replacing a two year old laptop. Laptop performance improvements have been tailing off for a while.


The OP mentioned a two year old computer. Do things compile 6 times slower than on today's computer? I'd guess at most 1/6th slower, ie 5/6ths of the time...


It's a hypothetical example to illustrate the point that "anything that slows you down or irritates you eats up more minutes than just the raw time you're waiting".

For the sake of a few thousand dollars for a laptop that can be used for several years, even a 1% increase in efficiency is very likely worth it.


If you think developers churn out code at maximum productivity throughout the day and a 10m improvement in compile times per day will reap you benefits in a year, you're sorely mistaken. Unless that improvement lowers the compile time below a certain threshold where the dev will say "fuck it, I'll read some HN while it builds" you're probably not gaining anything.


Ah but then that dev comes across an insightful article, that changes how they think about programming, and becomes a better programmer.

What we should really be doing is spending all our time 'researching' on the internet. Increasing our value to our companies. That's what I tell my boss anyway.


The original argument I read was that slowdowns or waits past a certain time affect mental flow. The flow is basically you being in the zone coding optimally. Interruptions like phones can jolt a person out of flow. So can long wait times.

If person is in flow, you want them cranking out as much output as they can with no slowdowns. That was an advantage of Smalltalk and LISP. Any language with REPL or IDE supporting something like it will let one do this.


I think there is more to it than pure compile time. It can affect responsiveness enough that it becomes disturbing.

From there it can mean using fewer plugins, disabling part of the static code checks, doing less full debugging etc.

I remember how activating the debugging flags wouldn’t work on some projects as the machine couldn’t handle it memory wise.

All of these have a compounding effect that is non trivial.


If you are using Node as the original poster mentioned, what is a slightly faster computer doing for you? You’re not spending time waiting on a compiler.


Exactly. They can not break out the "my codes compiling" argument. Lost morale over a 2 year old laptop?? Wow, they should work in education. Try a 10 year old laptop and no hope of every buying a new "high end" machine. Try a shiny new $500 dell. I think some new programmers are out of touch a bit on hardware...


> Wow, they should work in education. Try a 10 year old laptop and no hope of every buying a new "high end" machine.

That it is worse in another field is not a very good argument?


Yes because developers who are whining about having to use a two year old laptop because their company is trying to save money seems more like a bunch of entitled brats.


In the end, what matters is that you keep your employees happy. Whether or not they are entitle brats (they are...) is irrelevant if you need those employees.


If you're bringing in 3x your cost (as you should as an employee), the cost of a laptop is very nil for the company. Their nickel and diming you is worse for morale and retention.


Anyone who will quit a job for something so minute as having a two year old laptop is someone that is probably so immature that they couldn’t handle the normal rigors that come from being a professional.


You may think so. When you see certain coworkers get new machines every 2 years but you don't, for example, it builds a power structure when one wasn't before. If you feel less valued as a result, I don't think it means you can't handle the "normal rigors" of being a professional. It means, if you can find somewhere else where you feel more valued, then more power to you.


Again if you can’t handle the “power structure” of not getting a laptop every two years, you will never be able to handle navigating life in corporate America.


If you're bringing in 3x your cost, you're getting ripped off and need to start raking those numbers back in.


Unlikely with a smart employer. Your cost to them is 2x your tc. They need a profit, that's where the last x comes from.


What would you say is a good ratio then?


>seems more like a bunch of entitled brats.

I'd say anyone that drops the money on a 4 year CS education of any rigor, survives it, survives and succeeds at the fairly rigorous interview cycles required to get a job in SWE these days... is absolutely entitled to requesting top of the line hardware to perform their duties.

I'd say your argument makes it "seem" like it's coming from envious, curmudgeonly luddites.

If individuals of other vocations feel that these developers are "entitled brats", perhaps they should switch careers? I can't imagine a teacher went in to education seriously expecting to get provisioned the latest rMBP as a perk?

Are we assuming all jobs are equally dependent on high-end hardware to provide the best RoI for time spent?


Because you think CS is the hardest, most rigorous major?


Where did I say or even imply that?

I simply said it was hard and rigorous. Not the most.


Why do you assume all employed SWEs have 4 year CS degrees? Or any degree at all?


That's fair. Then they've managed to educate themselves, not exactly a non-trivial feat either.


Teaching yourself to program is not rocket science either. I was writing 65C02 and x86 assembly on 6th grade in the 80s. I got a C.S. degree because that’s what I was suppose to do.


It's not "another field", it's literally almost every field other than computing, as outside tech people either don't have money for top hardware, or don't have interest in buying it.

In so far as it makes developer more empathetic towards users, it's a good point to make.


If you were paid at your job the same as in tech, and you know the direct value you add then you'll understand the cost of the tools you prefer is a literal drop in the bucket and not worth serious discussion.


Wow seeing that I’ve been “in tech” developing professionally for over 20 years and a hobbyist programmer in assembly since 12, I think I’ve earned my geek cred...


It's actually you that is out of touch. I could reiterate the arguments already well expressed but perhaps I should just suggest you read the entire rest of the thread.

A teacher spends presumably most hours actually interacting with students and use the computer for preparation and paperwork something your 10 year old dell is probably well suited for during the minority of the time they spend on it.

Your dev spends most of their time on their machine and even if they don't have to build software in a compiled language may still be running a heavy development environment, several browsers to test the result, virtual machines, etc etc etc.

To drive the point home lets consider the cost of a 5% decrease in productivity due to using an inferior machine.

If a teacher is contracted to work 185 days or 37 work weeks in a year and earns 60k the teacher earns $32 at 50 hours.

If the teacher spends 10 hours per week on the computer the cost is no more than $32 * 37 * 10 * 0.05 = $592

If your software developer is earning 100k x 50 weeks and spends 50 hours per week almost all of which is spent using the computer then the cost is $40 per hour x 40 hours on the computer x 50 weeks x 0.05 = $4000

This doesn't account for actual costs incurred by having to hire more developers because management is too incompetent to retain them buy buying them nice tools.


Classrooms often use interactive whiteboards driven by PCs that can be quite slow to log in to the teacher's active directory profile as this involves pulling data over a network of variable speed. There can also be issues with software upgrades deciding to start at random times. The PC will need to be logged off and logged on 8 times a day...

Teachers don't talk to themselves in classrooms so time loss can affect a small percentage of up to 180 student-hours per day or 900 student-hours per teaching week per teacher. A typical '8 form entry' secondary school in UK will have around 100 teachers plus admin / head of subject / 'leadership'. School year is around 38 weeks.

I sometimes think something like ChromeOS but that can run IW software and just stay booted would be better. An appliance.


You need to factor in the time it would take for teachers to move all of their resources over to another format. Some teachers have decades worth of work that they teach with.


The argument you are constructing is that we should invest more money in classroom computers not less in developers.


Yup! Or at least 'appliance' style end points


Editor will be snappier, Docker will start faster, git merges will be quick, etc

it all adds up.


Vim is pretty snappy regardless of hardware!


I do not and have never used Vim, if I started at your company would you prefer to have me waste three or four days learning it or just pay an extra six hundred dollars on hardware?

Just for your knowledge, your answer probably differs from the one I'd get from whoever does the accounting at your company, and their answer is the right one.


> waste three or four days learning it

If you're a proficient IDE user, learning and setting up Vim up to a comparable level to a top-notch IDE (Visual Studio, IntelliJ) would take more than three or four days. Three or four weeks (or even months sounds) more realistic, in order to get efficient:

* code navigation * code refactoring * debugging * tool integration * workspace/project management

Don't let anyone tell you otherwise, I'm a Vim user and the people that say it will take just days or a few weeks have just forgotten how long it took them to ramp up. Or they're fooling themselves that their Vim does everything a capable IDE does on strong hardware.


I agree, but I wanted to clearly under estimate because three or four days is already enough cost wise and it's hard to argue. I think two weeks is on the optimistic side, but I think real proficiency would be a slow process and likely end up not paying off for about a quarter, though you'd be productive in other ways during that time.


At a previous company, I never told the new college grads that there were options other than vi, so unless they were enterprising enough to figure things alternatives by themselves (this was before widespread availability of Linux and the web), they were forced to learn it.

And then I'd tell them about alternatives after they were proficient with vi. Not one ever switched away from vi.

Did that cost us in initial productivity? Probably. But it's such a minor thing when it comes to NCGs.


Good thing I'm not in charge of these things at my company, I'd probably look for people who used hardware properly rather than just rely on Apple to make their POS GUI IDEs to run... Most of my colleagues run IDEs, but not the ones who actually get stuff done.


> three or four days learning it

Now this is being ambitious :)


> I do not and have never used Vim I do wonder, what editor do you use when you're sshed into a remote machine?


I generally use nano for quick and dirty things, but prefer to push/pull changes. The environment I'm working in doesn't necessitate editing local files on a remote very often.


For me, my ~4 year old MacBook Air (Yeah, not really a top end dev machine) has started to struggle with more than 15 or 20 browser tabs open. I regularly spend _way_ more time researching than compiling, so it's starting to annoy me enough to think about pushing for an upgrade.

(I put the blame half on the browser vendors, and half on modern cloud web apps. My tabs usually include Gmail, Jira, Confluence, Trello, and Slack. Even doing nothing, between them they'll sometime have the fans spinning...)


Bundling + dev servers are place a heavy load. I'm actually looking into a way I can avoid using bundling for production but still get some kind of "hot javascript file reload" in the browser.


Getting fast feedback from your tests.


As a counterpoint, tests are your application, and thus should run at the exact specs of your average consumer. And you don't get to compensate for the test suite itself either unless your 100% sure the average customer is only going to be running your app.

This is also a good argument for running tests on a separate dedicated machine


I like running tests on a separate dedicated machine and enjoy a build environment that verifies each commit with a full test suite... but being able to run unit tests for a file on each save of a file is something that can save you some time pretty trivially.

I also disagree strongly about needing to run tests on the exact specs of your average consumer, most of us aren't writing software for a small set of hardware configurations so determining those average specs is likely not possible - but if you're working in an embedded platform or with specialized hardware I do agree that you absolutely do need to run tests on the actual hardware regularly, I'd still argue that those tests should be run on a dedicated machine and should be in addition to tests that verify the code is obeying its contract.


> should run at the exact specs of your average consumer.

Kinda expensive, having a set of multiple dedicated servers (or VMs) running on God knows how many dozen cores and hundreds of gb of ram just to run the tests that my laptop runs fine by itself, all in the name of running the "exact specs" that my stuff is going to be run on.

You're describing system testing, which takes place extremely late in the product cycle.


This argument is particularly badly formed. Tests are your application but you are testing for correctness not speed. In theory your tests could exercise as quickly as possible as many unique operations as a customer could perform in hours of normal use. You would never want this to take hours.


That pesky windows 10 won’t run on an msp430 unfortunately...


Do you have incremental testing so that only what's changed is tested? If not that would be a better investment than new hardware, IME most shops don't have incremental testing.


For real. Compilers are lightning-fast compared to running even a subset of most test suites.


What if you're using TypeScript, creating a few packages, and bundling them?


then you probably only need a 486 or so? programs can manipulate text quite effectively using ancient cpus.


Clearly you have never used TypeScript, created a few packages, and bundled them.


Sounds like better tools are in order. Golang produces results instantaneously for example.


Almost no present software will run acceptably on a 486 since machines orders of magnitude faster are available at walmart for $300 the question of what software would is mostly academic.


In 1999 we were 'upgrading' to Evergreen 586s in K12 for a lucky few boys and girls. Good times and a lot of nothing going on. PIO4 and PATA with PCI buses clocked at 33 MHZ.

Can you imagine really trying to run 20+ year old tech in developer space? 8 - 10 MB/s throughput and 70 bogomips?


I think it's easy for a lot of people to criticize your mention of morale and many have, but the cost ratio of tools to labour is pretty extreme. Getting a new laptop every week is stupid but if your hardware is on the fritz your company should be willing to replace it pronto, having downtime without a machine while yours is in the shop is a reckless waste of company resources.

My favorite example here is from an old CS job my wife had in connection with HP, this shop was such a grind house that they refused to buy a new chair that would be compatible with her back issues. Refusing to buy an employee a 120$ chair for a loss of perhaps 10-20% of their productivity just doesn't make sense mathematically - ditto for any company that has people working on computers that refuses to shell out for good keyboards and mice. These pieces of hardware are so cheap that having a lengthy discussion about why you're not going to get them is probably costing your company more money than just snapping them up.


HP somehow became the epitomy of crappy enterprise bean counting. They lost all the talent that had any options available, and the results are readily visible.


I do not bother to ask for new mice or keyboards any more. I just bring those myself.


I'm just never quite sure the gain in productivity and happy developers outweighs the cost of shipping a software product that requires hardware that costs 4-digit US$ figures to run smoothly (which seems to be the case for most everything produced by startups nowadays).

I for my part would prefer if, for instance, the Slack developers were confined to machines on which their current product runs as badly as it does on mine, even if they feel so miserable and underappreciated as a consequence that their uplifting have-a-nice-day loading messages get replaced by passive-aggressive snark or something.


I agree with the first point: wasted time will quickly stack up with old/cheap tools. I don't buy your second argument about morale. The vast majority of professionals don't get to pick their tools, they are handed work and tools based on what is available and cost-effective. If these professionals' flexibility does not include programming on a 2 year old laptop vs a brand new one, wow, that's weak grit. If your employees' morale is destroyed by not working on the latest gadget, why is that, what else about the company is insufficient to degrade their morale so far?


As a professional I know what is cost effective for me better than anyone else. For example I know that I'm always bumping up against my 16 gig memory limit, but I have plenty of CPU. I know that I get less eye fatigue on a 4k monitor - so I picked a smaller 4k monitor that was cheaper than the monster lower resolution wrap around thing my colleague prefers.

I know that I'll be significantly more cost effective with 32 gigs of ram because I don't need to spend time killing processes and rebooting VMs after half a day of work.

I know what keyboard and mouse is still comfortable for me after working 8 hours, etc.

I know I'll be more productive on a macbook not because I'm an apple fan boy. I hate apple because they've done more to kill open source than any other company - even MS. I'm a linux fan boy. But I need to use various tools that don't run on linux. I could "tough it out" on a cheaper windows machine, but it wouldn't be cost effective. I would be less productive.

A professional knows what is cost effective and spends their hardware budget wisely to optimize their productivity. They don't rip off the company for the latest shiny gadget. It is stilly to trust a dev to essentially run a multi-million dollar company, but not to pick out a computer.


The hit to morale comes from management dismissing a serious cost-benefit analysis with phrases like "weak grit" and "latest gadget".


The literal difference between good and poor management. Being objective vs. subjective.


Do you make your employees only listen to one genre of music while working? Do you only allow specific food for lunch? Why do you just care about this specific point in how they work?


> 10 minutes of lost productivity a day due to inadequate hardware means you are paying at least $4k per year per dev for that shit pile of a laptop.

This only holds true if your developers are 100% efficient programming every second the machine is running. But let's face it. The first hour of the day is most certainly not the most productive (10 minute boot? fine, I'll make coffee meanwhile). You could easily schedule a meeting, like a stand-up, while the machines fire up, if those 10 minutes really would be necessary.


Mathematically flawed there is no reason to suspect you can subtract the time spent waiting for the computer from the time already wasted whereas actually inefficiency from poor hardware is distributed throughout the day including productive periods.

You would actually multiply a percentage of inefficiency x hours worked.

Also honestly human beings doing intellectual work can't just do something else for 10 minutes and lose zero productivity because intellectual work is fundamentally dissimilar from assembling widgets OR managerial work.

Consider reading http://www.paulgraham.com/makersschedule.html because you fundamentally don't understand how people work and can't be a good manager without at least attempting to understand.


> Also honestly human beings doing intellectual work can't just do something else for 10 minutes and lose zero productivity because intellectual work is fundamentally dissimilar from assembling widgets OR managerial work.

Yup, I'm talking about the beginning of the work day. Nothing productive would be interrupted.

> Consider reading http://www.paulgraham.com/makersschedule.html because you fundamentally don't understand how people work and can't be a good manager without at least attempting to understand.

I have no idea how you got to your conclusions about me by reading my comment but thanks for judging. Regarding this piece by PG, did you read it? Because it actually supports my claim, to schedule a meeting in the beginning of the day, while machines would boot, in order to save the precious time in between start and beginning of the work day.


To reiterate the prior poster claimed that 10 minutes of lost productivity could cost more than the devs desired computer.

You said that this calculation is erroneous because the developer could easily make coffee or have a meeting while his machine boots up and thereby recover that lost time.

This is a very puzzling suggestion. Slow machines aren't merely slow the start they are slow to complete user operations while the user is sitting at the machine awaiting the result. The time cost is the sum of 1000 small delays throughout the day. You can't productively fill the extra 30 seconds you spent waiting 20 times with a meeting for example.

In fact acceptable hardware/software wakes from sleep in a second or cold boots in 30-90 seconds. Boot up isn't really the problem.

You said

>Because it actually supports my claim, to schedule a meeting in the beginning of the day

What the actual article says

>Several times a week I set aside a chunk of time to meet founders we've funded. These chunks of time are at the end of my working day, and I wrote a signup program that ensures all the appointments within a given set of office hours are clustered at the end. Because they come at the end of my day these meetings are never an interruption.

PG actually suggested meeting at the end of the productive day to avoid breaking up productive working time.

I suggest you read it instead of skim it.


> PG actually suggested meeting at the end of the productive day to avoid breaking up productive working time.

I suggest you understand it instead of mindlessly quoting it. It's clear PG wants no interruptions for productive work time but if a meeting is scheduled before productive work time begins nothing gets interrupted.


It's less clear how you can deal with a slow computer by making coffee or holding a meeting. Did you think slow computers take 10 minutes to boot up but run real fast after?


Read it, almost everything written applies mostly to the IT industry. It does NOT operate anywhere near like that in any warehouse, full-bore semiconductor manufacturing, or even fast food job I've ever had.

What may apply to one person or industry absolutely does not apply to them all.

And to boot - I'm the GM of a very, very large solar company.


It is more like - this tests takes just long enough that I can jump over to hacker news for a few minutes...


Plus opportunity cost when developers talk and the good guys decide between a job with servers or VMs in the cloud and a MacBook pro to use them or a shitty, out of date, slower than Christmas windows desktop rendered on a vdi appliance connected to a Citrix server in the next state. I mean... hypothetically, of course.


i hate these kind of taylorist arguments. dev's limiting factor is always energy, not time. and i cannot imagine any good devs i know truly caring that they arent on the latest and greatest. i guess i just wouldn't work somewhere people care about that kind of shit, so maybe im biased.


> dev's limiting factor is always energy, not time

If this is true, I would expect that investing in hardware that most effectively gets out of a dev's way to have an even higher return on investment than is suggested by time and productivity arguments. The emotional toll of dealing with the dev equivalent of paper cuts should not be under-appreciated.


As long as it doesn’t get in their way, I think the previous statement is indeed true.

Devs don’t care whether their laptop is 2 years old or not, they care that their compilation takes 5 minutes when it could be 2.


I used to do in-house development at a shop that had a great policy on that front: Developer machines were always a generation behind the workstations of people who would be using the software they write.

And a strong culture of belief that, if a developer couldn't get something working well enough in that kind of environment, then that should be taken as a reflection on basically anything but the developer workstation. Inefficient code, excessive memory usage, just plain trying to do too much, bloat, whatever.

But I realize that's a hard thing to sell. A lot of developers don't really appreciate being reminded who works for who.


This is a really poorly thought out policy masquerading as deep understanding. It's if I can coin a phrase yoda talk. Basically incoherent ideas aren't improved by being couched as hard earned wisdom.

The tools being used to create a piece of software are often fundamentally different than those used on the other end.

This means that machines should be provisioned in accordance with the needs of the actual tools running on them.

Developers may in addition to running different tools may need to iterate quickly in order to test functionality. It may be acceptable that an app that is started once at the beginning of the day takes 3 seconds to start but if you deliberately handicap developers machines and it takes 7 each time the developer is trying to fix a bug in the space of a few minutes instead of 1 on a faster machine you may have damaged your own productivity for no reason.

This has no nothing to do with the idea of who works for whom. Incidentally this statement sounds entirely passive aggressive. Ultimately I'm pretty sure all of you worked for whomever is your ultimate customer and are invested in helping each other do effective work it sounds like the management team was entirely unclear on how to do this. Is that shop even in business.


Not only is the shop in business, it's one of the most profitable place I've had the pleasure of working at. The policy was actually handed down by the CTO, who started in the company as an entry level developer. Some money was incidentally saved, but, at least to hear him talk about it, it was more about getting people's incentive structures in line: If you want to discourage developers from doing something, the most straightforward way is to make it painful for them to do it. If you want to make creating performance problems painful, you accomplish that much more effectively by making it apparent on their workstations, where their nose will be rubbed in it constantly until they do something about it. Slow performance on an external test environment is much easier to ignore, because people typically don't bother testing there until they think they're basically done with their work, at which point their incentive structures are nudging them toward ignoring any potential problems.

Contrary to some of the criticism I'm seeing here, it wasn't actually hated by the development team, either. The hypothetical "can't get any work done because compiling takes a bazillionty days" scenarios that people are pulling out of thin air here simply didn't happen. At a shop where developers were expected to think carefully about performance and know how to achieve it, they tended to do exactly that.

Someone who was busy making excuses about how they weren't very productive because they only got a 3.4 GHz CPU while the analysts got a 3.6 GHz CPU probably wouldn't have lasted long there.


> At a shop where developers were expected to think carefully about performance and know how to achieve it, they tended to do exactly that.

As long as you have time to actually achieve this, remove that and even on the worst hardware, you'll see shitty implementation.


"Yoda talk" is a very nice phrase, I hope it catches on.

Dogfooding is sometimes a good idea, and of course testing on a range of setups is important. I suspect there is a problem with people testing on older software but not trying any older hardware (especially for web apps), which using old machines could have partially avoided.

But the idea that development should inherently be able to happen on older hardware than the product will run on is arbitrary and ridiculous. At best, that creates pointless pressure to rely on hardware-friendly tools, which could mean anything from not leaving open lots of relevant Chrome tabs to pushing developers to use vim instead of a Jetbrains IDE. (Nothing wrong with vim, obviously, but "we intentionally hobbled our hardware" is a weird reason to choose an environment.)

At worst, it fundamentally impedes development work. For an extreme case: Xcode isn't really optional for iOs development, and merely opening it seriously taxes brand new Macbooks - applying this theory to iOs developers might leave them basically unable to work. Even outside that special case, there are still plenty of computer-intensive development tasks that are way outside user experience. Just from personal experience: emulating a mobile phone, running a local test server for code that will eventually land on AWS, running a fuzzer or even a static analysis tool.

Even if we grant the merit of that trite "remember who you work for" line, sticking to old hardware doesn't seem to follow at all. We wouldn't go around telling graphic designers that if they work for a customer with an old iMac G3, they're not allowed to have a computer that can run Photoshop. Heck, are there any professions where we assume that building a thing should be done exclusively via the same tools the customer will employ once it's finished?


My company’s mobile app is made in react native.

As a senior dev., I still use an iPhone SE, but the mobile dev team all have the latest iPhones.

The app looks horrible on the SE, and some touch areas are blocked by overlapping text or images.

It is basically unusable on a supported device.


I worked in a place where dev workstations were far beyond the end users', and when devs tested (mostly local, not multiple systems across networks), there was no basis in the customers' reality.

Endless performance problems were masked in development, and then very noisily evident when it reached customers. Performance testing was largely reactive and far removed from the creators of some terrible code choices, so they'd tend to shrug ("Works fine here") until you analyzed the hell out of it.

Now, the code was a big C++ environment, and compilation speed was a problem, but maybe a means of testing in a throttled state would have prevented a lot of grief much, much earlier.


This reminds me of Facebook's "2G Tuesdays" (https://www.theverge.com/2015/10/28/9625062/facebook-2g-tues...), where they emulate their internet speed to that of emerging countries. If you are frustrated while waiting for a page to load, then your users are going to be as well.

This is a really nice way of working and you can see the results when browsing https://mbasic.facebook.com: even at 2G speeds, even with photos, pages load fast. No unnecessary background process trying to do something, all buttons are there to be clicked on already. A really smooth experience.


Developer tools often have much higher requirements than the resulting product. Not everyone is using vim.


Not a valid excuse. The end users have their own productivity apps that eat up their system resources, too.


In the minds of developers, the end users shouldn't be running any other apps than the the app that the developer is working on.


Correct, it seems much better to create a test bed environment emulating this, rather than artificially constraining a person own development environment due to some poorly thought out policy which is actually likely a way to save money.


That's a complete non sequitur. The grandparent comment is not talking about how resource intensive background apps affect the performance of the product, it's talking about how the development tools themselves may not run smoothly and efficiently on shitty hardware if they are resource intensive.


Why had complex IDE's with drag'n'drop tools, syntax highlighting and intellisense that were happy to run on hardware not much better than a 486.

Developer tools are only resource hungry today because their developers aren't dogfooding.


They are resource intensive because they prioritize feature completeness over being sleek and lightweight. That's the correct thing to prioritize because, because extra RAM is dirt cheap compared to benefit of higher developer productivity.


That's the point, they already were feature complete back when they were less bloated. VS2017 offers very little over VS6.0 or VB6, I think they're min requirements were 128MB of RAM, it was a lot more responsive too. Similar for eclipse.

Developer productivity hasn't improved since, even gone backwards in some ways.


That assumes your not using some monstrosity ide that requires bleeding edge hardware to run COUGH oracle forms COUGH.

We had to buy cutting edge pc's spend >£2k on extra ram and £4k on 20 inch crt monitors to develop - the application would run on a 133MZ

If you had less than 2 Blue screens per hour that was a good hour - This was early 90's btw.


I think you're mis-remembering your time period. 20" monitors weren't widely available until the mid-late 1990s; nor were P133 machines (P133 release was in June 1995; so for it to be considered mid-low end, late 90s would be more likely).

Where I worked in the early 1990s (1992-ish), us developers fought over who would have the color terminals (Wyse 370/380) and who would have the monochrome ones (Wyse 50/60). I was low-man on the pole so I always got stuck with the low-spec terminal (though I still have an affinity today to amber VT220 text characters).

Until I "wowed" the owner of the company (small shop) showing how to do windows, dialogs, etc in a terminal using a virtual buffer and some other "tricks". Then I was given one of the color terminals, to jazz our interface (which was mostly numbered menus and basic flat screens).

At one point, I played around with the 16 color Textronix graphics mode you could pop into with the right escape sequence; I think I made a very slow Mandelbrot set generator (never showed that to him; our app was already pushing our dev hardware, which was an IBM RS/6000 desktop workstation running AIX we all shared via 9600 bps serial connections)...


It was 93 and the gear we had was bleeding edge for the time - unlike the average developer who was on 100MZ or 133 at best

The system was SIROS the system that helped manage the UK's SMDS network so we had budget for it.

The 20 inch monitors where awesome for playing doom late at night


90s and 2000s was a pretty different story. These days a 5 year old i7 with 16g of ram is pretty damn fast. Maybe it's a different story if you're running windows with a big IDE.


This may be workable for a certain subset of projects, but programmers often have much more on their system than the end user. End users don't need a bloated IDE, an SQL server, an HTTP server, etc all running at the same time. Trying to run all of these programs on an old computer is of zero benefit to the process. Better to give programmers a new machine with remote desktop access to a slower computer/virtual machine that they can use to test out their software.


You could easily argue the opposite as well. Developers don't need an IDE, a SQL server, an HTTP server, etc running on their device at all. The choice is to use a bloated IDE that most people only use a small fraction of the features for. The servers could all run on a dev server and compile/test cycles can be done on similar servers.

Mind you I don't necessarily agree with all of this. Well except the IDE part, Vim and Emacs are tools that more people need to learn.


> The servers could all run on a dev server and compile/test cycles can be done on similar servers.

In every case I've had a dev db running on a shared test server, that DB has been woefully underspecced for the purpose and often in a datacenter with 300ms latency from the office over the company VPN.

While production instances are in the same datacenter as the production DB with 5ms latency.


You're using emacs as an example of non-bloated? Eight Megs And Constantly Swapping?


Ya you're right. I should have used something more light weight like Atom.


I understand it's a whole 0.1% of physical memory for the program you're spending most of your time. Better reduce that to 0.06 quick.


Unless you are literally building the dev tools you are using that doesn't make any sense. That shop lost tons of money on wasted dev cycles. You spend much more time building the app than running the app.

They should have bought everyone a second low spec machine to test on, and let them use proper dev machines for building the software.

I guess if is a shop where the management feels they need to remind developers suffering through that for 8+ hours a day "who works for who", that was probably the least terrible part of working there.


"But I realize that's a hard thing to sell. A lot of developers don't really appreciate being reminded who works for who."

"Yep, we put those developers in their place."

"Hey, why are they leaving???"

"Oh, you mean we need developers more than they need us?"


The idea you're trying to advance here is just plain silly.

You need your doctor more than your doctor needs you. That doesn't change the fact that your doctor is doing work for you, and not the other way around. Same for lawyers, plumbers, electricians, architects, and anyone else working in any number of other skilled professions.


Do you also engage in silly power plays to put your doctor in his/her place and remind them that they are working for you? Maybe you can insist that they use a 10 year old stethoscope or you else you'll take your business elsewhere.


Developer can switch jobs fairly easily. Its a sellers market. Companies that don’t understand this are going to wonder why they have a hard time retaining talent.


That’s great, until you have to run Xcode and interface builder.


I see...so your shop with a great policy seems not to have learned about these things called test environments eh?


It's a staple of 'developers' to have fancy Macbooks while essentially just needing a shitton of RAM these days.

Programmers don't need bleeding edge, just a lot of RAM.


Some of them don't – but some of them do, and that improved performance can offer a noticeable productivity improvement.

I recently upgraded from a 2015 Macbook pro to a new i9 one, and right now I'm working on a computer vision pipeline for some data – an embarrassingly parallel task. It takes about 15 minutes to run a process which would have previously taken about an hour. This is a direct improvement to my development experience (trust me!)

But there are a bunch of different reasons. Modern stacks can be annoyingly under-optimised; a large Webpack app with live reloading can be irritatingly slow even on recent machines. Fast SSDs are useful for people working with large datasets. Better GPUs can mean better UI performance and more screen space.

In short, remember that just because you don't need the hardware, doesn't mean that others don't! :)


But why do you need that on a laptop? Just run it on a server.

If you work for a employer with deep pockets I guess sure why not? Otherwise a workstation you can remote connect to (if you work from home or travel) is probably good enough.


Nothing about your post adds up.

2015 MB i7 to i9 MB doesn't increase anything times 4.

How long have you been working on that task and waited an hour? The wasted labour cost might have bought you a dedicated compiling rack 2 weeks in.

How long could you have rented cloud ressources to bring that task down to close to instant for the cost of a i9 MB?

I am just curious tbh. I have various tasks like yours hobbywise. But the least thing I'd encourage my laptop to do is compile/render/analyse a problem that takes more than 60 seconds. Beware if you fully utilise a laptop like you describe shouldn't it become useless for everything else while doing that?

So much questions...


Nothing about your post adds up.

That's great, but I'm literally sitting at my desk doing it just now, so I can assure you it's not just made up!

2015 MB i7 to i9 MB doesn't increase anything times 4.

i5 2015 13" MBP to i9 2018 15" MBP. 3x the cores, higher IPC, higher frequency, faster memory, faster SSD. It adds up, and for this class of process a 4x improvement is totally reasonable.

How long have you been working on that task and waited an hour? The wasted labour cost might have bought you a dedicated compiling rack 2 weeks in.

I don't know how the hell you work, but I don't just kick off a process and let it run while I sit still at my desk waiting for it to complete :) It just involves working to a slightly different rhythm, and the ability to iterate a bit faster makes that nicer for me, at minimal cost.

Anyway… wasn't the point that "developers don't need faster machines?" I think buying a "dedicated compiling rack" would count!

How long could you have rented cloud ressources to bring that task down to close to instant for the cost of a i9 MB?

No idea, but in the long term, more than it costs to buy a new development machine. Plus this way I don't have to fanny around with copying assets back and forward, remoting in to visualise things, setting up a server or whatever. And the new laptop makes everything a little bit faster and more enjoyable to use. The price of the machine is pretty marginal for a tool I use in excess of 40 hours a week.

Beware if you fully utilise a laptop like you describe shouldn't it become useless for everything else while doing that?

Nah, it's fine generally. Everything's just a bit slower until it's done.


Faster is usually better but it's worth pointing out: If a task took a whole day, you wouldn't burn the day twiddling your thumbs - you'd context switch and do something else. You'd save long compilation steps to late day so that you could come back tomorrow, not wasting time. For many people, 15 minutes is not worth switching but 1 hour wait time is.


Recently got access to a new HW platform with 256 cores CPU, 512M of L3, 512G/1TB of RAM. The speed of that thing compiling the linux kernel and run certain AI testing are amazing. One can try out different experiments so much faster.

Used to working diff company with server farms for compiling. Even with the servers, compiling the 26G software pkg took 4 hours. When it is high noon and everyone was using it. I have seen compile jobs can last 8+ hours. There a few times that by the time compile is done, I have completely context switch out and forgot what I need to debug.


I did two dissertations on realtime outdoors computer vision stuff (detecting landing targets for quadcopters and shadow classification), with a really high-end (at the time) laptop with 16GB memory, 8 cores @ like 3.5GHz turbo, a 512GB SSD and a GTX970 GPU which was okay at running tensorflow et al. Not only was it a very capable workstation which I could leave crunching data on overnight, both dissertations involved field robotics (quite literally, doing stuff in fields with drones :)), and being able to use the same machine wherever was a godsend.

No matter where I was, be it at home, in a computer lab, presenting my work to peers or my supervisor, or outdoors in a field, I had the same tools, same large dataset of raw h.264 video streams, same hardware, same everything, without needing to rely on streaming data to and from some server on the internet, or worry about keeping my work and software in sync across multiple machines. I could tweak my algorithm parameters in-field and I could continuously compile my huge 100-page latex sources for my dissertations from.. the beach :)


I think that's definitely one of the good use-cases of a portable machine. Before buying a new one, I did do the maths on using a more powerful desktop instead and just keeping my older laptop for travel – but like you say, it means constantly thinking about what data and capabilities you have available at any time. And coincidentally I was also working last week on real-time computer vision for robots, on a remote customer site, so it was nice to have my machine with me :)


I myself will take a small computer and a fast connection anyday, I do not think money is the biggest issue here just what is practical.


Yah. Or simply imagine the benefits of a cheap, upgradeable workstation instead of trying to do a long build on a computer squeezed into a half-inch-high package: more cores, more RAM, big old GPU, wired network, the list goes on.


Sure, a desktop machine would be less expensive than a laptop of equivalent performance. But I work mostly from home, work in the office maybe one day a week, and often visit customer sites. The ability to just pick up and go with a relatively powerful machine is pretty attractive.

I think it's always important here to realise that people have different use cases and priorities for their equipment, and that there's no right or wrong answers. Some people are happy to fork out for a portable; others don't require that, and would rather have a fixed workstation with more power. Some developers are totally fine with 10-year-old kit, and some can benefit from newer stuff. I'm sure everybody evaluates their circumstances and comes to suitable conclusions for themselves!


The newer macbooks also have significantly better I/O performance, which together with the faster CPUs and more cores might add up to a 4x difference.

It's a pity they are otherwise crappy for my work (I need a reliable keyboard, I do not want a "touch bar", I do want USB-A ports).


I've installed small amounts of RAM into machines and have them speed up more than an order of magnitude. The speedup was because it stopped swapping. Modern computers have other similar bottlenecks that can show massive improvements for what seem like small changes.


That's a bit of a knee point though.

If your machine doesn't have quite enough RAM, it will swap and likely be unusable.

Once you have just enough it will fly, and adding more RAM will make little difference.

I can't imagine the OP's aged stack is so low on RAM that he puts up with swapping.


As I said, there are lots of other knees. For example, a slightly increased L1, L2 or L3 cache size can have a similar effect.


uhh, in 2015 a i7 had 4cores, boost of around 3.4

in 2018 a mobile i9 had 6cores/12threads, bost of around 4.8 (so lets be reasonable, thermals might let us keep 3.6).

For a super optimized parallel load you could totally see more than 2x speed up.

But yeah... why do that on your laptop?


If it takes 15min locally then you should do it locally. My rule of thumb.. if it takes > 1 hr locally it should be done elsewhere.


> uhh, in 2015 a i7 had 4cores, boost of around 3.4

Not on a 13" MBP, it didn't. Source: there's one on my desk.


And on a none portable computer you could probably do it even faster. Maybe you need a laptop but there are tons of peoples using laptops that doesn't needs to.


I think that's pretty much what I said, right?

Not everybody needs a laptop, or a super powerful development machine. But if you want to process some stuff nice and quickly, while not being tied to any physical location, it's a totally reasonable solution.


The wonderful thing about a laptop is that if I want to go sit by a window for an hour while I'm working, I can just stand up and move. A desktop chains me to my desk, deep within the heart of a dimly lit cubicle farm.


The funniest thing about this is probably that RAM prices have went up in the last 2-3 years. I built my current computer in 2016 and if I had to buy the same RAM that it has today, I'd have to pay more. So for the same money, "newer" means in fact worse. Noting that prices have went down from their peak increase since, but it's still more than the lowest historic price.


More important than price is trust.

The Core 2 Duo/Quad on LGA775 is the last revision of the Intel Management Engine (ME) that can be completely removed. The me_cleaner script recently added the ability to clean this platform.

A Q9650 quad-core CPU is the best performance that can be reached on this board. I have two and I have run me_cleaner on both. I do my finances on the one that runs OpenBSD.


I bow to your paranoia. Although I have often wondered if retro computers will become more valuable in the future as the final bastion of electronic privacy.


If your tinfoil hat isn't too thick, you can take a modern computer and set it up behind a hardware firewall, or connect over serial...


And then monitor the tons of weird ass encrypted traffic coming in and out of it, hoping that those requests to AWS are some program getting updates and not the contents of your L2.

Firewalls are great, and you should have one, but you have to constantly watch and understand everything coming in and out of it. Works ok as a full time job, terrible when you have other things to do.


If this is too much trouble, you can connect over serial, like 25+ years ago. Transfer files with zmodem from an internet-connected host. A Raspberry Pi with a serial port would be perfect for this.


I have a rack full of machines like this. Dual sockets with 8GB ram each. Any affordable upgrade would be a downgrade. I'm looking at at least $200 per U to upgrade from hardware that is essentially worthless. I literally got the CPUs for $2 each. There's something really funny going on with prices after core 2.


How much is your power bill?


Kindof a lot, but I tell myself that servers are an efficient winter heat source.


Thank you for this information, this is going to be my next project.


Don't forget the tinfoil in your bunker.


Can your board take an X9770?


These are the boards that I patched:

https://github.com/corna/me_cleaner/issues/233


Even that depends on the dev environment.

If you're using a lot of virtual machines, sure. If you're working on something that's inherently RAM-hungry like video processing, sure. If you're working against a local database server, maybe. If you're using something like IntelliJ or ReSharper (but not necessarily Visual Studio), then, <sigh>, yeah, I guess.

If you're doing Web development, OTOH, it's probably better if you not have much RAM. If front-end, because a significant percentage of your users will be using low-RAM devices like smartphones and Chromebooks, and you shouldn't need any more RAM than they do. If back-end, because the production servers will (hopefully) be handling a lot more load than what you're doing in testing, so if you get too comfortable with treating RAM as a plentiful resource in development, that's going to be a recipe for scalability problems.


Our Scala teams disagree.

I do mostly Go, and yeah, there it doesn't matter as much, but you should see some of the build times on these services. There isn't even a big scary monolith left in our architecture (besides a frontend in node).

Oh, and let me not forget minikube.


SSD's really help a lot too. SSD's and a reasonable chunk of RAM (whether reasonable is 8 or 16 gigs depends on what you're doing, I've never needed more than 16). Anything else is not so important for normal development. A fast SSD definitely helps as without one, IMHO waiting on the disk I/O is typically the bottleneck.

Hell, I did a lot of (hobby, not work) development on an eee PC netbook a few years ago. Besides the discomfort that a small screen and sub-normal keyboard brings, it was a reasonable experience. I didn't have to run slack then though.

Obviously it depends on what you're doing though. Processing data? Running a ton of tools all at once? Differing workflows and requirements have different hardware needs.


I think the existence of SSDs are why modern OSs are borderline unusable on HDDs. It is kind of ridiculous how much passive IO there is, and how ridiculously long it takes to launch or open things.


You're probably right. Applications definitely are a lot more lax on resource consumption.

I've sat down and enumerated the things I do daily on my computer versus what I did back in ~2000. Besides streaming (youtube, netflix and spotify basically), its pretty uncommon that I do anything that I couldn't or didn't do back then. The performance of these things, now, seems about the same anecdotally. Maybe it was a little slower then, but not so much slower compared to the hardware performance difference. It makes me sad.

Also, as I said in other comments, I once developed hobby projects on a netbook. I of course ran a minimal system and didn't run resource hogs like slack. I'm currently using a cheap laptop running a minimal system (but unfortunately I do need to use slack, chrome, docker for work). They run fine.

I think developers or companies have decided that developer time is more important than performance and resource use of their products, punting the cost of that onto their customers. If my work didn't require it, I would definitely switch to more resource conscious tools and I'd be a bit happier. Oh well.


I would agree, and I hate that attitude in developers. Technology is a force multiplier, when you write something slow you're multiplying that slowness and all the time it wastes across the lives of your entire user base over the lifetime of the product.


I think browsers is what regressed the most.

I used to be able to open 3 Twitch streams in 720/1080p back in 2010 on a dual core Athlon X2 5000.

These days, if I open 2 HD streams in Firefox on a 4.2 Ghz quad-core 4670k, the system essentially freezes - the browser becomes unusable and both streams lag. Thank god for streamlink that lets you bypass the browser interface and view the streams in VLC or MPV.

Not to mention Electron-based apps using 600-800MB for a chat program with some light jpeg usage.

Developers' blatant disregard the the use of user's system resources is insane. It seems that regardless of hardware improvements, the modern developer only targets "acceptable" levels of performance, leaving power users frustrated.


Ubuntu seems fine on my older laptops, but Windows 10 is a different animal. I had an otherwise fast laptop that was almost unusable because Windows was always thrashing the slow spinning HDD. It could take 10 minutes or more after waking up from a sleep before the HDD light turned off (and even then off means a constant flicker). Apparently it's related to telemetry data Windows was constantly collecting for some reason. I eventually replaced the HDD with a SSD and the difference is night and day.


I'll admit that Windows is a bigger offender in this regard, but Linux Desktop has become quite bloated too.


I had a really old laptop with a 2.4Ghz C2D and 2GB of RAM plus a 4600RPM (I think) 60GB HDD that I had to switch to Cinnamon, but after that it worked alright. Could even keep a few tabs open in Firefox. It really only had trouble when it ran out of memory and started swapping.

It doesn't seem as easy to switch Windows to a lighter desktop environment.


Actually it kind of is, you just go to an older version of Windows. A surprisingly large amount of software will still work, or at the very least have alternatives that do. XP is really really snappy on modern hardware.

Granted, using XP today is not recommended for other reasons.


Can you even activate a copy of XP these days?


It's a trade off. Your OS had to play all kinds of games to manage those 150 or so IOPS you had. When you have 10,000 to a million IOPS you can drastically simplify the operating system.


RAM? maybe for Visual Studio/Xcode/Android dev. But for any nodejs/ruby/php development, a 10 year old machine with 1GB RAM is perfect.


There are two things that cannot be true at the same time.

"Ram is cheap"

and:

"The maximum ram you can get is 8-16G"

Developers who are making electron applications (not that electron is the primary cause, it's just a correlation I see often) constantly consider that "ram is cheap" for their users and thus do not pay attention to memory consumption like they should.

This means that you're right in a way; vim is good enough for most people and a decent terminal emulator is going to cost you much less than 100mb of ram.

However; slack is consuming 2GiB on my machine, Skype for business is using 500MiB, Outlook @ 600MiB. I use safari and not chrome (safari is @800MiB with too many tabs open) but if I were using chrome I could be using many multiples of gigabytes.

If everyone thinks their program is worth 1GiB of ram or more then your machines become very limited in what can be backgrounded. You might as-well run Android/iOS if you're running with 1G of ram with todays application/web ecosystem.

The thing is; I am quite conservative with memory usage and I'm still quite sure I wouldn't be able to work on anything with 8GiB or less.

( I mean, I just checked and I'm using 15G of memory: https://i.imgur.com/xd6eB91.png )


Electron is kind of the cause. You can program lean in Electron (see VSCode), but realistically the people that program JavaScript/TypeScript are usually not your CS pros. It's a language that everyone can get into easily, attracting very subpar developers as a result.

So it's not Electron, but it's what Electron enables.


Electron has its place, which is allowing web developers to make 'native' apps without learning new technology. When you are downloading any free software built on Electron, realize that it likely started as a labor of love that the person decided to open source or make available for free.

That said, large companies using Electron is what I don't get. You can't expect me to believe Slack doesn't have the resources to create an actual app for their product.


It's not that they don't have the resources, but if you want to create something that works the same on Mac, Windows and Linux it might even be a good decision for a bigger company.

Also they might have started in electron and now it's hard to move away from it.


Sure, there's a very low barrier to entry, but node has a big hurtle in the middle.

You can relate if you ever had to setup a JS taskrunner/webpack from scratch. If you don't do it regularly, expect to spend 2-3 hours going through documentation and lots of outdated StackOverflow posts.

JS is easy as long as you stay within the constraints of whatever scaffolding you're using, if you need anything outside of that or need to upgrade your stack you need to know a lot of little things. "Best practices" move fast in JS, and projects die fast too.


What I'm trying to say is that the low barrier to entry causes a lot of bad software engineering. Of course tooling issues can be hard to solve and you can do beautiful programs in every language. It's just far more likely that if you pick a JS developer at random they won't know much about memory allocation, which algorithm to use, profiling, and on general clean code.


Even VS Code is slower and uses much more memory than native apps.


True, on my machine it uses 150MB of my RAM. It's a lot compared to native apps, but it's really at the point where it's not that bad in the grand scheme of things.

Our companies customized eclipse on the other hand hogs 8GB ram when doing nothing, needs 2 minutes to start, and is sluggish in general. It's our course written in Java, which is much closer to native than electron, but still they wasted resources everywhere they could.

You can't run a compile on a 8GB ram machine, it will start swapping to hard disk.

Sublime doesn't have all the features, notepad ++ neither, and I just don't like vim style editing.

That leaves vscode.


And here I am, using nmh and exmh under CWM. With vimb + a host file, or plain Links+. IRSSI + Bitlbee for the rest. XMP to play some inspiring tunes. I only use 1.20G of RAM while using cached I/O . That minus the cache, is 185MB, according to vmstat.


You’re lucky. I could run so lean if I didn’t need to read emails using outlook and use Skype for business + slack for communicating with colleagues.

I’ve tried replacing slack with the wee-slack plugin for weechat but, while it works it’s far from a fully working solution and I still need to spin up a slack client sometimes.


Can't you use Thunderbird + Lightning? The RAM usage may be less.


The actual act of writing code, maybe, but:

1) 4GB—let alone 1GB—is already cramped with just Slack and any one of your usual bloated shitware issue trackers/wanky PM toyboxes (Jira, Asana) open in a tab or two, plus the usual few tabs of DDG/Google, Stack Overflow, some docs, et c. That's before any actual code-writing tools enter the picture. The basic suite of tools to just be working at all in almost any role is just barely not-painful to use on 4GB. Worst case you're in an agency and have all your tools, plus several duplicates for the same purpose for a client or three, and so on, all open. Yes, it's because all these tools are terrible and eat like 20x the RAM they have any right to and even that's generous, but I still have to use them.

2) Better hope you don't need any design tools at all. (Sketch, say) if you're trying to get by on 4GB or less for everything, unless you like only having one thing open at a time or dealing with UI sluggishness I guess.

3) Docker(-compose) or minikube or whatever? Service dependencies, local mock services? Running test suites? Without a strong CPU and 16GB you'll see slowdown.

4) A fan of any of the fancier webmail clients, like recent GMails or Inbox or Outlook or whatever? I'm not and just keep the Basic HTML version of Gmail open because its full-page loads are faster than the "speedy" AJAX garbage on those, but if you are into that sort of thing take a look at their memory use some time.

FWIW I think almost all the tools surrounding and supporting development these days are god-awful resource hogs that somehow still manage not to be very good and think 1GB absolutely should be enough memory to get by doing node/ruby/php dev, but I still have to work with that junk, and 8GB's the bare minimum to do that without hitting swap constantly, IME, and even with that you've gotta be careful. 16GB's much more comfortable, especially if you sometimes have to do things other that just Web dev.


I have 8GB here, under Debian. I can easily run docker (with a rails server), firefox (discord, slack, facebook, youtube, online radio + outlook webmail, all at the same time), with many Emacs windows.

I think I could manage to have less RAM (I frequently code on my chromebook with 2GB RAM). My theory is that the ram expands to the amount available https://en.wikipedia.org/wiki/Parkinson%27s_law


IIRC at my last employer an Asana tab + Slack ate ~1.5GB all on their own, and Asana was so slow to load that one hated to close it.

Jira's not as bad as Asana but depending on the set-up it can be pretty close. Then there's Invision, et c which are much lighter than those but still pretty damn heavy, if you're trying to get by on 4GB or less. And/or maybe you've got Outlook and Teams and all that. And that's just the communication & collab tools, not even any of the stuff to produce actual work output. Temporarily having to use a 4GB machine with that kind of workflow is why I'm now permanently on Basic HTML for Gmail—it loads fast enough I can close it, and uses so little memory there's no reason to. I couldn't spare the 300+MB for Inbox or whatever with all that other junk open, and besides, Basic HTML's much faster.


No, it's that developers write their code on 16GB of RAM Macbooks running the minimal amount of software required.

If you're not seeing performance issues in your developer machine, you'll hardly see a developer running on 2GB of RAM.


But for any nodejs/ruby/php development, a 10 year old machine with 1GB RAM is perfect.

I don't think I'd enjoy the experience of developing on that machine!


A lot of shops have large and swollen Linux VM's to run your database, web server and interpreter in for consistency with the production environment.

If I wasn't running on a Mac at work I could comfortably use containers without the overhead of another VM.


The new docker that uses the Mac’s builtin hypervisor works pretty well


I didn't know about that, I'll give it a look over.


You could always install Linux. I personally would hate running a VM regularly for my job. It just adds complexity without much benefit.


The benefit is the VM can precisely match production, and every project you work on can have a different VM with a different set of software.


It gets worse than that, I'm still on spinning rust, although I do have 21gig of RAM so I can keep my VM's running.

Looking forward to putting an inexpensive linux workstation together when this Mac eventually bites the dust.


For a small app on a box running Linux, _maybe_. Any other case? That hasn't been true for years.

The large Rails app I work on can eat up 500mb RAM easily. The webpack dev server I'm running to compile the Angular frontend for said Rails app is taking another 500mb. Visual Studio Code (easily the best TypeScript editor) is using over a gig of RAM all by itself.


lol, I'm sorry but no a 1GB machine will not cut it even for a hobby, never mind in a real business environment.

A good IDE like PhpStorm is going to use 1-2GB.

A browser will consume 1GB+. Want multiple browsers for testing? Add another few gigs.

Most devs now use a VM or container with PHP, MySQL, etc running inside it. Add at least 1GB, maybe 2GB.

You probably need Slack or HipChat running. 0.5GB to 1GB.

A mail client. 256MB+.

Your company probably has Office 365, so you'll need Outlook. 1GB+.


Eh, 1G is pretty limited. I'd want at least 4G. Yes, 1G is workable (I could do most webdev on a Raspberry Pi), but a desktop environment and a browser running one tab uses half of that, and each tab would increase that usage.

The 4G isn't for building or running code, it's for tons of browser tabs open with documentation. I routinely have >10 tabs open, and that's quite likely to cause swapping on 1G RAM.

Sure, I could slim everything down by running a tiling window manager instead of a desktop environment or configuring my browser to unload older tabs (or close and reopen with bookmarks), but that takes extra time, and if time is money, that money is better spent on a few extra gigs of RAM.


As a counterpoint I currently have over 130 tabs open and my machine isn't breaking a sweat. The secret is leaving Javascript disabled by default and selectively enabling it on a per-tab basis only when needed. This strategy works great if the main thing you're looking at is documentation.


I found out last year that 1GB RAM is not enough for TypeScript development (whose compiler runs on nodejs). I was building a fairly small TypeScript project on a Linux instance with 1GB and no GUI and no other processes hogging RAM. I was consistently having builds fail with errors about half of the time. Finally figured out that it was running out of RAM and if I turned off source map generation the errors went away. I didn't really need source maps in this case, as I was not developing on this machine, but it certainly would be a problem if I was trying to do development.

Sure, there is a lot of development that can still be done with 1GB RAM, but you're going to run into limitations and it is far from "perfect"


And people who only close browser tabs every six months


As long as you don't have to run a bloated electron app or anything similar at least...

I have a relatively minimalist Linux environment by today's standards (I use Emacs, Firefox with about 10 tabs, the Signal client, a few terminals and a lightweight WM) and I already use 1.5GB of RAM at the moment. Firefox alone uses 408M of RES, the Signal app almost as much (!) while my Emacs with dozens of buffers uses "only" 130MB. How times change. The rest is used by all sorts of system processes and minor daemons.

So basically I'd be fine with 1GB if it wasn't for the bloat that's modern web-based development. These days I'd say that 4GB is a more reasonable baseline.


Also not strictly true - this is enough for some version of PHP development.

The last PHP app I was developing was running in the same mode as the whole company's infrastructure was developed and running - one puppetized Vagrant VM for the code and one for mysql, and sometimes a few more for other services.

Sure, even if you just want to abstract the stuff away from your host machine you could reconfigure stuff to run on one VM - but that's again diverging from production. And in the grand scheme of things we were working on several different vm types a lot more than only on this PHP app...


> 10 year old machine with 1GB RAM is perfect

No, a machine with 1GB RAM is not perfect. If you are a frontend developer coding in node. And if you are not, you might need a browser for Stack Overflow; and you might need Docker or vagrant for virtualization.

And, you may also want an Electron-based editor, such as Atom or VS Code.

Even 4GB is hardly ideal. 8GB is probably where it starts getting comfortable.


I propose we give every front end dev at _most_ 4GB RAM for awhile and see if web bloat stops inflating.


There are far better ways to address this particular problem. Introduce web performance budgets in frontend projects; test frontend projects on specific target devices on which you need them to perform well; perhaps create dedicated web performance teams that will work on the tooling and testing for better performance.

Development machine does not need to have the constraints of a testing machine; it's counterproductive.

In any case, "web bloat" is relative to the target users and target devices. It's one thing to target cheap Android phones in India; it's quite another to target laptops of a SF-based startup. In the second case, web bloat is negligible.


> In any case, "web bloat" is relative to the target users and target devices.

What about, relative to how many resources should be required for whatever tasks the software needs to perform?

Just because I have a lot of ram doesn't mean I want Slack to use it all. I'd rather give that space to the OS for file caching and such.


Yes please! Very few companies do this type of dogfooding.


Nope. The project can still have crazy dependencies that need compiled (thrift etc).

Every morning every Dev in our shop no matter the project is going to run "mvn clean compile -U"

Sure you can split the large projects up. Just get some time from your PM.. next quarter.... :)


So you’re not doing anything with Docker or ML, or a compiled language, and you have no unit tests?


I haven't done web development lately. Why do unit tests mean higher requirements on the hardware?


At a guess - the various popular test runners are slow and bloaty.


Depending who you talk to, there are behavioral breakpoints at one, three, and seven minutes.

If you have a few thousand tests at a couple milliseconds apiece (really really easy to do even in a midsized project) you’re getting above that 1 minute range. Shaving 40% off with a faster computer stops people from task switching while testing.

Task switching pretty much doubles your test cycle because you never switch back at the precise moment the tests are done (we are developers. We always underestimate everything by a factor of 2).


Can Chrome or Firefox run on less than 1 GB of RAM nowadays?


FF Quantum runs quite well on a 1GB system. You can go even lower than that, but you'll be hitting swap.


Firefox can.


While I generally agree with that - more ram is better. I was doing android os builds a few years ago for a wearable project and using the -j make arg reduced build time significantly.


Don't forget the SSD.


The ssd is actually needed. My employer encrypts the drives. It’s too slow without the ssd


What encryption are you using? LUKS on HDD is imperceptible for me.


Sounds like you never worked on a monstrosity that requires 9 different docker processes running. Java, Scala, Postgres, damn.

Even on a top of the line macbook pro that shit chugged.

And the laptop would get HOT.


I think that's just the Docker for Mac client (and Hypervisor.framework). I think the disk I/O performance was bad, the last time I used it was for Node.js/Postgres about two years ago. I was able to really speed up the integration test suite by disabling fsync for Postgres.

If you were running native Linux it should be a lot faster.


That's what servers are for.


100


Furthermore, the mac os is seriously impaired compared to linux.


even windows have better UI. on a Mac you can't even alt-tab between two terminal windows. ridiculous.


What's ridiculous is condemning something based on your own ignorance. The keyboard shortcut is Cmd-Backtick ("Cycle Through Windows") and it works in all macOS applications.

There's simply a more sharply defined distinction between applications and windows. Cmd-Tab is for application switching.


you happen to be talking to someone graduated in UX design. and from my high horse I often don't fall for personal taste when writing UI critic :)

if the user wants to move to a terminal window they have open in the background, which key combination do they press? alt+tab or alt+backtick? they have to stop and actively think about which current window is highlighted. is it the browser? or another terminal? this completely kills the action fluency of a keyboard shortcut.


> is it the browser? or another terminal?

This is generally a non-issue, but even if you are getting tripped up consistently, the OS literally tells you at all times what application currently has focus


People have already given you the answer, but macOS also supports tabs in nearly all the stock apps, so CMD+{1,2,...} is an even nicer way to work with multiple terminal windows, if you're not a fan of tmux. I don't think any stock Windows applications allow tabs.


of course you can, it's just another hotkey.


lol try pressing command instead of alt :Z


cmd ~


I recently upgraded from a 2013 MBP to a 2018 MBP. The time to do a simple development recompile of our big monolithic web application dropped from ~1 minute to ~10 seconds. This is a process that is done about 20 times every day.

Over the course of the year, this will save ~72 hours in development time. Of course, it's not a strict comparison because I would always do other things while waiting for it to compile, but it's still a massive boon to productivity.

I understand that we want to live in this perfect world where all software is designed to run on 2008 era laptops. That's not the world we live in. It's Fantasy. I implore everyone to keep fighting for it, but Reality is what matters to businesses, and the reality of most businesses is that software is insanely complex, poorly designed, and it still generates revenue; more often than not enough revenue to afford the best machines to support running it.


>Over the course of the year, this will save ~72 hours in development time. Of course, it's not a strict comparison because I would always do other things while waiting for it to compile

That, in my experience, actually makes it worse. As soon as you start doing other things you multitask, forget where you were at, lose the zone, and lose much more developer time than that 1 minute.


Yes! And I think the limit is about 6 or 7 seconds for me.

SQUIRREL!


I would think that the performance difference between a 2013 and 2018 MBP would be significantly larger than the performance difference between a 2013 and 2015 MBP.

Having said that, MBP to MBP comparisons are not always apples to apples (sorry about the pun). You need to compare MBPs with CPUs from the same family as well.


And nobody suggested buying desktop workstations?

A laptop is generally roughly half the speed of a decent decent desktop. And from experience, even if mobility is sometimes useful, it's hardly the norm. Personally my laptop is used maybe 5% of the time at most outside of my work desk.

Cheap laptop for emails, meetings and occasional ssh into prod + powerful workstation would be a better option for me, and I'm guessing I'm not an exception. And this option is cheaper than a high-end Macbook pro.


Some things are helpful -- large clear display, good keyboard

That's why I prefer older equipment, especially laptops which have proper keys and a matte screen.


I started using mechanical keyboards made for gaming, and I've never been happier with a piece of hardware in my life. It's like typing on butter instead of jamming your fingers into concrete. From what I remember, really old keyboards were all mechanical; sometimes older really is higher quality.


The IBM keyboards used for the PC, AT, and PS/2 used a "buckling spring" mechanism that is not mechanical, as later models use a membrane.

I am typing this on a Model M that is dated "22JJUL88" - I do all of my important typing on these keyboards.

https://en.wikipedia.org/wiki/Model_M_keyboard


I’m fairly certain that Model F and M keyboards are considered mechanical.

It’s not a precise category. But I’d even go so far as to say they are the type specimens for mechanical keyboards.

Yes, most products in the category use Cherry-type switches, but Topres are definitely considered mechanicals and they combine a rubber dome and a spring.


Literally all keyboards are mechanical, what’s the alternative? Organic?


The term is used figuratively, not literally.

Buckling spring keyboards count, rubber-dome and scissors don't. There's no overarching principle here, if there is, it's how they feel under your fingers.


Optical or touch.


Te only thing that prevents me from switching to a mechanical keyboard is I care about my coworkers ability to concentrate.


Cherry MX Browns are reasonably quiet. My local Micro Center has a tester keyboard with a variety of switches that you can try. If you look outside the limited options there, you can find huge selections of mechanical switches that are only slightly louder than membranes. You’re looking for linear or tactile switches and you can find a pretty complete list at https://deskthority.net/wiki/Main_Page in the “Keyboard Switches” section. You can even mod louder switches with O-rings to dampen the noise.


I agree, MX Brown are silent enough to not get notice.

I guess it's a positive side effect of open spaces: there is so much noise around from phone calls and co-workers discussing that it doesn't make any difference.

More seriously, I've a Ducky keyboard with MX Browns (and it's also a weird one with MX Blues (the noisier ones) for the arrows and page UP/DOWN), and I never got any remark for that.

The only story I've heard from colleagues complaining about keystroke noises was for a friend that has a really heavy typing (to the point of cracking the key caps), and even in this case, rubber O-rings did the trick to dampened the noise enough.


Then get some medium stiffness linear switches. Near silent unless you bottom out, and the added stiffness helps with that.


> sometimes older really is higher quality

You can apply this successfully to espresso machines, coffee grinders, furniture and many tools (gardening, woodworking etc). The difficult part is finding them as often you need them immediately.


For me it's stereo receivers/turntables and cast iron pans. Vintage Le Creusets look incredible and they feel so good to handle.


> proper keys

Yes, please give me a full keyboard, and don't put things in weird places. My current machine has "Fn" where "CTRL" should be and it's driving me crazy. Also, the "PgUp" and "PgDown" are directly above left and right keys and I inevitably hit them when I'm jamming the keyboard with my meathooks. I don't need a number pad, but if you're going to give me nonstandard buttons, put em somewhere I can't hit em while I'm trying to do actual work.


In many laptops with Fn and Ctrl swapped there is often a setting in the BIOS to switch their positions. I know Thinkpads have this ability, and I feel pretty certain other brands can do this as well.


Not this Asus I have. I'm ready to cut some traces and solder jumper wires to rearrange this POS.


You can remap any keypress that reaches Windows by editing the registry, as described here: https://www.experts-exchange.com/articles/2155/Keyboard-Rema... (2011, but works with Windows 10)

Unfortunately many "Fn" keys are handled purely in hardware and Windows can't see them. But it might be worth a look, if you haven't already tried it.

AutoHotKey is also useful, especially if you want more complex hotkeys.


And xmodmap is the de-facto solution for X11-based Unix machines. Rebind your keys in an ~/.xmodmaprc that is run at startx by your ~/.xinitrc.


One of my old laptops had the function keys swapped (so you had to hold down Fn to hit the F key) and had mapped Sleep to F5, which I’m used to using to both start debugging and reload a web page.

The amount of times I put my laptop to sleep by accident was infuriating, and it was a work laptop so I couldn’t get into the bios to swap the keys back!


> My current machine has "Fn" where "CTRL" should be

You mean directly to the left of the A?


I.. uh... no? Why, why would you think that?



Well, my laptop isn't a Teletype.


I miss laptop mice that had actual buttons and didn't get in the way when typing. Laptop ergonomics are terrible these days.


My laptop at home is a Dell Latitude E6450 (I think that's the one); it's the last one where they had a real keyboard and not that stupid island shit, and it also has a trackpoint and a normal trackpad with actual buttons. After that generation, they finally jumped on the stupid mushy island key bandwagon, so I don't know how to upgrade from this machine. I had one of the newer Latitudes at my last job and it was nearly unusable because of that shitty keyboard. Honestly, WTF is wrong with everyone these days?

Luckily, this laptop can play x265 1080p video just fine and has 16GB of memory, but 4k is a no-go.


Best laptop keyboard I ever used was on a Compaq SLT/386 I owned. Keyboard could be detached from the laptop (ok, today it would be considered a "luggable", but back then, it was a nice machine), and it had a coiled cord that plugged into the computer underneath where the keyboard sat.

You could unplug it from the computer (it was a mini-din PS/2 style plug). The keyboard itself wasn't mechanical (not a buckling spring or similar system), but it did have full-travel keys and a nice feel for typing on.

I got mine used and had to build a custom battery pack from old cell phone batteries, which made me have to remove 2 meg of RAM (to fit the larger custom battery pack), leaving me with 6 meg instead of 8. I had Caldera OpenDOS installed on it:

http://www.deltasoft.com/opendos.htm

http://esca.atomki.hu/paradise/dos/opendos-en.html

https://en.wikipedia.org/wiki/DR-DOS

...with Monkey Linux installed on top of that (Monkey is a distro that used DOS for the underlying file system - you could even share data easily):

http://projectdevolve.tripod.com/ (downloads don't work)

http://www.ipt.ntnu.no/~knutb/linux486/download/monkey/monke...

I'm honestly not sure where or if you can still get a copy of that distro - I should look into it; maybe I should host my copy somewhere...


I can't speak to laptop keyboards that far back in time, but for a very long time, Thinkpads and Dell Latitudes had the best keyboards for laptops (obviously they weren't going to compare to a Model M or other mechanical keyboard).

But even these have gone away, to be replaced by the shitty island keys.


I use a Lenovo T480 for work and disabled the trackpad so I just use the nipple + hardware buttons when I'm not using a mouse. When I need a new personal laptop I'll probably buy the same model.

https://www.lenovo.com/us/en/laptops/thinkpad/thinkpad-t-ser...


give me a nipple any day.

I can use a macbook trackpad, it's OK for consuming, but the pc manufacturers saw macs, and copied them. Poorly.


I remember getting to my first Job and they had Pentium4 based machines (the last stepping that supported x64) with exactly 4gb of ram. They were dog slow but it also meant we were able to see and address performance problems early because they were more pronounced. I remember taking one bug from taking an hour to run to less than a second.


I kind of wish I was in that boat right now. I made a highly praised window manager[0] for macOS that some people[1] have reported some performance issues with. But I can't reproduce the issue, possibly because I have 16GB of RAM and don't use it all. Maybe I don't have enough Electron apps running and should install Slack and others (I run Slack in Safari when working with clients), not joking that literally seems to be the main environment difference.

[0] https://sephware.com/autumn/

[1] http://brettterpstra.com


Have you tried debugging it in a throttled VM or with valgrind?


I feel the same way, honestly. I do wonder if the programming environment has a huge impact on this though. Did you colleagues use an IDE of some sort? If you're a vim or emacs user, the editor doesn't require much. If your compiling a ton of C++ then maybe a faster processor would help.


I think one of them used Webstorm, I was using NeoVim, and the rest were using SublimeText.

I feel like if we were doing something that required C or C++ (e.g. video processing or data-training), then they might have had a point about wanting to upgrade, but we were doing a lot of fairly typical Node.js REST stuff, something that could fairly easily run on a Raspberry Pi.


From my experience, IDEs benefit the most from more RAM.


I don't always need a powerful machine but I sure want a powerful machine. I consider it part of my compensation. Working with a slower machine or a smaller screen is the same as working with an uncomfortable chair or lower pay.


A two year old machine, assuming it was top of the line at the time of purchase, should be plenty fine for almost anything today though.


In some organizations, replacing top of the line notebooks frequently is less about productivity than employee morale. Sometimes that cost is worth it, sometimes it is not.


10 year old laptops are still portable supercomputers. There is zero reason for them to be slow. The problem is bloated code that expects to use excessive amounts of system resources.


I feel that slower cheaper laptops with good battery life coupled with a compiling desktop/cluster you can ssh into is the best trade-off.


Plus, when that cheap laptop breaks or gets stolen, I can easily replace it. A $100 x200 in my bag and a $2000 dev server at home/work is much less of a risk than a $3000 MacBook Pro in my bag. It's nice to have all the power you need in your hands but it's also nice not putting all of your eggs in one basket.


I've prob been on the other side of that argument. I usually advocate for allowing developers to pick their laptops rather than coming up with a one size fits all solution.

This started when my company started buying Skylake HPs for all new hires because they were "cheaper". They were only cheaper because they were comparing them to current gen MBPs. As a result we were stuck with TN panels and 8 gbs of ram. I would rather have a $400 chromebook with an IPS display at that point, easily 1/2 to 1/3 of the price.


One of the great beauties of working with scripting languages, they work just fine on lighter laptops. The MacBook Air is a great machine for the node/python/php programmer on the go.


Running an old T440 Thinkpad to do my work programming. Mostly doing data analysis. Never ran into an issue with any kind of speed or capacity, and I love the keyboard.

I have a gaming laptop as well that I don't bother to carry around. It's just not necessary unless you're doing GPU programming or model training. Even then, I'd rather just work with a cloud instance.

Devs, you're supposed to know how to make a computer meet your needs! Don't outsource it to someone else. Even 5+ year old computers run pretty damn quick if you use a lightweight linux distro.


Most Devs don't want to deal with hardware restrictions. Its a lot easier to just get a good, general all rounder instead of coding on a dinosaur. Many lightweight distros barely have any meaningful functionality and usually require all interaction with the console to get even basic things usable. And quite frankly, even though you can do more with a console, its a lot easier to remember how to do things with a GUI then without one.

Older hardware generally means older, unsupported, unsecure drivers as well.

Also, I'm confused as to why you claim that you shouldn't outsource to someone else, yet you're fine with working in the cloud...


No idea how you'd run into hardware restrictions; T440 is from 2013, and I have yet to run into a driver issue dual booting Debian and Windows.

I don't understand the bit about the console; I haven't met a dev who doesn't find terminals worlds quicker than hunting and pecking in a GUI. Maybe an argument to keep general users on Windows, not really an argument against devs running a linux distro.

I was suggesting you should understand what's running on your machine and why, and if you do, that $2k mac isn't doing anything for you that a machine worth less than a quarter of that will. Whether or not you have a top of the line machine, there's still reasons to reach for an AWS instance with a powerful GPU attached.


> No idea how you'd run into hardware restrictions;

Hardware can become unsupported when you update the OS. It's happened to me with wireless cards when running FreeBSD on an old EeePC. Even when using xfce, having wpa_supplicant UI was much simpler than remembering and writing a bunch of scripts to set all the crap involved with getting it working. Not everyone uses Debian.

> I don't understand the bit about the console; I haven't met a dev who doesn't find terminals worlds quicker than hunting and pecking in a GUI

Doing a few clicks in a GUI can often result in very complex command executions in the CLI, sometimes across multiple processes. It can be confusing what's going on, especially with redirecting I/O and if you have to do something different, it often requires editing multiple arguments depending on what you want to do.

This is good if you want to script a common task that's repeatable and changes infrequently, but frequent changes in a GUI are much faster and you don't have to worry about copy/paste errors or spelling errors.

And if consoles were so much faster, why does everything evolve into a GUI at some point?

> I was suggesting you should understand what's running on your machine and why

There's hundreds of processes running on the machine at any given time. I would guess that most people don't know or aren't even aware of what and when each process runs at any given state of a machine.

The point is, with a $2k mac (which I would never get by the way), there's easy room for expansion.


> I haven't met a dev who doesn't find terminals worlds quicker than hunting and pecking in a GUI.

There's a billion Windows users out there, do you think there's 0 developers amongst them? :)


> And quite frankly, even though you can do more with a console, its a lot easier to remember how to do things with a GUI then without one.

I don't know about this. Whenever I do something unfamiliar/complicated on the command line I copy every command I used to a text file for later reference. Repeating these actions in the future is as easy as copy and paste. If I figure out how to do something in a GUI and I don't have to do it regularly I will almost certainly find myself flailing and clicking around randomly when I have to do it again in 10 months.


I'd say it depends on what you're programming, but for most programmers they indeed don't need that much.

If you force your devs to use inefficient tools it might be necessary though. Like our customized Eclipse that needs 8GB RAM. It starts page swapping if you have only 8GB.

Other than that I'd say you need good keyboards and good screens. The processing power usually doesn't make you better.


At a previous gig in the financial world, we had 2013 vintage MB Airs with maxed specs cranking out Scala for a card processing platform.

This was right as the term microservice was becoming better known. We were building a very de-coupled, stateless design, passing messages among services.

The MB Air was fine. We did a lot in the cloud as far as testing. Write code -> push -> get test results

These days so much of the work is “fill out yml files”. I see little value in having this 2017 MacBook Pro 15 with maxed specs. Having anything more than Firefox and my editor on the laptop seems useless.

Others workflows may vary. But for devops, security, and a great many common roles, anything above a mid-spec MB Pro 13 feels like overkill


If you're running Docker and minikube locally on a Mac (both DevOps tools), then it can be important.


In my experience hardware is the cheapest part of hiring staff and I've found providing the best equipment that the employee wants its a cheap way of aiding retention. Penny pinching is pretty stupid in the medium to long term.


I remember that third party developers for BeOS had machines (BeBoxes, PowerPC 603) with 16MB (not GB) of RAM, but internally Be engineers had to use 8MB systems. It did seem to have positive consequences at the time...


Causing the race between bloated environment and ultra performant hardware to carry text editing.

Many times I've observed that people running programs on older hardware realized stupid design non-decisions. Fast ~= blind.


I have an iMac and a MacBook pro 2015 ; I wanted a Unix like OS. My youngest sons use the iMac and MacBook for games essentially. I have a much older hp 600 notebook with openbsd. This perfectly fits my needs: emacs, TeX, a browser and some R. No need for Bluetooth either. I guess I paid the premium with the Macs for the build quality and the screens. Never cared about any of their apps and brewed what I needed ( openbsd provides everything I need ( except yet for pandoc) Ymmv.


Only argument I really buy when it comes to needing bleeding edge hardware is if you're doing something like game development where it's reasonable you might be compiling, testing, and running a big IDE.

This whole "productivity increase for saving 10 minutes" is a little silly, people take breaks, get coffee use the bathroom, etc. Most folks I know doing processing heavy stuff do it all on a remote server.

With a few exceptions, I think it really comes down to people want nice toys.


If you're going to be developing in Visual Studio Code, run Spotify and Slack, as well as a few other Electron / Chromium apps then I can understand why someone would want top of the line computers.


On the one hand I agree. I personally don’t need the latest spec to be productive. On the other hand computers are so cheap relative to developer’s salary - you can almost buy a new one every month.


Why not upgrade to more powerful desktop pc's unless your developing native mac software you don't "need" to use macs to develop on


As well, you don't need the overhead price of a laptop, especially as a startup. At my job, we're nowhere near a startup, but only well-established, well-trusted employees get a laptop.


I 100% agree until we start talking about compile times. If you're a Chromium dev, you can't use an ODroid.


I really dislike generalizations.

If you can use an old computer for development — great, I'm happy for you. But please don't assume everyone can, or wants to. For my type of work, no CPU has enough single-core performance (I need quick bursts). And since programming is what I do several hours every day, it is quite an important part of my life, so I'm not willing to torture myself on old hardware just for the sake of it.


I see no such generalization or assumption in the grandparent comment.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: