This started somewhat of an argument with my peers who claimed they needed a top-of-the-line Macbook to do everything because you can't program on anything slower. Management ended up caving and buying the new laptops.
I stand by my point on this though; as a proof of concept, I "lived" on an ODroid XU4 for a month awhile ago, doing all my programming and everything on there. I was happy to get my big laptop back when I was done with this experiment, but I never felt like weaker hardware impaired my programming ability.
10 minutes of lost productivity a day due to inadequate hardware means you are paying at least $4k per year per dev for that shit pile of a laptop. Usually it costs more than 10 minutes a day too.
The hit to morale when you tell a professional they can't pick their tools is even worse. Now they are spending at least 1 hour a day looking for a new job because devs are good enough at math to know if you can't afford a new laptop you've got six months of runway tops.
If you believe employee happiness has any correlation to productivity you always buy them whatever hardware makes them happy. It is a small price to pay.
and yet not every developer not using a tiling windowmanager is fired on the spot for wasting company time.
"productivity" (or what people think of as productive activities) is overrated. Shiny new hardware makes employees, especially technically oriented ones, feel important and appreciated. That's about the gist of it. Nothing wrong with that, but let's not blow up egos with claims of "productivity".
If I automate my personal toolset, I follow the same procedure I use around automation anywhere else: don't start off doing it to save time, do it to increase reliability. I will write small scripts, sometimes one-liner scripts, sometimes largish hundreds-of-lines scripts. But the outcome I am aiming for is that I have a procedure that is documented and puts all the configuration in a place where I can see it, so that when the situation changes, it is fixable. A productivity boost is a frequent byproduct of successfully automating, but it's usually a side component to "reliable and documented". The boost is perceived as reduced friction and increased conceptual integrity: fewer things to check or to accidentally get out of sync, and thus less stress involved.
Focusing on UI being both shiny and fast is likewise often missing the point - which is the case when discussing new hardware. There are order-of-magnitude thresholds for human attention that are important to keep in mind, but hitting a lower attention threshold usually doesn't solve a productivity problem. It creates a case of "wrong solution faster", drawing the user into a fast but unplanned and reactive feedback loop.
See for example the case of writers who like the Alphasmart, a device that makes doing editing so tedious that you don't do it, you just write the draft and edit later.
What on earth are you doing that a 2 year old mac is inadequate for?
Yeah there is a point that the hardware is a problem. I'm working on a 5 year old, mid range PC and I don't think an upgrade would really change any of my day to day work. Maybe InjeliJ would be a little faster and compile times would be a fraction faster but I doubt I'd notice it.
I have a 10 year old PC at home and the only pull to upgrade is gaming, but I'd rather not sink time into that (I get enough time sitting behind a screen at work) so I hold back on spending money on it too.
Maybe the developers happiness does drop if you give them older hardware but I don't think that's based on realistic changes in performance of that hardware.
Just because you don't notice doesn't mean that it's not there. The argument is that it's insane to pay developers north of $100k per year + benefits and then don't invest $4k every 3 years to buy them fast hardware.
And I don't think that argument is particularly convincing. Typing this from my 3.5 year old work MacBook Pro.
Not sure where your experience is originated in. It obviously also depends on what exactly you do with your computer. Mid 2015 Macbook Pro to Mid 2018 Macbook Pro got a 50% improvement for compute workloads according to Geekbench .
I work on largish (win/mac/linux/ios/android) C++ code bases and compile and test run times are definitely an issue. Switching to newer computers every few years definitely boosted productivity for me personally (and for my co-workers as well). We mostly saw that compile times halved for each generation jump (3 years) as a combination of IO, memory and CPU performance increases.
Not sure what marginally faster hardware means exactly for you, but for us it's definitely been significant, not marginal.
YMMV, but if you do the math of saving 10 minutes / day * $200 /h * 200 days that's > $6000 per year it becomes pretty hard to economically argue against investing in faster tooling of some sort.
Typing this from a 2.5yr old Macbook Pro.
But if you're like me where most of that happens off on the build machines there is very little impact in upgrading your hardware.
A 50% improvement on compute workloads probably wouldn't be noticeable on the setup I run. Outside of compiling I don't think I push a single core much above 30%.
I guess it really comes down to what you're doing.
If this is enough to make a huge difference then you should be running the workload on a low end or better server instead of a mac book. You'll get much more performance for a fraction of the cost and won't have to pay for the things you don't need, like a new battery, screen, etc that come attached the the CPU you need.
> I work on largish (win/mac/linux/ios/android) C++ code bases and compile and test run times are definitely an issue. Switching to newer computers every few years definitely boosted productivity for me personally (and for my co-workers as well). We mostly saw that compile times halved for each generation jump (3 years) as a combination of IO, memory and CPU performance increases.
Have you exhausted all other avenues there? Do you have distributed builds? Is Everything componentized? Do you compile to RAM disks?
For that matter, why a macbook? Why not a high end gaming laptop with more CPU, RAM, GPU resources?
In high-end job markets, hardware is so cheap compared to salary, etc. that there is no reason not get the top stuff. If you don't, that may impinge productivity, and in addition it will send a very negative signal.
All labour in the US costs far more than a decent chair or a nice ergonomic keyboard and mouse, a hands free headset... All of these things are peanuts compared to the costs of employing someone.
It’s actually cheap enough that you could buy it yourself if you wanted to. It certainly went into my bookmarks, just in case I ever actually need a literal battlestation.
I think it was a joke.
I honestly don't think I could work for a company that cheaps out on equipment and supplies anymore. The difference it makes every day to the quality of not only the work I do, but the working conditions for me and the people I work with is worth it.
If I can provide free coffee to the whole office on my own salary alone, maybe that’s not a great way to save money.
Anything that slows you down or irritates you eats up more minutes than just the raw time you're waiting too.
Say something takes 1 second to compile vs 6 seconds on slower hardware and you're having to recompile ten times to fix a bug. The raw waiting time might only be 10 seconds vs 60 seconds but that extra pause of having to wait every time to see if you've fixed bug might annoy you enough to drain significantly more productivity out of you.
You're best keeping your developers happy so they enjoy plowing through the work instead of hating it.
For the sake of a few thousand dollars for a laptop that can be used for several years, even a 1% increase in efficiency is very likely worth it.
What we should really be doing is spending all our time 'researching' on the internet. Increasing our value to our companies. That's what I tell my boss anyway.
If person is in flow, you want them cranking out as much output as they can with no slowdowns. That was an advantage of Smalltalk and LISP. Any language with REPL or IDE supporting something like it will let one do this.
From there it can mean using fewer plugins, disabling part of the static code checks, doing less full debugging etc.
I remember how activating the debugging flags wouldn’t work on some projects as the machine couldn’t handle it memory wise.
All of these have a compounding effect that is non trivial.
That it is worse in another field is not a very good argument?
I'd say anyone that drops the money on a 4 year CS education of any rigor, survives it, survives and succeeds at the fairly rigorous interview cycles required to get a job in SWE these days... is absolutely entitled to requesting top of the line hardware to perform their duties.
I'd say your argument makes it "seem" like it's coming from envious, curmudgeonly luddites.
If individuals of other vocations feel that these developers are "entitled brats", perhaps they should switch careers? I can't imagine a teacher went in to education seriously expecting to get provisioned the latest rMBP as a perk?
Are we assuming all jobs are equally dependent on high-end hardware to provide the best RoI for time spent?
I simply said it was hard and rigorous. Not the most.
In so far as it makes developer more empathetic towards users, it's a good point to make.
A teacher spends presumably most hours actually interacting with students and use the computer for preparation and paperwork something your 10 year old dell is probably well suited for during the minority of the time they spend on it.
Your dev spends most of their time on their machine and even if they don't have to build software in a compiled language may still be running a heavy development environment, several browsers to test the result, virtual machines, etc etc etc.
To drive the point home lets consider the cost of a 5% decrease in productivity due to using an inferior machine.
If a teacher is contracted to work 185 days or 37 work weeks in a year and earns 60k the teacher earns $32 at 50 hours.
If the teacher spends 10 hours per week on the computer the cost is no more than $32 * 37 * 10 * 0.05 = $592
If your software developer is earning 100k x 50 weeks and spends 50 hours per week almost all of which is spent using the computer then the cost is $40 per hour x 40 hours on the computer x 50 weeks x 0.05 = $4000
This doesn't account for actual costs incurred by having to hire more developers because management is too incompetent to retain them buy buying them nice tools.
Teachers don't talk to themselves in classrooms so time loss can affect a small percentage of up to 180 student-hours per day or 900 student-hours per teaching week per teacher. A typical '8 form entry' secondary school in UK will have around 100 teachers plus admin / head of subject / 'leadership'. School year is around 38 weeks.
I sometimes think something like ChromeOS but that can run IW software and just stay booted would be better. An appliance.
it all adds up.
Just for your knowledge, your answer probably differs from the one I'd get from whoever does the accounting at your company, and their answer is the right one.
If you're a proficient IDE user, learning and setting up Vim up to a comparable level to a top-notch IDE (Visual Studio, IntelliJ) would take more than three or four days. Three or four weeks (or even months sounds) more realistic, in order to get efficient:
* code navigation
* code refactoring
* tool integration
* workspace/project management
Don't let anyone tell you otherwise, I'm a Vim user and the people that say it will take just days or a few weeks have just forgotten how long it took them to ramp up. Or they're fooling themselves that their Vim does everything a capable IDE does on strong hardware.
And then I'd tell them about alternatives after they were proficient with vi. Not one ever switched away from vi.
Did that cost us in initial productivity? Probably. But it's such a minor thing when it comes to NCGs.
Now this is being ambitious :)
(I put the blame half on the browser vendors, and half on modern cloud web apps. My tabs usually include Gmail, Jira, Confluence, Trello, and Slack. Even doing nothing, between them they'll sometime have the fans spinning...)
This is also a good argument for running tests on a separate dedicated machine
I also disagree strongly about needing to run tests on the exact specs of your average consumer, most of us aren't writing software for a small set of hardware configurations so determining those average specs is likely not possible - but if you're working in an embedded platform or with specialized hardware I do agree that you absolutely do need to run tests on the actual hardware regularly, I'd still argue that those tests should be run on a dedicated machine and should be in addition to tests that verify the code is obeying its contract.
Kinda expensive, having a set of multiple dedicated servers (or VMs) running on God knows how many dozen cores and hundreds of gb of ram just to run the tests that my laptop runs fine by itself, all in the name of running the "exact specs" that my stuff is going to be run on.
You're describing system testing, which takes place extremely late in the product cycle.
Can you imagine really trying to run 20+ year old tech in
developer space? 8 - 10 MB/s throughput and 70 bogomips?
My favorite example here is from an old CS job my wife had in connection with HP, this shop was such a grind house that they refused to buy a new chair that would be compatible with her back issues. Refusing to buy an employee a 120$ chair for a loss of perhaps 10-20% of their productivity just doesn't make sense mathematically - ditto for any company that has people working on computers that refuses to shell out for good keyboards and mice. These pieces of hardware are so cheap that having a lengthy discussion about why you're not going to get them is probably costing your company more money than just snapping them up.
I for my part would prefer if, for instance, the Slack developers were confined to machines on which their current product runs as badly as it does on mine, even if they feel so miserable and underappreciated as a consequence that their uplifting have-a-nice-day loading messages get replaced by passive-aggressive snark or something.
I know that I'll be significantly more cost effective with 32 gigs of ram because I don't need to spend time killing processes and rebooting VMs after half a day of work.
I know what keyboard and mouse is still comfortable for me after working 8 hours, etc.
I know I'll be more productive on a macbook not because I'm an apple fan boy. I hate apple because they've done more to kill open source than any other company - even MS. I'm a linux fan boy. But I need to use various tools that don't run on linux. I could "tough it out" on a cheaper windows machine, but it wouldn't be cost effective. I would be less productive.
A professional knows what is cost effective and spends their hardware budget wisely to optimize their productivity. They don't rip off the company for the latest shiny gadget. It is stilly to trust a dev to essentially run a multi-million dollar company, but not to pick out a computer.
This only holds true if your developers are 100% efficient programming every second the machine is running. But let's face it. The first hour of the day is most certainly not the most productive (10 minute boot? fine, I'll make coffee meanwhile). You could easily schedule a meeting, like a stand-up, while the machines fire up, if those 10 minutes really would be necessary.
You would actually multiply a percentage of inefficiency x hours worked.
Also honestly human beings doing intellectual work can't just do something else for 10 minutes and lose zero productivity because intellectual work is fundamentally dissimilar from assembling widgets OR managerial work.
Consider reading http://www.paulgraham.com/makersschedule.html because you fundamentally don't understand how people work and can't be a good manager without at least attempting to understand.
Yup, I'm talking about the beginning of the work day. Nothing productive would be interrupted.
> Consider reading http://www.paulgraham.com/makersschedule.html because you fundamentally don't understand how people work and can't be a good manager without at least attempting to understand.
I have no idea how you got to your conclusions about me by reading my comment but thanks for judging. Regarding this piece by PG, did you read it? Because it actually supports my claim, to schedule a meeting in the beginning of the day, while machines would boot, in order to save the precious time in between start and beginning of the work day.
You said that this calculation is erroneous because the developer could easily make coffee or have a meeting while his machine boots up and thereby recover that lost time.
This is a very puzzling suggestion. Slow machines aren't merely slow the start they are slow to complete user operations while the user is sitting at the machine awaiting the result. The time cost is the sum of 1000 small delays throughout the day. You can't productively fill the extra 30 seconds you spent waiting 20 times with a meeting for example.
In fact acceptable hardware/software wakes from sleep in a second or cold boots in 30-90 seconds. Boot up isn't really the problem.
>Because it actually supports my claim, to schedule a meeting in the beginning of the day
What the actual article says
>Several times a week I set aside a chunk of time to meet founders we've funded. These chunks of time are at the end of my working day, and I wrote a signup program that ensures all the appointments within a given set of office hours are clustered at the end. Because they come at the end of my day these meetings are never an interruption.
PG actually suggested meeting at the end of the productive day to avoid breaking up productive working time.
I suggest you read it instead of skim it.
I suggest you understand it instead of mindlessly quoting it. It's clear PG wants no interruptions for productive work time but if a meeting is scheduled before productive work time begins nothing gets interrupted.
What may apply to one person or industry absolutely does not apply to them all.
And to boot - I'm the GM of a very, very large solar company.
If this is true, I would expect that investing in hardware that most effectively gets out of a dev's way to have an even higher return on investment than is suggested by time and productivity arguments. The emotional toll of dealing with the dev equivalent of paper cuts should not be under-appreciated.
Devs don’t care whether their laptop is 2 years old or not, they care that their compilation takes 5 minutes when it could be 2.
And a strong culture of belief that, if a developer couldn't get something working well enough in that kind of environment, then that should be taken as a reflection on basically anything but the developer workstation. Inefficient code, excessive memory usage, just plain trying to do too much, bloat, whatever.
But I realize that's a hard thing to sell. A lot of developers don't really appreciate being reminded who works for who.
The tools being used to create a piece of software are often fundamentally different than those used on the other end.
This means that machines should be provisioned in accordance with the needs of the actual tools running on them.
Developers may in addition to running different tools may need to iterate quickly in order to test functionality. It may be acceptable that an app that is started once at the beginning of the day takes 3 seconds to start but if you deliberately handicap developers machines and it takes 7 each time the developer is trying to fix a bug in the space of a few minutes instead of 1 on a faster machine you may have damaged your own productivity for no reason.
This has no nothing to do with the idea of who works for whom. Incidentally this statement sounds entirely passive aggressive. Ultimately I'm pretty sure all of you worked for whomever is your ultimate customer and are invested in helping each other do effective work it sounds like the management team was entirely unclear on how to do this. Is that shop even in business.
Contrary to some of the criticism I'm seeing here, it wasn't actually hated by the development team, either. The hypothetical "can't get any work done because compiling takes a bazillionty days" scenarios that people are pulling out of thin air here simply didn't happen. At a shop where developers were expected to think carefully about performance and know how to achieve it, they tended to do exactly that.
Someone who was busy making excuses about how they weren't very productive because they only got a 3.4 GHz CPU while the analysts got a 3.6 GHz CPU probably wouldn't have lasted long there.
As long as you have time to actually achieve this, remove that and even on the worst hardware, you'll see shitty implementation.
Dogfooding is sometimes a good idea, and of course testing on a range of setups is important. I suspect there is a problem with people testing on older software but not trying any older hardware (especially for web apps), which using old machines could have partially avoided.
But the idea that development should inherently be able to happen on older hardware than the product will run on is arbitrary and ridiculous. At best, that creates pointless pressure to rely on hardware-friendly tools, which could mean anything from not leaving open lots of relevant Chrome tabs to pushing developers to use vim instead of a Jetbrains IDE. (Nothing wrong with vim, obviously, but "we intentionally hobbled our hardware" is a weird reason to choose an environment.)
At worst, it fundamentally impedes development work. For an extreme case: Xcode isn't really optional for iOs development, and merely opening it seriously taxes brand new Macbooks - applying this theory to iOs developers might leave them basically unable to work. Even outside that special case, there are still plenty of computer-intensive development tasks that are way outside user experience. Just from personal experience: emulating a mobile phone, running a local test server for code that will eventually land on AWS, running a fuzzer or even a static analysis tool.
Even if we grant the merit of that trite "remember who you work for" line, sticking to old hardware doesn't seem to follow at all. We wouldn't go around telling graphic designers that if they work for a customer with an old iMac G3, they're not allowed to have a computer that can run Photoshop. Heck, are there any professions where we assume that building a thing should be done exclusively via the same tools the customer will employ once it's finished?
As a senior dev., I still use an iPhone SE, but the mobile dev team all have the latest iPhones.
The app looks horrible on the SE, and some touch areas are blocked by overlapping text or images.
It is basically unusable on a supported device.
Endless performance problems were masked in development, and then very noisily evident when it reached customers. Performance testing was largely reactive and far removed from the creators of some terrible code choices, so they'd tend to shrug ("Works fine here") until you analyzed the hell out of it.
Now, the code was a big C++ environment, and compilation speed was a problem, but maybe a means of testing in a throttled state would have prevented a lot of grief much, much earlier.
This is a really nice way of working and you can see the results when browsing https://mbasic.facebook.com: even at 2G speeds, even with photos, pages load fast. No unnecessary background process trying to do something, all buttons are there to be clicked on already. A really smooth experience.
Developer tools are only resource hungry today because their developers aren't dogfooding.
Developer productivity hasn't improved since, even gone backwards in some ways.
We had to buy cutting edge pc's spend >£2k on extra ram and £4k on 20 inch crt monitors to develop - the application would run on a 133MZ
If you had less than 2 Blue screens per hour that was a good hour - This was early 90's btw.
Where I worked in the early 1990s (1992-ish), us developers fought over who would have the color terminals (Wyse 370/380) and who would have the monochrome ones (Wyse 50/60). I was low-man on the pole so I always got stuck with the low-spec terminal (though I still have an affinity today to amber VT220 text characters).
Until I "wowed" the owner of the company (small shop) showing how to do windows, dialogs, etc in a terminal using a virtual buffer and some other "tricks". Then I was given one of the color terminals, to jazz our interface (which was mostly numbered menus and basic flat screens).
At one point, I played around with the 16 color Textronix graphics mode you could pop into with the right escape sequence; I think I made a very slow Mandelbrot set generator (never showed that to him; our app was already pushing our dev hardware, which was an IBM RS/6000 desktop workstation running AIX we all shared via 9600 bps serial connections)...
The system was SIROS the system that helped manage the UK's SMDS network so we had budget for it.
The 20 inch monitors where awesome for playing doom late at night
Mind you I don't necessarily agree with all of this. Well except the IDE part, Vim and Emacs are tools that more people need to learn.
In every case I've had a dev db running on a shared test server, that DB has been woefully underspecced for the purpose and often in a datacenter with 300ms latency from the office over the company VPN.
While production instances are in the same datacenter as the production DB with 5ms latency.
They should have bought everyone a second low spec machine to test on, and let them use proper dev machines for building the software.
I guess if is a shop where the management feels they need to remind developers suffering through that for 8+ hours a day "who works for who", that was probably the least terrible part of working there.
"Yep, we put those developers in their place."
"Hey, why are they leaving???"
"Oh, you mean we need developers more than they need us?"
You need your doctor more than your doctor needs you. That doesn't change the fact that your doctor is doing work for you, and not the other way around. Same for lawyers, plumbers, electricians, architects, and anyone else working in any number of other skilled professions.
Programmers don't need bleeding edge, just a lot of RAM.
I recently upgraded from a 2015 Macbook pro to a new i9 one, and right now I'm working on a computer vision pipeline for some data – an embarrassingly parallel task. It takes about 15 minutes to run a process which would have previously taken about an hour. This is a direct improvement to my development experience (trust me!)
But there are a bunch of different reasons. Modern stacks can be annoyingly under-optimised; a large Webpack app with live reloading can be irritatingly slow even on recent machines. Fast SSDs are useful for people working with large datasets. Better GPUs can mean better UI performance and more screen space.
In short, remember that just because you don't need the hardware, doesn't mean that others don't! :)
If you work for a employer with deep pockets I guess sure why not? Otherwise a workstation you can remote connect to (if you work from home or travel) is probably good enough.
2015 MB i7 to i9 MB doesn't increase anything times 4.
How long have you been working on that task and waited an hour? The wasted labour cost might have bought you a dedicated compiling rack 2 weeks in.
How long could you have rented cloud ressources to bring that task down to close to instant for the cost of a i9 MB?
I am just curious tbh. I have various tasks like yours hobbywise. But the least thing I'd encourage my laptop to do is compile/render/analyse a problem that takes more than 60 seconds. Beware if you fully utilise a laptop like you describe shouldn't it become useless for everything else while doing that?
So much questions...
That's great, but I'm literally sitting at my desk doing it just now, so I can assure you it's not just made up!
i5 2015 13" MBP to i9 2018 15" MBP. 3x the cores, higher IPC, higher frequency, faster memory, faster SSD. It adds up, and for this class of process a 4x improvement is totally reasonable.
I don't know how the hell you work, but I don't just kick off a process and let it run while I sit still at my desk waiting for it to complete :) It just involves working to a slightly different rhythm, and the ability to iterate a bit faster makes that nicer for me, at minimal cost.
Anyway… wasn't the point that "developers don't need faster machines?" I think buying a "dedicated compiling rack" would count!
No idea, but in the long term, more than it costs to buy a new development machine. Plus this way I don't have to fanny around with copying assets back and forward, remoting in to visualise things, setting up a server or whatever. And the new laptop makes everything a little bit faster and more enjoyable to use. The price of the machine is pretty marginal for a tool I use in excess of 40 hours a week.
Beware if you fully utilise a laptop like you describe shouldn't it become useless for everything else while doing that?
Nah, it's fine generally. Everything's just a bit slower until it's done.
Used to working diff company with server farms for compiling. Even with the servers, compiling the 26G software pkg took 4 hours. When it is high noon and everyone was using it. I have seen compile jobs can last 8+ hours. There a few times that by the time compile is done, I have completely context switch out and forgot what I need to debug.
No matter where I was, be it at home, in a computer lab, presenting my work to peers or my supervisor, or outdoors in a field, I had the same tools, same large dataset of raw h.264 video streams, same hardware, same everything, without needing to rely on streaming data to and from some server on the internet, or worry about keeping my work and software in sync across multiple machines. I could tweak my algorithm parameters in-field and I could continuously compile my huge 100-page latex sources for my dissertations from.. the beach :)
I think it's always important here to realise that people have different use cases and priorities for their equipment, and that there's no right or wrong answers. Some people are happy to fork out for a portable; others don't require that, and would rather have a fixed workstation with more power. Some developers are totally fine with 10-year-old kit, and some can benefit from newer stuff. I'm sure everybody evaluates their circumstances and comes to suitable conclusions for themselves!
It's a pity they are otherwise crappy for my work (I need a reliable keyboard, I do not want a "touch bar", I do want USB-A ports).
If your machine doesn't have quite enough RAM, it will swap and likely be unusable.
Once you have just enough it will fly, and adding more RAM will make little difference.
I can't imagine the OP's aged stack is so low on RAM that he puts up with swapping.
in 2018 a mobile i9 had 6cores/12threads, bost of around 4.8 (so lets be reasonable, thermals might let us keep 3.6).
For a super optimized parallel load you could totally see more than 2x speed up.
But yeah... why do that on your laptop?
Not on a 13" MBP, it didn't. Source: there's one on my desk.
Not everybody needs a laptop, or a super powerful development machine. But if you want to process some stuff nice and quickly, while not being tied to any physical location, it's a totally reasonable solution.
The Core 2 Duo/Quad on LGA775 is the last revision of the Intel Management Engine (ME) that can be completely removed. The me_cleaner script recently added the ability to clean this platform.
A Q9650 quad-core CPU is the best performance that can be reached on this board. I have two and I have run me_cleaner on both. I do my finances on the one that runs OpenBSD.
Firewalls are great, and you should have one, but you have to constantly watch and understand everything coming in and out of it. Works ok as a full time job, terrible when you have other things to do.
If you're using a lot of virtual machines, sure. If you're working on something that's inherently RAM-hungry like video processing, sure. If you're working against a local database server, maybe. If you're using something like IntelliJ or ReSharper (but not necessarily Visual Studio), then, <sigh>, yeah, I guess.
If you're doing Web development, OTOH, it's probably better if you not have much RAM. If front-end, because a significant percentage of your users will be using low-RAM devices like smartphones and Chromebooks, and you shouldn't need any more RAM than they do. If back-end, because the production servers will (hopefully) be handling a lot more load than what you're doing in testing, so if you get too comfortable with treating RAM as a plentiful resource in development, that's going to be a recipe for scalability problems.
I do mostly Go, and yeah, there it doesn't matter as much, but you should see some of the build times on these services. There isn't even a big scary monolith left in our architecture (besides a frontend in node).
Oh, and let me not forget minikube.
Hell, I did a lot of (hobby, not work) development on an eee PC netbook a few years ago. Besides the discomfort that a small screen and sub-normal keyboard brings, it was a reasonable experience. I didn't have to run slack then though.
Obviously it depends on what you're doing though. Processing data? Running a ton of tools all at once? Differing workflows and requirements have different hardware needs.
I've sat down and enumerated the things I do daily on my computer versus what I did back in ~2000. Besides streaming (youtube, netflix and spotify basically), its pretty uncommon that I do anything that I couldn't or didn't do back then. The performance of these things, now, seems about the same anecdotally. Maybe it was a little slower then, but not so much slower compared to the hardware performance difference. It makes me sad.
Also, as I said in other comments, I once developed hobby projects on a netbook. I of course ran a minimal system and didn't run resource hogs like slack. I'm currently using a cheap laptop running a minimal system (but unfortunately I do need to use slack, chrome, docker for work). They run fine.
I think developers or companies have decided that developer time is more important than performance and resource use of their products, punting the cost of that onto their customers. If my work didn't require it, I would definitely switch to more resource conscious tools and I'd be a bit happier. Oh well.
I used to be able to open 3 Twitch streams in 720/1080p back in 2010 on a dual core Athlon X2 5000.
These days, if I open 2 HD streams in Firefox on a 4.2 Ghz quad-core 4670k, the system essentially freezes - the browser becomes unusable and both streams lag. Thank god for streamlink that lets you bypass the browser interface and view the streams in VLC or MPV.
Not to mention Electron-based apps using 600-800MB for a chat program with some light jpeg usage.
Developers' blatant disregard the the use of user's system resources is insane. It seems that regardless of hardware improvements, the modern developer only targets "acceptable" levels of performance, leaving power users frustrated.
It doesn't seem as easy to switch Windows to a lighter desktop environment.
Granted, using XP today is not recommended for other reasons.
"Ram is cheap"
"The maximum ram you can get is 8-16G"
Developers who are making electron applications (not that electron is the primary cause, it's just a correlation I see often) constantly consider that "ram is cheap" for their users and thus do not pay attention to memory consumption like they should.
This means that you're right in a way; vim is good enough for most people and a decent terminal emulator is going to cost you much less than 100mb of ram.
However; slack is consuming 2GiB on my machine, Skype for business is using 500MiB, Outlook @ 600MiB. I use safari and not chrome (safari is @800MiB with too many tabs open) but if I were using chrome I could be using many multiples of gigabytes.
If everyone thinks their program is worth 1GiB of ram or more then your machines become very limited in what can be backgrounded. You might as-well run Android/iOS if you're running with 1G of ram with todays application/web ecosystem.
The thing is; I am quite conservative with memory usage and I'm still quite sure I wouldn't be able to work on anything with 8GiB or less.
( I mean, I just checked and I'm using 15G of memory: https://i.imgur.com/xd6eB91.png )
So it's not Electron, but it's what Electron enables.
That said, large companies using Electron is what I don't get. You can't expect me to believe Slack doesn't have the resources to create an actual app for their product.
Also they might have started in electron and now it's hard to move away from it.
You can relate if you ever had to setup a JS taskrunner/webpack from scratch. If you don't do it regularly, expect to spend 2-3 hours going through documentation and lots of outdated StackOverflow posts.
JS is easy as long as you stay within the constraints of whatever scaffolding you're using, if you need anything outside of that or need to upgrade your stack you need to know a lot of little things. "Best practices" move fast in JS, and projects die fast too.
Our companies customized eclipse on the other hand hogs 8GB ram when doing nothing, needs 2 minutes to start, and is sluggish in general. It's our course written in Java, which is much closer to native than electron, but still they wasted resources everywhere they could.
You can't run a compile on a 8GB ram machine, it will start swapping to hard disk.
Sublime doesn't have all the features, notepad ++ neither, and I just don't like vim style editing.
That leaves vscode.
I’ve tried replacing slack with the wee-slack plugin for weechat but, while it works it’s far from a fully working solution and I still need to spin up a slack client sometimes.
1) 4GB—let alone 1GB—is already cramped with just Slack and any one of your usual bloated shitware issue trackers/wanky PM toyboxes (Jira, Asana) open in a tab or two, plus the usual few tabs of DDG/Google, Stack Overflow, some docs, et c. That's before any actual code-writing tools enter the picture. The basic suite of tools to just be working at all in almost any role is just barely not-painful to use on 4GB. Worst case you're in an agency and have all your tools, plus several duplicates for the same purpose for a client or three, and so on, all open. Yes, it's because all these tools are terrible and eat like 20x the RAM they have any right to and even that's generous, but I still have to use them.
2) Better hope you don't need any design tools at all. (Sketch, say) if you're trying to get by on 4GB or less for everything, unless you like only having one thing open at a time or dealing with UI sluggishness I guess.
3) Docker(-compose) or minikube or whatever? Service dependencies, local mock services? Running test suites? Without a strong CPU and 16GB you'll see slowdown.
4) A fan of any of the fancier webmail clients, like recent GMails or Inbox or Outlook or whatever? I'm not and just keep the Basic HTML version of Gmail open because its full-page loads are faster than the "speedy" AJAX garbage on those, but if you are into that sort of thing take a look at their memory use some time.
FWIW I think almost all the tools surrounding and supporting development these days are god-awful resource hogs that somehow still manage not to be very good and think 1GB absolutely should be enough memory to get by doing node/ruby/php dev, but I still have to work with that junk, and 8GB's the bare minimum to do that without hitting swap constantly, IME, and even with that you've gotta be careful. 16GB's much more comfortable, especially if you sometimes have to do things other that just Web dev.
I think I could manage to have less RAM (I frequently code on my chromebook with 2GB RAM). My theory is that the ram expands to the amount available https://en.wikipedia.org/wiki/Parkinson%27s_law
Jira's not as bad as Asana but depending on the set-up it can be pretty close. Then there's Invision, et c which are much lighter than those but still pretty damn heavy, if you're trying to get by on 4GB or less. And/or maybe you've got Outlook and Teams and all that. And that's just the communication & collab tools, not even any of the stuff to produce actual work output. Temporarily having to use a 4GB machine with that kind of workflow is why I'm now permanently on Basic HTML for Gmail—it loads fast enough I can close it, and uses so little memory there's no reason to. I couldn't spare the 300+MB for Inbox or whatever with all that other junk open, and besides, Basic HTML's much faster.
If you're not seeing performance issues in your developer machine, you'll hardly see a developer running on 2GB of RAM.
I don't think I'd enjoy the experience of developing on that machine!
If I wasn't running on a Mac at work I could comfortably use containers without the overhead of another VM.
Looking forward to putting an inexpensive linux workstation together when this Mac eventually bites the dust.
The large Rails app I work on can eat up 500mb RAM easily. The webpack dev server I'm running to compile the Angular frontend for said Rails app is taking another 500mb. Visual Studio Code (easily the best TypeScript editor) is using over a gig of RAM all by itself.
A good IDE like PhpStorm is going to use 1-2GB.
A browser will consume 1GB+. Want multiple browsers for testing? Add another few gigs.
Most devs now use a VM or container with PHP, MySQL, etc running inside it. Add at least 1GB, maybe 2GB.
You probably need Slack or HipChat running. 0.5GB to 1GB.
A mail client. 256MB+.
Your company probably has Office 365, so you'll need Outlook. 1GB+.
The 4G isn't for building or running code, it's for tons of browser tabs open with documentation. I routinely have >10 tabs open, and that's quite likely to cause swapping on 1G RAM.
Sure, I could slim everything down by running a tiling window manager instead of a desktop environment or configuring my browser to unload older tabs (or close and reopen with bookmarks), but that takes extra time, and if time is money, that money is better spent on a few extra gigs of RAM.
Sure, there is a lot of development that can still be done with 1GB RAM, but you're going to run into limitations and it is far from "perfect"
I have a relatively minimalist Linux environment by today's standards (I use Emacs, Firefox with about 10 tabs, the Signal client, a few terminals and a lightweight WM) and I already use 1.5GB of RAM at the moment. Firefox alone uses 408M of RES, the Signal app almost as much (!) while my Emacs with dozens of buffers uses "only" 130MB. How times change. The rest is used by all sorts of system processes and minor daemons.
So basically I'd be fine with 1GB if it wasn't for the bloat that's modern web-based development. These days I'd say that 4GB is a more reasonable baseline.
The last PHP app I was developing was running in the same mode as the whole company's infrastructure was developed and running - one puppetized Vagrant VM for the code and one for mysql, and sometimes a few more for other services.
Sure, even if you just want to abstract the stuff away from your host machine you could reconfigure stuff to run on one VM - but that's again diverging from production. And in the grand scheme of things we were working on several different vm types a lot more than only on this PHP app...
No, a machine with 1GB RAM is not perfect. If you are a frontend developer coding in node. And if you are not, you might need a browser for Stack Overflow; and you might need Docker or vagrant for virtualization.
And, you may also want an Electron-based editor, such as Atom or VS Code.
Even 4GB is hardly ideal. 8GB is probably where it starts getting comfortable.
Development machine does not need to have the constraints of a testing machine; it's counterproductive.
In any case, "web bloat" is relative to the target users and target devices. It's one thing to target cheap Android phones in India; it's quite another to target laptops of a SF-based startup. In the second case, web bloat is negligible.
What about, relative to how many resources should be required for whatever tasks the software needs to perform?
Just because I have a lot of ram doesn't mean I want Slack to use it all. I'd rather give that space to the OS for file caching and such.
Every morning every Dev in our shop no matter the project is going to run "mvn clean compile -U"
Sure you can split the large projects up. Just get some time from your PM.. next quarter.... :)
If you have a few thousand tests at a couple milliseconds apiece (really really easy to do even in a midsized project) you’re getting above that 1 minute range. Shaving 40% off with a faster computer stops people from task switching while testing.
Task switching pretty much doubles your test cycle because you never switch back at the precise moment the tests are done (we are developers. We always underestimate everything by a factor of 2).
Even on a top of the line macbook pro that shit chugged.
And the laptop would get HOT.
If you were running native Linux it should be a lot faster.
There's simply a more sharply defined distinction between applications and windows. Cmd-Tab is for application switching.
if the user wants to move to a terminal window they have open in the background, which key combination do they press? alt+tab or alt+backtick? they have to stop and actively think about which current window is highlighted. is it the browser? or another terminal? this completely kills the action fluency of a keyboard shortcut.
This is generally a non-issue, but even if you are getting tripped up consistently, the OS literally tells you at all times what application currently has focus
Over the course of the year, this will save ~72 hours in development time. Of course, it's not a strict comparison because I would always do other things while waiting for it to compile, but it's still a massive boon to productivity.
I understand that we want to live in this perfect world where all software is designed to run on 2008 era laptops. That's not the world we live in. It's Fantasy. I implore everyone to keep fighting for it, but Reality is what matters to businesses, and the reality of most businesses is that software is insanely complex, poorly designed, and it still generates revenue; more often than not enough revenue to afford the best machines to support running it.
That, in my experience, actually makes it worse. As soon as you start doing other things you multitask, forget where you were at, lose the zone, and lose much more developer time than that 1 minute.
Having said that, MBP to MBP comparisons are not always apples to apples (sorry about the pun). You need to compare MBPs with CPUs from the same family as well.
That's why I prefer older equipment, especially laptops which have proper keys and a matte screen.
I am typing this on a Model M that is dated "22JJUL88" - I do all of my important typing on these keyboards.
It’s not a precise category. But I’d even go so far as to say they are the type specimens for mechanical keyboards.
Yes, most products in the category use Cherry-type switches, but Topres are definitely considered mechanicals and they combine a rubber dome and a spring.
Buckling spring keyboards count, rubber-dome and scissors don't. There's no overarching principle here, if there is, it's how they feel under your fingers.
I guess it's a positive side effect of open spaces: there is so much noise around from phone calls and co-workers discussing that it doesn't make any difference.
More seriously, I've a Ducky keyboard with MX Browns (and it's also a weird one with MX Blues (the noisier ones) for the arrows and page UP/DOWN), and I never got any remark for that.
The only story I've heard from colleagues complaining about keystroke noises was for a friend that has a really heavy typing (to the point of cracking the key caps), and even in this case, rubber O-rings did the trick to dampened the noise enough.
You can apply this successfully to espresso machines, coffee grinders, furniture and many tools (gardening, woodworking etc). The difficult part is finding them as often you need them immediately.
Yes, please give me a full keyboard, and don't put things in weird places. My current machine has "Fn" where "CTRL" should be and it's driving me crazy. Also, the "PgUp" and "PgDown" are directly above left and right keys and I inevitably hit them when I'm jamming the keyboard with my meathooks. I don't need a number pad, but if you're going to give me nonstandard buttons, put em somewhere I can't hit em while I'm trying to do actual work.
Unfortunately many "Fn" keys are handled purely in hardware and Windows can't see them. But it might be worth a look, if you haven't already tried it.
AutoHotKey is also useful, especially if you want more complex hotkeys.
The amount of times I put my laptop to sleep by accident was infuriating, and it was a work laptop so I couldn’t get into the bios to swap the keys back!
You mean directly to the left of the A?
Luckily, this laptop can play x265 1080p video just fine and has 16GB of memory, but 4k is a no-go.
You could unplug it from the computer (it was a mini-din PS/2 style plug). The keyboard itself wasn't mechanical (not a buckling spring or similar system), but it did have full-travel keys and a nice feel for typing on.
I got mine used and had to build a custom battery pack from old cell phone batteries, which made me have to remove 2 meg of RAM (to fit the larger custom battery pack), leaving me with 6 meg instead of 8. I had Caldera OpenDOS installed on it:
...with Monkey Linux installed on top of that (Monkey is a distro that used DOS for the underlying file system - you could even share data easily):
http://projectdevolve.tripod.com/ (downloads don't work)
I'm honestly not sure where or if you can still get a copy of that distro - I should look into it; maybe I should host my copy somewhere...
But even these have gone away, to be replaced by the shitty island keys.
I can use a macbook trackpad, it's OK for consuming, but the pc manufacturers saw macs, and copied them. Poorly.
A laptop is generally roughly half the speed of a decent decent desktop.
And from experience, even if mobility is sometimes useful, it's hardly the norm. Personally my laptop is used maybe 5% of the time at most outside of my work desk.
Cheap laptop for emails, meetings and occasional ssh into prod + powerful workstation would be a better option for me, and I'm guessing I'm not an exception. And this option is cheaper than a high-end Macbook pro.
I feel like if we were doing something that required C or C++ (e.g. video processing or data-training), then they might have had a point about wanting to upgrade, but we were doing a lot of fairly typical Node.js REST stuff, something that could fairly easily run on a Raspberry Pi.
This started when my company started buying Skylake HPs for all new hires because they were "cheaper".
They were only cheaper because they were comparing them to current gen MBPs.
As a result we were stuck with TN panels and 8 gbs of ram.
I would rather have a $400 chromebook with an IPS display at that point, easily 1/2 to 1/3 of the price.
I have a gaming laptop as well that I don't bother to carry around. It's just not necessary unless you're doing GPU programming or model training. Even then, I'd rather just work with a cloud instance.
Devs, you're supposed to know how to make a computer meet your needs! Don't outsource it to someone else. Even 5+ year old computers run pretty damn quick if you use a lightweight linux distro.
Older hardware generally means older, unsupported, unsecure drivers as well.
Also, I'm confused as to why you claim that you shouldn't outsource to someone else, yet you're fine with working in the cloud...
I don't understand the bit about the console; I haven't met a dev who doesn't find terminals worlds quicker than hunting and pecking in a GUI. Maybe an argument to keep general users on Windows, not really an argument against devs running a linux distro.
I was suggesting you should understand what's running on your machine and why, and if you do, that $2k mac isn't doing anything for you that a machine worth less than a quarter of that will. Whether or not you have a top of the line machine, there's still reasons to reach for an AWS instance with a powerful GPU attached.
Hardware can become unsupported when you update the OS. It's happened to me with wireless cards when running FreeBSD on an old EeePC. Even when using xfce, having wpa_supplicant UI was much simpler than remembering and writing a bunch of scripts to set all the crap involved with getting it working. Not everyone uses Debian.
> I don't understand the bit about the console; I haven't met a dev who doesn't find terminals worlds quicker than hunting and pecking in a GUI
Doing a few clicks in a GUI can often result in very complex command executions in the CLI, sometimes across multiple processes. It can be confusing what's going on, especially with redirecting I/O and if you have to do something different, it often requires editing multiple arguments depending on what you want to do.
This is good if you want to script a common task that's repeatable and changes infrequently, but frequent changes in a GUI are much faster and you don't have to worry about copy/paste errors or spelling errors.
And if consoles were so much faster, why does everything evolve into a GUI at some point?
> I was suggesting you should understand what's running on your machine and why
There's hundreds of processes running on the machine at any given time. I would guess that most people don't know or aren't even aware of what and when each process runs at any given state of a machine.
The point is, with a $2k mac (which I would never get by the way), there's easy room for expansion.
There's a billion Windows users out there, do you think there's 0 developers amongst them? :)
I don't know about this. Whenever I do something unfamiliar/complicated on the command line I copy every command I used to a text file for later reference. Repeating these actions in the future is as easy as copy and paste. If I figure out how to do something in a GUI and I don't have to do it regularly I will almost certainly find myself flailing and clicking around randomly when I have to do it again in 10 months.
If you force your devs to use inefficient tools it might be necessary though. Like our customized Eclipse that needs 8GB RAM. It starts page swapping if you have only 8GB.
Other than that I'd say you need good keyboards and good screens. The processing power usually doesn't make you better.
This was right as the term microservice was becoming better known. We were building a very de-coupled, stateless design, passing messages among services.
The MB Air was fine. We did a lot in the cloud as far as testing. Write code -> push -> get test results
These days so much of the work is “fill out yml files”. I see little value in having this 2017 MacBook Pro 15 with maxed specs. Having anything more than Firefox and my editor on the laptop seems useless.
Others workflows may vary. But for devops, security, and a great many common roles, anything above a mid-spec MB Pro 13 feels like overkill
This whole "productivity increase for saving 10 minutes" is a little silly, people take breaks, get coffee use the bathroom, etc. Most folks I know doing processing heavy stuff do it all on a remote server.
With a few exceptions, I think it really comes down to people want nice toys.
Many times I've observed that people running programs on older hardware realized stupid design non-decisions. Fast ~= blind.
If you can use an old computer for development — great, I'm happy for you. But please don't assume everyone can, or wants to. For my type of work, no CPU has enough single-core performance (I need quick bursts). And since programming is what I do several hours every day, it is quite an important part of my life, so I'm not willing to torture myself on old hardware just for the sake of it.