Company supplies horrible laptop locked down in ways that prevent any real work from being accomplished, but allows unfettered VM use because they don't understand or care about security outside of the environment they designed. Some developers use an underground system of sneaker net and whisper doc hodgepodge to get real work done, but that largely isn't a problem since productivity is measured on a political basis rather than getting anything accomplished.
Thinking you'll improve the situation for you and your group, you obtain permission from management to procure and install a workstation on your desk and replace its OS with Linux. IT drops the machine, you install the OS and enjoy much better productivity for a few months until a security flag is raised in a distant location and you're hauled in for an interview with HR and your boss, who now claims never to have given such approval.
Back to the VM no one cares about. You get a writeup and final warning, and because the economy is going to hell anyway, you stick it out, and things unexpectedly change for the better. Your boss quits, and everyone forgets about the writeup.
Years later, someone decides it's time for a Linux server push. You get tapped for that effort and can now set some policies. Its too late to help much though, since this is 2008 and this company is named Lehman Brothers.
Corporate IT security is always a juggling act - and it's not easy.
I think developers assume those choices are made for political reasons "hey, I did something", but in reality we are often countering known mechanisms of infection propagation.
Remember what happened to Sony? So we disabled SMBv1 and PowerShell - devs complain. Then we see someone in accounting installed a fake version of Adobe something - so we prevent software installs in that department. Then a VP forces us to give him a "dev-mode" OS without restriction and subsequently gets a virus that brings down his department. ...so we have to then role those restrictions to everyone.
...and, you're right, we don't pay much attention to devs setting up VMs and tunneling around firewalls, because the vast majority of risks we combat don't use those methods. But once they do, yes, we'll lock them down too. (and VMs are becoming more common in malware space, fyi).
There are two important reasons, and one medium reason...
1. These rules are usually distributed via GPO, and the AD system it uses may or may not have a useful and/or well maintained notion of who is a "developer". At best it's done at the departmental level, which isn't that great - since a lot of IT/dev people work in all departments.
2. Whenever you make an exception to a rule that restricts behavior, it's always the worst actors that game that system to get the exception. It's exactly that one VP who thinks he's tech "enough" to handle the risk that'll figure out how to get the exception for himself - he's also the one to run a torrent client to download a free copy of "PDF Writer" or a malicious keylogger.
3. GPOs can be complex to apply - errors happen. The simpler you have your rules, the less likely there is an error that leaves core/critical systems unprotected.
I worked for a large multinational company that everyone here has heard of, whose development machine policy was pretty much: Order (through corp) whatever machine you need to get your work done. You have root. Install what you need to work. We hired you because you are a smart, responsible grown-up and we trust you not to fuck up our IT systems. Don't fuck up our IT systems. And amazingly, it works!
I once joined a company of couple of thousand employees (as software engineer). The interviewer assured me that employees can pick whichever laptop or PC they want. Ok, great, I thought, then asked for a Thinkpad.
A funny thing happened after I started to work - IT department was unable to procure me that laptop. After ~3 weeks they said that the company they order hardware from can't provide Thinkpads for the foreseeable future. So I've asked to just order from Amazon or similar. Nope, can't be done. My manager just shrugged and told me to ask for standard Dell laptop like everyone else in the company. Yeah, it arrived on my desk the next day. Later I found out that company works like that in many different areas. Lots of freedom and choices on a first glance, but once actually needing something, there was always only one way to have it done. Usually the most annoying and time consuming way.
So, the interviewer wasn't wrong in telling me that I can choose what I want, but the company makes sure everyone ends up with the same setup. Somewhat similar to Henry Ford's "you can pick any color as long as it's black".
I'd just like to comment as someone who's done sysadmin work - the nightmare security scenario for us definitely includes employees bringing their own unsecured hardware to the office and connecting it to the corporate network.
So many security issues with that - that it was never a reasonable request on your part.
Moreover, having also helped with support and procurement. BY FAR, the most efficient thing to do is get the exact same laptop model for everyone. Otherwise, the IT support team is constantly fighting with driver and support issues on different brands, models, bloatware-in-drivers - a new battle for each configuration.
One-size-fits-all laptops for the company makes 100% sense for a company.
man all those maverick companies that deign to allow developers to choose their laptops must be screwed, then... (xP)
as to non-engineering / technical staff, though i am general allowing people who care strongly about their setup (for good reasons, mind you), while having the vast majority of people (especially the ones ambivalent about setup who simply want the smoothest possible experience) on a constrained, non-fancy-whiz-bang-corporate-it-3000-or-whatever setup. i also freely acknowledge that there are extra costs to this mindset, and people are fee to assign their own values/prices/costs to these things as they see fit (all other things being equal).
Most enterprise-scale companies are like that, only using a single source for procurement of laptops, desktops and monitors, and making it impossible to buy IT yourself and expense it.
I take the opposite approach and switch every few months, so I know that each and every nightmarish organisation (which is most of them) is only a temporary client.
I've been lucky enough to work for two 'fairly progressive' organisations in a row (by UK standards in terms of work/life balance, working from home, flexi time, general attitudes around 'how to do good work' (outcomes vs outputs, etc) and all that jazz.)
I figure after 2 in a row, the next place has got to be step backward a good 5-10 years. I'm putting it off as long as possible.
Though saying that the first place I worked was firmly stuck in the 1970s so maybe I got it all out the way in those first 3 years.
Not being able to select your own hardware for a job is perhaps one of my worst experiences to date.
At one company, I was handed laptop with an older version of Windows, preinstalled bloatware, 3rd party encryption software, anti-virus, VPNs, etc.
Even worse, the laptop was about more than 16 inches and weighted over 3 kg because the company crammed in as much expensive hardware in it as possible -- so that the computer would fit everyones needs.
I understand the need for company IT policies, but global companies have such complicated policies it basically hindered all my work.
You won't believe the sigh of relief when I started at a small start-up company, and I was able to choose all of my hardware. The CTO ensured my SSH keys would grant me access to all the servers I would need.
Pretty sure screen sizes are still universally measured in inches for some reason - maybe because it is what people are used to, or maybe for the American market as it is one of the biggest?
The moment after I signed the contract to join a startup, the CTO said "OK, then... Here's your laptop, install Windows and we'll talk later..".
My heart sank and felt I took a bad turn in life. I really wanted to work there but I had stopped using Windows for health reasons a long time ago. I wondered if it were rude to resign the same day because there was no way I would use Windows. I thought it was more honest, but at the same time I thought I may be too radical. Maybe I could be spared and still work there. My brain went into overdrive. The CTO then said "Uh, I mean Linux.", he noticed my face and continued "Don't panic, there are no Windows machines here". The whole thing lasted a few seconds.. I heaved a sigh of relief.
We try to be brutally honest with applicants, not only with the hardware but on the specifics of the job. It disappoints many who imagine they'll be doing "data science/AI" with screens that blink and computers that beep.
I was young and immature and started doing that after I tried everything. I was also stuck with a "faulty" RAM which I couldn't replace that I thought was causing this. That situation occurred often enough and right when I really didn't want it to happen - like saving work - that it became irritating.
It is not that Windows caused health issues, but my reaction to Windows was not healthy. Bear in mind that before switching, I had only used DOS but mainly Windows for almost two decades. Let's say we grew apart over the years.
One place I worked - the main electricity supply and management company for a certain state and certain country - I didn't have admin rights to my own machine, and had to put in a (paper) request for any software that I needed to install to do my job (as the image started as the same image used by someone who would only need Word, email etc). I didn't know exactly what tools I needed, so it took - seriously! - some few weeks before I could actually do any work.
My job was supporting, enhancing and debugging internal tools used by engineers. Upon arrival I was handed a USB key with ZIP files of the source. There was no source control. I initiated a request and the political process necessary to 'allow' me to use source control and setup an SVN (this was quite a while ago) server. I quit in disgust a year later, and the first meeting to discuss whether I should be allowed to use SVN or not was scheduled for the day after my departure. My last work action there was to leave a note on my desk advising my replacement where the most recent ZIP files of source were stored.
None of the above was explained to me before I was employed, and I've since learned not to assume, and explicitly ask "Do you use source code control?" and other basic questions at interview / contract negotiation stages.
> I've since learned not to assume, and explicitly ask "Do you use source code control?"
So did I, but when they told me "Yes", it often turned out to mean "Yes, but only in a few departments; not the one you will be working at", and in one case "Yes, but we only use it as a backup system for the completed project, not during development".
Asking the right questions is important, but it's only half of success; the other half is getting truthful and non-misleading answers.
I've been to companies that outright lie about their development environments. I only enjoy using GNU/Linux so one of my first questions is whether I can use Linux and if I would be forced to use Windows for certain things. One company told me yes for the first one, but when I got there I found Linux was completely unsupported by the draconian IT department. Many companies simply don't know the answer to the second but it usually turns out they use Exchange or email some important document in a doc file that only Word will display properly. These details are often forgotten, especially by some people who don't seem to mind following stupid corporate protocols.
Mature IT departments tailor solutions to different classes of users with different needs. If they don’t do that for engineering, that tells you everything you need to know about engineering’s stature in the company.
To an extent, and as allowed by whatever particular constraints exist for that business. Letting you run whatever IDE you want, usually OK. Letting you install whatever operating system you want, well... there’s lots of reasons you may choose to not support that choice, that have nothing at all to do with maturity. If your jobs was improving developer experience, there’s only so many times you can come up with solutions that work great for everybody except that one guy on Arch, before you give up.
This is more related to how organizations work. That larger organizations tend to need to streamline more in order to scale up number of employees, and do more to maintain acceptable security simply because there's that many more people on board.
If policy is install whatever you want, and if you get hacked, you're fired! This just won't stand in court.
So policy is that IT department is responsible for installations, IT department gets the blame. Infrastructure is sort of "outsourced" within the company.
If you were accountable for those younger first-timers running I2P and Tor within security perimeter, what would you do?
Lower down the tread I mentioned vendor due diligence, specifically because I’ve done so many vendor security reviews. But there’s more to it than that. You might also need to be threat modelling it, legal will need to review the ToS and privacy policy. You probably need to figure out the impact on other services too. If you’re in a regulated organisation, there could be any number of other things you have to do, and on going compliance costs. If you work in a bank, and somebody wants to install Gentoo, you’d have to figure out how to run anti-virus on it, how to centralise patches for it, how to install endpoint DLP, make sure it has the correct web proxy configuration... the costs can easily stack up.
Yes, you need to do all those things, and it is expensive. The organization's choice not to pay those costs to provide an environment suitable for engineering work (not every single one someone could ask for, but one) reflects its views towards engineering.
It may be correct for them. But for you, as a candidate, it's a good indicator that you'd be happier in the kind of company where engineering has the power to get that done.
Between jamfcloud, osquery, munki, etc. there are plenty of companies and tools out there catering to IT departments that take this seriously.
This has no impact at all on an organisations ability to provide an environment suitable for engineering. If they have an engineering practice, then you could be sure they’ve invested resources into making sure they do have a suitable engineering environment. The issue at hand relates entirely to personal preferences. The problem is that an individual can not necessarily use whatever tools they prefer, not that they don’t have suitable tools available.
But if your 10,000 employees each need a hammer, getting them all the same one, from the same vendor, with the same support contract might just make sense.
Edit to your edit: whatever particular sets of tools the business needs. Whether it’s laptops, thin clients, operating systems, IDEs, ticketing systems...
> Even then, what if some app does not work on Windows 10?
If your business had standardized on Windows 10, then you’d hope checking whether things worked on Windows 10 would be part of their procurement process.
> Or what if you need a 32 bit app running on macOS?
You choose something else. Like any business running MacOS would have to do.
I’d probably just pick a better analogy. If the job can be done on the operating system provided, then it can be done equally well by anybody using that operating system. No need to grow bigger hands.
If you’re working from the premise that your brain is only suited to work with one operating system, then you’re really only harming yourself, by shutting down any opportunities you may otherwise have open to you.
Small organisations have the luxury of letting people choose their tools more freely. As they grow, they tend to have to restrict this more. Not just because they might have to support the tools you choose to use, but because they absolutely will have to support how your choices work with all the other tools they have in the organisation. At scale, this starts to get out of hand pretty quickly, and the only way you can provide a good working experience is by adding constraints to the tools used.
On top of that, some organisations have regulations and compliance requirements to meet that make it even harder. If your basic procurement pipeline includes $10,000 of vendor due diligence, then you don’t want to just give everybody free reign to use anything they feel like. If those choices introduce additional ongoing compliance costs, then you want to control that even more so.
You could ignore all of that, and focus only on how it affects you. But there’s good reasons that organisations do that sort of thing.
> One company told me yes for the first one, but when I got there I found Linux was completely unsupported by the draconian IT department
Happened with me as well (but I use a macbook). I won’t call it draconian. The IT department usually has to deal with a large number of hardware/software support requests on a daily basis. It’s quite understandable that they won’t have all the answers for a platform they’ve never used/supported before.
I didn't say it was draconian for not supporting Linux, just that it was draconian. It was widely recognised in the part of the business I was in that the IT department was holding the company back when it came to technology.
And, in any case, I wasn't asking any questions about Linux, I just wanted to connect to the network. My current company lets me use whatever weird distro I want (even Arch, btw), but they won't officially support it. That's fine. I can, however, ask them to support standards that I need (rather than specific support for some distro).
I find it hypocritical that developers preach "languages don't matter, a good engineer is a good engineer", while also being very specific to work environments. Personally, I want the environment thats most similar to my coworkers to create the least amount of friction during on-boarding and documentation based learning.
Don't get me wrong, there are bad dev experiences that can be had. However, from my experience its definitely not mac vs linux vs pc, but rather intricacies that are unlikely to be able to be explained during the recruitment process. And if they can, why would they? It's not like applicants are being super truthful on their resumes.
One problem with building upon a framework of falsehoods is that to be a consistently effective liar, you have to have a fantastic memory to keep your falsehoods referentially consistent.
When I was on the proverbial other side of the table helping to evaluate candidates, I became flabbergasted with the frequent false skills claims of candidates, especially those sent by job shops.
To me, it would be exhausting to be phony. I'd rather save that mental bandwidth for stuff I can feel good about doing.
People can be very good at constructing “facts” that are not correct but desirable to them, and not completely devoid of anchoring.
Like that time they were at a book signing event, asked the author for a two-shot, got a smile for a last joke thrown before leaving for the next in line, and now proudly declare themselves “good friends” with that author.
It’s seldom straight lying and more often omission of critical facts or misrepresentation of their roles on projects. Which sadly is the common advice given to people writing resumes, and it becomes a prisonner’s dilemma.
I assisted at whole interviews where “tech leads’ craftily avoid recognizing they code at most half an hour a day.
Or a dev listing super hard stuff on their resume but never mentioning until thoroughly asked that they pair programmed all of these.
Personally I find pair programming hard stuff way better, because your constantly talking about the challenge a find issues earlier than when you just implement your initial solution.
I never thought that is somehow bad, but I guess people have different worldviews
My broad guess is that 50% of resumes have lies throughout. typically you can smell it during an interview, but on occasion the interviewers that are available aren't themselves technical and people slide through.
Maybe those talking about how languages don't matter are the bad devs, whose ~0 experience in one language is equivalent to ~0 experience in any other language they wrote "hello world" with.
Once I had an interview with a guy who claimed to be fluent in several languages, so as a warm-up question I asked: "tell me any difference between Java and PHP", choosing two languages he claimed to have most experience with. After five minutes of silence, I gave him a fizz-buzz-like test, just to be sure I am not mistaking something else for utter lack of programming knowledge; and he failed at that, too. But until the technical part of the interview, he made a really good impression; good fashion sense, great verbal skills, interesting CV. I think this guy would easily agree with any manager that the importance of a specific programming language is overrated.
Ironically, a similar attitude might also come from the opposite side of the expertise spectrum, where the person would roughly mean "PHP is Turing-complete, Java is Turing-complete, Haskell is Turing-complete, Lisp is Turing-complete; what you can do in one of them, you can do in any of them". Like, sure, given enough time and an infinite Turing-machine tape, of course you can. But if someone has dozen years of experience using language X -- not just the language itself, but the entire ecosystem, like knowing the best practices, good libraries, good build tools, et cetera -- how much time would it take them to acquire equivalent knowledge of language Y and its ecosystem? Especially considering that the typical employer will spend $0 for training and requires full productivity from Day 1.
> It's not like applicants are being super truthful on their resumes.
Relevant: https://www.joelonsoftware.com/2005/01/27/news-58/ Horrible developers are overrepresented at job interviews, because the decent ones usually get the job and the horrible ones have to try again, and again, and again.
If you invite only those with good resumes (makes sense, why would you invite those with bad ones?), you mostly get an intersection between people who suck at programming and have great resumes. That means: liars.
At my last job, we weren't allowed to interview anyone until we had either been trained in interviewing, or were doing the interview with someone who was trained.
When I took the training class, I assumed it was all about not asking questions we could be sued for, but that probably took up about 5 minutes of the class. Instead, the instructor created exactly the scenario you described. Someone who sounded really, really great and who you'd like to work with, until you thought about what you heard and realized he hadn't told you anything at all.
Good con artists are really good. Even if the average interviewee isn't trying to pull a con, digging into their real experience isn't something we're all good at. Training helps.
I used to think tech adopted/co-opted the term production because that is 'where the real stuff happens' but now I think it refers more to it's theatrical aspects.
The problem with this whole thing is that nobody really embraces the zen of devops (really the zen of everything) -- there should be one way to do things.
Prod is something that runs in the cloud, staging is something sorta like that, but the data is garbage and nobody maintains it, and dev is whatever someone could cobble together in a bash script to get something running using homebrew dependencies -- if you're lucky.
Or everyone hopes to docker-ify everything, but that's its own pile of garbage.
It's actually k8s-ifying everything (docker-ifying was 2015-2018).
> nobody really embraces the zen of devops
I know exactly what you mean, as am struggling to get people around me understand the value of tooling, developer delight but to no avail.
Management, other non-tech teams (sales, product, Ops etc) that have learn/worked mostly in regimented setups have a long long way to go before they even start to understand what DevOps really was supposed to be - all about actionable faster feedback loops.
But, faster, actionable feedback loops also bring out the the orgs's / dept's systemic rot/muck in policies/deficiencies/politics and make it visible in-the-face. That's very threatening to many who put up a make-believe fakeshow that "our stuff's all cool, that other team/dept has all the problems". Hence, all that pushback against the new or throwing the tool in (docker/k8s/cloud) and claim that it would act as deep magic to fix the fuckery happening all over.
I learnt it over my exp at big, small, medium sized firms - over last 15+ yrs.
Only way to get to achieve DevOps (yeah, not "adopt"; one can't just 'adopt' that) is to get CTO/COO/CFO to really really understand the value. Else, lost causes!
Do you think that companies like G are going to be able to hire literally anyone if they start advertising all the mess that they are working around?
Why would you write this article from an engineer's standpoint with arguments that will only benefit other engineer's but for some reason is directed at companies? Why not just call it "I'm upset with the dev experience" and be honest about your intentions?
If you don't present valid arguments to benefit the company don't title it like advice. I clicked the article expecting some more hopeful arguments than "hurr durr I got baited into doing things I don't like", but I see none.
To minimize human suffering you have to donate out all your earnings and be non-profit. I don't have anything against that, but those are irrelevant to this discussion.
I think there's someone who would accept working at each of those places just fine. Even the worst ones. Her point is probably more that if a company admitted their experience up front, people who thought that way was fine would accept jobs and those that hated the idea of that experience wouldn't. Sure, companies could lie, but then people would just quit as soon as possible if they really hated the experience. I'd work at any of those companies myself if the situation was right otherwise.
> I think there's someone who would accept working at each of those places just fine. Even the worst ones.
Yes, people desparate enough to subject themselves to poorer working conditions. What's the single benefit for companies to cut themselves off from potentially good employees?
Honesty? People are going to know what is up within the first month, at minimum, so don't waste their time?
This goes along with a broad category of attempted deception that fails because the person you're attempting to deceive has, typically as a part of their job, understanding the actual state of thing you're trying to lie about. Especially egregious when it creates a safety risk, for example, but bad enough when it simply wastes time. Good example: Lying to accountants about the contents of financial statements.
How many IT people do you know that have quit after a month because of poor conditions? And how many do you know that suck it up and just stick to complaining about their job?
Engineers don't want their time wasted, but companies have no benefit in trying to prevent it as far as I can see. What kind of brand image would they portray if they started saying how bad their tech stack is?
Well, every person I know who excels at their career can say "Wait, no, this was a bait and switch" and bow out gracefully in the first couple weeks and take one of their backup offers.
I've also worked for extremely corporate companies, and a lot of the saner people backed away slowly during the interview process, but some people took a week or two at the company to register "yes it's really that bad" and bail.
Well senior engineers won't work at entry level positions, but medium skill ones might. Replace skill with working conditions and it'll be the same thing, people that have a lower tolerancy for working conditions will leave quickly, but some will stay and that's still a win for the company. Especially compared to advertising how corporate and annoying their processes are.
but she says
> This way, if it sucks, people can see it as a warning and stay far away.
which implies she thinks companies should tell potential hires you won't like working here and should run away, which I doubt many companies will listen to that advice and think "sounds good".
That said what you said would be a good thing for companies who are afraid to tell what their dev experience is like to consider.
Companies don't have to tell you whether you'd like it or not, because they really can't know. However, they should describe the environment they provide.
I thought I had worked at some stupid companies over the years, but as a general rule all of them knew, at least in a broad overview, how the employees felt about the place. Often they even knew what things people disliked, but they didn't want to change these things or somehow found themselves incapable of changing them.
So again, I don't think relying on companies to tell you the things they know their employees don't like about them when they are trying to get you on board will be seen as a winning strategy.
If companies with decent process start doing this, and a company refuses to share their process, then everyone who listened to this is better off. Now of course, the companies with shitty process did not (and maybe should not) take this advice. However, the situation improved for everyone for whom it should.
My company isn't that big. I work for one which is a vendor for a bigger one. I was a fresh grad when I joined. The clients complained about echo during skype calls. We didn't use headphones at the time and we'd have to constantly switch between mute and unmute. Some (senior) people didn't bother muting themselves unless someone superior to them was present on the call.
I talk to my manager, get an approval for a headphone and email IT. IT refuses. Since I persisted it got raised to the IT head. IT head calls me in for a quick chat and told me we cannot cater to the whims of every employee. I got told off that I should know my place as a new grad. I offered to use my own headphones, but requested an adapter since the mic/audio were on separate ports. Adapter costs Rs.300 ($4) on amazon. Headphones cost Rs.1000 ($14). Now my keyboard and touchpad sucks too. Overall I have invested more from my own pocket to work with this brick. What I learned from talking to others was that they just recycle laptops and give to employees. Say Emp N and Emp N+1 complain about laptops, they give them laptops which belonged to N-1, N-2. Basically the last laptop they received goes to the next person.
And yes I am stuck with a 4GB i3 Windows 7 (no budget to update windows) with an Ubuntu VM. Good luck running docker on windows. Not to mention the daily rite of passage (BSOD) after 10 mins of booting up.
Company M: you're given a machine. In fact, multiple machines. It's your job to setup them properly, install the OSes that the company provides (sometimes they are betas). You've got full admin access to your machines, of course. You sync the source code repo. It takes half a day, but you get all the dev tools with it. You run a full build. It takes another several hours. Hooray! By the next day, you're ready to do development work, run tests, commit code.
FB is company D or E. FB you could have a linux laptop though, and could install linux on the mac. Not officially supported, but lots of people installed linux.
I'd love to hear if anyone has an example of a great dev experience/environment/workflow? It seems like they are all hacky or bad in one way or another.
Some general things I'd quantify as making a "good" environment:
- The time from saving a file and seeing the change should be low
- Running the environment is complete, just like production/staging
- Database schemas are reproducible and in a single location
- Intermittently issues due to differences between machines are nonexistent
- A debugger can be hooked up to the process or otherwise remotely debugged via network
- Very little configuration necessary
- Ability to use third-party APIs for integrations and infrastructure dependencies
- Any crons or async tasks are easy to run
- No other arbitrary limitation on access that get in the way (it is just audited if necessary)
- [If allowed] production data can be pulled in for testing
1. Build tools/dependencies from source on macOS and Linux
2. Use an OS agnostic init (supervisord)
3. Automate machine setup and deployment in something like Ansible or Saltstack
This allows you to bootstrap a bare macOS or Linux system with a smallish shell script and run all the necessary services in the same way they'd run on production. You have very little specialization between production and dev -- like in the cloud you just run Postgres, you don't use RDS on Amazon.
Coincidentally this makes supporting multi-cloud or bare metal very easy since you don't use anything that's really special in any cloud. Like I would run our main thing on ec2, but CI/dev workloads on Linode where I can get cheap VMs.
Most of the piecemeal automation - migrations, cron were just tools written in python - some were daemons, others just scripts that ran during a deploy.
Have to say that the company I work for has improved this leaps and bounds in the last 5 years but still has a ways to go.
Macs can be requested for devs but Windows is still the default for users.
lots of services being migrated into azure/office365 means no longer needing the VPN for everything
but sharing large files is still primarily done via mapped network drives, which requires the VPN.
Security team unfortunately MITM SSL which means breaking random things like raw files on github (thanks Cisco).
Most of us have the freedom to spin up local vms/docker for dev environments and easy access to AWS and Azure for spinning up test/prod servers.
No real standouts, but the team is great and the work is interesting
EDIT: I should also note, this is an old school engineering firm, not a tech firm. Change is slow but it does happen
Mac fetish started when Mac went BSD based, and hardware was higher quality than common company hardware running windows. So developers wanted Mac.
Windows has seen the light and plenty of Unix possibilities on Windows, but they have lost nearly a generation of Developers who don't want to move back.
I personally don't care, I have two macs, one windows, one Ubuntu, and several tablets to work on. OS's are pretty commodified.
WSL2 is extremely recent, so in context of the question, the fetishization of Macs is because using Linux on windows used to be pretty laborious.
Setup a VM, getting networking going on the VM, blah, blah, blah.
Mac actually supported Linux commands, windows didn't.
That or you had a Linux laptop which had a massive overhead in breaking a lot. Never did it myself, but the complaints tended to be that drivers broke constantly and it was hard to use a lot of commercial software, including games.
As far as I remember, it was common even 10 years ago to see Linux users on HN openly say they'd given up on trying to get their soundcard to work as it wasn't worth the hassle as it would just break in 6 months again.
Hehe, this feels like the typical oblivious hacker answer that begins "So do X complicated thing instead".
People paid for simplicity. Engage your empathetic brain to understand why people don't want to make their lives complicated, instead of trying to invent another jury-rigged workaround.
You can go dig through the HN archives around 2005 when everyone started switching to Macs. You will see the people delighted with their purchase in the comments, how much easier it made developers lives to have a working, powerful laptop that actually supported Linux commands.
UM having a native x access to your dev machines (which are full tech copies of live) is simpler - rather than this more modern method of every one developing locally on different hardware and a different os.
Except the fact you won't need to restart to install updates every other weekend, worry about driver issues, which of the 457 always-on background services is hogging the CPU, why the video/audio output is suddenly stuttering, why it won't come back from sleep sometimes, it runs *nix and has some of the nicest tooling available both in GUI and CLI form.
I recently switched from mac to windows (nice desktop gaming rig) - and although I ran into a day's worth of issues getting WSL2 setup so I could use Docker - it's been smooth sailing with VSCode otherwise.
Dunno. I also upgraded to those new M2 "hard drives". Right now the docker images are the dependencies (redis, sunspot, PG) and the test suite isn't particularly large, but also isn't particularly optimized (ex: there were a bunch of tests that looked like they were VCR'd, but weren't), so I suspect any speed differences from the file system would be swamped by all that.
I can appreciate wanting a fast NPM install, but how often are you doing that that it's any kind of significant?
can you bring your own device in? This sounds like the sort of company you have your personal laptop beside your work one and go to that machine to do things like check your email, buy lunch, listen to music.
All while trying to get them to improve their policy, of course.
I've not had the need to do so, there's no issues using work machines for any of the above. The sec team just has some weird ideas on restricting work machines networking.
but yes if we want too there's no rules against using our own machines.
We can login to things like the company github etc using sso/ssh from our own machines, just not the Intranet services.
I've seen what happened to a company (X) after it was acquired by another (Y) that dealt primarily with banks.
Company X did well to preserve their own culture after the acquision, but despite that they didn't deal with sensitive information in the same capacity as Company Y, Company Y's IT security policies took over and made life miserable for devs.
When every task you do is wading through a slog of VPN and RDP mollasses, it's no surprise that so many devs quit.
The only reason the project I worked on got anywhere was due to some insidious tricks to punch through firewalls and basically hack into our own dev environment.
Company supplies horrible laptop locked down in ways that prevent any real work from being accomplished, but allows unfettered VM use because they don't understand or care about security outside of the environment they designed. Some developers use an underground system of sneaker net and whisper doc hodgepodge to get real work done, but that largely isn't a problem since productivity is measured on a political basis rather than getting anything accomplished.
Thinking you'll improve the situation for you and your group, you obtain permission from management to procure and install a workstation on your desk and replace its OS with Linux. IT drops the machine, you install the OS and enjoy much better productivity for a few months until a security flag is raised in a distant location and you're hauled in for an interview with HR and your boss, who now claims never to have given such approval.
Back to the VM no one cares about. You get a writeup and final warning, and because the economy is going to hell anyway, you stick it out, and things unexpectedly change for the better. Your boss quits, and everyone forgets about the writeup.
Years later, someone decides it's time for a Linux server push. You get tapped for that effort and can now set some policies. Its too late to help much though, since this is 2008 and this company is named Lehman Brothers.
It was fun while it it lasted.