Hacker News new | past | comments | ask | show | jobs | submit login

The government department standardizes on Azure but all the young devs push for Linux servers so all their shiny toys work great.

Windows is truly dead on the cloud and has been for a decade. The only thing people use it for is self hosted exchange, and file servers and active domains - legacy stuff.




There is an entire parallel community of business IT, basically unknown and unintelligible to the Silicon Valley community, where the Microsoft server-side stack is as obvious a choice as Linux is in our world. You may even have some of it in your company. It's really amusing (and dysfunctional) when our projects require "Eng" and "Corp IT" to cooperate. We're both ostensibly computer professionals who select, write, and operate production services, yet to each side the other may as well be from Mars.


And at least where I come from, it seems the owners of .Net and C# shops are all driving Maseratis or Mercs, while the owners of Ruby/Nodejs shops are on fixies or electric skate boards.

(And the Java people all work for banks, and ride the train. Even though they're really well paid, they don't own the company, and don't get allocated parking spots in the bank's CBD offices...)

Probably an awful statistical analysis, but it's a common enough thing here I've heard other people talking about it...


Like 10 years ago, I thought it was great when we garnered a client which was one those "Microsoft only" type shops, because I like C#/NET. But the only justification for using it was really "IT made us do it". And I saw how the IT leadership in these places were older people perhaps maybe fighting older political battles. In any case its been many years since I've stumbled across a pure Microsoft shop. I think Azure means they know they lost, but also have legacy business to support.


I work in a 99% MS shop, basically (well, apart from our SAP installations). Most of this was indeed setup like this due to homogenization (all our admins know everything, it all works together, no weird stuff on some servers, etc), but it has actually created a weird situation where we have some smaller services running on old unix servers and no real admin in the org... we actually have on older guy coming in once a week to check on these servers. If it wasn't so sad it was hilarious. I mean, I used to do some unix server stuff en years ago, but here I am an SAP dude, so while I do have some auths on one of those servers, I can probably basically restart some daemons and that's it.

Oh yeah, and iPhones. We have those - not complaining, really, but those were sold internally as "it looks better if we have those premium phones and not old junk". Ah well...


>But the only justification for using it was really "IT made us do it". And I saw how the IT leadership in these places were older people perhaps maybe fighting older political battles.

I've seen this happen too in predominantly Linux shops. In the early 2000s Microsoft worked hard to cultivate relationships with future IT leaders and those relationships have a way of enduring. Even if your CIO can't get everyone onto Windows they might still find a place for Office 365 without much time spent weighing up the alternatives.


That’s really how it works. I can attest to that as a former IT guy in a large organization. It’s a personal relationship with a clueless CTO. But to be honest, they’ve just replace IBM guys doing the same.

Big difference, IBM representatives actually knew their stuff. Microsoft would rely on its enterprise sales guys to cultivate this relationships, pushing CTO to make the most stupid decisions and using system engineers to sugar coat and hide the details.

I remember one being told off by a sales person when saying “yes, it’s a bug” in one of these sales pitches.


IT made them do it for good reasons though. Managing a large number of users and workstations in an enterprise environment is one of the things that Microsoft stack stands out.


Pull the boundaries in. Provide all of the enterprise services through the web. Provide hardened VMs for those services that can't be migrated. Suddenly your workstations are thin clients, and it doesn't really matter anymore if someone decides they want BYOD.

Now you can get after the actual interesting problem of intellectual property theft and corporate espionage. Things which the firewalls of old did nothing to prevent but provide a layer of obfuscation.


It's naive to think that just because your production critical software is web based, you can just have everyone BYOD. That is asking for rampant malware on the company network, and targeted attacks that steals company secrets.


You're completely missing the point. The internal network I'm describing should be nothing more than server metal and virtual workstations. Your border is miniscule at that point. You no longer have to fret about network ports in conference rooms. You no longer have to worry about testing your OS and software patches on the 20 iterations of lifecycle replacement hardware that is floating out in your company. You don't even need to invest heavily in workstations, as thin clients will suffice for most situations. Business continuity becomes a piece of cake because you're no longer factoring in the physical workstation beyond ensuring your cold site has a box of laptops in the corner.

Your concern about rampant malware won't even matter because the only way people can access the web services is through the VMs. The thin clients and physical workstations won't even be able to access most of those services that are mission critical.

In the situation where a company provided physical workstation is necessary, that machine would be just as isolated from the internal network as anyone else. Developers can use one of those, or use VMs that are VLAN'd off. And if your developers are automating their builds and containerizing, then your developers are going through layers of services and automation before their code goes out to production, so even they won't need access to production environments from their relatively insecure dev laptops.


From the lens of a programmer it's definitely possible to do BYOD without any real danger to the domain.

They need access to a few webfrontends and be able to use their SCM server...

But in order to do that, they'll have to be able to test locally or have a really hardened access to their servers. Significantly harder than just forcing everyone on company hardware without root/admin access


Developers get special treatment, because the nature of their work often require them to have local admin access. But the majority of people do not need that, and if I'm in charge of company network and security, I would not allow anything but approved machines on the internal network.


> Developers get special treatment, because the nature of their work often require them to have local admin access.

That is extremely rare at larger b2b enterprises


Each and every organization I've heard of that did that, reverted to give developers local admin pretty quickly because the requests to the IT administrators for every time a developer needed admin to install a required dependency, start whatever at admin to debug, change reg settings, test installations, ect.


Senior scientists in corporate R&D often require local admin access as well. The image analysis and spectum simulation programs I needed to run and keep updated were not on the radar for our corporate IT folks. They were very good at what they did and were cooperative when we explained our needs and showed that we were competent to manage our own systems and were willing to reach out and ask questions before doing something unfamiliar. Even in R&D, most chemists could use a standard workstation. It was mainly analytical chemists/material scientists working with specialized instruments and doing custom software development that needed local admin access.


“The company network” is no different from the coffee shop WiFi next door, just has more bandwidth.

We still restrict web access to company managed machines but it’s a layer, not an essential boundary.


Provide all of the enterprise services through the web.

Thats like 10 years of projects right there


I work in a mostly MS shop currently. Of course, all the orchestration scripts I've written for CI/CD etc around our projects are using Docker and Node. Then again, in the main project I'm on, everything I didn't make is .Net Framework and not Core... so going to spend months migrating the crap to actually make progress with some other needs.

There are some things I do like about C# and .Net ... the culture is rarely one of them. Though I still prefer it to Java for the most part.


Basically in the same boat. If you haven’t looked recently, the container support in Windows is getting much better - including prod support for Windows worker nodes in Kubernetes 1.14. But just using Docker with gMSAs was a big help to our CI/CD and simplifying/standardizing deployment of our apps which are a mix of legacy ASP.NET, WCF, and traditional Windows services.

If you can target at least 4.7.1 you can use configuration builders [0] to use environment variables/json/cloud parameter stores/etc to modify legacy web.config or appsettings without code changes. You can also use the .Net core config libraries directly in legacy .Net apps if the devs are willing.

[0] https://docs.microsoft.com/en-us/aspnet/config-builder


I had a coworker who came from Facebook and told me similar stories.

We all hear about the exciting engineering projects and stories coming out of Facebook. But the IT side of their org is held together with various off the shelf enterprise products and Windows stuff.


Everyone uses Active Directory because its basically the only functional solution for its problem space.


Which problem space is that? For group policy part cfengine, salt, puppet are far better (group policy is just modifying windows registry and manipulating files although at first it seemes a bit like magic). For authentication we have kerberos, which Microsoft pulled in from the free world beginning with Windows 2000. Kerberos does not fly in the Cloud world though.


And Dropbox is just sshfs, ftp-mount etc. There is a huge advantage to having an all-in-one, it just works, solution. Active directory is such a solution and does really well. Login to a different mschine and things are just setup. Push programs, setting, updates. Change security. It's great.


Have you ever actually done that - managed a large fleet of desktops and a productivity suite, all on Linux with no active directory? Complete with whatever the Kerberos equivalent of a backup DC is?


The problem with AD is that it is for windows only. It does not fly for managing Mac and Linux desktops. So that gives us salt for example, which has clients for all three. The second problem with ad is that if you need to manage any complex settings, you'll need to write your own templates. The included group policies are only for basic level os management + some for basic level Microsoft office managment stuff. Anything else and it's scripting + manual work and AD is only for distribution and selecting hosts/users where to apply those settings. And yes, I've managed multi thousand workstation networks with AD. Do not recommend it.


AD isn't just for Windows, which would be weird since it is mostly a fancy key value store (with associated functions and services of course). SSSD for example can use AD. The problem is that Linux itself doesn't support the same functionality client side, which using a configuration manager doesn't really solve. And question wasn't if you have used AD, but if you have managed Linux desktop deployments without it. Since your claim is that it is better.


I'm in the process of bulding a solution for managing all three OS'es. AD is not on the table because theres nothing to do with kerberos in our network and AD would be a "windows only" solution.


Why is AD a Windows only solution? Large corporations and startups use it to run tens of thousands of Macs and Linux machines in addition to Windows. In fact, I can't think of a single large company that does not use it. Its basically the core for many.


Linux can totally run in a AD domain with auth managed by AD. Client side SMB is also not bad. But you are excluding Kerberos for some unrelated reason, right?


I would imagine that Google has and is doing that, as is AWS.


You'd be surprised how big of a Microsoft shop Amazon is. It's pretty representative of a large company actually.


Hosting SQL Server has been another common use case for Windows Server for a long time, though it can run on Linux too now.


Last I looked many of SQL server advanced features didn't work on Linux. https://docs.microsoft.com/en-us/sql/linux/sql-server-linux-...


Right, like no Analysis & Reporting Services on Linux means if you're using it for BI, you're probably still running it on Windows.


The last time I installed sql server was on Linux




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: