Hacker News new | past | comments | ask | show | jobs | submit login

>I kind of judge people who use MS tech by choice and while claiming to be technologists as being in the lower end of the skills spectrum.

In return, I judge people who use technology choice as an indication of skill as inexperienced. At one point, I was like that too. I was all about tweaking and customizing Linux; getting it to run just so on hardware that, in some cases, was actively hostile to it (I still haven't entirely forgiven Broadcom). Moreover, at that time I was time-rich and cash-poor, so it made sense for me to spend extra time in order to gain open-source versions of closed-source tools that cost hundreds or thousands of dollars.

But, at some point, I realized that I wasn't really getting anything out of it. Watching stuff fly by on the command line, and tweaking config files by hand makes you feel like a 1337 h4x0r the first time you do it. But after you've had to tweak config files to get your wifi working for the twelfth time, after you've had to apologize to your collaborators because you can't edit their documents, after you've wasted people's time because your laptop can't connect to a projector, you realize that the real determinant of success isn't whether you're using Windows or Linux. It isn't whether you've memorized man pages or obscure command line flags. It isn't whether you can rattle off the contents of your xorg.conf and smb.conf by heart. It's whether you get stuff done. All the rest of it is just means to that end. And I realized that Linux, on the client, was more of an impediment to getting stuff done than a help.

I still use Linux. I have a half-dozen or so Linux VMs running on this machine right now for various projects. The fact still remains that new programming languages and server technologies come to Linux first, OSX second and Windows much later. But in terms of desktop/laptop experience? Linux still requires far too much tweaking and configuration, and I find it much easier to get stuff done with Linux on the server and Windows on the client.




Super late reply and I supect I'm talking to myself at this point, real life got in the way again. I did see your comment when you posted it and did mean to respond at the time.

Well, better late than never.

First off; I totally agree with you.

I wasn't trying to say that ability to configure xorg or tweak your desktop setup constantly was any indicator of technical skill and I've also met a lot of people who think that it is and they irritate me significantly too.

I'm not one of those. I don't judge skill on how well one can edit config files and I disregard the views of most of those who do as juvenile and in search of a measuring tape.

I had a similar journey to you w/r/t the need to get shit done vs having a fragile mess of customization to maintain and eventually it drove me to OSX (one 'yum update' before going away for lunch and coming back to an 80x25 console just as a change window opened was the straw for my particular camel).

I was comfortable there for a bit. It stayed the hell out of my way for most of the time (I just have fullscreen terms and a browser) which is what I needed from a machine that I'm working on. OSX stopped being the best tool for that job recently, but that's a post for another time.

So, back to my judgmental/biased statement.. I'm not really sure in both what I'm trying to say here or how to say it, but I'll try again for the hell of it.

I have found through years of personal observation (YMMVx1000) that windows users in any tech position that I'm involved with is an immediate smell of lack of technical skill. Again, that sounds really bad. Once again to be clear: I'm not saying there aren't smart .net or whatever folks around and I'll be proven wrong at some point (happily!), however I'm sticking to my view that as far as I've seen, on any team I've had (be it java devs or sysadmins) that the one or two folks using windows are always the weakest members of those teams.

While you can shoot me for being honest, I find the idea that a capable ops engineer or java developer would willingly choose MS as a driver, after they're at the point where they understand the issues around closed source and privacy, incomprehensible.


I wouldn't put it like that. I would put the notion of Windows users as sign of conservatism, rather than lack of technical skill. They're not entirely divorced, but it's not the same.

It also depends on your field. Yeah, if you're doing web dev, seeing Windows on the desktop might be a smell. But if you're doing embedded programming? Or gamedev? Or even just native software development for desktops? All of those areas pretty much require Windows. The toochains are, in many cases, Windows-only, and even if there are cross platform toolchains, 95%+ of your users are going to be using Windows, so you may as well run it to experience the app as your users will see it.

In fact, this is why I lament the lack of Windows in web development. There are so many web sites out there where you can tell (usually from font choices) that the entire development team was using Macs, because the site looks fine on a Retina display, but absolutely atrocious on, say, something like a 1366x768 TN laptop panel.

Moreover, in the Seattle area, at least, I'm see a slow but steady movement back to Windows machines from OSX because Apple is neglecting its offerings, and people are finding out that most server-side programming languages and frameworks actually work just fine on Windows these days. While setting up, e.g. Python and Ruby on Windows was really complicated and annoying at one point, these days it's as simple as just downloading the .msi installers from their respective websites and running them. And that's leaving aside WSL, which (though currently unfinished) promises to bring a full Linux userspace to the Windows kernel.

I don't know what you mean by, "Choose MS as a driver," but I personally have willingly chosen to go back to Windows as my normal day-to-day computing experience. Linux GUIs are atrocious messes designed by people too busy cargo-culting what Microsoft and Apple put out in their last iteration to do any actual UI/UX research. And OSX, for whatever reason, never really sat well with me. Maybe it was the lack of a proper "maximize" function. Maybe it was the menus at the top of the screen rather than the top of the application window. Maybe it was the fact that closing the last application window didn't close the application. There were too many decisions that I, personally found weird and unintuitive, even though OSX is supposed to be the more "intuitive" GUI environment. Again, this is all client-side. On the server, I still have no hesitation in choosing Linux. It works, it's stable, and it's very well supported.

As far as "closed source", well, honestly, I don't care that much. Like I alluded to above, I've seen too much crap open source software to have the illusion that open-source is some magic pixie dust that makes software better. It doesn't. The continuing state of Linux GUI (un)usability proves this. The fact that there are no open source IDEs that even approach the power, speed and usability of JetBrains products or Visual Studio proves this. There's a reason that "the year of the Linux desktop" only happened when Google took over and transformed Linux into something indistinguishable from a closed source OS.


Yep; like I said I have definite biases and my experiences are only from unixy shops and server/backend stuff - I hope you don't think I'm claiming otherwise. In my specific area of tech it's a definite smell.

When I said 'choose driver' I really meant 'daily driver' which means workstation/laptop OS to me (so apple, linux, windows, bsd, etc etc) -- almost everywhere I've worked with in the last few years allows anyone to use whatever they want so the choice is personal and not enforced.

It's all personal, I'm not sure what we're debating anymore. Use what works for you. I guess I have less requirements than you when it comes to interfaces. It's good to have alternatives and I'm glad that MS is becoming more friendly/capable as a host for developers..

My workflow for the last few years is I ssh into some big linux/bsd servers and do almost all my work there via tmux, my workstation is just a terminal and browser (two workspaces, both fullscreen). OSX was okay for this, the hardware was great (I had one of those fanless tiny macbooks after a few MBP's and it lasted 10 hours on battery weighed nothing and was really pretty) but now I'm on a XPS and I spent a single afternoon customising dwm which has turned out to be a much better fit for me (even though I still don't do anything locally).

Linux isn't a great desktop os for non-developers. It can work, sure, but I don't see it owning the consumer market and getting a tagged year :}

As for IDE's -- that's another personal matter and is largely dependant on where you work. You'll lose patience being the only member of a team using jetbrains when everyone else is on netbeans or the other way around; but I duck in an out of teams without having their specific IDE setup with just git and a term no problem.


Honestly, I think we agree more than we disagree. Use what works, indeed. Moreover, even if you're not personally invested in Windows or Microsoft, you should be happy that Microsoft is investing in making Windows a more capable platform for developers, if only because that'll drive everyone else (cough Apple cough) to step up their game.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: