Hacker News new | past | comments | ask | show | jobs | submit login

I have a reasonably noob-unfriendly linux distro (arch) on my bleeding edge kaby lane xps 13 laptop (month or two old) and even the touchscreen worked without any messing around with config files and so on. It came with ubuntu pre-loaded and I assumed they tweaked the arse out of the configs for it, and so expected pain when I replaced the OS. Turns out I was wrong. Vanilla with a minimal distro and even the touchscreen and suspend worked.

Woah? I was impressed anyway..

As someone who's suffered OS pains you wouldn't believe (I mean.. I ran solaris 10 on a hp probook as my main workstation for a year one time...) and for well over a decade -- things are MUCH better these days then they were.

Windows? Fair enough for you I guess. I'm not trying to convert you...

I'll come off badly here, but I'll be honest: I kind of judge people who use MS tech by choice and while claiming to be technologists as being in the lower end of the skills spectrum.

I'm not an elitist or at least; I'm really not trying to be.. Please don't take me wrong. At the same time I'm don't understand how anyone with who works in tech and is thus capable of really understanding these things would voluntarily pick this stack as their daily driver.

If you're a photo editor or a journalist or writer or something, sure -- I get why that stack works.. Programmer? Sysadmin? I can't get my head around it.

I'd like to understand, though.. What do you do? What tools do you use? Are you just living inside some IDE all the time and not interacting with lots of different tools?

Thinking out loud as I type, maybe that's the difference here then? We *nix folks tend to use lots of small tools and if you're just sat in one big IDE perhaps the OS around it doesn't matter so much to you as it does to us??




>I kind of judge people who use MS tech by choice and while claiming to be technologists as being in the lower end of the skills spectrum.

In return, I judge people who use technology choice as an indication of skill as inexperienced. At one point, I was like that too. I was all about tweaking and customizing Linux; getting it to run just so on hardware that, in some cases, was actively hostile to it (I still haven't entirely forgiven Broadcom). Moreover, at that time I was time-rich and cash-poor, so it made sense for me to spend extra time in order to gain open-source versions of closed-source tools that cost hundreds or thousands of dollars.

But, at some point, I realized that I wasn't really getting anything out of it. Watching stuff fly by on the command line, and tweaking config files by hand makes you feel like a 1337 h4x0r the first time you do it. But after you've had to tweak config files to get your wifi working for the twelfth time, after you've had to apologize to your collaborators because you can't edit their documents, after you've wasted people's time because your laptop can't connect to a projector, you realize that the real determinant of success isn't whether you're using Windows or Linux. It isn't whether you've memorized man pages or obscure command line flags. It isn't whether you can rattle off the contents of your xorg.conf and smb.conf by heart. It's whether you get stuff done. All the rest of it is just means to that end. And I realized that Linux, on the client, was more of an impediment to getting stuff done than a help.

I still use Linux. I have a half-dozen or so Linux VMs running on this machine right now for various projects. The fact still remains that new programming languages and server technologies come to Linux first, OSX second and Windows much later. But in terms of desktop/laptop experience? Linux still requires far too much tweaking and configuration, and I find it much easier to get stuff done with Linux on the server and Windows on the client.


Super late reply and I supect I'm talking to myself at this point, real life got in the way again. I did see your comment when you posted it and did mean to respond at the time.

Well, better late than never.

First off; I totally agree with you.

I wasn't trying to say that ability to configure xorg or tweak your desktop setup constantly was any indicator of technical skill and I've also met a lot of people who think that it is and they irritate me significantly too.

I'm not one of those. I don't judge skill on how well one can edit config files and I disregard the views of most of those who do as juvenile and in search of a measuring tape.

I had a similar journey to you w/r/t the need to get shit done vs having a fragile mess of customization to maintain and eventually it drove me to OSX (one 'yum update' before going away for lunch and coming back to an 80x25 console just as a change window opened was the straw for my particular camel).

I was comfortable there for a bit. It stayed the hell out of my way for most of the time (I just have fullscreen terms and a browser) which is what I needed from a machine that I'm working on. OSX stopped being the best tool for that job recently, but that's a post for another time.

So, back to my judgmental/biased statement.. I'm not really sure in both what I'm trying to say here or how to say it, but I'll try again for the hell of it.

I have found through years of personal observation (YMMVx1000) that windows users in any tech position that I'm involved with is an immediate smell of lack of technical skill. Again, that sounds really bad. Once again to be clear: I'm not saying there aren't smart .net or whatever folks around and I'll be proven wrong at some point (happily!), however I'm sticking to my view that as far as I've seen, on any team I've had (be it java devs or sysadmins) that the one or two folks using windows are always the weakest members of those teams.

While you can shoot me for being honest, I find the idea that a capable ops engineer or java developer would willingly choose MS as a driver, after they're at the point where they understand the issues around closed source and privacy, incomprehensible.


I wouldn't put it like that. I would put the notion of Windows users as sign of conservatism, rather than lack of technical skill. They're not entirely divorced, but it's not the same.

It also depends on your field. Yeah, if you're doing web dev, seeing Windows on the desktop might be a smell. But if you're doing embedded programming? Or gamedev? Or even just native software development for desktops? All of those areas pretty much require Windows. The toochains are, in many cases, Windows-only, and even if there are cross platform toolchains, 95%+ of your users are going to be using Windows, so you may as well run it to experience the app as your users will see it.

In fact, this is why I lament the lack of Windows in web development. There are so many web sites out there where you can tell (usually from font choices) that the entire development team was using Macs, because the site looks fine on a Retina display, but absolutely atrocious on, say, something like a 1366x768 TN laptop panel.

Moreover, in the Seattle area, at least, I'm see a slow but steady movement back to Windows machines from OSX because Apple is neglecting its offerings, and people are finding out that most server-side programming languages and frameworks actually work just fine on Windows these days. While setting up, e.g. Python and Ruby on Windows was really complicated and annoying at one point, these days it's as simple as just downloading the .msi installers from their respective websites and running them. And that's leaving aside WSL, which (though currently unfinished) promises to bring a full Linux userspace to the Windows kernel.

I don't know what you mean by, "Choose MS as a driver," but I personally have willingly chosen to go back to Windows as my normal day-to-day computing experience. Linux GUIs are atrocious messes designed by people too busy cargo-culting what Microsoft and Apple put out in their last iteration to do any actual UI/UX research. And OSX, for whatever reason, never really sat well with me. Maybe it was the lack of a proper "maximize" function. Maybe it was the menus at the top of the screen rather than the top of the application window. Maybe it was the fact that closing the last application window didn't close the application. There were too many decisions that I, personally found weird and unintuitive, even though OSX is supposed to be the more "intuitive" GUI environment. Again, this is all client-side. On the server, I still have no hesitation in choosing Linux. It works, it's stable, and it's very well supported.

As far as "closed source", well, honestly, I don't care that much. Like I alluded to above, I've seen too much crap open source software to have the illusion that open-source is some magic pixie dust that makes software better. It doesn't. The continuing state of Linux GUI (un)usability proves this. The fact that there are no open source IDEs that even approach the power, speed and usability of JetBrains products or Visual Studio proves this. There's a reason that "the year of the Linux desktop" only happened when Google took over and transformed Linux into something indistinguishable from a closed source OS.


Yep; like I said I have definite biases and my experiences are only from unixy shops and server/backend stuff - I hope you don't think I'm claiming otherwise. In my specific area of tech it's a definite smell.

When I said 'choose driver' I really meant 'daily driver' which means workstation/laptop OS to me (so apple, linux, windows, bsd, etc etc) -- almost everywhere I've worked with in the last few years allows anyone to use whatever they want so the choice is personal and not enforced.

It's all personal, I'm not sure what we're debating anymore. Use what works for you. I guess I have less requirements than you when it comes to interfaces. It's good to have alternatives and I'm glad that MS is becoming more friendly/capable as a host for developers..

My workflow for the last few years is I ssh into some big linux/bsd servers and do almost all my work there via tmux, my workstation is just a terminal and browser (two workspaces, both fullscreen). OSX was okay for this, the hardware was great (I had one of those fanless tiny macbooks after a few MBP's and it lasted 10 hours on battery weighed nothing and was really pretty) but now I'm on a XPS and I spent a single afternoon customising dwm which has turned out to be a much better fit for me (even though I still don't do anything locally).

Linux isn't a great desktop os for non-developers. It can work, sure, but I don't see it owning the consumer market and getting a tagged year :}

As for IDE's -- that's another personal matter and is largely dependant on where you work. You'll lose patience being the only member of a team using jetbrains when everyone else is on netbeans or the other way around; but I duck in an out of teams without having their specific IDE setup with just git and a term no problem.


Honestly, I think we agree more than we disagree. Use what works, indeed. Moreover, even if you're not personally invested in Windows or Microsoft, you should be happy that Microsoft is investing in making Windows a more capable platform for developers, if only because that'll drive everyone else (cough Apple cough) to step up their game.


> At the same time I'm don't understand how anyone with who works in tech and is thus capable of really understanding these things would voluntarily pick this stack as their daily driver.

Because it's way less work. Simple as that.

Windows, in my experience, just works. The things about it that don't work are the exceptions. Windows gets out of my way and lets me work with minimal setup and maintenance.

Linux, on the other hand, has been a pain in the butt every time I've tried to make it my primary desktop OS, and I have tried multiple times (granted, the last time was 6 years ago). Editing endless config files; building some obscure driver from source to make a device function correctly; and heaven help you if you want to do an OS version upgrade.


After hopping back and forth between Windows and Linux for 6-12 months at a time for a many years, I've realized that I spend at least as much time dorking around with Windows and related software after a clean install as with desktop Linux distros. Still I keep hopping, because both worlds get too frustrating after a while.

Out of the box, Ubuntu & friends (which have improved a lot in 6 years) are far more functional than Windows, assuming the hardware is properly detected and configured. When something goes askew, it's a toss-up whether it will be easier to fix on Windows or Linux. Familiarity/experience play a huge role here.


Well, my point was kinda exactly that things are better now.....

Yours was the exact reason I used OSX for almost 6 years; it stopped being true and I switched back to linux at the end of last year.

Since then, I've had exactly one time my laptop didn't wake up from sleep in around 2 months VS daily crashes and annoying popups from my last mac.

Also lots of grateful family/friends who recieved all my old apple gear ;)


> things are better now

Ubuntu 16.10 can't handle an external monitor with a DPI/resolution. And keeps throwing up error boxes after a fresh install on a Linux machine that came with Ubuntu by default.

Better? Maybe. Good? Hardly.


As one of those people: I've used Linux for quite a while in the past but switched to windows since windows 10 came out.

As someone else in this thread pointed out, I've only seen the Candy crush ad once, that's all.

With bash on Ubuntu on windows, great hyperv based docker and the windows system I'm just much more flexible. I have an the advantages of Linux with WSL (windows subsystem for Linux) and at the same time I don't have to spend time on getting my computer to work.

Whenever I see someone presenting using Linux, they have to spend like 10 minutes configuring their beamer. I'm just connected, with great scaling handling and ready to go.

Additionally, I have a surface pro 4, so whenever I want I can just start sketching on it (great for a programmer).

Stop judging people by the technology they use, because maybe you're the one being fixated.

EDIT: Oh, and PowerShell is actually a nice to use terminal.


It's easy to forget how obscure is a technology to someone who doesn't have the experience and understanding of all the underlying concepts. I tend to do the same with programming. I tend to think certain things are trivial, until I see a beginner struggling with concepts that are so obvious I didn't think it was worth spending more time than just mentioning them.

I am a beginner on linux, and outside of a few basic settings, I find myself very quickly in front of a terminal window, trying to guess the state of the system or what syntax could work and not break anything.


Yep so I'm definately being biased by personal experience here -- not claiming otherwise. I've been using and working with unixy types for >15 years or so and probably type 'grep' a hundred times a day (I'm a lead dev).

I'm genuinely just curious how your world looks as it's so different to mine -- I must live a sheltered life but even junior devs I have with routinely push makefiles and bash scripts etc..

Are you a dev? Do you just live in one big contained environment like visual studio or such that has all these kind of tools rolled in?


That's the thing. I'm not a dev, I am a banker. And there are only so many hours I am willing or able to spend on a week end relearning the basics.

GUI are infinitely valuable in that case. Windows allows to do quite sophisticated stuff all the way with a UI. You can always observe the state of the system, know what can be done from there, etc. I know a few command lines for some basic things I do all the time, but for things I do occasionally, it is just not worth me investing time doing it with a CLI, like creating a VM.

But then I am hostile to this new Microsoft...


> GUI are infinitely valuable in that case. Windows allows to do quite sophisticated stuff all the way with a UI. You can always observe the state of the system, know what can be done from there, etc.

People from Windows commonly have this impression because in Windows everything is either a GUI or the registry, and nobody wants to touch the registry without a long pole and a hazmat suit but you occasionally have to anyway. So you quickly build the intuition that anything without a GUI is painful and wrong, which isn't the case on other operating systems.

Most configuration files on Unix-like systems have manual pages that list what options you can set and what they do. For example the man page for logrotate.conf says you can add the line "mail address" to have it email the oldest log to address before deleting it. It's no more difficult to read the man page and find the option you want than look through seven pages of GUI settings to find it, and often easier because man pages are searchable. With the further advantage of actually describing what the option does rather than just giving you a GUI text box with the word "mail" next to it and letting you guess whether you're supposed to type an email address or a filename to mail or something else.

> like creating a VM.

There is a GUI program called virt-manager to do this on Linux. Most desktop Linux distributions have GUI programs to do a lot of things like that, people just don't use them as often because the GUI-alternative on Unix-like systems is actually reasonable. Although you picked a good example in the sense that creating a VM directly using qemu-kvm is harder than it probably ought to be.


Hmm. I've never worked with the 'frontend' of finance -- I know how your infra, payement processing, massive enterprisey stacks etc work but the day to day stuff the folks on the desks are doing I've not really observed.

It's a wide field I guess, but at the end of the day would I be safe to assume that you're mostly playing with data in interesting ways then telling various services/apps (aforementioned chunky java apps, for e.g) to do things based on what you work out from those manipulations?

If that's the case, then I really think there is an advantage for you in mastering the 'unix ide', but only if you're doing really custom stuff all the time...

If you're working with a bunch of tools that do everything you already need them to do, then of course there's no point in working out how to do the same with different tools. If you're constantly hitting the limits of how you can play with your data, then perhaps it's worth looking into a language which is good at manipulating data and you'd be suprised how powerful this mysterious CLI can be..


I'm gonna suggest taking a look at Fedora/GNOME. I feel like that's the closest Linux has to a cohesive GUI.


> perhaps the OS around it doesn't matter

Of course it doesn't matter. Windows, macOS, and Linux are all good enough. More than ever, you pick the operating system and hardware to support the applications you want to run.


Well, okay.

The tools I use will work on all these operating systems, but I wouldn't consider windows or OSX 'good enough' for me.

I suspect most developers could get their stacks to work on everything from freebsd to windows (well, unless you're a .net dev) -- so that's not really the reason to pick one or the other?

I don't run linux becuase 'vim' works better than it does on opensolaris... right?


I'm guessing we don't really disagree on this.

You are right about vim. It works well everywhere, so that's not likely to be the application that dictates what the rest of your system is. If you use vim and play a lot of Call of Duty, then you are probably going to use Windows. If you use vim and absolutely depend on Final Cut Pro, you're going to be on a Mac.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: