Hacker News new | comments | show | ask | jobs | submit login
Dell’s gamble on Linux laptops has paid off (techradar.com)
302 points by em3rgent0rdr on Nov 16, 2017 | hide | past | web | favorite | 310 comments

As a 20 year Linux user I really appreciate Dells support in proving the market exists. This improved support across all manufacturers.

I've recently acquired a HP Envy 13 2017 laptop with Windows preinstalled. I've really wanted to try Windows out, but after installing Google Chrome and Docker (1hr after starting it up) it stopped working - windows boot up into a blank black screen.

In a hurry I've downloaded Ubuntu 17.10, clicked yes and I had a working OS in 15 minutes time. With everything (suspend, broadcom card, USBC video,...) working out of the box.

Super impressed.

If you're a developer there's no UX reason to go Windows anymore.

Agreed. One laptop worth investigating that I haven't seen anyone discussing in this thread is the Precision 7520. It is the same chassis as the XPS 15" laptop, but you can configure it with some more adult options.

It is 2017 and I finally own a laptop with 32GiB of ECC(!) RAM. That is pretty damn cool. Allow me to roll out my specs, just cause I love this laptop:

* Quad Core Xeon (1505M)

* Discrete GPU with acceptable gaming performance and great at cracking passwords in a pinch

* 32 GiB of ECC RAM

* 1920x1080 15.6" (144ppi) display with no mic or web cam

* An actually decent keyboard

* Touchpad that is actually nice

* So many ports. All the ports. The best ports (4 USB-3, 1 USB-C, HDMI, mini DP, Ethernet, SD Card slot, Smartcard Slot)

* For the cost of equivalent Applecare, someone will show up to my house same day and fix this beast if something breaks

* Expandable and replacement storage and memory

Tell me all those ports don't make you a little jealous if you are using a Macbook, I won't believe you ;). It does come at he cost of this laptop being a bit chunky, so it won't be for everyone. You can also get it with a HiDPI display, but I actually prefer things right in this 120-144ppi range more than Retina.

It is just a great computing package. End of story. I also still love my 2013 MBP. It is also a beautiful machine for its own reasons, so I am no Dell / Linux fanperson, but I have a bit of a crush on this laptop as I just got it about a month ago :)

Sounds like what I should have bought. How long have you had it? Any issues? I'm about to swear off Dell due to the XPS15 (9550) - this laptop has been a buggy mess since I got it: had to replace the oem wifi adapter for an Intel to get any kind of stability, battery issues, issues with it waking up randomly in my bag and almost overheating, random high speed fan for no reason (no CPU load or heat reason). I think Dell has issued a BIOS update at least once per month for the past year to try and fix everything :(

I do like the fact that you can actually user repair it though (SSD, memory, wifi, battery) - otherwise it would have been gone a month after I got it. I wish Apple would do this on the "pro" macbooks... ugh.

Only one month. I had an XPS 13 and it was much buggier. I figure the large chassis makes it easier to design the hardware. The 13 worked but was a real battle for the first week.

This machine on 1604 is a dream so far. I do see Dell has more production issues, especially on XPS machines. Intel WiFi is a must. I learned a lot about what Dell does best on the 13" XPS.

I have a similar Dell 7510 running Debian Sid and my experience is also mostly positive.

I had to disable the built in graphics in the BIOS to get my 2 external monitors to work (some Nvidia Quadro mess), suspend doesn’t work properly (common Linux problem) and the headphone port does not work unless you toggle microphone mute on an off first (could be some pulseausio wierdness but I haven’t gotten to the bottom of it so far).

Even played some games on it, although I try to stay away from that sort of thing.

We're having similar problems at work regarding the headphones. You just have to remove them and plug them in (sometimes twice) before they work after a restart. This problem is present on both Lenovo and Dell laptops and in both cases with Windows 7.

It's a lot bigger and heavier than an XPS 15 though. That said, I am a big fan of Dell's business lines. The Latitude 7000 series are great machines.

I have a Latitude e7470 which I highly recommend, provided you don't need dedicated GPUs. It supports up to 32GB (non-ECC), has a wide array of ports, and is much closer to the XPS in terms of size and weight.

The Precision 5520 is pretty much the same as the XPS 15 though, in dimensions. And largely allows the same options (Xeon, 4K, different speed SSDs, etc.).

What laptop make and model is this please? Sounds great. Is the keyboard backlit?

> One laptop worth investigating that I haven't seen anyone discussing in this thread is the Precision 7520.

He has a Dell Precision 7520. If you customize the model on the Dell website, there's a backlit keyboard option.

^^^^ See above :)

Just make sure anyone seriously considers this takes a long hard look at hte physical dimensions and weight. It is not a sleek or light machine :)

The Precision 5520 has very close to the same options, in a sleeker body.

Hastily gets nonexistant wallet out to go buy one

I like how Ubuntu and Red Hat are in the officially supported list of OSes.

Why didn't you go for 64GB ECC DDR4? :P

I couldn't see a smartcard slot on it, and then I learned that the smartcard reader is contactless.

And it supports SD cards up to 2TB in size even.

The GPU only supports 4GB RAM though.


So I just found the order page and have been having a bunch of fun speccing out the most expensive possible device.

After wading through the comedy of

- LINUX/RED HAT Operating Systems are not available with Dell Threat Protection and Endpoint Security Suite option. Please change one of these selections.

- The Operating System you've selected does not support the ENERGY STAR Energy Efficient Option.

- The LINUX Operating System is not compatible with the Processor.

The absolute winner of

- The Operating System you've selected does not support Office Productivity Software.

And the (easily fixed) legacy annoyance of

- The Ubuntu Operating System does not support the Wireless Card you've selected.

I finally reached:

Starting at Price: $8,383.58

Total Savings: $3,005.11

Dell Price: $5,378.47 <-- the actual cost


I added absolutely everything but kept the screen at 1080p (which adds capacitative touch!). So it has 64GB RAM, two 1TB SSDs, a 4GB Quadro M2200, a 3.9GHz Xeon E3, and an IPS display.

I like how vPro can be left unchecked :>

I'm confused why it's still saying Linux is incompatible with the processor though. Why? :(

Let's see if this URL breaks Arc: http://www.dell.com/en-us/work/shop/dell-laptops-and-noteboo...

I tried really hard to come up with a need for the extra RAM and failed. Even with a few beefy VMs. Plus ECC is only available up to 32GiB. I kept mine to about 3800 and got all I can use for a long while I think :)

That's fair.

I open far far too many Chrome tabs and only have 8GB RAM, so I guess I'm very heavily biased.

FWIW, I was able to select 64GB ECC DDR4 in the configurator.

Windows has gotten a lot better, but the rules regarding new computers hasn't changed in the past 30 years, you still need to wipe the hard drive and install it yourself from scratch, and then spend a little time disabling the ads, bing crap, telemetry, etc.

Once you do, it's rock solid, but it should be OOTB and it annoys me that it isn't. It leaves those not familiar with Windows with a bad first impression, and those tech savvy enough to install Linux just don't come back.

"It leaves those not familiar with Windows with a bad first impression"

But an accurate impression. Windows main focus is Microsoft and partners product sales and ad delivery. No matter how much you remove after install, they're likely to override your very obvious wishes not to be bothered in a forced update.

It's for this reason that I have sworn off never, ever managing Windows again. The fact that they have turned many Win10 machines into billboards - and this is an OS you have to pay for! - really makes me sick. You cannot get away from advertising built deep into the core of the OS. And MS have deliberately taken away a lot of the control admins have over their own LAN. Even Enterprise editions of Win10 don't have control of the telemetry.

And yes, I've seen first-hand the updates steamrollering your preferences and setting them to nice, MS-friendly 'please target me with marketing' options. I gave up Windows in 2008, was Mac full-time until 2015 and now I run Ubuntu exclusively, at home and work. Canonical aren't perfect, but they're a bit more open about their commercial expectations than MS will ever be.

On multiple occasions I have seen Windows 10 revert the default browser preference to Edge because "an error was detected" with no explanation of the error, and Chrome still functioning normally. I have also seen IE on Windows 10 revert the default search engine to Bing. I'm sure it's all just in the best interest of the end user...

"Windows has gotten a lot better, but the rules regarding new computers hasn't changed in the past 30 years, you still need to wipe the hard drive and install it yourself from scratch, and then spend a little time disabling the ads, bing crap, telemetry, etc."

The funny thing is that Microsoft know that the normal OOTB experience is bad, and created the "Signature" effort to offer people PCs with clean Windows installations. Then they seem to have forgotten that, and started putting their own passive-aggressive "value-adds" into Windows 10.

Well, I remember 30 years ago you didn’t have to disable ads, Bing crap and telemetry. In fact 5 years ago all you needed to disable was shitty drivers and some trial software.

5 years ago it was horrible we all just forget the bad things and mostly remember good.

From 2012: https://www.itworld.com/article/2718342/consumerization/here...

That's all OEM junk though. If you installed from official media on a PC you built yourself (or at least a wiped disk), you didn't deal with that stuff.

With Windows 10, that is unfortunately not true anymore because MS is the provider of a lot of the crapware.

That was what I was responding to. A person recommend that the proper way to install Windows on a new Laptop was to wipe the drive and install a fresh Windows. One person said it wasn't needed 5 years ago :)

May be 5 years ago for PCs but still year 0 for Smartphone crap-ware bundling. At least PCs have the ability to remove it with keeping the hardware warranty intact.

> Once you do, it's rock solid, but it should be OOTB and it annoys me that it isn't.

I was very happy with an Asus Zenbook I bought that had the PURE (+ sometimes a number) suffix on the model name. Those laptops come with a clean windows install + any needed drivers.

It only saves an hour or two for yourself, but I wish more (or really all) manufacturers supplied these models.

A co-worker of mine recently got a Surface Laptop.

Very nice piece of kit, but even that came loaded with ads, and would search Bing and the app store every time you searched the start menu.

It's little things like this that make Windows so much worse. It takes ten minutes in Group Policy to turn it off, but the majority of users will never go near group policy, and will just assume that is just the way Windows works.

> Windows has gotten a lot better, but the rules regarding new computers hasn't changed in the past 30 years, you still need to wipe the hard drive and install it yourself from scratch, and then spend a little time disabling the ads, bing crap, telemetry, etc.

You're just scratching the surface on how user-hostile Windows can be. The ads, bing crap, and telemetry are simply horrific considering it's a paid product.

Windows Update is still a mess. It takes me less than a couple of minutes to pacman -Syu, while windows updates can take HOURS.

Some random windows service can still suddenly ramp up disk usage or cpu to 100%. There are two control panels. File Explorer is still terrible. The list goes on.

> and those tech savvy enough to install Linux just don't come back.

It isn't a matter of tech savvy, and pretending that it is is part of the problem with the Linux Desktop community, which is the worst part about using a Linux Desktop if you ask me.

The system has a whole lot of issues, many of them somehow even worse that dealing with Windows 10's bullshit, or else I'd have switched myself by now. Even Linus calls out the Linux Desktop for being pretty crap. Developing for it is a nightmare too, especially in regards to distribution.

I'm trying not to go on a rant here. Point is, there are plenty of reasons to not want to use a Linux Desktop aside from not being "tech savvy" enough.

I feel like you are arguing the same side. The reason you have to be tech savvy is that the Linux Desktop isn't great (note: I currently operate only on linux at home and work). You have to know some tech just to get a decent amount of things working. This is getting better every year, but we are still dealing with issues that the average user has no idea how to diagnose or fix.

"Average User" is a strawman. We really need to stop using that term in software development, but Linux Desktop people especially need to stop using it because to them "Average User" is some drooling moron who uses the CD tray as a cup holder. To the extent that "Average User" ever existed, he has largely moved on to using a phone or tablet anyway.

Also, I disagree that it is getting better every year. Individual pieces may work better once they're properly configured, but on the whole the system is becoming more and more of a complicated kludge of intertwined, abstracted, obfuscated, and otherwise non-transparent mechanisms. I personally don't see much hope for it as long as that trend continues.

I liked version 7 but I believe Windows has taken a downturn since then.

I was about to defend it, but on reflection, windows 10 is a real mixed bag.


- It's fantastically fast, I dislike going back to win 7

- Some of the utilities are nicer


- Random scheduled tasks fire in the background at really annoying times. I installed a completely clean install of win 10 and some fail and then constantly re-run as they've not been marked as completed (e.g. .net 32-bit optimization service, some sort of memory optimization service). On my desktop I don't notice, but on my laptop if I leave it unattended the fan often starts going crazy. I've tried fixing the broken services but never really got to the bottom why. What's most annoying is the when you then try and disable or replace these services with dummy exes, windows update will "helpfully" "fix" them and re-enable them. There are also various scheduled tasks that automatically re-enable themselves on restart. It was also constantly trying to run one windows update that kept failing and it never told me, but would go through the whole rigmarole of doing incredibly expensive windows update trying to install this one driver.

- New start menu is bad, it's simply just not as usable as win 7, you end up always searching where win 7 seem to just be easier to quickly find things without typing anything

- Tiles are generally annoying

- New windows apps are annoyingly fancy and touch screen centric, for example there's no reason a calculator should be semi-transparent, it's actually really visually distracting

- Telem crap is incredibly user-hostile, invasive and almost impossible to get rid of

- Ctrl-alt-delete is partially broken and no longer works seamlessly. You have to do some crazy stuff like create a new desktop to get off a crashed program

- Explorer (file explorer) just seems to have become a real mess since XP and searching is agonisingly slow. Especially if you're doing it on a folder that contains bower or npm folders, in 2017 windows still struggles with long file paths

How did you find whether the update services were running or not? I was having trouble with a failing update, but I couldn't find any evidence of it running.

In "Windows Update Settings" there's a "Update History" link which lists all recent updates. For the one that was failing it was listed every day.

This had been going on for months too.

I think that's my biggest beef with Windows 10, you're not supposed to muck around with the scheduler running tasks when idle or the updates, but it's so brittle that when it fails it does really stupid things without telling you.

The UX improvements to window management, multiple monitors and virtual desktops since 7 is immense though.

> Windows has gotten a lot better [...]

Remember when we used to say the same thing about Linux? Funny how times change.

Regarding gaming - Linux went backwards now that new gadgets like VR headsets and high refresh synced monitors are required for serious gaming. Feels like 2002 all over again without any games.

TBH, the hardware manufacturers could write drivers for their stuff.

Or at least write open documentation so free software can do the work.

So that people can do their work for free.

Is writing proper docs so bad not even getting these fine people working for you for free makes it worth the effort?

They might do, if Linux wasn't so hostile to the concept of proprietary drivers and had a stable driver ABI.

Not that I necessarily disagree with their position on the matter, but you can't escape the consequences.

Well, there would be another kind of consequences. See Android, where there are proprietary drivers and most of them are buggy crap thrown over the wall.

There is also a reason, why Apple won't ship hardware or drivers, that they don't have source for.

I'm not sure "Linux" (by this I believe you mean Linux kernel and distros & open-source software) is hostile to the concept of proprietary drivers. There is only a small subset of users who are hostile to proprietary drivers. And I have no idea why you are suggesting Linux doesn't have a stable ABI.

Because for drivers, it doesn't and never did and never will as long as Linus has anything to say about it. This is in stark contrast to his position on userland ABI, which is so stable you can still run Motif.

Ok, that sounds like a fair criticism.

hangman still works great!

> Once you do, it's rock solid, but it should be OOTB and it annoys me that it isn't.

Really? Which version are you using? On a clean install of Win 10 Pro, I fell foul of several known issues. A failing update to Anniversary edition required reinstallation, and the Creator's update only made it through after initially crashing mid-update. The fast shutdown (which I wasn't aware of) caused me a problem with network connectivity on moving the machine which was hard to troubleshoot because it was unexpected. Then to top it all off once I got it running properly, the start-menu was still full of ads.

Maybe if I could get hold of the enterprise edition, with all the ads/telemetry disabled by default, it'd be a better experience, but my faith in the updates has truly gone and as a result we now just leave the Win10 install of the dual-boot alone.

I’m using Enterprise on an XPS 15, and I’ve turned the advertising/crapware etc with group policy and it is solid.

I wouldn’t use any other version of Windows, Enterprise is the only usable version.

You shouldn’t have to do that, though, especially on Enterprise!

There's an option in Windows Defender settings that automatically wipes the hard drive and reinstalls from scratch, without the manufacturer's bloatware

Tried this on a Lenovo notebook and it will reinstall the Lenovo bloatware anyways. I think this restore the initial windows image state, but the image that came in the notebook is customized by Lenovo.

Yes, the 'refresh Windows' feature actually reinstalls from the image OEM provided, which contains all the bloatware. You need to create a fresh install media from Microsoft (their media creation tool) and reinstall from DVD and delete the OEM recovery partitions to finally be rid of them.

I believe it's even more sinister than that. I recall reading about a feature of Windows 10 where it will automatically install OEM bloatware from separate storage on the board, even if you install from a generic Window disk/USB. I'm trying to find a source.

Edit: I cannot find a source, so take this with a grain of salt, however I do clearly remember reading it.

I believe the one you're referring to is the "Windows Platform Binary Table (WPBT)" feature, which has a slot in UEFI where an exe can be put, which is run by Windows during an install.

Couple of refs for anyone interested:



It actually runs on every boot, even worse. This is how overbroad nanny software like Computrace works, you can remove it but Windows will just reinstall it at boot.

Kind of a glaring security issue.

So only Microsoft’s bloatware is left. With the ads and ‘telemetry’ spyware.

This is actually useful. Does it preserve user data?

IIRC you have three options: (1) wipe OS/programs, preserve user data, (2) wipe OS and data, and (3) securely wipe OS and data (for if you're selling the PC).

As another 20+ (24 actually, 0.99pl15) Linux user let me counter: with the Linux subsystem for Windows there's no reason to wrestle with Linux any more. While Linux certainly works on laptops it doesn't work well. Wifi and battery life are absymal and for me seemingly every other update either breaks the printer half or the scanner half of the MFC device (and if not, then it's time for another bluetooth breakage, you can search Arch forums on how many times BT broke on me). Let's not even mention that I simply can't log into one of clients VPN network as F5 doesn't seem to have a Linux client capable of web login (previously it was a Firefox extension but of course that capability needed to be obsoleted). So now I need to use my Android phone to VPN in and tether it... madness.

For many years (roughly since systemd) I hoped there's a kind of "modern Linux handbook" especially when people come up with solutions to problems seemingly totally out of nowhere but turns out, there's nothing. Some people have knowledge of narrow areas but the entire modern Linux is a frighteningly complex monster. And when you realize your fractions of knowledge makes you one of the experts, that's even worse: when I ask about DisplayLink in Linux, I am redirected to the relevant Arch Wiki page where the meat of the page was written by me years ago...

I set up Windows 10 to my taste https://github.com/chx/chx.github.io/wiki/How-I-set-up-my-Wi... and it's a better world now. Drivers and GUI programs are running in Windows, daemons and CLI tools are running in Linux, switching to a brief game session is trivial.

Do you have a Dell XPS 13 (any version)? Because I do (a 9343), and I don't have any of the problems you listed above - in fact, I spend far less time managing my laptop since I switched to Ubuntu. I started off with 14.04 then switched to 16.04 and I upgraded to an 4.9 LTS kernel in January. That's about it.

I have no issues with battery life, wifi, or BT or anything else. The first ~6 months of 2014 was a rough ride, because the hardware was so new there was no proper support for it in the kernel, but the team @ Dell managed to upstream all the required stuff, so depending on their custom Ubuntu release is no longer necessary. Everything should just work out of the box.

Same for me: Dell XPS 13 user, Fedora user, it all "just works" including after a recent Fedora 27 upgrade, although in fairness I did have to replace the Broadcom wi-fi card with an Intel one, due to Intel having far superior Linux drivers: http://www.alphadevx.com/a/510-Replacing-a-Broadcom-wireless...

Once I done that, I now have the perfect developer laptop (in my opinion of course :-) )

> there's no reason to wrestle with Linux any more.

Unless one wasn't "wrestling" with Linux in the first place, because things tend to just work these days.

> While Linux certainly works on laptops it doesn't work well. Wifi and battery life are absymal

That's blatantly untrue. Given the right laptop, Linux is on par or excels. Given the wrong laptop, an install from scratch of the Windows operating system is likely to be far worse.

> problems seemingly totally out of nowhere

So problems that arose without any changes to the system?

> there's nothing.

Well, there's full access to your own system.

> Drivers [..] are running in Windows

Never run old hardware or peripherals? Because if you do Windows is a world of pain filled with drivers sourced from questionable download sites.

I'll assume good faith here instead of astroturfing, but you might want to tone down the unwarranted praise of X over Y when describing your satisfaction with your current setup, especially when you're apparently not really familiar with Y.

I have plenty of experience with linux on laptops and windows.

Yes it is very nice to have 1, 2 or even 5 Laptops which work with linux.

Windows still runs on more devices, better.

I had never the issue, that i had to sit it out that a specific bug came into the release cycle of a windows update but i had this 4 times in 3 years with arch linux.

And i also have a strange experience with linux: One ssd does work flawless under windows and under linux it creates minilags of the whole system.

While i like my linux machine and i use it daily @work and also daily @private laptop, it takes more effort and more knowledge than windows.

> Yes it is very nice to have 1, 2 or even 5 Laptops which work with linux.

It's been a long time I haven't seen a general purpose laptop or desktop that didn't work flawlessly with both Ubuntu and Fedora. The only one I have is the MacBook Pro in front of me, but it comes with a decent OS anyway.

> Windows still runs on more devices, better.

Can it delete open files now?

> Can it delete open files now?

Does it matter? And how is this related to the rest of the discussion? :)

The open-delete-close pattern is common with Unix-like OSs. It makes temporary files invisible and inaccessible to any program that doesn't have them already open at deletion time.

> under linux it creates minilags

Samsung 840 series? Well-known bug in their firmware. Update your firmware.

Nope; Early intel SSD from 2009 when they were released.

Your comment doesn't fully implement the Linux Desktop Evangelism playbook.

It's got variations of: "It works for me!" "You chose the wrong hardware!" "Well you can fix that because the source is available!" and "But Windows is worse!"

But it's missing: "Try distribution #132!" "You don't actually want to do that!" and my favorite: "Maybe you're just too stupid to use it!"

How about these:

Dual-booting on an XPS15. Running Ubuntu and Win10. I've had fewer problems with Ubuntu than Win10.

Had a HiDPI version of the same laptop at the last gig. Both Windows and Linux had some issues which were easier to overcome.

Windows definitely gets better hardware support.

However, having my local "dev" environment be native Linux is glorious. No weird Docker issues. Python is legit. Don't have to mess with Homebrew.

As another 16+ (2001, not sure which kernel) Linux user let me counter: with Ubuntu there's no reason to wrestle with Linux any more.

Wifi, Bluetooth, HD resolutions, keyboard buttons, printers, everything just seems to work in Ubuntu, and this has been the case for quite a few years now.

Ah, WSL. That sounds great. Docker works, let's try to edit something without messing up the permissions...

  $ git clone ssh://blahblahblah.space/repo.git
  $ code repo
VS Code opens... C:\windows\System32. Because the contents of the VolFS contents aren't available to Windows applications. So, you have to use /mnt/c, which is using Windows permissions (so, everything ends up 0777 for example). That was disappointing.

Microsoft explicitly warns people in big red letters never to open any WSL files from Windows. I'm not sure if that will ever be possible.

"While Linux certainly works on laptops it doesn't work well."

Part of the issue is that using Linux exposes you much more directly to the quality of the hardware (and the associated firmware and drivers). The hardware that "just works" on Windows might be horribly engineered, but Windows and the proprietary drivers might cover over the problems. On Linux, bad hardware is more likely to just work badly.

> you can search Arch

Isn't Arch specifically for users who don't mind things breaking and going deep into internals to fix it?

Correct but that has caused their wiki to be one of the better places to find out how to setup and configure services, hardware, etc even if you have to do a little tweaking to the instructions for your own distribution

I think it's well-suited to anyone who despises bloat. To anyone who wants to run a resource-efficient and performant system with just the parts they need. And honestly, most users will never need to dig deep. Most users will only, occasionally, need to follow a few steps on the well-explained Wiki.

More of my thoughts on Arch:



I tried arch out in 2016. Over a period of two weeks, 3 separate updates broke one thing or another.

In my 4 years using mint / ubuntu, an update broke my machine once, and that was when the power went out during an update.

Arch apparently works great for a lot of people, but I'd urge caution.

Did it break in an unrecoverable way, or was it just an inconvenience?

Arch is a rolling release that many choose to update daily, or on a fixed schedule, like once a week. That updates sometimes can break a thing or two, out of hundreds of updates, is to be expected. If that sounds scary to you, Arch is certainly not for you. That said, it has rarely happened to me, and any breakage is usually minor and easily fixed. It's not like you have to even figure out how; it's laid out for you on the Wiki, Forum or Arch's homepage. My experience with other distros has been far more troubling, except Slackware. Arch's strengths easily outweigh occasional inconveniences. Not having to deal with anything Ubuntu or Canonical is a great bonus.

I originally went Slackware when diving into Linux. Manual dependency resolution was just a step too far. Installing Calibre became a nightmare. I turned to Arch and have been very happy. I simply don’t have the time to properly maintain a Slackware box. I do really like it though.

I hear you regarding time... If I didn't regularly try out new software it could still be a viable choice, but now I enjoy the ease of Arch's ecosystem. I guess a lot of people who still use Slack does a full install and have few needs beyond those packages.

My admittedly anecdotal counter to that... With “begintermeditate” Linux experience, I spun up Arch on a machine with an AMD processor/mobo and dirt cheap graphics card to act as a server in February 2017. I system update when I remember (every 1-4 weeks). I haven’t had anything break. Admittedly, I containerize nearly everything (Plex, Calibre server, bittorrent clients, etc) out of pure laziness and do very little outside that on the machine. It has been incredily low maintenance, and has been wonderful. I’d highly recommend.

Incorrect, I break arch less times than Ubuntu and I use it daily while I use Ubuntu casually

Arch's KISS attitude is really helpful for that. Whenever I have to deal with a Debian/Ubuntu system, I'm sort of weary of "apt install"ing any daemons because of the autoconfigure magic that's going on in dpkg.

With Arch, installing a package just unpacks the package files and runs a few innocuous post-install hooks, mostly to rebuild indexes (mandb, fontconfig, etc.) and compile DKMS-based kernel modules. But nothing potentially destructive like `systemctl start postfix`.

Also, it installs exactly what you want, instead of dozens of tangentially-related "suggested packages". If you `apt install rsyslog`, it pulls in an entire MTA because somewhere deep in rsyslog, there is an option that might send emails, so Debian in its unending wisdom determines that you MUST HAVE an MTA beyond this point. In Arch, `pacman -S rsyslog` installs just rsyslog and a few required shared libraries.

Also, it installs exactly what you want, instead of dozens of tangentially-related "suggested packages".

So does Debian, if that's what you tell it you want (either on the config file or as a CLI argument). Even without that, while it may suggest a full MTA by default, you can ask it for another solution and it'll suggest a simple mailer (like bsd-mailx).

I don't dispute that Arch's defaults guide you to a simpler system, but it's not really hard to start from Debian netinstall and add just what you need, package by package.

> you can ask it for another solution and it'll suggest a simple mailer

Exactly. The correct solution for me is "no mailer at all".

Right, so tell it not to install the recommended packages, like I said.

There is no "autoconfigure magic". Daemons install standard configuration files and start by default. Autostart can be disabled, that's all.

Suggested packages are off by default. You are talking about recommended packages which can be turned off via --no-install-recommends.

By staying on Windows you are however not helping the Linux cause. Having a functional FOSS OS is a lot more desirable for the future of computing.

Install tlp on Ubuntu/mint. My asus ux305 gets a solid 10 hours under light use.

"Wifi and battery life are absymal"

With Dell Precision 5520, I get much better battery life on Linux kernel 4.14 than on Windows 10. No Wifi problems as well.

Ps. DisplayPort MST support is spotty at best.

"If you're a developer there's no UX reason to go Windows anymore."

'Developer' is a pretty encompassing term and I wouldn't presume to know what requirements people have.

I would claim that software that is available only for Os X or windows natively is still a pretty strong UX reason.

Or that the deployed software runs only on windows.


These constraints kick into place if we take "UX" to include the total behaviour of the man-machine-environment system and not just installing Linux and the apt-gettable software.

I’ve a dual boot set up on one of my machines with Ubuntu 17.04. I’m consistently having Ethernet issues related to having two network cards, and the multi monitor support is lacking. I can’t use my triple monitor setup, and if I forget to disconnect my third monitor I have to reboot with only one monitor attached to get it work. Meanwhile, on the same machine, windows handles it in its stride.

There’s still some improvements to be made

> windows handles it in its stride

Windows has serious bugs with multiple monitors as well. I switch the second monitor often (usually once or twice a day) and often it would have problems like blank screen on one (or sometimes two) monitors, whole system locking up, HDMI sound device not showing up so I have to play sound over laptop speakers instead of monitor/TV speakers, etc. It doesn't happen often, but it does happen.

I tried using Linux, Windows and Mac and they all have bugs regarding multiple monitor setup. If everything works smooth for you, you're just lucky that your usage doesn't fall into "corner cases".

In my personal anecdotal experience, Mac's multi-monitor support is the worst of the three (and I say that as a currently full-time and otherwise happy Mac user).

ethernet ? It is the most stable part of Linux since 25 years. Do you have a link to the bug report ?

It's funny that you say that, because I'm constantly having problems with e1000e driver on my Intel NUC. Ethernet card stops working and dmesg shows "hardware unit hung" message. Tried to upgrade driver to latest version, compiled from sources - didn't fix it.

I don’t, because I haven’t even investigated the issue myself. I get times where I can’t see the device, and have no connectivity. My solution: use windows.

I am running a 4 monitor setup with Linux Mint 18.2 (Cinnamon) just fine. I set it up two weeks ago and so far I am impressed how easy it was.

However, prior to this setup I had Win 7 on the machine with no issues either.

Reverse anecdote:

I can get my three monitors to work under Linux (trivially, actually), but I can't under Windows, and not for a lack of trying.

Under Windows I can decide which one gets turned off, but running all three is somehow denied. This, of course, with recent drivers straight from the manufacturer (AMD).


I do a lot of support on IRC Freenode #linux and #ubuntu and we get a lot of this kind of report. We see a 90% success rate with a simple-to-apply workaround.

This is so common I wrote an article and fix [0] for it containing 4 shell commands that we refer users to whenever we get these 'works OK in Windows but not in Linux' type of reports.

By far the biggest cause of these hardware issues relating to power (suspend/resume, power-states, platform device enablement, ports - including GPU outputs) not working correctly is not caused by Linux but by the PC firmware itself and is down to the manufacturers.

In short - manufacturers customise the ACPI (Advanced Configuration and Power Interface) DSDT (Differentiated System Description Table) via it's _OSI() method to recognise only Microsoft Windows OSs and enable functionality based on the Windows version.

The version strings are of the form "Windows 2009" "Windows 2015" etc., where the later the year the more functionality is enabled. This is a total misuse of the purpose of _OSI but almost all manufacturers are doing this now.

As a result when Linux is the OS often only a minimal or incomplete set of functionality is enabled which causes all sorts of problems.

For a long time Linux has had code to report itself as various Windows versions to try to solve the worst of these but in many cases unless it claims to be the exact 'best' Windows version for each model/DSDT there will still be problems.

From silly things like some devices not working if the PC starts on battery (but fine starting on AC) - recently saw this was the cause of an external HDMI port not being enabled - through not handling power-state switching correctly and thus draining the battery rapidly, to suspend/resume/shutdown freezes and other symptoms.

The solution has frequently been to set the OSI string to match the most recent Windows version the PC's ACPI DSDT recognises, on the basis that the most recent is likely to enable the most functionality, using the DSDT OSI string via kernel command-line acpi_osi="...".

Checking the supported OSI version strings is quite easy from Linux:

    sudo strings /sys/firmware/acpi/tables/DSDT | grep -i windows | sort
You'll get a sorted list with the most recent version generally at the end. E.g:

    Microsoft Windows
    Microsoft Windows NT
    Microsoft WindowsME: Millennium Edition
    Windows 2001
    Windows 2006
    Windows 2009
    Windows 2012
    Windows 2015
Adding the most recent version to the kernel command-line, either via the boot-menu at start-up to test it, but usually permanently via GRUB, in this form:

    acpi_osi=! "acpi_osi=Windows XXXX"
where XXXX is the latest year reported. The initial =! tells Linux not to offer any of it's hard-coded version strings via OSI - this guarantees the one that gets used is the one specified.

On most GRUB-based systems this is done by editing /etc/default/grub and adding the options into the GRUB_CMDLINE_LINUX variable (there may be other kernel options there too) thus:

    GRUB_CMDLINE_LINUX="acpi_osi=! \"acpi_osi=Windows XXXX\""
(note: the two \" are required to ensure the double-quote gets inserted onto the kernel command line surrounding the space in the version string)


    sudo update-grub
Reboot and test.

[0] http://iam.tj/prototype/enhancements/Windows-acpi_osi.html

I might look into this but it’s not worth spending the time at right now. Unfortunately, and I mean no disrespect, but _its not my problem_. I have existing hardware that works well, it doesn’t matter if it’s the motherboard manufacturer, a kernel bug, a driver issue, it should work (which it does, on windows), and until it does, users won’t use it.

As an aside, that’s awful, and kudos to you for spending so much time looking into the issue and for a clear description of it along with a fix. It reminds me of the useragent nonsense - https://webaim.org/blog/user-agent-string-history/

Sometimes linux works fine, sometimes it doesn't and there will always be room for improvements. Fortunately I have three laptops, one for Win10, one for Arch and another for OSX, all have their quirks (but I must admit, arch linux is my preferred one:)).

I would also recommend Fedora 27. I have seldom seen an OS as polished as that. Holds its own versus OSX.

Big note: Fedora is a kinda beta branch of RHEL with experimental features and could have breaking changes.

Fedora is not kinda beta branch of RHEL.

Fedora is a fast-moving distribution, similar to non-LTS Ubuntu.

RHEL is a long-term supported distribution (way longer than LTS Ubuntu). It is for machines that you want to set up and forget about their existence.

The difference is in target audience and the intent, what their machines have to do.

More proper designation would be, that RHEL is a fork of Fedora, that is being supported long-term.

> Fedora is not kinda beta branch of RHEL.

I thought maybe Redhat changed their stance on Fedora and read from Redhat's page and well it is 100% a beta testing ground for RHEL.

"The size and expertise of the Fedora community make Fedora an ideal incubator and proving ground for features that eventually get incorporated into Red Hat Enterprise Linux. To meet the quality and reliability requirements that make Red Hat Enterprise Linux the choice for mission-critical applications, Red Hat puts Red Hat Enterprise Linux through its own set of tests and quality assurance (QA) processes that are separate and distinct from those of Fedora."


I understand where you are coming from, but I think it is just a matter of framing. If you consider regular Ubuntu releases to be the "beta branch" of their LTS releases, then the comparison would be apt.

I don't really see it that way. Each Fedora release is reliable, but the software does evolve quickly. The RHEL releases are taken from a point in time on that stream, pushed through an even more extensive QE process, and maintained for an extended period of time.

But Ubuntu doesn't say that they use Ubuntu as "... an ideal incubator and proving ground for features."

I use to use Fedora Core and then up to Fedora 7. I got burned way to many times and view Fedora as a slight step up from Arch Linux as a working distro. Both Fedora and Arch are more "hobbyist" distros and I will not use them in a professional work flow. I use OpenSUSE and I find their model to be the best of Distros and haven't ever got burned by them. OUTSIDE of 2001 when I was installing SUSE and wiped my partitions because I didn't know what I was doing :)

You can read from that excerpt whatever you want, but it still does not make Fedora beta for RHEL.

Just like Windows 10 is not beta for Windows LTSB, or Linux kernel releases are not betas for kernels that will be LTS.

Based on your comment I assume you're an old timer. It's been 14 years since the bad blood days of Fedora Core 1. Fedora is way more polished than back in the day.

Yes, I am an old timer, but Fedora still isn't production ready like Ubuntu, OpenSUSE or the paid for distros. I am certainly more of a pragmatic Linux user so the graphic card drivers are not proprietary blobs and must use the open source drivers. I might be wrong haven't installed Fedora since 14.

I think RHEL is for solid long-term Enterprise-level support, so even though its more experimental than that, it’s still pretty solid IME

>> If you're a developer there's no UX reason to go Windows anymore.

If you weren't a Windows developer, was there ever?

We've been using Lenovo laptops for web development on Linux with no issues for the last 10 years, and I was using them personally for another 5+ years before that.

Are Mono and .Net Core on Linux as good as on Windows? If they are, as a developer, there isn't much reason to go with Windows

It depends on how much you like VS.

I have actually moved to vscode and I love it so far. I still can't move to Linux since I need visual studio for some older projects, but I hope I'll be able to make the switch in the future. In the mean time vscode + windows subsystem for Linux has been extremely usable for me, no more virtual machines

You don't have to switch.

You can dual-boot. GParted or most distro's installers will resize your windows partition, just don't forget to disable fastboot in windows first.

You can run Linux in a VM. Granted, this means you can't really use a good tiling window manager, which is one of my main reasons for not using Windows.

You can run Windows in a VM. Take a look at Qubes to see just how integrated this can be.

I've run dual boot systems before. Not a fan of having to reboot to switch basically between programs. Thats why I used a Virtual machine and now I use Windows Subsystem for Linux.

Once we move to fully .net core shop, I'll be making the move to linux completely.

> If you're a developer there's no UX reason to go Windows anymore.

I am a developer serving Windows customers, that rely on native UI/UX for their workflows.

Obviously if you're developing primarily for Windows, you're going to need to run Windows. I don't think that's what the person you're replying to meant by that at all, but rather that the Windows UX doesn't buy you, as the user of the OS, anything that Linux doesn't.

I never get to see same smoothness in graphics and fonts as MacOS/Windows/ChromeOS in the same machine with Linux.

Every single time I decide that I'm moving to Linux, I give up after living with the browsers for a day or two. Rendering and fonts are just inferior to other systems in my experience.

> I never get to see same smoothness in graphics and fonts as MacOS/Windows/ChromeOS in the same machine with Linux.

Well, Chrome OS runs the same stuff as your average Linux distro, however maybe most distros don't turn on subpixel antialiasing in their FontConfig out of the box. (I don't know because I've been sharing configs between machines of the same distro for about five years).

Some distros do not enable subpixel rendering indeed, due to patents and them being based in the US. The users have to download a -freeworld package then, if they want it.

With HiDPI, that doesn't matter anyway.

Another issue are fonts themselves. If you have a document that uses Arial or Cambria or another proprietary font, and you don't have it installed, you are going to get something substituted, which may or may be not metrics-compatible. Do not expect pixel-identical results then.

> patents and them being based in the US

If I'm not mistaken, those patents are dead now. Even Fedora has the freeworld stuff now.

> Another issue are fonts themselves. If you have a document that uses Arial or Cambria or another proprietary font, and you don't have it installed, you are going to get something substituted, which may or may be not metrics-compatible. Do not expect pixel-identical results then.

Yeah, though the Chrome OS core fonts are good metric-compatible substitutes for Arial and friends (though lacking the stylistic variety of the core Windows fonts).

The bytecode interpreter patents are dead now.

Not the subpixel rendering ones ("cleartype"). In Fedora, FT_CONFIG_OPTION_SUBPIXEL_RENDERING is still disabled.

Google also released metric compatible substitutes for Cambria (Caladea) and Calibri (Carlito), but Linux distributions do not install them by default.

Edit: I'm wrong. In F27, the subpixel rendering is enabled.

ChromeOS uses Chrome own rendering engine, based on Skia.



Skia can only use what the underlying graphics drivers offer.

Sure it does, proper GPU acceleration, hybrid graphics support and hibernation that actually works, and UI/UX tooling resembling Xerox PARC way of computing, only beaten by macOS.

Not everyone wants to replicate a PDP-11.

> proper GPU acceleration

What GPU is this?

> only beaten by macOS

Well, you said you wanted "proper GPU acceleration", macOS GPU drivers are terrible. They're unstable, they have limp shader compilers, they are CPU-intensive, and in the case of OpenGL, they are behind by nearly six years in terms of features. Today, your sleek MacBook Pro supports considerably fewer modern GPU features than your average entry-level smartphone. You're missing stuff like multi draw indirect, texture clears, 8-bit stencils, image copying, explicit uniform location...

> hibernation that actually works

Well, I have that set up (and in my case it really has to be optional, since I have a really disk-scarce device right now). It worked correctly. I believe systemd will even monitor my battery and hibernate before it gets too low, like you'd expect (though maybe it could be brought out of hibernation enough times in a row to eventually run out of power before systemd gets to it). Maybe there should be a smoother transition between suspend and suspend-to-disk like macs with macOS have, but that isn't strictly necessary.

> What GPU is this?

Any capable of DirectX 12, but only gets OpenGL 4.3 drivers or similar on GNU/Linux.

Or my AMD Brazos one on a Linux netbook, DirectX 11 class, which only gets OpenGL 3.3 drivers.

Regarding OpenGL on macOS, I don't care, that is what rendering engines are for, and all relevant ones already support Metal 2.

> Any capable of DirectX 12, but only gets OpenGL 4.3 drivers or similar on GNU/Linux.

What GPU is this? I don't know of a DX12 device which lacks OpenGL 4.5 support in a stable release of Mesa (or the blob, in the case of NVIDIA, since they are uncooperative with regard to firmware).

> Regarding OpenGL on macOS, I don't care, that is what rendering engines are for, and all relevant ones already support Metal 2.

Sure, if you're running only games, and those that were made before 2012, and after 2017, then that works fine. For an example of something that does not currently use Metal (2 or no 2): any web browser (to my knowledge, including Apple's own, except as a backend for the WebGPU prototype), professional graphics application, or visualization utility. The shader compiler is still subpar, even in Metal.

> Or my AMD Brazos one on a Linux netbook, DirectX 11 class, which only gets OpenGL 3.3 drivers.

IIRC Brazos in that range would be... Radeon HD 6330M, and you're right, that should support GL 4.4. If you're running Mesa from 2016 or later, it should have 4.1 (and most extensions, except compute shaders, up to 4.4), if you run AMD's blob you can probably get 4.4 on it today. Work on drivers for pre-southern-islands devices is slow these days, but anything after SI has full certified OpenGL 4.5 on Mesa along with every extension in in 4.6 except for SPIR-V loading (coming soon), and the performance is better in basically all cases than it is on Windows, or with the proprietary drivers on Linux (though for now you'll need the proprietary drivers for some features like FreeSync, which is unfortunate but oh well).

I was having rendering issues with a NVIDIA Quadro M1200 when I tried it out with a live CD, never bothered to check again.

You are not going to find the Nvidia drivers on the live CD.

I assume what you saw was the Nouveau driver, which is enough to have you up and running to be able to download the proprietary ones using GUI.

I find that to be irrelevant for development (except games, of course).

In fact, my graphics card (the cheapest to support Vista) is the only thing I haven't upgraded in the past 5+ years. And it was from eBay, used.

Providing good native UI/UX development is similar to games, specially when using WPF, UWP, Qt,...

I would love to see what you develop.

My graphics card supports the native UI elements of Windows.

My users don't want a "Save as" dialog with life-like water refraction and fog, rendered at higher FPS than their monitor.

Software that adheres to nice designs like Material and Fluent.

So basically working vsync, alpha blending, render to texture and some basic shaders like blur, where even Intel GPUs are overkill?

Then you are in luck; even Intel GPUs under Linux do support OpenGL 4.5. 4.6 is in the works, the only missing part is the SPIR-V support and I don't suppose you need that for fluent or material design.

Honestly, tooling on windows is in the range [complicated-unusable]. Higher order abstractions and OS specific optimisations just seem to get in the way when I'm developping.

I for one do not feel windows has a good story if you're developping anything you don't use VS for.

I don't feel being stuck with a PDP-11 user model is any better.

A world where GNOME and KDE developers get bashed upon, when trying to bring UI/UX into a modern world.

I swapped back and forth a few times and Linux worked for all of that stuff 5 years ago. I do .Net development so I tend to stick with windows. But, from a pure user standpoint Linux is more user friendly on compatible hardware.

> hybrid graphics support

This works with open source drivers, it's only nvidia that is gumming up the works here. With recent drivers it should even be automatically setup, although you still need to set DRI_PRIME=1 when running something you want on the discrete GPU.


I develop for Windows, but I use Linux.

I find Windows to be a hindrance (I've used it from v3.1 to 7, so it's nothing to do with unfamiliarity).

Microsoft may be trying to become developer friendly - but their OS certainly is not, and increasingly so.

Some developers enjoy a graphical IDE worlds, keyboard shortcuts + mouse, not everyone feels likes developer == "UNIX user/CLI".

> Some developers enjoy a graphical IDE worlds, keyboard shortcuts + mouse

Many devs using Linux do too. Every single one out of many many dozens of Desktop Environment and/or Window Manager combinations let you enjoy all these. Just a matter of familiarity. VS is certainly a neat IDE and always was since I first played with it ~17-18 years ago, but hasn't been the only player in a long time now. Unless you need to use every single feature and tool from its up-to-120GB of installation size (with everything selected) --- never met anyone who does..

Most Linux users are against nice UI/UX provided by GNOME, KDE, Anjuta, KDevelop, GNOME Builder, .... as seen by the complaints and forks.

There is this culture of tiling window managers, vim, Emacs, everything done via CLI and UI/UX is a subject of interest only for dumb users.

Which surfaces as the biggest one among its users.

So anyone that cares about programming pleasant UI/UX workflows, hang with developers that share the same love for it, only has macOS and Windows dev cultures as harbor.

Many many Linux users are all for nice UIs, and pretty animations. Which is why GNOME, KDE, XFCE, MATE, Cinnamon, Budgie, Pantheon exist in the first place.

The sheer amount, variety and quality of themes available for a typical Linux distro puts your point to rest.

What puts my point to rest is that the only successful Linux based consumer devices, are a web browser based OS and a Java based one, with nothing Linux specific exposed to its users or app developers.

IDE vs CLI has nothing to do with it.

Windows isn't the only OS that has a graphical IDE - likewise Bash is available on Windows.

(all of my tools are available on both systems)

> IDE vs CLI has nothing to do with it.

Sure it has, because it springs from the developer culture on each OS, and the main reason why good UI/UX research is so hard on GNU/Linux systems.

It is always an uphill battle, as GNOME and KDE devs know about.

Also note that I pair macOS with Windows on my list.

Same could be said against Windows. Ever used Windows 8? "Everyone" hated it.

Windows 10 has gone for the hybrid UI; both "modern" and traditional. At the same time.

I'm not sure why you say it's hard for UI/UX research on Linux, when there are so many different desktop environments and toolkits - all(?) of which are theme-able. On Windows, you're stuck with one. Well, both.

Which is exactly the point of lack of good polished UI/UX on GNU/Linux.

Hence ChromeOS and Android in what concerns consumer devices, where Linux just happens to be the chosen kernel but nothing prevents Google changing it for something else.

I develop for both, but I use Windows.

Until there're Adobe Photoshop & Illustrator for Linux, no professional UI development is possible on that OS.

I use Linux, despite the absence of these essential tools. In my ideal world software vendors would start supporting Linux, and this is why I created SoftwareOnLinux.com - to create pressure groups and give vendors insights in the demand for their software on Linux. Would be cool if you'd show demand for Photoshop through this page: https://www.softwareonlinux.com/programs/8-adobe-photoshop

I cannot sign up, I'm getting this error on submitting the sign up form: ": The response parameter is missing."

Firefox 57, CentOS 7.3 desktop

> no professional UI development is possible

In your opinion.

I'm a professional. My skills relate to the domain/subject, not a specific software package/language/library. I'm able to transfer my skills and learn+use alternatives - and co-operate with those who aren't.


> Until there're Adobe Photoshop & Illustrator for Linux

There are plenty of great tools that really doesn't require these overly priced tools.

I’m not a designer, I’m a programmer, I mostly use these to export to something else, and do some minor adjustments. But on input, I always getting .psd (raster) or .ai (vector) files, created by professionals.

I’ve been developing software for 17 years, and this always was the case, regardless on the industry (desktop software, embedded, videogames), regardless on target platform (PC, mobile, embedded win or linux, game consoles).

I know there’re other good tools, but I’m not in the position to force everyone else to stop using Adobe.

Yet you're fine with others forcing you.

I don't use anything Adobe, yet I don't remember having a problem opening such files (regardless of OS).

If so, I'm not afraid to respond with "Hey, can you export/save as in <some other format>?". It's not like I, a developer, am going to be making changes to the file and send it back - let alone pay for a product with which to open it.

I wouldn't give them a .c file.

The issue has nothing to do with file formats, but work flow. Use different tools to use with the file format. I would personally use every cli program I can get my hands on and just convert them on my machine.

Personally I just make scripts and view the output file to check for quality. I use ImageMagick mostly but Gimp and Inkscape also have command line interface.

Gimp - https://www.gimp.org/man/gimp.html

Inkscape - https://inkscape.org/sk/doc/inkscape-man.html

use inkscape for .ai or Gimp or Krita for .psd

To export just use a cli program and batch that process.

Imagemagick to convert psd to png `convert file.psd[0] file.png`

Imagemagick to convert ai to png `convert test.ai -channel rgba -alpha on test.png`

Well, I'm glad that I refuse to let a single company determine my life.

Solus is a very solid distro as well, well worth trying out.

At my company, most of the developers use Ubuntu end-user machines, all from Dell. Not only does this avoid us having to pay for Windows licenses we never use, but we know the hardware is actually going to work fine with it. There's some quirks, of course - Ubuntu doesn't much like the 4k screens in our 13" XPSen so there's a lot of tweaking involved, but we're getting some very good hardware at a good price, and stuff just works. It's the right choice to target the high-end of the market, since the users are likely highly Linux-knowledgeable and so far we've never had to contact Dell for any software issues, which is going to improve their profits while our stuff just keeps working.

The only issue is the long lead times between actually ordering a batch of laptops and them being dispatched. It can be a fortnight or more. Currently I (as the sysadmin) keep a cupboard full of them ready for use, but I never look forward to ordering more.

> Ubuntu doesn't much like the 4k screens in our 13" XPSen so there's a lot of tweaking involved

> stuff just works

I can't reconcile these two statements.

This harkens back to the classic Unix statement, "Unix is user friendly, it is just picky about its friends."

I must agree that the support of high DPI displays is not optimal just yet. However, it is not that bad. At a corporate level one person can do the tweaking and create a great user experience for an entire organization.

Given the amount of tweaking and "management" goo organizations foist upon the average OSX installation blasting out a few lines of DPI tweak commands is almost nothing.

So it just works in the sense that you can do sane configuration activity and it behaves as you might expect.

Also, Unity out of the box is still pretty good on a scale like this: very bad, bad, tolerable, pretty good, good, great. It does "just work" for the most part :)

> "Unix is user friendly, it is just picky about its friends."

Specifically, all of its friends are masochists.

Yes, poorly phrased :)

There's no actual issues with the hi-DPI aspect, it does work, but it's obviously very difficult to read without tweaking. All the other applications we use just work out of the box.

As noted elsewhere, hi-DPI problems aren't exclusive to Ubuntu.

Stuff just works, as long as you pick the right stuff.

I agree,and I've found it very frustrating to work with. To the extant that new machines we have order, were not of the hi DPI variety.

FWIW, Windows has problems with 4K screens, too. Most apps work fine, some display at the size of a postage stamp.

Not Windows' fault, though. It's up to developers to update their apps to handle DPI scaling.

True, and I'm pretty sure the reason's the same for Linux.

Doesn't affect my opinion on the usability of the platform, though. For my part, I got my work to do, so I ultimately ended up back on a Mac (yet again, ugh) because, for all Apple's recent problems, none of them bother me as much as things like, "I just flat-out can't use this app that I need without a microscope."

Regarding usability, been a little while since I've used macOS now, so sincere question: Is it still true that you need third-party software to do keyboard-based window management? Even just something as simple as Windows' Win+arrow keys.

Yes, still true. Lots of options though.

Divvy, Spectacle, ShiftIt, Moom, Amythest, SizeUp, Cinch, Slate, Flexiglass, probably others.

If you want something very configurable and don't mind writing some Lua, Hammerspoon is a fun utility.

I'll throw Mjolnir in there too. As somebody who came from a pretty customized Awesome installation on Arch to Mac OSX, Mjolnir scratches that same itch. It allows you to completely script your hotkeys with lua, do grid based layouts, etc.


spectacle works fine indeed, out of the box windows management for halfsys and quarter of the screens.

Need more fancy ? http://www.mactrast.com/2017/04/review-mosaic-simple-window-... --> cannot vouch as not installed yet

Like spectacle[0]? I'm not on the latest macOS but I believe you still need a app like this for macOS.

[0]: https://www.spectacleapp.com/

I think it is. I haven't dug much into Sierra, but there's no obvious window management options. I remember using BetterTouchTool on Snow Leopard to Mavericks which did the job very well, but I think you're correct in that there's still no native support.

hell, you still need a 3rd party tool to do something as essential as middle click on the touchpad... but you install a few tools and tweak a few things and then it works wonderful...

> Doesn't affect my opinion on the usability of the platform, though. For my part, I got my work to do, so I ultimately ended up back on a Mac (yet again, ugh) because, for all Apple's recent problems, none of them bother me as much as things like, "I just flat-out can't use this app that I need without a microscope."

Well, at that point, you're no longer using 4K (even if you use a 4K external monitor, OS X doesn't render at that resolution by default). Incidentally, if you do try to render OS X in 4K, it looks horrible, for the exact same reasons.

In my experience, the 4K experience is actually strongest on Linux, followed by Windows. OS X comes in last.

Your argument is insidious.

You chose to use a Mac to avoid HiDPI problems, I chose to use a normal DPI laptop.

Windows is getting better at high DPI even for non-DPI aware applications.

It's weird though. Windows seems to do different things depending on the scaling factor and how you set the factor. 125% might look terrible but 150% is tack sharp. Setting 125% as a custom scale factor does something different than choosing 125% as a standard scale factor. Usually, you have to log out and log back in after changing the scale, but not always.

When you are working over remote desktop, it tries very hard to stop you from changing the scale factor. I'm not even sure if you can anymore.

I attribute that to a simple limitation of physics.

Scaling a pixel to one pixel and a bit of another pixel is not physically possible, there is a hard physical unit[1]. They are some zoom ratios that will smooth well to the screen and some that won't. I think it's mostly noticed on text, the eye is incredibly good at noticing minimal variations.

[1] Actually, font rendering on windows is already on a subpixel level. A pixel square is made of 3 rectangular colored sub pixels.

Yes, I've definitely seen blurry fonts on 125% in some apps. Creators Update and Fall Creators Update both have a lot of scaling fixes and those might help for some of the apps. Have you updated?

Yeah, I'm entirely up to date. I think we are in agreement - Windows 10 does a pretty good job, it just isn't clear to me exactly what it is doing and how the different settings affect the display.

Well, that's the same for Linux really, both GTK3 and Qt5 support 4K screens.

Absolutely it is. Such a simple fix too, at least in terms of avoiding things becoming too small. In terms of clarity and scaling up pixel-based GUI elements like icons you need to do a bit more work (new icons, double the res).

Swing apps (Tibco BW and DBVisualizer I think) all suffered on my latest XPS. There is a workaround though (adding some XML Windows guff to do with scaling next to the offending binary and/or java/javaw.exe.

> Ubuntu doesn't much like the 4k screens

This is not really Ubuntu's fault, but mainly a problem with GTK2. Everything based on GTK3 scales quite well nowadays, the same is true for QT. Unfortunately, the Gimp and Inkscape are two notable applications which have not yet been ported to GTK3.

Other than that, it's only the console which is not usable out-of-the-box.

> the Gimp ... [has] not yet been ported to GTK3.

Which is ironic, considering GTK is the GIMP ToolKit.

It's funny how a primary criticism of consumer linux is "you'll have trouble running it on any laptop you pick at random." Isn't this an unfair criticism? If you apply the same logic to MacOS (only works on one laptop!) or Windows (only works on laptops specifically designed for it), Linux begins to look extremely compatible in comparison.

Linux has demonstrated itself to be compatible with nearly any hardware combination imaginable, to various degrees of usability. It runs on everything from tiny embedded microcontrollers to supercomputers. MacOS and Windows cannot say the same.

It is perfectly valid to criticize the frequency of driver incompatibility. However, this criticism should not be leveled at Linux. It should be directed at the hardware manufacturers who do not release Linux drivers for their products, or the software vendors like Microsoft and Apple, who forego open standards in favor of proprietary protocols.

The fact that hardware vendors can successfully produce and market a "Linux first" laptop shows that those vendors are responsible for any incompatibility in their other products. They've proved they can make compatible hardware, so why don't they do it by default?

> or Windows (only works on laptops specifically designed for it)

Windows also works well on laptops specifically designed for MacOS. It helps that all the commodity parts that Apple uses were designed for other Windows laptops. Plus, Apple has put some effort into Boot Camp to ensure that Windows can be loaded, and Microsoft has probably spent some effort making sure that Windows runs well on Apple hardware, but that's one example of a machine not designed for Windows which still runs it.

Unfortunately, that's basically the only example of a laptop which is not designed for Windows that I can think of. Perhaps the Novena laptop [1] is an example of a laptop not designed for Windows? Are there others?

[1]: https://spectrum.ieee.org/consumer-electronics/portable-devi...

Apart from their absolute garbage EFI (not UEFI) implementation, there are no problems with running Linux on anything that runs OS X.

As far as EFI goes, Apple does not allow you to boot USB. Apple allows you to boot the OS X installer via firewire, but nothing else. This means your only option to install Linux, Windows, *BSD, etc. is from a CD/DVD or from a new partition on the hdd/ssd.

Thankfully, you can shrink the root partition in OS X while running OS X, so there is a way to install Linux without optical media, but it is complicated.

ChromeBook would be a good example.

That it would, thanks for pointing it out!

I wanted to throw my money at Dell last year. I wanted the top of the line XPS laptop with Linux. But I needed it quick. At that point, at least, Dell were building them to order in China. Two weeks plus shipping for the UK market.

I'm not sure if this is the same for their other models but I bet there's a big pile of Windows machines sitting in an EU warehouse.

To add insult to injury, I asked if they could improve the price in a web chat and the guy botched it so hard he added an extra £93 onto the price. Outsourced, off-shored support at its very worst.

In the end I dropped a fraction of the money on a next-day-delivered Thinkpad 13, tore Windows off it and raised another flag for glorious penguin.

No, same with Win10 machines.

Bought an XPS13 maxed out from Dell. Website deceivably back then said 2 days shipping, but I didn't realize it meant 2 days to deliver it to the carrier and the 3 weeks to Germany from China. Website made it look like it was in Europe when I ordered. BIOS gives smoke detection loudness (!) level alarms at night when some hardware part fails and laptop reboots.

Beside that, hardware is crappy, get's too hot, also hardware gives load pitching and buzzing noises when under load. Keyboard has problems too. I can't run Linux b/c I've stupidly ordered NVMe.

Would never again buy Dell, I'm back with Apple for my next laptop.

FWIW, my Lenovo T460s also has NVMe and it works just fine via /dev/nvme*. Eg. my boot partition is /dev/nvme0n1p1

Yeah I'm using nvme on a few machines. No issue.

How is NVMe preventing you from running Linux? Linux has had NVMe support since 2012 (3.3). I use NVMe on a Dell XPS 13 9350 Developer Edition as my daily driver with no issue.

Well Dell clearly doesn't have their shipping logistics down. I'm sure if you bought a Dell via Amazon Prime, it would arrive next day.

It's a pity they're not stocked more by third party retailers, that would have sorted your issue out. I had a similar issue with the warranty, because I was a bit uncertain about the coil whine issue, I preferred to buy from a retailer rather than directly from Dell (who don't have a clear no-questions-asked return window), but the Linux laptops aren't available from other major shops in the UK. There must be a lot of people who have bought the Windows laptops and installed Linux themselves. The oddity is those purchases are based on the existence of the Sputnik programme, but Dell presumably won't be able to account for them on that basis.

Sorry, I'm gonna have to flame Dell out here.

I own a Dell XPS 13 9365, a 2-in-1 laptop. Dell have decided, arbitrarily, not to support Linux on this machine, but you can still actually make it run by disabling Secure Boot and setting the SATA mode to AHCI. Even then, Ubuntu doesn't seem to detect when you've rotated the screen for tablet mode or stuff like that -- but that's probably Ubuntu's driver support. The machine also can't wake up from suspend mode, which I suspect is a driver thing.

Problem is, if you then apply a typical, recommended firmware upgrade from your Ubuntu Software Manager, it bricks the laptop. The problem being, "applying" the firmware upgrade in Ubuntu just primes it to run when you restart the computer. When you restart the computer, the BIOS detects a present firmware upgrade and applies it, without apparently double-checking that, you know, this isn't gonna brick the computer. I now have to wait for a technician to arrive today and replace my motherboard, then figure out how to change those BIOS settings again so that I can boot back into Linux.

Maybe Ubuntu could have stopped this by detecting compatibility issues between the firmware package and the hardware. Dell's BIOS definitely should have stopped this by detecting the compatibility issue between its own settings and the firmware package. Alternately, if the issue is that, for instance, the BIOS won't check these packages because I've disabled Secure Boot, the BIOS needs to bloody well differentiate between Secure Boot for operating systems (which we want off, because we own our laptops, thank you very much) and for BIOS/firmware packages (which we want on, because why brick a good machine?).

Overall, great machine, but bad Dell, no cookie.

I really wanted this same laptop, but I got the 9360 in the end because of the lack of Ubuntu support on the 9365.

> Dell have decided, arbitrarily, not to support Linux on this machine...Ubuntu doesn't seem to detect when you've rotated the screen for tablet mode or stuff like that

I suspect that this is exactly the reason why Dell decided not to ship with Ubuntu on this particular machine. I'm not sure that GNOME has support for tablet mode yet; Unity 7 (16.04 LTS) certainly doesn't.

> Maybe Ubuntu could have stopped this by detecting compatibility issues between the firmware package and the hardware.

Firmware updates come directly from https://fwupd.org/ I believe. Ubuntu (like other distributions) doesn't have anything to do with distribution and release management of the the firmware blobs themselves; that's done entirely by Dell I believe.

> Dell's BIOS definitely should have stopped this by detecting the compatibility issue between its own settings and the firmware package.

What exactly do you mean by "brick"? Dell explicitly don't support Ubuntu on this particular machine. If you had been running Windows, would you still be stuck? If not, I don't think you can reasonably blame Dell here. They don't QA with Ubuntu (or any Linux) on this hardware, and you knew this before you purchased.

>Ubuntu (like other distributions) doesn't have anything to do with distribution and release management of the the firmware blobs themselves; that's done entirely by Dell I believe.

Indeed, and yet here y'all are, saying that if I didn't want a brick, I shouldn't have run Linux. Ubuntu didn't do anything to the hardware, it just queued up a Dell-distributed, Dell-released firmware update to run upon BIOS POST.

Since this update was supplied by the manufacturer, I expect that at the very least, it can check for BIOS settings which render the update incompatible, and should most probably just, you know, install itself cleanly. It should never, ever brick the machine, because why in the hell is the manufacturer sending me something that bricks their own hardware?

>What exactly do you mean by "brick"?

By brick I mean brick. It fails POST and the several input combinations for resetting to a clean BIOS don't work either. Dell is having to send a technician to me to replace the motherboard, after which I can once again figure out the BIOS settings to run Ubuntu cleanly.

This doesn't mean that Ubuntu crashes after GRUB loads it. It means the BIOS no longer loads, period, let alone GRUB and Ubuntu.

>Dell explicitly don't support Ubuntu on this particular machine. If you had been running Windows, would you still be stuck? If not, I don't think you can reasonably blame Dell here. They don't QA with Ubuntu (or any Linux) on this hardware, and you knew this before you purchased.

Bull. They QA their own BIOS and firmware. The whole point of firmware-BIOS-OS separation is that the operating system never even speaks to the BIOS directly. BIOS runs the bootloader, bootloader loads the OS, OS proceeds according to a standard for how PCs are run. What I do to customize the machine is my responsibility, but proprietary updates from the manufacturer are theirs. If I've altered the BIOS settings to run a different operating system, the update needs to detect the altered settings and, if necessary, refuse to run. Then I can get on with my usage of the machine I bought.

It's not my job customize their firmware, nor should their firmware updates have compatibility issues that cripple the BIOS, ever, period.

> Bull. They QA their own BIOS and firmware. The whole point of firmware-BIOS-OS separation is that the operating system never even speaks to the BIOS directly. BIOS runs the bootloader, bootloader loads the OS, OS proceeds according to a standard for how PCs are run.

The complexity here is that the firmware updates shipped for Linux may well take a different QA path to the firmware updates shipped for Windows, as the distribution channels are different. Further, manufacturers usually try to unify (to some level) firmware updates for different systems into fewer actual binary blobs to reduce release engineering workload.

What keeps a complex system such as this working smoothly is QA. Dell cannot reasonably be expected to spend effort on QAing combinations they clearly do not support.

While the fact that your system was bricked is likely a bug that should not have happened, nevertheless I think it's unreasonable to blame Dell for this as viciously as you are doing because they do not support or QA that combination and cannot be expected to do so. That they're fixing it is the most I think you can reasonably expect.

I understand that you've accidentally ended up having a poor experience here. But you can't reasonably expect that to reflect on the experience others might get following a path Dell actually supports and can reasonably be expected to actually QA.

>The complexity here is that the firmware updates shipped for Linux may well take a different QA path to the firmware updates shipped for Windows, as the distribution channels are different.

Assuming that they're all still Dell-made firmware blobs, I don't see why they'd be different at all. Anything that Ubuntu can "do to them" ought to ruin the signature. Only an authentic Dell firmware blob should actually make it through the BIOS' checks to installation.

It's like saying, "look, you opened our cryptographically signed file under our system, sure, but you downloaded it through an unsupported channel."

> The complexity here is that the firmware updates shipped for Linux may well take a different QA path to the firmware updates shipped for Windows, as the distribution channels are different.

No, the update content in different OSes is identical. The firmware is being updated by UEFI capsule; the only differing part is what puts the capsule content into it's space. There's actually a third way how to do an update: straight from UEFI.

I had a Dell XPS 13 i7 with Ubuntu pre installed that I sent back within a day because of lack of support for qhd display and horrible electrical "coil whining".

> The machine also can't wake up from suspend mode, which I suspect is a driver thing.

I'm not the only one. I haven't seen anything on the internet about this model. This is driving me insane, feels like im always wasting power.

Did you check for Linux support before you purchased it?

If not, you can't blame Dell for your choice.

(I'm the type of person to buy the car before choosing the tyres)

It's a computer. No firmware update supplied by the manufacturer should ever brick it, no matter what operating system I put on it, which I can put on because it's my computer, not Dell's. That's the whole point of having a separate BIOS and operating system.

There's a reason why companies say they support specific OSes. You'll have a point if they said they supported linux. But they don't.

Rubbish. Driver problems are mine. Firmware and BIOS problems are theirs. Legally, they didn't sell me a "computer experience" to be had in a walled garden. They sold me a computer. What they provide, such as signed firmware and BIOS updates, needs to either work correctly, or safely refuse to run on an unsupported machine. What it should not do is brick the machine by assuming it only runs inside a walled garden and breaking things -- because, again, they sold me a computer, not a walled garden.

Otherwise the precedent might as well be set that, for instance, firmware updates are not required to work in the case that I've installed LibreOffice instead of Microsoft Word. After all, Dell made no commitment to supporting LibreOffice, so why should I expect the firmware update to work?

Well, because it's an orthogonal concern, and because they sold me hardware, dammit, not software experiences.

meanwhile in Thinkpad land...

honestly I never understand why people go for Macbooks or XPS. X/T model user since 7 generations. I did look at the XPS but there are so many disadvantages with those modern one-body-all-closed-no-mods laptops.

Before you down vote me, I still applaud Dells decision to get Ubuntu Laptops out there. But I don't really see what the big game changer is whether a laptop comes preinstalled with Ubuntu or if I do it myself.

> I don't really see what the big game changer is whether a laptop comes preinstalled with Ubuntu or if I do it myself.

It means a certain level of support. They're actively trying to make this thing run Linux well, instead of leaving it up to "the community". And you don't have to go reading forums or blog posts before buying just to figure out if it will have some stupid UEFI bug or whatever that makes installing Linux on it a real hassle.

Although I prefer Thinkpads, I very much understand why people want Macbook-like computers – they look quite nice! You shouldn't have to fit into some black-and-blue mold just to run Linux.

When I bought my XPS13 the trackpad, sound and keyboard haven't been working reliable. The thing has been unusable. Contacted support and they send me a link how to reflash my BIOS and install a newer version of Ubuntu.

It seems you get "a certain level of support", but I fail to see how this is better then the usual "buy Thinkpad, install Linux, use Google if you have problems" cycle is.

Lenovo did the same in the early days at least, they were very aware of having to produce high compatibility laptops. Not so much for the point of open source but for business usage which sometimes required non-windows OS's. But sure I applaud Dell. The world should rid of Windows.

No longer, apparently. A colleague if mine has had some headaches with ubuntu on his legion Y520, including no wifi connectivity at work. My T430 has no problems though.

AFAIK, the Y-series line is not a ThinkPad line. What the parent said applied to the ThinkPad line, which is treated differently by Lenovo to their consumer lines.

As a practical matter, would be nicer if they could just load windows and have Ubuntu automatically loaded with a well working Ubuntu virtual machine image where everything works (e.g., webcam, sound, peripherals, etc.).

> honestly I never understand why people go for Macbooks or XPS

Touchpad. No Windows laptop I know even comes close to both size and functionality of the touchpad of a Macbook - although I have to admit I won't get a recent Macbook, because the keyboard is crap, the touchpad is too big for my opinion and no USB-A = no Macbook for me.

> the touchpad is too big for my opinion

You won't notice it though and will be glad of the size when you do. I LOVE the bigger touchpad on the newer Macbooks (but its about the only thing about it I love. Keyboard is absolute dirt, as you said, no USB-A is ridiculous given that all of my devices are USB-A, the touch bar is useless and virtual function keys on touch bar are a very poor substitute for real function keys... but the touchpad size really isn't an issue, you can use how much or little as you like and the palm rejection works perfectly)

Well personally I oppose touchpads, I tried them all (even the new macbook ones), they all don't work well, I love my thinkpad trackpoint, works much better. The touchpad is the first thing I disable after OS installation.

I use thinkpad and macbook for work and imho thinkpad's red tracking nub trumps macbook touchpad. No need to do ballerina dance with your 3-4 fingers. 1 finger... 1 thumb... and track to infinity.

> But I don't really see what the big game changer is whether a laptop comes preinstalled with Ubuntu or if I do it myself.

The game changer (IMHO) is that it becomes a sustainable economic model for Linux users to buy modern (rather than out of date) Linux laptops with good driver support.

I'm aware that Thinkpads generally have excellent Linux support for the basics; I suspect that this is because of a combination of a reasonably supportive manufacturer (in a not hostile sense) and the number of Linux kernel developers who use Thinkpads.

However, what about buying a new Thinkpad that just came out? Now I'm not sure whether everything will work and what non-working bits will continue to not work for years.

With a pre-installed Linux, I have more confidence that everything will work from day one, and (if this continues) that this is a sustainable model. Because some of my money actually gets allocated against Linux support, rather than being misdirected to Windows support.

Support, and in this case is not the kind "I can't install Linux on this laptop", but more likely we will fix your hardware issues without trying to question that you installed a non-supported OS.

Our company recently bought Dell laptops with Linux pre-installed, I had a hardware issue with sound (basically, neither Windows or Linux recognized the sound board, showing that this was a hardware problem). I entered in contact with Dell support, and even after saying that I was using an unsupported OS (Arch Linux, instead of original Ubuntu) they happily RMA'd the whole thing and nowadays the notebook is working without issues.

Having owned a Thinkpad (T420) I can tell you that the recent XPS line has really solid hardware. It has a Macbook level quality feel.

The T420 I owned on the other hand had a rather plasticy feel.

> honestly I never understand why people go for Macbooks or XPS

They have an excellent performance with relatively easy travel size weight. What's not to like? (disclosure: I own an XPS)

Probably it means that everything works well since day 0. I bought a HP ZBook in 2014. I wiped out Windows and installed Ubuntu. It worked but there were some problems with the backlight control keys. There were workarounds and everything got fixed after a few months with some new version of the kernel. I expect zero problems if the pc came with Linux.

The Macbook's integration with iMessage and the rest of my iPhone is a huge plus for me. Super convenient to be able to message while working, plus battery life is great, and in emergencies I can just walk into an Apple store and get it fixed or swapped out quickly.

> I don't really see what the big game changer is whether a laptop comes preinstalled with Ubuntu or if I do it myself.

I personally cannot fathom why anyone would ever use a factory-installed operating system of any kind. It's only your computer if you installed it yourself.

There are plenty of places a hostile manufacturer could hide malware, having installed the OS yourself doesn't guarantee anything.

Done right, a manufacturer-installed OS can have everything set up right for the specific hardware of that machine, which is a much nicer experience than having to check the details yourself. (A lot of manufacturers don't do it right of course).

> But I don't really see what the big game changer is whether a laptop comes preinstalled with Ubuntu or if I do it myself.

A laptop sold with Ubuntu means that you don't have to pay for Windows (>100€). So it's cheaper. Furthermore, you have guarantees that linux support will be better than average.

Ironically enough, inside Dell there is an initiative to push everyone onto Windows, no Mac or Linux. With no exception for engineering.

How did you hear this?

I can actually confirm this, through a third party. I use to work for EMC, and left in early 2017, but still have a lot of friends still working there. Going forward employees were no longer going to be able to connect to the VPN with anything other than the Dell official Windows img. This mostly just meant that any work laptop must run Windows. Developer workstations could still run Linux, at least in the office I worked in as the IT there was not EMC or Dell, but our own internal that originated prior to our EMC acquisition many years ago.

I could see other organizations which were not so disconnected from Dell EMC being force further down Windows. Their solution is to run Linux in a VM, which is just unfortunate given the work and the product.

EMC acquisition — Isilon by any chance? :-)

Hello :-) 00

Directly from an employee.

Got the XPS 9560 from Costco, which they sell at the max configuration (32gb ram, 1tb SSD, 4k touch screen, nvidia 1050) for a few hundred dollars cheaper than Dell, and it's a beast. Got Ubuntu running on it without too much trouble, though it required a bit of tweaking. Only thing that doesn't have a driver is the fingerprint scanner. I'm extremely happy with it so far. Have had no stability issues at all. Only real issue is not the laptop so much as Linux with patchy software support for 4k screens. I run Xmonad as my desktop so I have to configure several things manually to work with HiDPI.

In arch that laptop works pretty well without tweaks.

Don't try to use the GPU and the CPU at the same time for longer than 5 minutes or you will be very disappointed (I own the same model).

What are you referring to? Does it overheat? Battery run down?

I don't know what jorgemf is referring to but I own the same laptop. It's bad at switching between the intel and nvidia GPUs, most notably in Chrome it even causes lockups. Pretty sad they haven't resolved it yet, out of the box Chrome is now locked down to Intel on Win10.

You need to use Bumblebee for this, and then launch the apps with primusrun or optirun. Some programs like steam doesn't need anything, and work without issues.

The thermal zone of the VRMs gets hot pretty quickly and it causes power throlling which decreases the CPU to 700Mhz. It is even much worse in windows due to some drivers.

Do you have some tips for the HiDPI? I have the same machine, and same problem.

First I had to go into the unity settings to set scaling on gnome apps to 2. Then it was a per-app thing after that.

Chromium scales on its own, but some sites are too small, like hacker news. But it does remember zoom settings and 125% works here.

For Steam I found a skin that makes it slightly larger but still too small, so I just run it in big picture mode.

I run xmonad + gnome fallback so I get easy notification icons and media button support. The panel needs to be set to 48 pixels in height, but haven't figured out how to get all the icons to be full size. I set the xmonad border width to 3px.

Gimp has a hidpi skin but it doesn't work very well.

This script will scale things up and it works in a pinch but isn't very elegant:



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact