I've recently acquired a HP Envy 13 2017 laptop with Windows preinstalled. I've really wanted to try Windows out, but after installing Google Chrome and Docker (1hr after starting it up) it stopped working - windows boot up into a blank black screen.
In a hurry I've downloaded Ubuntu 17.10, clicked yes and I had a working OS in 15 minutes time. With everything (suspend, broadcom card, USBC video,...) working out of the box.
If you're a developer there's no UX reason to go Windows anymore.
It is 2017 and I finally own a laptop with 32GiB of ECC(!) RAM. That is pretty damn cool. Allow me to roll out my specs, just cause I love this laptop:
* Quad Core Xeon (1505M)
* Discrete GPU with acceptable gaming performance and great at cracking passwords in a pinch
* 32 GiB of ECC RAM
* 1920x1080 15.6" (144ppi) display with no mic or web cam
* An actually decent keyboard
* Touchpad that is actually nice
* So many ports. All the ports. The best ports (4 USB-3, 1 USB-C, HDMI, mini DP, Ethernet, SD Card slot, Smartcard Slot)
* For the cost of equivalent Applecare, someone will show up to my house same day and fix this beast if something breaks
* Expandable and replacement storage and memory
Tell me all those ports don't make you a little jealous if you are using a Macbook, I won't believe you ;). It does come at he cost of this laptop being a bit chunky, so it won't be for everyone. You can also get it with a HiDPI display, but I actually prefer things right in this 120-144ppi range more than Retina.
It is just a great computing package. End of story. I also still love my 2013 MBP. It is also a beautiful machine for its own reasons, so I am no Dell / Linux fanperson, but I have a bit of a crush on this laptop as I just got it about a month ago :)
I do like the fact that you can actually user repair it though (SSD, memory, wifi, battery) - otherwise it would have been gone a month after I got it. I wish Apple would do this on the "pro" macbooks... ugh.
This machine on 1604 is a dream so far. I do see Dell has more production issues, especially on XPS machines. Intel WiFi is a must. I learned a lot about what Dell does best on the 13" XPS.
I had to disable the built in graphics in the BIOS to get my 2 external monitors to work (some Nvidia Quadro mess), suspend doesn’t work properly (common Linux problem) and the headphone port does not work unless you toggle microphone mute on an off first (could be some pulseausio wierdness but I haven’t gotten to the bottom of it so far).
Even played some games on it, although I try to stay away from that sort of thing.
I have a Latitude e7470 which I highly recommend, provided you don't need dedicated GPUs. It supports up to 32GB (non-ECC), has a wide array of ports, and is much closer to the XPS in terms of size and weight.
He has a Dell Precision 7520. If you customize the model on the Dell website, there's a backlit keyboard option.
Just make sure anyone seriously considers this takes a long hard look at hte physical dimensions and weight. It is not a sleek or light machine :)
I like how Ubuntu and Red Hat are in the officially supported list of OSes.
Why didn't you go for 64GB ECC DDR4? :P
I couldn't see a smartcard slot on it, and then I learned that the smartcard reader is contactless.
And it supports SD cards up to 2TB in size even.
The GPU only supports 4GB RAM though.
So I just found the order page and have been having a bunch of fun speccing out the most expensive possible device.
After wading through the comedy of
- LINUX/RED HAT Operating Systems are not available with Dell Threat Protection and Endpoint Security Suite option. Please change one of these selections.
- The Operating System you've selected does not support the ENERGY STAR Energy Efficient Option.
- The LINUX Operating System is not compatible with the Processor.
The absolute winner of
- The Operating System you've selected does not support Office Productivity Software.
And the (easily fixed) legacy annoyance of
- The Ubuntu Operating System does not support the Wireless Card you've selected.
I finally reached:
Starting at Price: $8,383.58
Total Savings: $3,005.11
Dell Price: $5,378.47 <-- the actual cost
I added absolutely everything but kept the screen at 1080p (which adds capacitative touch!). So it has 64GB RAM, two 1TB SSDs, a 4GB Quadro M2200, a 3.9GHz Xeon E3, and an IPS display.
I like how vPro can be left unchecked :>
I'm confused why it's still saying Linux is incompatible with the processor though. Why? :(
Let's see if this URL breaks Arc: http://www.dell.com/en-us/work/shop/dell-laptops-and-noteboo...
I open far far too many Chrome tabs and only have 8GB RAM, so I guess I'm very heavily biased.
FWIW, I was able to select 64GB ECC DDR4 in the configurator.
Once you do, it's rock solid, but it should be OOTB and it annoys me that it isn't. It leaves those not familiar with Windows with a bad first impression, and those tech savvy enough to install Linux just don't come back.
But an accurate impression. Windows main focus is Microsoft and partners product sales and ad delivery. No matter how much you remove after install, they're likely to override your very obvious wishes not to be bothered in a forced update.
And yes, I've seen first-hand the updates steamrollering your preferences and setting them to nice, MS-friendly 'please target me with marketing' options. I gave up Windows in 2008, was Mac full-time until 2015 and now I run Ubuntu exclusively, at home and work. Canonical aren't perfect, but they're a bit more open about their commercial expectations than MS will ever be.
The funny thing is that Microsoft know that the normal OOTB experience is bad, and created the "Signature" effort to offer people PCs with clean Windows installations. Then they seem to have forgotten that, and started putting their own passive-aggressive "value-adds" into Windows 10.
From 2012: https://www.itworld.com/article/2718342/consumerization/here...
With Windows 10, that is unfortunately not true anymore because MS is the provider of a lot of the crapware.
I was very happy with an Asus Zenbook I bought that had the PURE (+ sometimes a number) suffix on the model name. Those laptops come with a clean windows install + any needed drivers.
It only saves an hour or two for yourself, but I wish more (or really all) manufacturers supplied these models.
Very nice piece of kit, but even that came loaded with ads, and would search Bing and the app store every time you searched the start menu.
It's little things like this that make Windows so much worse. It takes ten minutes in Group Policy to turn it off, but the majority of users will never go near group policy, and will just assume that is just the way Windows works.
You're just scratching the surface on how user-hostile Windows can be. The ads, bing crap, and telemetry are simply horrific considering it's a paid product.
Windows Update is still a mess. It takes me less than a couple of minutes to pacman -Syu, while windows updates can take HOURS.
Some random windows service can still suddenly ramp up disk usage or cpu to 100%. There are two control panels. File Explorer is still terrible. The list goes on.
It isn't a matter of tech savvy, and pretending that it is is part of the problem with the Linux Desktop community, which is the worst part about using a Linux Desktop if you ask me.
The system has a whole lot of issues, many of them somehow even worse that dealing with Windows 10's bullshit, or else I'd have switched myself by now. Even Linus calls out the Linux Desktop for being pretty crap. Developing for it is a nightmare too, especially in regards to distribution.
I'm trying not to go on a rant here. Point is, there are plenty of reasons to not want to use a Linux Desktop aside from not being "tech savvy" enough.
Also, I disagree that it is getting better every year. Individual pieces may work better once they're properly configured, but on the whole the system is becoming more and more of a complicated kludge of intertwined, abstracted, obfuscated, and otherwise non-transparent mechanisms. I personally don't see much hope for it as long as that trend continues.
- It's fantastically fast, I dislike going back to win 7
- Some of the utilities are nicer
- Random scheduled tasks fire in the background at really annoying times. I installed a completely clean install of win 10 and some fail and then constantly re-run as they've not been marked as completed (e.g. .net 32-bit optimization service, some sort of memory optimization service). On my desktop I don't notice, but on my laptop if I leave it unattended the fan often starts going crazy. I've tried fixing the broken services but never really got to the bottom why. What's most annoying is the when you then try and disable or replace these services with dummy exes, windows update will "helpfully" "fix" them and re-enable them. There are also various scheduled tasks that automatically re-enable themselves on restart. It was also constantly trying to run one windows update that kept failing and it never told me, but would go through the whole rigmarole of doing incredibly expensive windows update trying to install this one driver.
- New start menu is bad, it's simply just not as usable as win 7, you end up always searching where win 7 seem to just be easier to quickly find things without typing anything
- Tiles are generally annoying
- New windows apps are annoyingly fancy and touch screen centric, for example there's no reason a calculator should be semi-transparent, it's actually really visually distracting
- Telem crap is incredibly user-hostile, invasive and almost impossible to get rid of
- Ctrl-alt-delete is partially broken and no longer works seamlessly. You have to do some crazy stuff like create a new desktop to get off a crashed program
- Explorer (file explorer) just seems to have become a real mess since XP and searching is agonisingly slow. Especially if you're doing it on a folder that contains bower or npm folders, in 2017 windows still struggles with long file paths
This had been going on for months too.
I think that's my biggest beef with Windows 10, you're not supposed to muck around with the scheduler running tasks when idle or the updates, but it's so brittle that when it fails it does really stupid things without telling you.
Remember when we used to say the same thing about Linux? Funny how times change.
Is writing proper docs so bad not even getting these fine people working for you for free makes it worth the effort?
Not that I necessarily disagree with their position on the matter, but you can't escape the consequences.
There is also a reason, why Apple won't ship hardware or drivers, that they don't have source for.
Really? Which version are you using? On a clean install of Win 10 Pro, I fell foul of several known issues. A failing update to Anniversary edition required reinstallation, and the Creator's update only made it through after initially crashing mid-update. The fast shutdown (which I wasn't aware of) caused me a problem with network connectivity on moving the machine which was hard to troubleshoot because it was unexpected. Then to top it all off once I got it running properly, the start-menu was still full of ads.
Maybe if I could get hold of the enterprise edition, with all the ads/telemetry disabled by default, it'd be a better experience, but my faith in the updates has truly gone and as a result we now just leave the Win10 install of the dual-boot alone.
I wouldn’t use any other version of Windows, Enterprise is the only usable version.
You shouldn’t have to do that, though, especially on Enterprise!
Edit: I cannot find a source, so take this with a grain of salt, however I do clearly remember reading it.
Couple of refs for anyone interested:
Kind of a glaring security issue.
For many years (roughly since systemd) I hoped there's a kind of "modern Linux handbook" especially when people come up with solutions to problems seemingly totally out of nowhere but turns out, there's nothing. Some people have knowledge of narrow areas but the entire modern Linux is a frighteningly complex monster. And when you realize your fractions of knowledge makes you one of the experts, that's even worse: when I ask about DisplayLink in Linux, I am redirected to the relevant Arch Wiki page where the meat of the page was written by me years ago...
I set up Windows 10 to my taste https://github.com/chx/chx.github.io/wiki/How-I-set-up-my-Wi... and it's a better world now. Drivers and GUI programs are running in Windows, daemons and CLI tools are running in Linux, switching to a brief game session is trivial.
I have no issues with battery life, wifi, or BT or anything else. The first ~6 months of 2014 was a rough ride, because the hardware was so new there was no proper support for it in the kernel, but the team @ Dell managed to upstream all the required stuff, so depending on their custom Ubuntu release is no longer necessary. Everything should just work out of the box.
Once I done that, I now have the perfect developer laptop (in my opinion of course :-) )
Unless one wasn't "wrestling" with Linux in the first place, because things tend to just work these days.
> While Linux certainly works on laptops it doesn't work well. Wifi and battery life are absymal
That's blatantly untrue. Given the right laptop, Linux is on par or excels. Given the wrong laptop, an install from scratch of the Windows operating system is likely to be far worse.
> problems seemingly totally out of nowhere
So problems that arose without any changes to the system?
> there's nothing.
Well, there's full access to your own system.
> Drivers [..] are running in Windows
Never run old hardware or peripherals? Because if you do Windows is a world of pain filled with drivers sourced from questionable download sites.
I'll assume good faith here instead of astroturfing, but you might want to tone down the unwarranted praise of X over Y when describing your satisfaction with your current setup, especially when you're apparently not really familiar with Y.
Yes it is very nice to have 1, 2 or even 5 Laptops which work with linux.
Windows still runs on more devices, better.
I had never the issue, that i had to sit it out that a specific bug came into the release cycle of a windows update but i had this 4 times in 3 years with arch linux.
And i also have a strange experience with linux: One ssd does work flawless under windows and under linux it creates minilags of the whole system.
While i like my linux machine and i use it daily @work and also daily @private laptop, it takes more effort and more knowledge than windows.
It's been a long time I haven't seen a general purpose laptop or desktop that didn't work flawlessly with both Ubuntu and Fedora. The only one I have is the MacBook Pro in front of me, but it comes with a decent OS anyway.
> Windows still runs on more devices, better.
Can it delete open files now?
Does it matter? And how is this related to the rest of the discussion? :)
Samsung 840 series? Well-known bug in their firmware. Update your firmware.
It's got variations of:
"It works for me!"
"You chose the wrong hardware!"
"Well you can fix that because the source is available!"
and "But Windows is worse!"
But it's missing:
"Try distribution #132!"
"You don't actually want to do that!"
and my favorite: "Maybe you're just too stupid to use it!"
Dual-booting on an XPS15. Running Ubuntu and Win10. I've had fewer problems with Ubuntu than Win10.
Had a HiDPI version of the same laptop at the last gig. Both Windows and Linux had some issues which were easier to overcome.
Windows definitely gets better hardware support.
However, having my local "dev" environment be native Linux is glorious. No weird Docker issues. Python is legit. Don't have to mess with Homebrew.
Wifi, Bluetooth, HD resolutions, keyboard buttons, printers, everything just seems to work in Ubuntu, and this has been the case for quite a few years now.
$ git clone ssh://blahblahblah.space/repo.git
$ code repo
Part of the issue is that using Linux exposes you much more directly to the quality of the hardware (and the associated firmware and drivers). The hardware that "just works" on Windows might be horribly engineered, but Windows and the proprietary drivers might cover over the problems. On Linux, bad hardware is more likely to just work badly.
Isn't Arch specifically for users who don't mind things breaking and going deep into internals to fix it?
More of my thoughts on Arch:
In my 4 years using mint / ubuntu, an update broke my machine once, and that was when the power went out during an update.
Arch apparently works great for a lot of people, but I'd urge caution.
Arch is a rolling release that many choose to update daily, or on a fixed schedule, like once a week. That updates sometimes can break a thing or two, out of hundreds of updates, is to be expected. If that sounds scary to you, Arch is certainly not for you. That said, it has rarely happened to me, and any breakage is usually minor and easily fixed. It's not like you have to even figure out how; it's laid out for you on the Wiki, Forum or Arch's homepage. My experience with other distros has been far more troubling, except Slackware. Arch's strengths easily outweigh occasional inconveniences. Not having to deal with anything Ubuntu or Canonical is a great bonus.
With Arch, installing a package just unpacks the package files and runs a few innocuous post-install hooks, mostly to rebuild indexes (mandb, fontconfig, etc.) and compile DKMS-based kernel modules. But nothing potentially destructive like `systemctl start postfix`.
Also, it installs exactly what you want, instead of dozens of tangentially-related "suggested packages". If you `apt install rsyslog`, it pulls in an entire MTA because somewhere deep in rsyslog, there is an option that might send emails, so Debian in its unending wisdom determines that you MUST HAVE an MTA beyond this point. In Arch, `pacman -S rsyslog` installs just rsyslog and a few required shared libraries.
So does Debian, if that's what you tell it you want (either on the config file or as a CLI argument). Even without that, while it may suggest a full MTA by default, you can ask it for another solution and it'll suggest a simple mailer (like bsd-mailx).
I don't dispute that Arch's defaults guide you to a simpler system, but it's not really hard to start from Debian netinstall and add just what you need, package by package.
Exactly. The correct solution for me is "no mailer at all".
With Dell Precision 5520, I get much better battery life on Linux kernel 4.14 than on Windows 10. No Wifi problems as well.
'Developer' is a pretty encompassing term and I wouldn't presume to know what requirements people have.
I would claim that software that is available only for Os X or windows natively is still a pretty strong UX reason.
Or that the deployed software runs only on windows.
These constraints kick into place if we take "UX" to include the total behaviour of the man-machine-environment system and not just installing Linux and the apt-gettable software.
There’s still some improvements to be made
Windows has serious bugs with multiple monitors as well. I switch the second monitor often (usually once or twice a day) and often it would have problems like blank screen on one (or sometimes two) monitors, whole system locking up, HDMI sound device not showing up so I have to play sound over laptop speakers instead of monitor/TV speakers, etc. It doesn't happen often, but it does happen.
I tried using Linux, Windows and Mac and they all have bugs regarding multiple monitor setup. If everything works smooth for you, you're just lucky that your usage doesn't fall into "corner cases".
However, prior to this setup I had Win 7 on the machine with no issues either.
I can get my three monitors to work under Linux (trivially, actually), but I can't under Windows, and not for a lack of trying.
Under Windows I can decide which one gets turned off, but running all three is somehow denied. This, of course, with recent drivers straight from the manufacturer (AMD).
This is so common I wrote an article and fix  for it containing 4 shell commands that we refer users to whenever we get these 'works OK in Windows but not in Linux' type of reports.
By far the biggest cause of these hardware issues relating to power (suspend/resume, power-states, platform device enablement, ports - including GPU outputs) not working correctly is not caused by Linux but by the PC firmware itself and is down to the manufacturers.
In short - manufacturers customise the ACPI (Advanced Configuration and Power Interface) DSDT (Differentiated System Description Table) via it's _OSI() method to recognise only Microsoft Windows OSs and enable functionality based on the Windows version.
The version strings are of the form "Windows 2009" "Windows 2015" etc., where the later the year the more functionality is enabled. This is a total misuse of the purpose of _OSI but almost all manufacturers are doing this now.
As a result when Linux is the OS often only a minimal or incomplete set of functionality is enabled which causes all sorts of problems.
For a long time Linux has had code to report itself as various Windows versions to try to solve the worst of these but in many cases unless it claims to be the exact 'best' Windows version for each model/DSDT there will still be problems.
From silly things like some devices not working if the PC starts on battery (but fine starting on AC) - recently saw this was the cause of an external HDMI port not being enabled - through not handling power-state switching correctly and thus draining the battery rapidly, to suspend/resume/shutdown freezes and other symptoms.
The solution has frequently been to set the OSI string to match the most recent Windows version the PC's ACPI DSDT recognises, on the basis that the most recent is likely to enable the most functionality, using the DSDT OSI string via kernel command-line acpi_osi="...".
Checking the supported OSI version strings is quite easy from Linux:
sudo strings /sys/firmware/acpi/tables/DSDT | grep -i windows | sort
Microsoft Windows NT
Microsoft WindowsME: Millennium Edition
acpi_osi=! "acpi_osi=Windows XXXX"
On most GRUB-based systems this is done by editing /etc/default/grub and adding the options into the GRUB_CMDLINE_LINUX variable (there may be other kernel options there too) thus:
GRUB_CMDLINE_LINUX="acpi_osi=! \"acpi_osi=Windows XXXX\""
As an aside, that’s awful, and kudos to you for spending so much time looking into the issue and for a clear description of it along with a fix. It reminds me of the useragent nonsense - https://webaim.org/blog/user-agent-string-history/
Fedora is a fast-moving distribution, similar to non-LTS Ubuntu.
RHEL is a long-term supported distribution (way longer than LTS Ubuntu). It is for machines that you want to set up and forget about their existence.
The difference is in target audience and the intent, what their machines have to do.
More proper designation would be, that RHEL is a fork of Fedora, that is being supported long-term.
I thought maybe Redhat changed their stance on Fedora and read from Redhat's page and well it is 100% a beta testing ground for RHEL.
"The size and expertise of the Fedora community make Fedora an ideal incubator and proving ground for features that eventually get incorporated into Red Hat Enterprise Linux. To meet the quality and reliability requirements that make Red Hat Enterprise Linux the choice for mission-critical applications, Red Hat puts Red Hat Enterprise Linux through its own set of tests and quality assurance (QA) processes that are separate and distinct from those of Fedora."
I don't really see it that way. Each Fedora release is reliable, but the software does evolve quickly. The RHEL releases are taken from a point in time on that stream, pushed through an even more extensive QE process, and maintained for an extended period of time.
I use to use Fedora Core and then up to Fedora 7. I got burned way to many times and view Fedora as a slight step up from Arch Linux as a working distro. Both Fedora and Arch are more "hobbyist" distros and I will not use them in a professional work flow. I use OpenSUSE and I find their model to be the best of Distros and haven't ever got burned by them. OUTSIDE of 2001 when I was installing SUSE and wiped my partitions because I didn't know what I was doing :)
Just like Windows 10 is not beta for Windows LTSB, or Linux kernel releases are not betas for kernels that will be LTS.
If you weren't a Windows developer, was there ever?
We've been using Lenovo laptops for web development on Linux with no issues for the last 10 years, and I was using them personally for another 5+ years before that.
You can dual-boot. GParted or most distro's installers will resize your windows partition, just don't forget to disable fastboot in windows first.
You can run Linux in a VM. Granted, this means you can't really use a good tiling window manager, which is one of my main reasons for not using Windows.
You can run Windows in a VM. Take a look at Qubes to see just how integrated this can be.
Once we move to fully .net core shop, I'll be making the move to linux completely.
I am a developer serving Windows customers, that rely on native UI/UX for their workflows.
Every single time I decide that I'm moving to Linux, I give up after living with the browsers for a day or two. Rendering and fonts are just inferior to other systems in my experience.
Well, Chrome OS runs the same stuff as your average Linux distro, however maybe most distros don't turn on subpixel antialiasing in their FontConfig out of the box. (I don't know because I've been sharing configs between machines of the same distro for about five years).
With HiDPI, that doesn't matter anyway.
Another issue are fonts themselves. If you have a document that uses Arial or Cambria or another proprietary font, and you don't have it installed, you are going to get something substituted, which may or may be not metrics-compatible. Do not expect pixel-identical results then.
If I'm not mistaken, those patents are dead now. Even Fedora has the freeworld stuff now.
> Another issue are fonts themselves. If you have a document that uses Arial or Cambria or another proprietary font, and you don't have it installed, you are going to get something substituted, which may or may be not metrics-compatible. Do not expect pixel-identical results then.
Yeah, though the Chrome OS core fonts are good metric-compatible substitutes for Arial and friends (though lacking the stylistic variety of the core Windows fonts).
Not the subpixel rendering ones ("cleartype"). In Fedora, FT_CONFIG_OPTION_SUBPIXEL_RENDERING is still disabled.
Google also released metric compatible substitutes for Cambria (Caladea) and Calibri (Carlito), but Linux distributions do not install them by default.
Edit: I'm wrong. In F27, the subpixel rendering is enabled.
Skia can only use what the underlying graphics drivers offer.
Not everyone wants to replicate a PDP-11.
What GPU is this?
> only beaten by macOS
Well, you said you wanted "proper GPU acceleration", macOS GPU drivers are terrible. They're unstable, they have limp shader compilers, they are CPU-intensive, and in the case of OpenGL, they are behind by nearly six years in terms of features. Today, your sleek MacBook Pro supports considerably fewer modern GPU features than your average entry-level smartphone. You're missing stuff like multi draw indirect, texture clears, 8-bit stencils, image copying, explicit uniform location...
> hibernation that actually works
Well, I have that set up (and in my case it really has to be optional, since I have a really disk-scarce device right now). It worked correctly. I believe systemd will even monitor my battery and hibernate before it gets too low, like you'd expect (though maybe it could be brought out of hibernation enough times in a row to eventually run out of power before systemd gets to it). Maybe there should be a smoother transition between suspend and suspend-to-disk like macs with macOS have, but that isn't strictly necessary.
Any capable of DirectX 12, but only gets OpenGL 4.3 drivers or similar on GNU/Linux.
Or my AMD Brazos one on a Linux netbook, DirectX 11 class, which only gets OpenGL 3.3 drivers.
Regarding OpenGL on macOS, I don't care, that is what rendering engines are for, and all relevant ones already support Metal 2.
What GPU is this? I don't know of a DX12 device which lacks OpenGL 4.5 support in a stable release of Mesa (or the blob, in the case of NVIDIA, since they are uncooperative with regard to firmware).
> Regarding OpenGL on macOS, I don't care, that is what rendering engines are for, and all relevant ones already support Metal 2.
Sure, if you're running only games, and those that were made before 2012, and after 2017, then that works fine. For an example of something that does not currently use Metal (2 or no 2): any web browser (to my knowledge, including Apple's own, except as a backend for the WebGPU prototype), professional graphics application, or visualization utility. The shader compiler is still subpar, even in Metal.
> Or my AMD Brazos one on a Linux netbook, DirectX 11 class, which only gets OpenGL 3.3 drivers.
IIRC Brazos in that range would be... Radeon HD 6330M, and you're right, that should support GL 4.4. If you're running Mesa from 2016 or later, it should have 4.1 (and most extensions, except compute shaders, up to 4.4), if you run AMD's blob you can probably get 4.4 on it today. Work on drivers for pre-southern-islands devices is slow these days, but anything after SI has full certified OpenGL 4.5 on Mesa along with every extension in in 4.6 except for SPIR-V loading (coming soon), and the performance is better in basically all cases than it is on Windows, or with the proprietary drivers on Linux (though for now you'll need the proprietary drivers for some features like FreeSync, which is unfortunate but oh well).
I assume what you saw was the Nouveau driver, which is enough to have you up and running to be able to download the proprietary ones using GUI.
In fact, my graphics card (the cheapest to support Vista) is the only thing I haven't upgraded in the past 5+ years. And it was from eBay, used.
My graphics card supports the native UI elements of Windows.
My users don't want a "Save as" dialog with life-like water refraction and fog, rendered at higher FPS than their monitor.
Then you are in luck; even Intel GPUs under Linux do support OpenGL 4.5. 4.6 is in the works, the only missing part is the SPIR-V support and I don't suppose you need that for fluent or material design.
I for one do not feel windows has a good story if you're developping anything you don't use VS for.
A world where GNOME and KDE developers get bashed upon, when trying to bring UI/UX into a modern world.
This works with open source drivers, it's only nvidia that is gumming up the works here. With recent drivers it should even be automatically setup, although you still need to set DRI_PRIME=1 when running something you want on the discrete GPU.
I find Windows to be a hindrance (I've used it from v3.1 to 7, so it's nothing to do with unfamiliarity).
Microsoft may be trying to become developer friendly - but their OS certainly is not, and increasingly so.
Many devs using Linux do too. Every single one out of many many dozens of Desktop Environment and/or Window Manager combinations let you enjoy all these. Just a matter of familiarity. VS is certainly a neat IDE and always was since I first played with it ~17-18 years ago, but hasn't been the only player in a long time now. Unless you need to use every single feature and tool from its up-to-120GB of installation size (with everything selected) --- never met anyone who does..
There is this culture of tiling window managers, vim, Emacs, everything done via CLI and UI/UX is a subject of interest only for dumb users.
Which surfaces as the biggest one among its users.
So anyone that cares about programming pleasant UI/UX workflows, hang with developers that share the same love for it, only has macOS and Windows dev cultures as harbor.
The sheer amount, variety and quality of themes available for a typical Linux distro puts your point to rest.
Windows isn't the only OS that has a graphical IDE - likewise Bash is available on Windows.
(all of my tools are available on both systems)
Sure it has, because it springs from the developer culture on each OS, and the main reason why good UI/UX research is so hard on GNU/Linux systems.
It is always an uphill battle, as GNOME and KDE devs know about.
Also note that I pair macOS with Windows on my list.
Windows 10 has gone for the hybrid UI; both "modern" and traditional. At the same time.
I'm not sure why you say it's hard for UI/UX research on Linux, when there are so many different desktop environments and toolkits - all(?) of which are theme-able. On Windows, you're stuck with one. Well, both.
Hence ChromeOS and Android in what concerns consumer devices, where Linux just happens to be the chosen kernel but nothing prevents Google changing it for something else.
Until there're Adobe Photoshop & Illustrator for Linux, no professional UI development is possible on that OS.
Firefox 57, CentOS 7.3 desktop
In your opinion.
I'm a professional. My skills relate to the domain/subject, not a specific software package/language/library. I'm able to transfer my skills and learn+use alternatives - and co-operate with those who aren't.
There are plenty of great tools that really doesn't require these overly priced tools.
I’ve been developing software for 17 years, and this always was the case, regardless on the industry (desktop software, embedded, videogames), regardless on target platform (PC, mobile, embedded win or linux, game consoles).
I know there’re other good tools, but I’m not in the position to force everyone else to stop using Adobe.
I don't use anything Adobe, yet I don't remember having a problem opening such files (regardless of OS).
If so, I'm not afraid to respond with "Hey, can you export/save as in <some other format>?". It's not like I, a developer, am going to be making changes to the file and send it back - let alone pay for a product with which to open it.
I wouldn't give them a .c file.
Personally I just make scripts and view the output file to check for quality. I use ImageMagick mostly but Gimp and Inkscape also have command line interface.
Gimp - https://www.gimp.org/man/gimp.html
Inkscape - https://inkscape.org/sk/doc/inkscape-man.html
To export just use a cli program and batch that process.
Imagemagick to convert psd to png `convert file.psd file.png`
Imagemagick to convert ai to png `convert test.ai -channel rgba -alpha on test.png`
The only issue is the long lead times between actually ordering a batch of laptops and them being dispatched. It can be a fortnight or more. Currently I (as the sysadmin) keep a cupboard full of them ready for use, but I never look forward to ordering more.
> stuff just works
I can't reconcile these two statements.
I must agree that the support of high DPI displays is not optimal just yet. However, it is not that bad. At a corporate level one person can do the tweaking and create a great user experience for an entire organization.
Given the amount of tweaking and "management" goo organizations foist upon the average OSX installation blasting out a few lines of DPI tweak commands is almost nothing.
So it just works in the sense that you can do sane configuration activity and it behaves as you might expect.
Also, Unity out of the box is still pretty good on a scale like this: very bad, bad, tolerable, pretty good, good, great. It does "just work" for the most part :)
Specifically, all of its friends are masochists.
There's no actual issues with the hi-DPI aspect, it does work, but it's obviously very difficult to read without tweaking. All the other applications we use just work out of the box.
As noted elsewhere, hi-DPI problems aren't exclusive to Ubuntu.
Doesn't affect my opinion on the usability of the platform, though. For my part, I got my work to do, so I ultimately ended up back on a Mac (yet again, ugh) because, for all Apple's recent problems, none of them bother me as much as things like, "I just flat-out can't use this app that I need without a microscope."
Divvy, Spectacle, ShiftIt, Moom, Amythest, SizeUp, Cinch, Slate, Flexiglass, probably others.
If you want something very configurable and don't mind writing some Lua, Hammerspoon is a fun utility.
Need more fancy ? http://www.mactrast.com/2017/04/review-mosaic-simple-window-... --> cannot vouch as not installed yet
Well, at that point, you're no longer using 4K (even if you use a 4K external monitor, OS X doesn't render at that resolution by default). Incidentally, if you do try to render OS X in 4K, it looks horrible, for the exact same reasons.
In my experience, the 4K experience is actually strongest on Linux, followed by Windows. OS X comes in last.
You chose to use a Mac to avoid HiDPI problems, I chose to use a normal DPI laptop.
It's weird though. Windows seems to do different things depending on the scaling factor and how you set the factor. 125% might look terrible but 150% is tack sharp. Setting 125% as a custom scale factor does something different than choosing 125% as a standard scale factor. Usually, you have to log out and log back in after changing the scale, but not always.
When you are working over remote desktop, it tries very hard to stop you from changing the scale factor. I'm not even sure if you can anymore.
Scaling a pixel to one pixel and a bit of another pixel is not physically possible, there is a hard physical unit. They are some zoom ratios that will smooth well to the screen and some that won't. I think it's mostly noticed on text, the eye is incredibly good at noticing minimal variations.
 Actually, font rendering on windows is already on a subpixel level. A pixel square is made of 3 rectangular colored sub pixels.
This is not really Ubuntu's fault, but mainly a problem with GTK2. Everything based on GTK3 scales quite well nowadays, the same is true for QT. Unfortunately, the Gimp and Inkscape are two notable applications which have not yet been ported to GTK3.
Other than that, it's only the console which is not usable out-of-the-box.
Which is ironic, considering GTK is the GIMP ToolKit.
Linux has demonstrated itself to be compatible with nearly any hardware combination imaginable, to various degrees of usability. It runs on everything from tiny embedded microcontrollers to supercomputers. MacOS and Windows cannot say the same.
It is perfectly valid to criticize the frequency of driver incompatibility. However, this criticism should not be leveled at Linux. It should be directed at the hardware manufacturers who do not release Linux drivers for their products, or the software vendors like Microsoft and Apple, who forego open standards in favor of proprietary protocols.
The fact that hardware vendors can successfully produce and market a "Linux first" laptop shows that those vendors are responsible for any incompatibility in their other products. They've proved they can make compatible hardware, so why don't they do it by default?
Windows also works well on laptops specifically designed for MacOS. It helps that all the commodity parts that Apple uses were designed for other Windows laptops. Plus, Apple has put some effort into Boot Camp to ensure that Windows can be loaded, and Microsoft has probably spent some effort making sure that Windows runs well on Apple hardware, but that's one example of a machine not designed for Windows which still runs it.
Unfortunately, that's basically the only example of a laptop which is not designed for Windows that I can think of. Perhaps the Novena laptop  is an example of a laptop not designed for Windows? Are there others?
As far as EFI goes, Apple does not allow you to boot USB. Apple allows you to boot the OS X installer via firewire, but nothing else. This means your only option to install Linux, Windows, *BSD, etc. is from a CD/DVD or from a new partition on the hdd/ssd.
Thankfully, you can shrink the root partition in OS X while running OS X, so there is a way to install Linux without optical media, but it is complicated.
I'm not sure if this is the same for their other models but I bet there's a big pile of Windows machines sitting in an EU warehouse.
To add insult to injury, I asked if they could improve the price in a web chat and the guy botched it so hard he added an extra £93 onto the price. Outsourced, off-shored support at its very worst.
In the end I dropped a fraction of the money on a next-day-delivered Thinkpad 13, tore Windows off it and raised another flag for glorious penguin.
Bought an XPS13 maxed out from Dell. Website deceivably back then said 2 days shipping, but I didn't realize it meant 2 days to deliver it to the carrier and the 3 weeks to Germany from China. Website made it look like it was in Europe when I ordered. BIOS gives smoke detection loudness (!) level alarms at night when some hardware part fails and laptop reboots.
Beside that, hardware is crappy, get's too hot, also hardware gives load pitching and buzzing noises when under load. Keyboard has problems too. I can't run Linux b/c I've stupidly ordered NVMe.
Would never again buy Dell, I'm back with Apple for my next laptop.
I own a Dell XPS 13 9365, a 2-in-1 laptop. Dell have decided, arbitrarily, not to support Linux on this machine, but you can still actually make it run by disabling Secure Boot and setting the SATA mode to AHCI. Even then, Ubuntu doesn't seem to detect when you've rotated the screen for tablet mode or stuff like that -- but that's probably Ubuntu's driver support. The machine also can't wake up from suspend mode, which I suspect is a driver thing.
Problem is, if you then apply a typical, recommended firmware upgrade from your Ubuntu Software Manager, it bricks the laptop. The problem being, "applying" the firmware upgrade in Ubuntu just primes it to run when you restart the computer. When you restart the computer, the BIOS detects a present firmware upgrade and applies it, without apparently double-checking that, you know, this isn't gonna brick the computer. I now have to wait for a technician to arrive today and replace my motherboard, then figure out how to change those BIOS settings again so that I can boot back into Linux.
Maybe Ubuntu could have stopped this by detecting compatibility issues between the firmware package and the hardware. Dell's BIOS definitely should have stopped this by detecting the compatibility issue between its own settings and the firmware package. Alternately, if the issue is that, for instance, the BIOS won't check these packages because I've disabled Secure Boot, the BIOS needs to bloody well differentiate between Secure Boot for operating systems (which we want off, because we own our laptops, thank you very much) and for BIOS/firmware packages (which we want on, because why brick a good machine?).
Overall, great machine, but bad Dell, no cookie.
> Dell have decided, arbitrarily, not to support Linux on this machine...Ubuntu doesn't seem to detect when you've rotated the screen for tablet mode or stuff like that
I suspect that this is exactly the reason why Dell decided not to ship with Ubuntu on this particular machine. I'm not sure that GNOME has support for tablet mode yet; Unity 7 (16.04 LTS) certainly doesn't.
> Maybe Ubuntu could have stopped this by detecting compatibility issues between the firmware package and the hardware.
Firmware updates come directly from https://fwupd.org/ I believe. Ubuntu (like other distributions) doesn't have anything to do with distribution and release management of the the firmware blobs themselves; that's done entirely by Dell I believe.
> Dell's BIOS definitely should have stopped this by detecting the compatibility issue between its own settings and the firmware package.
What exactly do you mean by "brick"? Dell explicitly don't support Ubuntu on this particular machine. If you had been running Windows, would you still be stuck? If not, I don't think you can reasonably blame Dell here. They don't QA with Ubuntu (or any Linux) on this hardware, and you knew this before you purchased.
Indeed, and yet here y'all are, saying that if I didn't want a brick, I shouldn't have run Linux. Ubuntu didn't do anything to the hardware, it just queued up a Dell-distributed, Dell-released firmware update to run upon BIOS POST.
Since this update was supplied by the manufacturer, I expect that at the very least, it can check for BIOS settings which render the update incompatible, and should most probably just, you know, install itself cleanly. It should never, ever brick the machine, because why in the hell is the manufacturer sending me something that bricks their own hardware?
>What exactly do you mean by "brick"?
By brick I mean brick. It fails POST and the several input combinations for resetting to a clean BIOS don't work either. Dell is having to send a technician to me to replace the motherboard, after which I can once again figure out the BIOS settings to run Ubuntu cleanly.
This doesn't mean that Ubuntu crashes after GRUB loads it. It means the BIOS no longer loads, period, let alone GRUB and Ubuntu.
>Dell explicitly don't support Ubuntu on this particular machine. If you had been running Windows, would you still be stuck? If not, I don't think you can reasonably blame Dell here. They don't QA with Ubuntu (or any Linux) on this hardware, and you knew this before you purchased.
Bull. They QA their own BIOS and firmware. The whole point of firmware-BIOS-OS separation is that the operating system never even speaks to the BIOS directly. BIOS runs the bootloader, bootloader loads the OS, OS proceeds according to a standard for how PCs are run. What I do to customize the machine is my responsibility, but proprietary updates from the manufacturer are theirs. If I've altered the BIOS settings to run a different operating system, the update needs to detect the altered settings and, if necessary, refuse to run. Then I can get on with my usage of the machine I bought.
It's not my job customize their firmware, nor should their firmware updates have compatibility issues that cripple the BIOS, ever, period.
The complexity here is that the firmware updates shipped for Linux may well take a different QA path to the firmware updates shipped for Windows, as the distribution channels are different. Further, manufacturers usually try to unify (to some level) firmware updates for different systems into fewer actual binary blobs to reduce release engineering workload.
What keeps a complex system such as this working smoothly is QA. Dell cannot reasonably be expected to spend effort on QAing combinations they clearly do not support.
While the fact that your system was bricked is likely a bug that should not have happened, nevertheless I think it's unreasonable to blame Dell for this as viciously as you are doing because they do not support or QA that combination and cannot be expected to do so. That they're fixing it is the most I think you can reasonably expect.
I understand that you've accidentally ended up having a poor experience here. But you can't reasonably expect that to reflect on the experience others might get following a path Dell actually supports and can reasonably be expected to actually QA.
Assuming that they're all still Dell-made firmware blobs, I don't see why they'd be different at all. Anything that Ubuntu can "do to them" ought to ruin the signature. Only an authentic Dell firmware blob should actually make it through the BIOS' checks to installation.
It's like saying, "look, you opened our cryptographically signed file under our system, sure, but you downloaded it through an unsupported channel."
No, the update content in different OSes is identical. The firmware is being updated by UEFI capsule; the only differing part is what puts the capsule content into it's space. There's actually a third way how to do an update: straight from UEFI.
I'm not the only one. I haven't seen anything on the internet about this model. This is driving me insane, feels like im always wasting power.
If not, you can't blame Dell for your choice.
(I'm the type of person to buy the car before choosing the tyres)
Otherwise the precedent might as well be set that, for instance, firmware updates are not required to work in the case that I've installed LibreOffice instead of Microsoft Word. After all, Dell made no commitment to supporting LibreOffice, so why should I expect the firmware update to work?
Well, because it's an orthogonal concern, and because they sold me hardware, dammit, not software experiences.
honestly I never understand why people go for Macbooks or XPS. X/T model user since 7 generations.
I did look at the XPS but there are so many disadvantages with those modern one-body-all-closed-no-mods laptops.
Before you down vote me, I still applaud Dells decision to get Ubuntu Laptops out there. But I don't really see what the big game changer is whether a laptop comes preinstalled with Ubuntu or if I do it myself.
It means a certain level of support. They're actively trying to make this thing run Linux well, instead of leaving it up to "the community". And you don't have to go reading forums or blog posts before buying just to figure out if it will have some stupid UEFI bug or whatever that makes installing Linux on it a real hassle.
Although I prefer Thinkpads, I very much understand why people want Macbook-like computers – they look quite nice! You shouldn't have to fit into some black-and-blue mold just to run Linux.
It seems you get "a certain level of support", but I fail to see how this is better then the usual "buy Thinkpad, install Linux, use Google if you have problems" cycle is.
Touchpad. No Windows laptop I know even comes close to both size and functionality of the touchpad of a Macbook - although I have to admit I won't get a recent Macbook, because the keyboard is crap, the touchpad is too big for my opinion and no USB-A = no Macbook for me.
You won't notice it though and will be glad of the size when you do. I LOVE the bigger touchpad on the newer Macbooks (but its about the only thing about it I love. Keyboard is absolute dirt, as you said, no USB-A is ridiculous given that all of my devices are USB-A, the touch bar is useless and virtual function keys on touch bar are a very poor substitute for real function keys... but the touchpad size really isn't an issue, you can use how much or little as you like and the palm rejection works perfectly)
The game changer (IMHO) is that it becomes a sustainable economic model for Linux users to buy modern (rather than out of date) Linux laptops with good driver support.
I'm aware that Thinkpads generally have excellent Linux support for the basics; I suspect that this is because of a combination of a reasonably supportive manufacturer (in a not hostile sense) and the number of Linux kernel developers who use Thinkpads.
However, what about buying a new Thinkpad that just came out? Now I'm not sure whether everything will work and what non-working bits will continue to not work for years.
With a pre-installed Linux, I have more confidence that everything will work from day one, and (if this continues) that this is a sustainable model. Because some of my money actually gets allocated against Linux support, rather than being misdirected to Windows support.
Our company recently bought Dell laptops with Linux pre-installed, I had a hardware issue with sound (basically, neither Windows or Linux recognized the sound board, showing that this was a hardware problem). I entered in contact with Dell support, and even after saying that I was using an unsupported OS (Arch Linux, instead of original Ubuntu) they happily RMA'd the whole thing and nowadays the notebook is working without issues.
The T420 I owned on the other hand had a rather plasticy feel.
They have an excellent performance with relatively easy travel size weight. What's not to like? (disclosure: I own an XPS)
I personally cannot fathom why anyone would ever use a factory-installed operating system of any kind. It's only your computer if you installed it yourself.
Done right, a manufacturer-installed OS can have everything set up right for the specific hardware of that machine, which is a much nicer experience than having to check the details yourself. (A lot of manufacturers don't do it right of course).
A laptop sold with Ubuntu means that you don't have to pay for Windows (>100€). So it's cheaper.
Furthermore, you have guarantees that linux support will be better than average.
I could see other organizations which were not so disconnected from Dell EMC being force further down Windows. Their solution is to run Linux in a VM, which is just unfortunate given the work and the product.
Don't try to use the GPU and the CPU at the same time for longer than 5 minutes or you will be very disappointed (I own the same model).
Chromium scales on its own, but some sites are too small, like hacker news. But it does remember zoom settings and 125% works here.
For Steam I found a skin that makes it slightly larger but still too small, so I just run it in big picture mode.
I run xmonad + gnome fallback so I get easy notification icons and media button support. The panel needs to be set to 48 pixels in height, but haven't figured out how to get all the icons to be full size. I set the xmonad border width to 3px.
Gimp has a hidpi skin but it doesn't work very well.
This script will scale things up and it works in a pinch but isn't very elegant: