It offers a lot of attractive features for what I imagine to be the typical HN demographic.
That being said, it's still got rough spots that OSX doesn't. It works great when it works, but when it doesn't....
Font rendering, display/compositor fragmentation etc...
Inb4 the anecdotal "well it works for me I just had to download the xf86 font library and compile with a legacy glibc version..." crew comes in with a thousand and one rebuttals. Problems like that are still a suboptimal user experience, no matter how you slice it.
I'd definitely consider a Linux daily driver for some of my work, but there are things that are just going to be less painful on Apple.
I've been using Ubuntu exclusively for well over a decade for everyday work, only occasionally going into a Windows VM, and it's really been perfectly fine, great even, as a developer. I bought a X1 Yoga the month after it was released and Ubuntu installed perfectly on it, the only thing that didn't work out of the box is the fingerprint reader.
Until a year ago, battery life wasn't as good as Windows/Mac, but it's very good with the latest versions.
Proprietary software that some people need to run, now that's another issue, but for most development tasks it's fantastically manageable and accessible, a real pushback against closed systems.
You mention the fingerprint reader not working out of the box - with my laptop, Fingerprint GUI was basically waiting on one of its dependencies to somehow figure out how to integrate my fingerprint scanner. Things like my active-stylus capable touchscreen weren't supported, and there were no applications to really utilize it even if it was.
I switched to Windows as my primary OS when I realized Windows PowerShell had basically become on par with Linux in almost every degree and that VS Code was as cushy as I could hope for in a development environment. The only thing I've found that isn't supported out of the box is Redis, but I downloaded a ridiculously lightweight version of Ubuntu from the Microsoft store (we're talking <1 MB memory footprint) with one click and was then good to go.
The other thing that really impressed me was all the easy to use tuning software. With ThrottleStop I was able to easily under-volt my processor to completely eliminate things like thermal throttling and improve performance all while greatly improving my battery life. Nvidia support is also way better so I can turn off my graphics card for anything but games - and then there's MSI Afterburner to under-volt my GPU when I am using it.
And yeah, not only do my fingerprint scanner and stylus work on Windows, but Windows has Windows Ink built in so I can easily take screenshots of whatever I'm doing with Snip & Sketch and annotate them with a pen in an instant, and I can use Sketchpad like an on the fly whiteboard when I need to do some math.
Plus, it has art programs like Krita that basically turn my laptop into an iPad Pro when I feel like getting artistic.
And with programs like Enpass, I can use my fingerprint with Windows Hello in place of my master password for stuff like logins and credit card information, which is a lot more secure for someone like me that does a lot of my work from coffee shops.
I still love Ubuntu, but all the offerings of Windows 10 has kind of made me a Windows fanboy and even makes MacOS seem like a decisive downgrade.
Not trying to discourage you from using it like that. Its perfectly fine as long as you realize that the fingerprint is only secure against random people on the street or just not very competent attackers... which is fine and is probably enough for most scenarios!
but now on the topic itself... the windows subsystem for linux is perfectly fine for a lot of things, there are quite a few issues however. All files accessible from windows will have 777 for example, there are a few applications that have issues with that. daemons exit as soon as the last terminal closes is another thing many people have to stumble upon.
and ymmv on the issues you mentioned. everything you mentioned is completely uninteresting to me, personally. a good window manager such as i3wm offsets pretty much every shiny-ness windows 10 has for any development purposes.
i do use windows for anything else though. (and am sadly forced to use it at work as well)
Not sure how Throttlestop compares to the latest Linux options. I've got many containers going, three instances of vscode, a zillion tabs, performance is not an issue.
But, I realize not everyone cares about a free and transparent world, even when it's more and less as good. It must at least encourages companies like Microsoft to keep opening up and getting better.
Here is the point. It absolutely is fine as a developer. But my parents would never be able to get accustomed to Ubuntu or any other distro. It was hard enough to make them use email.
The truth is that the vast majority of people just want things to work. Like turning on a TV without any setup. Hell, people pay electronics stores 100€ to plugin a cable and run the "find channels" function.
OECD studies have shown that more people than one thinks are incapable of using search in email .
However, your example of fonts is definitely not one of those areas, anymore. Font rendering on Linux is as advanced and capable as any other OS including in the areas of kerning and hinting. It should just work without any user intervention and look great.
I agree with you that compositing/window manager fragmentation is a problem. And this article is a perfect example of that. The author may think that they're happy using i3 with Firefox and st at the moment, but the desktop computing environment has gotten so complex and the expectations of users who interact with desktop applications so rich, that a small hobby project DE/WM cannot fully satisfy all of those use cases over the long term. The only two desktop environment projects that have enough resources to meet the needs of users behind them are KDE and GNOME. And we shouldn't be telling new Linux users to try anything but those two.
FWIW, I've worked at Google for 7 years where we use Linux on our workstations. Many engineers fiddle with various "hacker" window managers like i3, Sway, Awesome, or fvwm. They almost universally give up and switch to GNOME(/Cinnamon) or KDE: it's just too fiddly/not complete and they'd rather use their brain power for solving real problems. The authors example of messing around with dmenu because "unix philosophy" is an example of this kind of waste of time that people eventually get tired of because they have better things to spend their time on.
I like my desktop lean and mean. I do not want distractions. When I am dealing with a remote system crashing under load, the last thing I want is my desktop or my shortcuts to behave in weird ways. Things must always work, in a consistent way. Funny thing is I can only get that in Linux... and in Windows 10.
Customization is a feature, just not everyone needs that feature.
So I disagree with your assessment, as some users will find Gnome or KDE too distracting.
That doesn't leave much of gnome.
Unless I'm misunderstanding, you can use Alt + left click to move any window around, and Alt + right click to resize. No need to reveal the title bar except perhaps to read the occasional title.
I never thought I could move away from LXDE but I think I'm going to soon.
At the moment I'm considering Sway mostly because of the wide use and community, but it's a "long term" project this month or the next :-)
Users have different needs and preferences, so I see the fragmentation as a positive thing because it gives people choice.
For me personally, xmonad, one of these hobby projects, has been perfectly sufficient for the last 10 years. I also didn't really touch my configuration much in the last ~8 years. More importantly though, I find it actually reduces my mental workload since I no longer have to handle window placement myself.
No matter the thing I'm doing, every useful or efficient placing of windows is always no more than one or two keystrokes away.
This isn't entirely true. AIUI, Apple enables LCD filtering and subpixel rendering by default, because it knows that you're using an LCD and what the subpixel order is. However, these are usually toggleable via the GUI, and even without them it usually still looks fine.
External displays are nowhere near the normal use case?
The highest supported resolution on a MacBook is still scaled down. Only a couple of expensive LG displays match the actual density as far as I know.
The MacBook Air only got a Retina display a few months ago. The low-end iMac is still 1080p.
> but there are still things that just going to be less painful on Apple.
Of course OSX is more user friendly now, but Linux desktop has improved by leaps and bounds. 4 years ago you needed to be a developer or extremely savvy to run the average desktop distro, now I would say you just need to be tech sdavvy. I would argue that for the ho hum business cases (not extreme use case profiles like design and video production) that Ubuntu won’t cause any unnavigable issues.
I think in a few years you will see the Linux diaries continue in popularity, especially among developers. Laptops have become commodity items. There just isn’t that much that differentiates (for me at least) a MacBook from a good ThinkPad.
As soon as auto-configuring XFree was kinda figured out, out goes XFree and in comes x.org. Xorg getting to the point where having 3d animations doesn’t require kernel-module-config expertise? Out goes Xorg, in comes Wayland. Gnome 2 worked out the kinks? Time for Unity! Kde 4 finally getting snappy? time to break it up! Init systems figured out? Systemd! ALSA getting adoption? Pulseaudio! Pulseaudio finally working? Let’s rip it out! And so on and so forth, in an endless churn.
Now, this sort of churn also happens in commercial alternatives; but stuff gets shipped when it’s 99.9% working, left running for years (or decades, if from Microsoft), then maybe gets rewritten with something that must be better (no regressions) or it won’t even ship. In the Linux world, it’s all just thrown over the wall; maybe you’ll be lucky and it will work on your machine, and maybe it won’t. By the time it gets fixed, it will be time to replace it. And so the experience is a perennial struggle against half-finished, unpolished software.
I don't think there has been a single inflection point for me there has been steady incremental improvement.
If you want to think about how far things have come I started using Linux in 2001 with Mandrake. Around 2.2 -> 2.4 kernel switch. So much has changed since the bad old days. I don't want to throw out a "back in my day we walked up hill in snow both ways" style rant but...
-All we had was EXT2 and we liked it...
-You had to manually configure modelines for your video card changing display resolution was more or less a crapshoot
-Apps would exclusively lock the sound card which typically meant the first thing you opened would be only thing capable of playing sound. But you could pipe things to /dev/dsp and have the speaker emit random beeps that was kind of cool
-window manager used to crash a lot and you'd lose all the title bars for all the windows this happened fairly often - cntrl alt backspace is still in my muscle memory years later.
-Printers were basically impossible to configure.
People complain about changes like ASLA, pulse audio etc but I think there is a lot of rose tinted glasses being applied to how things were before. Sure some things aren't perfect but neither was their predecessors and on the whole they fixed more things then they broke.
So I can agree that “Linux will get smoother”, because progress is more or less inevitable, but “linux will be as smooth as [Windows|MacOS]”, as upthread implied? Never going to happen.
There are a lot of use case profiles that are very ho hum for which Linux is a non-starter. I'm thinking specifically of basically any time you need to use Mac-only software. Likewise, I'm sure there's Linux-only software that would make OSX a non-starter.
IMHO, all of the major modern operating systems are good enough and have been for a while now. Pick the applications you want then find the OS that best supports those applications.
It would be nice if the Purism guys would open some retail stores. Once you can walk in some place and get help it gets to be a lot easier to recommend those machines to less tech-savvy relatives.
Yes Hollywood is pretty much sold on Maya and Houdini on Linux, but they use their own in-house distributions and have no issues dealing with binary blobs for performance.
Meanwhile my Asus netbook sold with Linux still can't do video decoding on hardware nor OpenGL 4 support, in spite of DirectX 11 class hardware, because AMD decided to reboot their driver development.
Font rendering the Apple way is a style choice at best.
What does "display/compositor fragmentation" mean to end-user?
Given how much of a hacker's usage of a computer is working with text (reading pages, writing code and documentation, taking notes) it's a style choice that actually has a significant impact.
(By the way, macOS Mojave deprecates subpixel antialiasing — a poor decision when there are many non-hidpi displays still being used)
That was the time of sub-100dpi screens though. With 300dpi (Retina) screens these days it does not matter much anymore.
My 1Password FireFox extension (ubuntu 18.04 LTS) doesn't work on X unless I log in through the website but it works just fine on Wayland.
Hey there! Beyer from 1Password here. It sounds like your issue might be related to my post here: https://goo.gl/cdhFbz
The good news is the underlying bug that was "breaking password fields in Firefox" was recently resolved. You can read about it here: https://goo.gl/uFv5rL
If you are still having an issue using 1Password X after updating gnome-shell, please reach out to us at email@example.com, and we'd be happy to help!
Thanks for using 1Password!
The practical implications of what it means to the end user is that companies/entities that write any GUI-enabled software for Linux are forced to make decisions about what(if any) OS they'll support.
And just like that, we've waded into "cracking open the window manager" just to figure out what's going on.
Edit: akiselev said 1Password, not LastPass. My mistake. Interesting that I experienced this same issue with LastPass.
After I login through the 1password website, the extension window opens just fine so it might be some input security service, but how would a Firefox extension even have access to a system service like that except through Firefox's built in APIs?
With keepassxc you have native application and extension, that communicate together via socket. The native application can use whatever native APIs it wants.
However, back to you. What's more weird, if it is pure Firefox Webextension, that Firefox (still by default) launches as X11 application under Wayland, so the extension should have no way to know the difference.
Someone really needs to make Linux distros just work when hooking a laptop up to a projector. The fact that there's so much trouble that's so public and faced with so much concentrated embarrassment is a serious ongoing PR impediment to Linux.
Linux distros got printing licked across the board, so it can happen.
How about a Linux version of Airdrop?
This was the story with printing in Linux as well, back in the day. It would work fine for some, and be a nightmare for others.
This also happened with printing. People would say it worked fine for them, then point out that scroungers on quirky old laptops were getting what they deserved. Really, the fault wasn't those quirky old laptops, but rather fragile and not so well standardized software.
(fwiw the sway 1.0 betas, which are more or less i3 ported to Wayland, handle this type of case beautifully, and unless my kernel does Strange Things and panics, I basically plug and play into whatever I want, much like one would expect from the GNOME/KDE experience)
The amount of time that joke's been around should be taken as a sign of a persistent condition.
unless my kernel does Strange Things and panics, I basically plug and play into whatever I want
The kernel doing strange things and panicking doing plug and play was never a thing on OS X when I started using it. Seems to me it's not been a thing for Windows since before Windows 7.
It's long been speculated that one problem with Linux is cultural. Are Linux desktops trapped by cultural expectations?
The only OS I haven't had kernel panics on in recent memory is, indeed, Windows starting with 7, but I have a long list of other reasons I can't/don't/won't use it as my daily driver.
I'm still rocking my 2012 Macbook Pro. It's been solid.
I can only stand to do my development on a windows machine by using a Linux VM.
Really? Because I definitely got the impression it was a widespread joke in that circle of devs. Also, when I saw it happen to the last poor sap, it happened on Bionic Beaver. Is this more of the Thermocline of Truth?
From everything I've seen a projector "just working" seems to be the exception rather than the rule.
My experience with Macbooks has been excellent. Even Airplay over an Apple TV just does the right thing with presenter view in Keynote just working. The one time I had a rotten experience on a Macbook was when I had to use Zoom.
As for the font thing, maybe I'm not a fontphile or something, but they're usually fine for me.
And you know what? Its mostly true, but its also true for me in reverse.
Mac OS has definitely improved in the last few years... that being said, it's still got rough spots that Fedora 29 doesn't. For me, this would be just as true of a statement.
 Maybe not the part about "had to download[...]" which felt a bit biased.
Everyone has their own view of rough spots but those aren't mine.
The rough spots I'm experiencing with Linux are very specific. Namely lack of support for some proprietary VPN solutions like Junos Pulse for example.
Also lack of native clients for software like Webex Teams, forcing you to use their web apps which use up so much resources that I'm convinced they've caused my laptop to stall a couple of times.
And of course, perhaps related to the issue above, anything relating to graphics does need work.
The major positive thing I can say about using Linux daily in work and personal life is that it works so well that when it fails you get very annoyed. That's a good sign. It means that it's rare enough to annoy me. If it was too common I wouldn't be surprised when it fails.
I also switched back from Mac to Linux, 2 years ago.
But I use vanilla Gnome 3 on Fedora. Before Mac I used tiling window managers but now I don't see the point. It's just so much configuration to handle which Gnome does without a single line of config or shell code.
There's no rhyme or reason.
Sometimes, my wifi is broken. On a different distro, it's not.
Right now, under Ubuntu 18.10, my suspend from resume is broken - it wasn't on 18.04.
I have the official Linux laptop, basically - a Dell XPS 9350, all 100% Intel hardware, no binary blobs.
The Killer Wi-Fi card has none of the connectivity problems I read about almost everywhere for this model.... but has all the throughput problems, running at about 15% of the throughput/bandwidth of my Chromebook (an Intel AC card) sitting right next to it as of the last time I tested.
I'd definitely not call Linux consistent, but it's better than it was when I started using it back in 2006-2007ish, and I wouldn't trade it for any other setup (and it's not for a lack of trying: my primary work machine for 2.5ish of the last 6 years was a Macbook Pro of some sort for one reason or another)
No, my experience is much more along the lines of, "I download and install Ubuntu or Linux Mint, and then it works without further issues."
My current Dell Inspiron 7000 didn't need a single driver installed manually (with Ubuntu). Pretty impressive.
Yeah that good for MacOS and virtually every other piece of technology.
So take a user from Windows 10 or OSX whatever, and sit them in front of your favorite stable Linux distro running a terminal emulator and a browser of your choice.
Are the fonts going to be rendered in a way that is unobtrusive for those users, or will things look ugly and difficult to read?
Because I'm running a chroot of Debian Buster on my Chromebook and boy, that terminal sure does look blurry. (And is this a bug that will be fixed when Buster stabilizes, or am I expected to go read some font wiki on a different machine to "guess-and-check" it back to sanity?)
I don't want to switch to OSX. But I also understand why someone wouldn't trust the UX of a system that ships in "headache-mode" by default.
The XPS machines we have are ugly, they're flimsy, they have a grotesque carbon-fibre pattern on them, the keys leave imprints on the monitor after the lid has been closed, the fan drowns out the sound of music in my apartment, the camera is situated about 5mm above the keyboard with it's rickety keys, it is without a doubt the ugliest, the worst computer I have owned, I eat tramadol just to handle the back-pain from carrying it around in my rucksack, but I'm much less angry and frustrated working on it than I am on a Mac because it doesn't shit it's pants when more than a few containers are running.
It's also very pleasant to touch and hold in your hands, especially if you like the Thinkpad aesthetics.
The fingerprint reader doesn't work though, not that I know what I would use one for.
I use it to log in, plus to auth for anything that requires root. I’m doing some coding so occasionally do something that requires sudo in a terminal. A prompt appears and asks me to swipe my finger. It’s very neat.
I got a personal 2016 XPS 13 (9350) because of Project Sputnik (https://bartongeorge.io/tag/project-sputnik/), which brought official desktop-Linux support to it. I also work at Google where we have a very solid desktop distribution, providing me a working example of almost every tweak I might want to make. Owning the hardware has been a great experience since two changes. First, I swapped out the Killer wifi card and replaced it with an Intel card. Unfortunately this is table stakes for any machine with that card. Second, I thought the spring on the keyboard's keys was much stronger than another XPS 13 that I'd test-driven, so I rolled the dice on eBay and bought a replacement for something like $20. I swapped it in, and it indeed restored the light touch that I liked about the other one. I wouldn't expect many people to do this.
As for software, I've settled successfully on Ubuntu 18.04 LTS, running the same Cinnamon desktop environment we use at work. A key part of my success has been committing to Ansible for configuration management. When I make a settings change I want to keep, I figure out where it's persisted, and then upstream it into my Ansible repo. This has the three-part benefit of teaching me a little about the system (mostly dconf), creating a worklog of changes I've made to the machine, and giving me the psychological comfort that if things get really bad, I can reinstall the OS plus all my tweaks. (Just to be clear, these aren't
I've since expanded the Ansible setup to manage a tiny target-practice server I keep on a cloud service, and over the holiday break this year I successfully set up an old desktop in a closet by (1) installing base Ubuntu, (2) unzipping my Ansible repo to it, (3) running a bootstrap script in the repo that installs python, git, and ansible, and runs Ansible on itself, and then (4) drinking a cup of coffee. By the end I had a machine that was substantially identical to my XPS 13.
I'm sure I'm proving someone's point that Linux isn't ready for the desktop. But from my perspective, the Linux-hardware interface is excellent on a Dell XPS 13 because of Project Sputnik, and if you can factor out the likely dozens of hours I spent overengineering my personal Ansible configuration system, I now have a reliable, easily reproducible desktop setup that rivals the manageability of a Chromebook, which is my personal gold standard for desktop statelessness.
What Mac were you using previously? 13 inch macbook pro is heavier than the 2.67 pound 13 inch XPS
Our development environment has been with docker on Mac for a few years. It's definitely slow, but we found it acceptable. Do you happen to use a very large image with tons of IO work?
And, it installed Ubuntu (mate) without a hitch. Everything worked out of the box except the font scaling. I suspect that's a Mate specific problem though. Gnome may work better.
Same with my Late 2013 MacBook Pro.
> the keyboard with it's rickety keys
I haven't used the XPS keyboards extensively but they seem to get good reviews and in my limited experience, I definitely prefer them to the new MacBook keyboards (which I've used quite a bit).
> I eat tramadol just to handle the back-pain from carrying it around in my rucksack
The XPS 13 is actually lighter than the MacBook Pro 13 and the new MacBook Air. The XPS 15 is half a pound heavier than the MacBook Pro 15, which should be barely noticeable when carried in a backpack. I am not quite following.
Some struggled with font rendering between a 1080p monitor and their Retina displays as well.
This is one area where I'd have to concede the Apple experience is objectively suboptimal.
However, the Python ecosystem isn't doing anyone any favors here.
C++ ecosystem sucks a lot when you have to build 5 libraries using 5 different building systems. Linux package managers do that for you.
Or, God forbid, you want to use clang or gcc. Or intel's whatever.
Plus there are DLLs and COM libraries as well.
Apple isn't, either. They ship a version of Python by default that is woefully out of date and conflicts with any other Python that you install, and what's more, some of their tools (notably, LLDB) freak out if you have a newer Python on your $PATH.
I'm still not sure I know the best way to install Python packages. It seems someone has something bad to say about any given method (pkg mgr, pip, venv, etc).
These days I tend to recommend using https://pipenv.readthedocs.io/ or https://poetry.eustace.io/ which will automate creation of those virtual environments and provide a nice interface for managing versioned package installs (i.e. `pipenv install foobar` will record the exact version you installed, including SHA-256 hashes, for repeatable installs).
If you're in the Red Hat world, note also that they're making some big (and I think welcome) changes to isolate the Python they use from the version developers use, which will also allow tracking newer versions faster:
If there’s no requirement from some OS feature to have a particular package preinstalled, I would rather not even have it in the base system if I’m just going to have to replace it with an updated (and updateable) version.
Linux and the multiple variants have been horrible. Also, you commented about Python 2-3, I agree I had issues with OS X. But you are ignoring the fact that installing software on Linux is still a mess of resolving all the dependencies and the package manager. Where in OS X, it is just a file move to App.
Nowadays, I run dev code on a Linux docker container. I do pretty much anything else on OS X.
As far as installing apps, I guess it depends on what you're trying to install. In my experience installing apps is easier on Linux than on any proprietary OS, so long as you're installing open source applications and working within the package manager and keeping your system up to date. If you want to run proprietary software on your open source OS, then yeah it's more difficult since Linux distributions aren't really designed for it.
My second mainline Linux install was a copy of RedHat on a laptop.
Mind you, this was RedHat 5.1 on an old 486 laptop with 8 meg of RAM, PCMCIA, etc. Sometime in 1995 or 96, I forget.
Several re-compiles later, I had that entire system working - all drivers for all the hardware, including the built-in modem (plus sound and PCMCIA ethernet).
I got lucky there.
I had an Asus laptop that would only pickup wifi if I hibernated it first. I have a Dell touchscreen model that took a full day to get the touchpad to work at all as Ubuntu kept defaulting to the screen. Too afraid to do a fresh install of the current (or any other) distro because I can't remember how I fixed the touchpad issue.
> But you are ignoring the fact that installing software on Linux is still a mess of resolving all the dependencies and the package manager. Where in OS X, it is just a file move to App.
This is also highly dependent on specific experiences. I haven't had any issues with conflicting dependencies on Arch linux in the past few years. And personally I appreciate a system package manager for all software instead of having to download applications and doing drag and drop for installation
BTW, on either platform your best bet IMO is installing all your scripting languages within some kind of version manager. I like pyenv, rbenv, and nodenv since they work exactly the same way between the three languages.
Similarly, TeX installation is hard and it's forcefully shoehorned on the installation, and you can't do anything better in the current situation.
My solution is to run a Linux VM for these situations. For programming and server-like purposes, a headless minimal installation is fine. For TeX authoring, an XFCE installation is very handy and not impacting battery life in a visible way.
Installing MacTeX has always been one of the most straightforward things I can imagine.
When they were working on these problems, I was in the middle of my Ph.D., and I needed a stable TeX installation, fast. I installed a Linux VM, all my woes went away, and that method just stuck.
I just checked the MacTeX page now, and it looks like they solved the problems I mentioned above, but currently I'm too lazy and need that TeX installation keep working, so I'll not retry it now.
OSX comes with some software such as git and python2 preinstalled. However they do not usually keep that software up to date and there is no way for you to change that, which means that community package managers need to be used to install more up to date software versions in parallel to the system ones
1. brew install pyenv
2. pyenv install 2.7.x
3. pyenv install 3.x
4. cd work; pyenv local 3.7.x (by default, all projects under "work" will use 3.x)
5. cd work/legacy; pyenv local 2.7.x (but this one will be on 2.x)
then for each project, I'll create a separate virtual environment.
For various reasons, I have about 6 separate python versions and a dozen or so mini projects all working flawlessly as separate virtual environments created under separate versions managed by pyenv, and I always have the latest pyenv thanks to brew.
I much prefer something that I can kick off to play an album or a playlist and have it all but vanish from sight, either into the taskbar or, preferably, into the system tray.
Any time I was posting there included screenshots of 'ranger', various permutations of 'htop', and a 'Matrix' style scrolling character feed.
You never circle through apps as it's not necessary.
Most of the apps are in fullscreen or tiled depending on your need. No window is hiding another one.
When you're use to it, having to move your hand to the mouse or trackpad and click on stuff feels painful. Then keyboard driven apps (which terminal apps are usually really good at) makes total sense because you still don't want to have to move your hands and click in stuff.
Basically usability switches from graphical apps made for mouse/trackpad to keyboard driven. It's a quite different way to manages windows that have usually a learning curve as it is keyboard driven, but it's really efficient.
After using i3 for about 3-4 years, I have to admit that I'm always a bit lost on floating WMs now and feel like an idiot for a few minutes until I'm somewhat comfortable again.
To me tiling WMs are just vastly superior.
Anecdotally, most people I know that have seriously tried to use a tiling WM usually end up sticking with them.
The great thing about i3 (and also other tiling WMs) is that they are very flexible and customizable to your preferred workflow.
Together with them being centered around the keyboard and easily scriptable, you can use them in a way that works best for your preferences and your environment.
For coding on my laptop, the active main workspace is split into three parts, with the editor taking the left half of the screen, and the right half split into browser and terminal (with the terminal being a TAB container with multiple terminals that I can quickly switch between).
When I need to focus on either the editor or the browser, I make them full screen (I have shortcuts to jump between full screen browser and full screen editor directly with one keystroke).
On other workspaces I have setups with music player, email, Jira/Bug tracker/etc that are always launched in the same configuration on boot and are also just a shortcut away.
So it works great on a laptop too, even if you spend a lot of time with one app maximized.
Sure, it felt like I was moving faster when I used keyboard shortcuts to fly around but at the end of the day, switching between windows was never a real bottleneck to begin with.
FWIW I used i3 exclusively for 2-3 years.
I switched to i3 because I was just frustrated with what I had used in the past. I wanted a simple window manager without animations (disabling them on my phone was also a huge gain in terms of usability), that just works and does not get in the way (I had a slow laptop on which Gnome was painfully slow, at the time). The cost was basically to write a configuration file matching what I wanted, learning a few key to switch workspaces and windows, but that was it.
I agree with you that these are micro-optimizations time-wise, but the frustration can be real and it gets in the way when you use your computer all day long. The simpler, the better I'd say (although it doesn't have to be a tiling window manager).
I haven’t used a computer than ran gnome slowly in a long time, so that has something to do with it.
Most of us Linux users find you a bit odd. That's just how it goes.
The shell is a very useful interface. It allows abstractions that just aren't tractable with GUIs. Many of us have found that those abstractions are quite useful for fun things, too.
It’s lightweight, responsive, and makes me feel like I have much more control over my music.
I have it always open in the corner of my sceeen with a tiling window manager.
Play Pause etc. just work without the terminal. The biggest advantage in comparison most graphical players for me is that there is no startup time. If I want to play a song I can do so immediately and even without leaving the terminal I am currently working in.
I'm so used to playing all of my media with mpv that using any kind of GUI app would feel like sludging through molasses.
It's less clutter IMHO. But i don't use tiling wm.
This is exactly why I've been switching back to Linux everywhere, including laptops. Apple has done a pretty good job recreating the modern Windows experience.
I run arch with gnome 3, chromium (+chrome), and vs code as my main dev machine.
No emacs or vim required. No text only interfaces. No terminals unless you open it yourself (In which case it's running powerline with git addons and is still a very nice visual experience)
Gnome 3 with dash to dock is basically a drop in replacement for the osx style desktop. It just works.
Better yet, I'm free to get a better-than-osx experience on any hardware I want (and I REALLY don't want to the new macbook pros - I'm stuck on one at work and it's... bad).
Between work, gaming, personal projects - I'm running osx, win 10, and Arch. They all stay synched with dropbox and google drive. I get unix style terminals/shell everywhere (cmder/conemu for win10), and life is generally lovely.
Of the three, I prefer Arch, then Win 10. Osx is last by a mile. I think apple has a flakey os, and their hardware is rapidly moving in a direction that doesn't suite professional needs.
There's no rule that you ever have to open either emacs or vim on Linux (though, personally I think you're missing out if you've only spent 8 minutes in emacs, but that's beside the point). In fact, you can just not install either emacs or vim, and thus make certain you don't ever open them by accident.
The one I ended up using had the Budgie desktop, which I found to be the most "Mac-like" of all the experiences. The distro I used was Ubuntu Budgie, because I wanted the Debian and Ubuntu software ecosystem, but Budgie is a part of the Solus Linux project (which I enjoyed when I tried it, but I didn't like the package system).
So give Solus and/or Ubuntu Budgie a try on a live USB thumbdrive sometime; you might enjoy it, if you are looking for a "more Mac-like" experience...
But, in any case, yes, there are plenty of Linux front-ends which can be 'Mac-like' if you like that sort of thing.
Same reason I keep Android around, now someone will say it's proprietary blah blah, but I can root my phone and install anything I want on it.
Since I've used Linux there was always a Linux gaming community for example, doubt they spend centuries on VIM or Emacs. Also remembering all the crazy Desktop Environment eye candy from the 2000's like the desktop cube, and the self-incinerating window when you'd close it. Linux just made you feel so damn cool at the fact you could have things that just did all these insanely awesome things!
Linux has KDE Plasma.
I long for the Linux prairies of old, but I've grown fat and lazy on OSX. Linux is a youngster's game.
Apple got it wrong in 2016 but there are signs that we might have passed "iPhone peak"... maybe they will be forced to pay attention again to us old farts.
What this means is that I installed a base ubuntu system with awesomewm and just used it mostly stock. I already had awesomewm experience from 5+ years prior, but for the first month of moving back to Linux I was also perfectly happy chugging along with gnome.
What I am saying is give it a try. Only time I miss OSX now is when I need to sign a PDF.
45 years old here and still rocking Linux (does that qualify as "old" - probably - sigh). Started playing with it in 1995 (MonkeyLinux on DOS!); haven't stopped.
Stuff's gotten dirt-simple compared to what I remember having to do in the past. Certainly no kernel recompiles needed any longer (but if you really wanted to, you can do them).
Have I gotten "fat and lazy"? Yep - just on Linux. I can't see that changing in the future, either...
On a Mac, you can cut/copy/paste using predictable keystrokes in any application, anywhere. And there is a single clipboard. On Linux, you have to deal with a multitude of behaviors (middle mouse button, Ctrl-C, others, and does it paste where the cursor is or do you have to click the middle mouse button exactly where you want stuff pasted, etc).
And I'm not even talking about multiple media types: all I need is plaintext.
The second dream wish would be for consistent keybindings in all text input boxes, like in Mac OS. Such as make Ctrl-A always go to the beginning of line, Ctrl-E always to the end, and Ctrl-K always kill whatever is in there.
If I had that, I think I could use Linux without getting annoyed every couple of minutes. I would still miss apps like TextExpander, flawless drag&drop, and lots of other things, but I could at least use it without frustration.
Unfortunately, none of this is likely to happen. If you wonder why, it's enough to look at the responses that will surely land here:
Actually, in some apps the cursor disappears and in some the cursor goes to the end, but the behavior of what happens when text is selected is always the same.
I agree with GP: I find the two separate copy-paste methods very useful and wouldn't want to see them merged. Users who aren't aware of one of the aren't affected, and those who are get an extra buffer to use.
Usually a scroll wheel, now.
On a more substantive note (globally consistent actions from keystrokes). It would be great, but not going to happen because there are established modes where the same keystrokes lead to well known results. And those actions are different depending on the application. Maybe it would be good if this was not the case, but that ship has sailed a long time ago. MacOS is in sync because its BDFL (Apple) forces it to.
Example: Ctrl-Z. In a terminal, it suspends a process. In most GUI editors it performs undo. Shall we force a global standard? I would pay to watch a user get its editor suspended when he presses Ctrl-Z to undo a typo. Did the program crash? Let me start it again ... what does it mean it is already running?? Where are my edits??? I'll just reboot the machine!
So agreed, global synchronizations that would force this to happen are unlikely. Fortunately. My 2c.
That seems to be a defining characteristic of Linux communities, anyway. Not sure how you can say "all" communities are like that.
> Example: Ctrl-Z. In a terminal, it suspends a process. In most GUI editors it performs undo. Shall we force a global standard?
Haiku uses Alt instead of Ctrl for shortcuts (similar to macOS and the Cmd key), so Alt+Z does undo and Ctrl+Z suspends a process, both in the Terminal. (If you want to use Windows-style keybindings, then it's the reverse, of course.)
Yes. Windows botched this by hijacking the Control key, and the desktop-Linux crowd are unable to do anything but imitate Windows. Pre-Linux GUIs didn't have this problem, and MacOS doesn't have this problem.
It certainly makes sense when running inside a terminal. But when it's an X11 application, it makes me angry everytime I fumble C-X and it interrupts my workflow.
It would be nice to have a universal "pause" shortcut in GUI apps. Why would it instantly hide the app?
Would you pay to watch someone kill a process when they meant to copy text?
Shortcuts are just better on the OSX side of the fence (not perfect, but way better). IMHO, global actions (copy, paste, select all, cut, undo, redo, etc.) should be mapped to one modifier (say AltGr) and local actions to another (say Ctrl). The most important part of this scheme would be that "global modifier + key" has an unsurprising result that is consistent system-wide. Then "local modifier + something, or modifier1 + modifier2 + something" can do whatever the app wants. This would improve things immensely IMHO...
I don't know about others, but with terminator I'm able to map Ctrl+C to copy - but if nothing is selected, then ^C works as normal. Is that what you're looking for?
From the terminal's point of view, Ctrl+C and Ctrl+Shift+C do the same thing. The only reason you can't usually use Ctrl+Shift+C is that by default that combination is intercepted by gnome-terminal to mean "copy".
(I wouldn't be surprised if other emulators with configurable keyboard shortcuts let you do a similar thing).
(If the terminal's Ctrl+Shift+C really bothers you, many terminal emulators allow you to rebind it to Ctrl+C)
One piece of advice: In your clipboard manager you can check a box to sync the middle click and ctrl+c buffers, if you want a single clipboard.
> The second dream wish would be for consistent keybindings in all text input boxes, like in Mac OS. Such as make Ctrl-A always go to the beginning of line, Ctrl-E always to the end, and Ctrl-K always kill whatever is in there.
For this, you should really just run Emacs. But this isn't a Linux issue really, but a more general 'problem' of competing conventions (Emacs, vi, CUA, etc.) for keyboard shortcuts.
The controls are actually a little strange, because they aren't well documented or discoverable, and don't seem to be configurable, unlike most other shortcuts.
The NS prefix on those classes means "NEXTSTEP"
Microsoft Office probably implements its own text box classes so it's different.
The Linux communities are full of volunteers. It seems like their act is actually quite together! There is no 'buyer' to make demands with their dollar. Only there is a developer who one day decides they want a feature.
Be the change you want to see in the world. Open source developers are volunteers scratching their own itches, working on their own time. I wish someone would come build me a house for free too, and if they did I'd be hard pressed to complain about the number of windows.
The thing about Linux is, there's all these evangelists out there who insist that everyone should use Linux (or that they are superior to you for doing so) and, when you tell them why you don't, insisting your preferences are wrong, your workflow is wrong, it works for them so you're a liar, or you should fix it yourself.
If it weren't for those people, I think the attitude towards Linux would be more like it is towards Haiku. No one really complains about it, they just don't use it.
Personally, I love that all the software I have used in Linux does nothing to encroach on my super+* keybindings.
I also love having a secondary clipboard that feels more permanent (ctrl+xcv), while having a quick and dirty one (select middle-click)
Also, some context before I get flamed for being a newbie: I've used Linux since before 1.0 (that's 1993). My usual working keyboard setup pretty much requires a Super key and a Hyper is quite helpful, too.
Autocutsel tracks changes in the server's cutbuffer and CLIPBOARD
selection. When the CLIPBOARD is changed, it updates the cutbuffer.
When the cutbuffer is changed, it owns the CLIPBOARD selection. The
cutbuffer and CLIPBOARD selection are always synchronized.
With this, I can highlight text (which is an implicit "copy") or use CTRL-C and paste with middle-click or CTRL-V (or in emacs, CTRL-Y) and it all works.
The only weirdness is in Google Docs, which seems to hijack CTRL-C, CTRL-V, etc. for its own handling.
alias x='/usr/bin/setxkbmap -option ctrl:swap_lalt_lctl'
I installed Autokey and have keys assigned for Terminal. So for instance: pressing <alt>+c sends a <ctrl>+c, and pressing <ctrl>+c sends <shift>+<ctrl>+c. I have pretty much all of the alt keys assigned to send control keys, and all of Command keys (control keys) to do what I expect: Close ^w, Copy ^c, Find Next ^g, Find ^f, New ^n, Paste ^v, Select All ^a.
as for consistent keybindings, I feel you on that. However your example is a bit flawed. home/end keys do what you want, and ctrl left/right does word skipping.
> It’s just another clipboard. One clipboard is interfaced via the keyboard. One clipboard is interfaced via the mouse.
Perhaps that’s common knowledge, but that understanding made me instantly fall in love with middle-click paste.
In most terminal emulators, Ctrl-C deviates from normal behavior in favor of sending SIGINT. However, some terminal emulators (e.g. the one in elementary OS) makes the simple decision that if text is highlighted, copy it. Otherwise, send SIGINT. That closes all loose ends for me.
Middle-click paste has just made me terrified of ever pressing the middle mouse button for fear of sending PII to somebody.
That's needlessly rude.
It's unreasonable to expect that criticisms be phrased politically to avoid stepping on oversensitive toes.
>It's unreasonable to expect that criticisms be phrased politically to avoid stepping on oversensitive toes.
If you mean politely then no, it's not unreasonable to phrase criticisms rudely. It's unreasonable to phrase them rudely and then expect the recipient to do anything about it.
edit: Also, I said _needlessly_ rude. Are you saying the poster needs to be rude?
If I say “The USA needs to get its act together and fix its broken immigration system”, I don’t see this as particularly rude. By criticizing the immigration system, I’m already saying that it’s fucked up. The fact that I use the phrase “get it together” is irrelevant to both tone and content.
And again, the subject is a large group. The idea that particular individuals should take offense at such a mild blanket statement feels kind of ridiculous.
People don't talk like this in real life.
You probably don’t feel bad saying something like “McDonald’s burgers suck”, but if you tell someone who invited you to a cook-out that their burger sucks, you’re an asshole. Context and audience matter.
And people very much do talk like this is real life. People who understand context and audience can do so just fine.
> On a Mac .. there is a single clipboard.
There are 3 clipboards, at least. Cmd-C/Cmd-V most everywhere, Ctrl-K/Ctrl-U/Ctrl-Y in most text fields, and select/middle-click in Terminal.app. It's like having 3 hands, I find it works well.
Interesting, because Linux is my daily driver, and when I borrow my spouse's MacBook one of the things I get tripped up on is some programs seem to use Ctrl+C/V and other programs use Command+C/V for copying/pasting
I've never encountered an app that uses Ctrl. A developer would really have to go out of their way to accomplish this in their app, since the copy/paste menus are provided by Cocoa; the keyboard shortcuts are essentially built into the OS.
Out of curiosity, could you name an example or two?
What programs are you using that use Ctrl for clipboard actions? Do they also allow their Cmd equivalents, or are they just using the wrong keys?
So if you have windows office because it has traditionally been better than the one offered to mac you'd have to remember to use control instead of command.
(runs under x11, does not support cmd-c/v)
There is also only one clipboard, and none of the "primary paste buffer / selection buffer / etc." nonsense, either.
So ... the dream is alive on Haiku, anyway. :)
I've had better luck over on Wayland, thankfully
end result, my ctrl c and ctrl v work as you would expect -- always!
Would you have some links? I am especially interested in ways to handle shortcuts like with autohotkey on Windows: a context dependent remap (ex: pressing a given physical key may send ctrl-tab in an application, but ctrl-pagedown in another if said other application doesn't support remapping shortcuts)
KWin for example allows setting various global shortcuts: https://docs.kde.org/trunk5/en/kde-workspace/kcontrol/keys/i...
For wl-copy / wl-paste, see: https://github.com/bugaevc/wl-clipboard
I'm thinking about starting with weston and moving to sway if it doesn't do the job