There's been a tremendous amount of work over the last ten years to make the Linux desktop environment habitable. At first it was usable for very narrow use cases, like living within an Office-compatible application, but over time that space has grown. What was once done out of spite can now be done for the sake of convenience.
Instead of being all negative about Apple not living up to our expectations I think we should appreciate how much Linux has exceeded them.
I thing that this is a misrepresentation of the history of desktop Linux. There was a period at the beginning of the century where Linux was arguably far better positioned to take significant market share. At the time, consumer Windows (98/ME) was dramatically bad. Both in terms of stability and security. OS X just came into existence, but was slow and required expensive hardware.
At the time, Linux was far less fragmented. There were only a few distributions and KDE and GNOME ruled the desktop. There was a very serious push some companies to make Linux easy to install (graphical installers popped up in Red Hat, Corel Linux, Caldera, etc.). You could run Microsoft Office using CrossOver Office with virtually no glitches (I used Office like that for years). Corel released Wordperfect for Linux (which still had significance at the time). Loki was pumping out Linux ports of games like crazy. There was a genuine feeling that Linux was taking off on the desktop and many non-tech family/friends installed Linux.
The problem at the time was that web apps basically didn't exist. So a lot of interested people eventually abandoned the idea to switch to Linux, because they had some win32 application that they needed to run.
In 2017, the Linux distribution landscape is more fragmented than ever and the Linux desktop landscape is more fragmented than ever (heck, even GNOME has three popular forks). Moreover, problems are far harder to debug than they were around 2000, because there are multiple layers of stuff piled on each other (D-Bus, systemd, Debian alternatives, *.d, et al. have blessings and curses).
I think there is some movement from OS X to Linux (though it is hard too tell whether it's not just some vocal minority) for four reasons: 1. people rely less than ever on native applications, so it's much easier to move now; 2. OS X is also Unix, so for OS X users it is quite simple to move OS X to Linux and vice versa; and 3. there are serious worries among OS X users about the future of OS X and Apple's lack of focus; 4. Macs are becoming so expensive that it is hard to justify getting relatively bad specs for 1.5 times the price.
The Linux you remember is very different than what I remember. To say that back then Linux was unusable on the desktop would be an understatement. Yes, you had Red Hat, Mandrake (later Mandriva) and SuSE that were trying hard to provide a usable desktop distribution, but nothing was working well. And people needed Win32 apps because there were no alternatives usable for Linux.
Nowadays Linux usually works well on most hardware you can throw at it, it doesn't choke on the most basic of tasks and while it still has an app problem, sadly, the web is making that less relevant. Nowadays all that average users need is a good web browser, coupled with Linux's security and remote debugging capabilities, it's a very good fit for my father for example.
Speaking of which, I don't have a high regard for companies and products popular among developers and that don't support Linux. I pay a premium for Dropbox because it works on my Linux box, even though my primary workstation is now OS X. And I've been transitioning away from 1Password. Voting with your wallet does work and if a company isn't supporting Linux then it means it doesn't want my money. OS X was in the same situation a while back, for certain use-cases it still is, yet it prevailed because people have stuck with it.
There was a reason why you could buy Linux in bookstores for years. People were actually using it. For instance, SuSE had quite a stronghold in Europe. I was still in high school and I knew quite a lot of people that were using SuSE Linux.
coupled with Linux's security
What security? It's running every application unsandboxed and each X11 application can read all keystrokes, mouse events, make screengrabs, etc. Linux on the desktop is way behind macOS and Windows when it comes to security. It's just not a very interesting target, because of its small marketshare.
One can only hope that distributors will follow the lead of Fedora and move to Wayland soon.
At that time Windows security was a joke. You could write anywhere on the whole disk and you could "read all keystrokes, mouse events, make screengrabs" as well.
Windows has always had the best desktop system and it still does. Some loud minority always talks about how great the Mac UI is but I think it's hideous and most of the world agrees with me since nobody copied the Mac OS' major UI gimmicks (and if they did, it was seen as a stupid unpopular mistake).
Anyway - I run all three in my business and I'm well versed in each of them. Linux is great for servers and I need Macs for iOS stuff, but nothing beats the sheer convenience of Windows, the best desktop OS that ever was.
Maybe one day Windows will become open source and all the Linux folks can stop trying to catch up on the desktop. Until then, I'll gladly throw my money at Microsoft for providing such a beautiful system for me to use.
These days less of a problem. Message injection/sniffing between applications in different contexts doesn't work. Unless you're the administrator, in which case it is game over already.
(Note: I am not a Windows expert.)
Nowadays [...] coupled with Linux's security and remote debugging capabilities, it's a very good fit for my father for example.
In any event, when a user is compromised, it's lol Windows. But when Yahoo is compromised, is anyone opining security in Linux? Or are these social engineering hacks?
Furthermore StarOffice was the office app available (Abiword and Gnumeric and KOffice were very young and struggled with MS Office files as I recall).
Konqueror as a web browser didn't really take off or mature until KDE 3 as I recall? Firefox/Firebird was very young, there was no Google Chrome. I think I might have been running Netscape on Linux at this time??
Having said all this, I did enjoy using Linux at this period. It was a fresh new operating system to me. I remember wasting hours of my life using it.
All this before .Net, Vala, before Qt was open source. egcs and the gcc fiasco, XFree86 etc. etc.
He was a fervent KDE fanboy until he did an about face and starting disparaging KDE over the Qt licensing while pushing a hodgepodge of software under the Gnome moniker.
During that whole time he picked up idea after idea from the Windows world (hell, his first program of fame, Midnight Commander, was a straight clone of the DOS program Norton Commander) and rammed them into Gnome.
And Now he works for Microsoft, go figure...
Thinking about it, i get the feel that Poettering is a rerun of Icaza. Only this time OSX is the source of the "inspiration" rather than Windows.
Beyond that it is yet another round of NIH projects and divide and conqueror political rhetoric.
I did enjoy using Gnumeric so thank Miguel.
Also MC is pretty useful when you SSH into a box and want some form of file manager.
There's always been some flux to the ease of Linux vs the ease of say..Windows.
There was a time during XP SP1 and SP2 where installing Windows took fucking forever because it had none of the drivers pre-installed, whereas Linux had a grand majority of them bundled.
Windows still had better overall hardware support with peripherals, so there was still the odd thing missing - but if you had a compatible machine, you could do a clean linux install in under an hour and be loaded with apps, codecs, drivers, the works. With XP you would've been left scouring one web page after another for endless download links, often from another computer b/c your NIC driver didn't work out of the box.
I remember reading those same words in 2004, 2005, 2006, 2007, 2008...
Debian has had flavoured CDs for long time ago.
Is not the first time a read somebody complain about lack of choice but unknowingly is only trying or using Ubuntu (not Kubuntu nor Xubuntu).
May be in latest years Ubuntu has messed up a bit the percepction of Linux Desktop, but Linux Desktop is as healthy and fragmented as always.
And there is a probability that I'm also able to do all that on your box. Not sure about mine.
As the article mentions, Thunderbird as an email client is _okay_. Rewind back to the late 90s, and applications like Winamp were pretty amazing, it just needed a better user interface. There are loads of music applications on Linux, but I end up using CLI mplayer and even Foobar2k under Wine, because of some of the other shortcomings.
Even the core of Linux desktop applications like Nautilus feel like they are regressing. When a file manager in my mind should be at the heart of an OS. Mounts and copying should be very intuitive.
Much functionality exists in CLI apps, but is stripped out or lacking in desktop applications. Having something like Rsync in your file manager could be great. And better integration with something like Git. But that's after the core utility of the application is polished to the max.
It's the final 10%, and I appreciate how difficult that is, with moving targets weird Window tool-kits and other abstractions and APIs, but it's that polish that makes it.
Having said that Windows always feels like a fragmented Frankenstein's monster with UI inconsistencies not only across core applications but also with third party ones.
While some of the desktop paradigms have worn well, I don't see much radical experimentation on the desktop, and Linux could provide exactly the platform to try things out.
So I disagree that there has been no progression on the desktop front. The progression has been extraordinary. It's true that in some areas things have stalled a little or even gone backwards as of lately, mostly due to the introduction of Gnome 3 in my opinion, but all in all there's no contest between what we have now and what we had 15 years ago.
In this period, Netscape 4.x still worked pretty well and was available on Linux. Opera was also available for Linux at that time. So, it's definitely not true that the Mozilla Suite (which was indeed horrible) was the only available browser.
You could not play MP3 files out of the box due to patent issues.
I remember mpg123, xmms, etc. being available in many distributions. It was primarily Red Hat who didn't distribute MP3 support out of the box. IIRC free decoders were exempt from the decoder licensing fees.
Most graphic cards only worked in VESA mode, so we had no 3D acceleration of any kind.
Around 2000 there were not that many 3D chipsets. 3DFX was still ruling the world and there was 3D support for Linux per Glide. Many 2D accelerators did have support (e.g. Tseng Labs ET series, Matrox chipsets, ATI chipsets, most S3 chipsets). I had a 3DFX Voodoo 3 and was playing accelerated OpenGL games on Linux. I remember tinkering with NetBSD and having to go back to software rendering, which was frustrating.
But the worst thing was the lack of fonts. Fonts were truly abysmal.
Erm, Microsoft's 'Core fonts for the Web' were distributed since 1996. There has been a Sourceforge project (msttcorefont) distributing the fonts since Microsoft took them offline. Many distributions had an installer or package to get these Microsoft fonts. Then you had all the usual Windows fonts.
I think this shows the main problem with users' perception of Linux at the time. Most of the people who tried Linux expected things like that to work out of the box, and when they got the bad defaults instead they proclaimed Linux "unusable". They most likely never knew that MS Fonts package existed. I think I was using Linux for over a year before I discovered it.
I think I spent months fiddling with Linux before I got it to great looking, productive, usable OS which was in some regards even better than Windows XP at the time. For example, CD burning on Windows required 3rd party software which took over the CPU and your machine was basically unusable while it was doing that. K3B on Linux was so much better.
I used KDE in the beginning, but then I switched to IceWM, and spent days configuring it until I got it to run really fast, with USB drive icons showing up and stuff like that. And you got free stuff like CPU/hard drive/network monitors in the taskbar which Windows doesn't have to this day.
Yeah Linux attracts some really strange people - why, why, why do you need these monitors? Why do you need conky with your hostname? With your IP? With your kernel version?
Your workstation is not another random server you administrate, so hostname showing is absolutely not required (unless when you sit down in front of a computer you are high as hell and cannot tell it's your computer, or just too stupid to tell which computer it is)...
There are times I need to check CPU usage, but simple eye candy for 12 year olds who watched too much Hacker movies widgets won't help anyways...
It really pisses me off when I double click to run a program and there's no indication that it's starting up. Seeing CPU spike confirms that it's working. If I had a penny for every time I started a program twice on Windows...
Similar problem when I start some operation in some CPU/HDD intensive program. If the program itself has no progress bar or similar indication I have no clue if it started doing it, or maybe I misclicked.
Similar problem when a browser or another network program seems stuck on a download and I have no clue whether it's the network problem or the program itself is the culprit.
And there are some programs that eat RAM like crazy and on Windows you have no clue when it starts swapping to disk. With monitors you can clearly see that 1. you ran out of RAM and 2. you disk I/O is the cause why the system is slowing down.
I really don't care about hostname or kernel version though. But, I absolutely want to know what's happening in order to use my computer efficiently.
So, you are either using your system inefficiently or you have some beast of a machine with infinite RAM, CPU power and super fast disks.
So those monitors are required if you have 2GB RAM, no SSD and use poorly designed software - seems about right for a linux user, which cannot afford and OS.
CPU monitor is needed to see if some app is running or not. It's killing me not having that on Windows on a daily basic. For example, starting up Steam takes 5-10 seconds before you get any visual clue that your double-click on the Steam icon was acknowledged. On Linux and Mac I get feedback right away because I can see the CPU and I/O spike.
I think the articles comments about poor calendar and contact applications really highlight missing functionality. Outlook did well I think because of the calendar integration. Even the most architecturally stunning systems aren't worth much without a great application eco-system. That's no slant on Linux.
I struggled with Flash on Linux for years.
I too am disappointed by the lack of desktop applications. The advent of the microcomputer has meant that we shouldn't need to connect to a server the other side of the world to do tasks - we can do 99% of them locally (unless collaboration is involved).
Whilst I agree that Windows has 20+ years of history to maintain (and it does this very well) and suffers from inconsistencies (particularly apparent in Windows 10 as they push out apps like System Settings which doesn't adhere to UI standards, I can't double-click the icon top-left to close a window like since 1990 because it doesn't exist...), I would prefer Linux did not experiment with desktop paradigms - look at the Unity and GNOME3 weirdness. It seems apparent that the normal "windows and taskbar" system has worked well for decades. You can take a Mac from the early 90s and find your way around it. Experimentation of this paradigm would be for experimentation's sake.
Applications end up doing their own file management for example.
There are alternative innovative ways to access menus and trigger actions. Context tasks etc. But people are stuck in the past it's like they can't imagine anything different.
Linux also has inconsistencies between windowing tool-kits.
Mobile gets a little more attention these days. But that seems to also be stuck in a rut.
I meant experimental forks. Ubuntu foisting unity upon people was pretty damaging. Again if only they'd concentrated on the last 10%. Instead of creating a mess.
With age some tasks have become harder for me. I used to be quite a whizz at dragging and dropping, and fine pointer precision. Now I'm a bit of a fat handed twat, and my eyes aren't that great. I really do need a 10ft display with simple controls.
A lot of this is stuff is usability and ergonomics 101.
Experimental forks would be a safer way to go. I suppose if they don't have massive development teams/effort, it would be difficult to do. It would satisfy everyone who wants to go and invent the future and do exciting new things (which will likely revert to how they should be when real use of them occurs). I imagine the "let's maintain the existing" team would shrink.
Do you use a mouse or touchpad? As resolutions increase, we have to be more precise for UI or just scale everything 200% (like everyone does on 4K screens or Microsoft Surface Pro 4) thereby defeating having a high resolution in the first place......
There are dozens of us! DOZENS!
But how do you see the push Canonical did with Ubuntu? As I remember, the sentiment was that they were going to make Linux usable and easy for the average user, instead of going after advanced users. And IMHO they really succeeded in doing that, surely you're not saying that was all there before?
A user with absolutely no Linux or terminal experience can use an Ubuntu system for writing letters/emails/browsing and doesn't need to ever touch the terminal (provided all hardware is supported).
I can speak to this. Both my wife and roommate use ubuntu referbished laptops I supplied. Neither knows what a terminal is or how to use it. It just works(tm).
When hardware works out the box, everything is fine and dandy. As soon as it doesn't like with many OSs, it's a time waster, you loose your confidence with the entire project.
I find it ironic that the GNU/Linux is the one that helped make the GNU/Linux desktop less relevant by fostering web apps. Anything that is able to run a recent browser is good enough.
Also the desktop fragmentation and FOSS culture, makes it almost impossible to sell desktop software to GNU/Linux users.
If you look at the engineering workstation segment, there are companies with hundreds or thousands of seats pooling FlexLM licenses of very expensive software, namely for microelectronics design.
Those seats are typically RedHat or CentOS. If you look up revenues for Cadence, Mentor and Synopsys, most of it is EDA software licensing (remainder is mostly IP and training.)
Not prime time news, but not pocket change either.
The enterprise is relatively easy to sell software to GNU/Linux users, because what gets sold is actually training, support and consulting.
All things that non-technical consumers don't care to pay a dime for, but they usually buy shrink-wrapped software, even if on digital form.
I believed back then, and still now, that money is to be made with experts & service. Seems to work very well for companies like RedHat and IBM.
True, but it also boosts the non-traditional Linux desktop. Something like the Chromebook would be a flop 15 years ago.
They only see a window manager taking care of Chrome instances, with ChromeOS specific APIs.
Google can release a Chromebook without any access to Crouton, replaced the Linux kernel by something else, and no USA school buying Chromebooks would notice.
Same applies to Android, specially after the Android 7 locked on linking to private shared objects.
The only real problem which you missed is the hardware. Hardware support on Linux wasn't really good these days. I had to buy a new printer and scanner and setting the CRT monitor to work at acceptable framerate took a lot of fiddling with Xorg config.
There was a very brief period, between cca. 2005 and 2011 or so, when the fragmentation was less obvious because there was some degree of integration and cohesion (we had things like QtCurve, for instance). Then everyone started having delusions of grandeur again, in an early-90s-Unix manner, and things have been pretty much degrading ever since.
Today's "Linux desktop ecosystem" (God I hate this particular sequence of words) seems less fragmented largely because application development outside the major desktop environments has been largely abandoned, except for very relevant niches (photo/video editing, web browsers). A development that's largely unsurprising between KDE's architecture astronautics and Gnome's see no feedback, speak no feedback, hear no feedback attitude (that GTK, sadly, adopted for a pretty long time).
The unpleasant consequence is, of course, the "app problem" you speak of. A long time ago, the default KDE installation in Slackware 10 (which shipped with KDE 3.5, I think) shipped with a huge suite of applications, including graphical diff tools, VCS frontends, several multimedia players and so on. Two major rewrites later, they haven't re-accumulated this wealth of applications (and some of the ones that they do ship or advertise today are practically abandoned or remnants from the KDE 3 days). Some of the developments have been outright catastrophic, like KMail, which was turned from a very useful mail client to something that borks in a gazillion unpredictable ways as soon as you try to configure more than one account.
Similar things are happening in Gnome land, where they've chased feature-parity with Gnome 2 for years as they've been scrambling to fix everything that wasn't wrong with it and the horde of bugs that ensued from these fixes. Which, in fact, is why they have three forks in the first place. There was a lot of negativity about the Gnome 1 -> Gnome 2 transition, too, but that never resulted in forking. Nowadays we have people trying to keep a KDE branch that hasn't been developed in almost ten years alive, and actively using it (TDE). That's because, for all its flashiness, Apple and design fetishism, the super-disruptive community of desktop developers has failed to develop anything that's convincingly better than what was available ten years ago. The discussion, for some reason, is centered on whether the UI metaphors are adequate, ignoring users' feedback that focuses on far more obvious things, like "this thing KEEPS FUCKING CRASHING", "I just did apt-get update && apt-get upgrade and now all my applications look weird" and "everything is huge on my screen and this would look great on a tablet but this is not a tablet".
The Linux desktop today is far less fragmented, but that's because a) most of the people who could fragment it by developing fragmenting applications have long given up and use Macs and b) a lot of the traditional functions of a computer's desktop and applications have been eaten up by the web. There's little fragmentation to have when virtually all you use now is a web browser, the terminal and maybe a mail client.
Franly if it were not for the objective-C stuff it would make for a real nice option.
Which is quite a shame. Mail.app, for instance, was really good.
The biggest hurdles to GNUstep's adoption were a) its lack of documentation (you were generally expected to use the NeXTStep documentation, although some functions were not implemented at all and others were slightly buggy) and b) the fact that Gorm and ProjectManager were really buggy for a really long time. This made GNUStep development not significantly more pleasant than GTK or Qt development -- not to mention difficult to get into for people who had never written NeXT or OpenStep software before.
I think Linux will be the better option after: SNAP/Wayland goes mainstream and JACK becomes easier (Low latency Audio on Linux is still a nightmare compared to Windows and OSX).
But there are still a few rough edges. That being said, I use it for everything at home now, except for my Suunto GPS tracker, and Serato.
Perhaps I didn't play sound very often in multiple apps at the same time (who can listen to two sounds at once?)
Frankly the major reason for PA to exist at all is to handle transitory audio devices. I think Poettering started working on it because he bought himself a pair of USB headphones (basically a USB soundcard with some headphones hardwired to the analog pins).
Dmix could just as well have been extended to do audio routing. Or just ram sound down all channels unless the user decides to mute some of them.
For all of you poor onlookers who have no idea why people don't like PulseAudio, sit down and let grumpy ol' notalaser tell you the story of how PulseAudio gets all this hate -- much of which is, in fact, entirely undeserved today.
Back in 1999-2000 or so, things were really bad when it came to audio on Linux. The biggest problem most users faced right after being able to finally make something come out of their speakers was mixing audio streams from more than one source. This was especially relevant because, at the time, audio effects were really in vogue on the desktop.
The problem was that OSS was flaky and had basically no support for software mixing -- which meant that, save for the few sound cards that supported this feature in hardware, you couldn't play more than one stream at once. ALSA, on the other hand, was in a very early state, had all sorts of trouble, not too many drivers and updates were going slow considering that this was back when virtually everyone installed Linux from CDs and the biggest hurdle to a rolling release model was dial-up.
The way most desktops solved this problem was with a sound server (yep, basically PulseAudio). KDE used something called aRts; Gnome, I think, used Enlightenment's ESD. Both of these would do the mixing in software and pour everything into /dev/dsp. The bad news? They were really slow, high-latency (and I mean high latency, sometimes I'd get the sound alert a dialog about ten seconds after closing the dialog) and were a little funky. If aRts' queue stalled, for instance, the effect was a little comic -- after unstalling it would end up playing all the sounds that had accumulated in the queue. It was very comic.
By 2004-2005 or so, however, the whole concept ended up being mostly irrelevant, as ALSA began to really support hardware (and software mixing). Gnome eventually dropped ESD altogether. KDE kept up with aRts (but IIRC a lot of distributions started shipping KDE with aRts disabled) up to 2004, then embedded some of its ideas into Phonon. Those of us using something else finally rejoiced and never ran a sound server again.
2004 is also the year when PulseAudio was first released. By the time it got included by default in Fedora, in 2006, playing multiple streams using nothing but ALSA was very much a solved problem. I think it was so solved that it worked on Gentoo out of the box, without having to configure anything, on the more common sound card models. Not that Linux sound was perfect -- drivers were still flaky-ish sometimes, but obviously that was not something that a sound server, be it PA or something else, could solve.
So fast forward to 2007, when PulseAudio is actually unleashed upon the computers of everyone else except Lennart and his friends as it's adopted and enabled by default in Fedora 8. To put it mildly, nothing worked anymore. Very literally -- when we installed it at the crufty place where I held a part-time job there, it broke sound on every single one of the 10-15 different configurations we had, from laptops to desktops. On really old desktops, the breakage was subtle (high latency, occasional crashes). On newer laptops it was entirely terrible, they wouldn't even hiss. PA had no useful documentation, basically no means to do any useful debugging, and its upstream team quickly made a lot of friends due to its leaders' difficult (if superficially gentle) personality.
This was extremely unfortunate because alsa -- while not sucking as much as Pulse's PR machine claimed -- was still pretty bad, and many of its design decisions were firmly rooted in the landscape of the late 1990s Linux. dmix wasn't too power efficient and it had a bunch of other problems re. dynamically-added devices. Linux' sound system needed the improvements that PulseAudio brought; unfortunately, PulseAudio was both very slow in delivering them (in a manner that would not crash once a day, that is) and very eager in being adopted, which resulted in it being widely deployed while it was pretty much in early beta.
This is how everyone came to hate PulseAudio (and Lennart Poettering). It created quite a rift in the community, too, one that went on to be made even deeper. It also hurt PA's development pace in the long run.
tl;dr PulseAudio was too bold, too soon, even though it was, in many ways, too late.
This complete with a "you are holding it wrong" like statement from Poettering as new servings of bile rolled in...
and then xp came and it was the better in in everything from 98/ME.
But I will say that linux has come miles and miles. My transition from Windows to full time linux by way of Ubuntu was incredibly seamless. The only thing I really miss is Excel.
Sadly Excel for Mac is the main thing keeping me from switching to Linux. OpenOffice/LO are ok for very basic documents but I routinely deal with files that they can't handle.
Google Sheets straight up fails if you try to upload files in that format. No preview available and no option to work around. So I would have to open the file in Excel anyway.
Relatively new versions of LibreOffice are capable of opening simple files, but once you get into features like named ranges LO fails.
So even if the UI is acceptable, failures on data import force me to keep a copy of Excel around. Keep a VM with Excel for Windows around just in case files don't work in Excel for Mac, but I've used it maybe once in the last 6 months.
For example, if you have an index then a tiny kerning shift can bump an item to a new page and screw up your numbering.
It really depends on the types of documents you're dealing with. Some are hell, others are a no-brainer.
It's good to search for duplicates first or see this filterable list of the most popular dupes: https://bugs.documentfoundation.org/duplicates.cgi
You could try office's web app thing if you are mostly a read only excel user.
You could try SoftMaker Office.
Quote: "No other spreadsheet is as compatible with Microsoft Excel as PlanMaker 2016. Both the old .xls files (Microsoft Excel 5.0 and higher) and the modern .xlsx files from Excel 2007 to 2016 are displayed true to the original and saved reliably. This guarantees trouble-free data exchange with Excel users."
Heck, you can even create PDF forms entirely with Free Software, using LibreOffice / OpenOffice.
And don't forget PDF.JS of Firefox, which enables you to view and edit most PDF forms directly in your browser. However, it still has issues with some PDF documents, so I prefer Evince and Okular.
The still-common advice to install Adobe Acrobat Reader is outdated! And it has been outdated for years. In fact, some time ago the FSFE started a (still successful) campaign to convince public entities to no longer tell people to download Acrobat whenever they offer a PDF file. With public entities you can argue that they advertise Adobe without getting paid for that, that they ignore the fact that PDF is an Open Standard, and that they put the plethora of good PDF readers into disadvantage. Instead, most of these public entities now point to a community-driven overview of Free Software PDF readers:
But as mentioned before: As a Linux you don't need to worry about that, you almost certainly have already installed a good, free PDF viewer in the default installation.
Chromium possibly does it do, have not tested tho.
Although sometimes you can use Chrome (or evince, or okular), occasionally Adobe Reader is the only one that can fill forms properly.
I've had some success with running current Adobe Reader in wine but trying to print makes the program crash, so there you have it.
Atril is what Evince used to be before it regressed in usability.
The only hacking I did was to install libdvd-pkg or something like that.
I've run into some recent disks that I can't play, but never one that actually damaged my blu-ray drive.
Source? I don't even see how this could be possible.
Never mind that the BR spec has all kinds of weirdness, including things like bundled Java applets(?!).
All in all, its problems like these that keeps us torrenting.
I haven't really used the drive other than what I mentioned. I just grabbed the nearest DVD (disc 2 of 3 of season 7 of "The Big Bang Theory"), popped it in, and it just worked.
That was just a quick "test" to see how difficult it was nowadays to rip/encode DVDs (I haven't done that for probably eight years or so and it was a major PITA back then). I was planning on doing the same to most of our "media collection" so I'm hoping it continues to go well.
Thanks for the heads up. I'll be looking more into this.
Several hundred AskUbuntu posts disagree with you. (And that's for Ubuntu...one of the friendly distros.)
Sure, things are much better, but I think most criticisms of linux not being "out of the box" ready are valid.
However it is faster/more expandable than most macs available today.
I keep hearing this, but I've never had an issue (Arch, Macbook Air 2013).
On the contrary, I have endless problems in macOS with WiFi where some networks won't work if I don't specify a DNS (I use Google's, but I assume that doesn't make a difference) - and others won't work if I do! (Meanwhile, other devices are fine doing the opposite.)
Never had a problem with macOS Wifi since 10.5.x.
But it's currently not the case that you can take an arbitrary Windows or Apple machine, install Linux and have a working Wifi. It's very much hit and miss.
The only trouble you may get is with cutting edge hardware because manufacturers are slow.
How do you expect a normal user to this?
Intel chipsets are OEM only. Cards are readily available on Amazon, but are all gray-market apparently.
In short, I don't expect normal users to do this, but I'd expect them to be willing to pay triple the normal markup on $20-$50 componens if they "just worked" in Linux.
To be clear, I'm talking about having an ODM run off copies of Intel/Atheros reference boards. The engineering effort is as low as it gets for hardware manufacturing.
There are problem devices out there, but more and more they're the exception, rather than the rule.
This is not true if you have Intel wifi hardware, which is a lot (a majority?) of laptops.
Is ndiswrapper still being developed?
It only supports Windows XP drivers and crashed the kernel when I tried to get my TL-WN8200ND (RTL8192CU) working with it on 64 bit Ubuntu 16.04
The ethernet controller is an Intel I219-LM (rev 31). The driver source that worked was https://downloadcenter.intel.com/download/15817/Intel-Networ...
What I had hoped would happen during install was that the kernel would fall back to some simple driver, kinda like the simple VGA drivers, and then find the correct driver and install that.
I cannot understand people that think Ubuntu is on par with OSX and Windows for workstation use.
WiFi drivers are much less likely to work out-of-the-box; usually you need a firmware file which may require futzing with the windows driver installation package.
It works 100% reliably 100% of the time.
I had issues with OS X sometimes that I put the MacBook into a bag and it would get hotter and hotter because it didn't sleep and the air couldn't move, untill it panicked and shut itself down to not start burning.
Acer/Wistron laptop with MSFT DSDT would reset spontaneously instead of waking up from sleep.
I have a hard time seeing how you could possibly be being honest about this.
Isn't distributing libdvdcss illegal in the US and therefore most distributions don't include it? So, most distributions don't come with full DVD playing capabilities out of the box.
For Fedora, it's a matter of adding the RPMFusion-nonfree repository, and installing a couple of packages.
As I recall, installing libdvdcss, libdvdnav, libdvdread from the official repos solved that problem and allowed me to painlessly watch dvd's through players like VLC and MPlayer.
It's annoying, but better than when DVD support first appeared, and you'd end up compiling something like 5 libraries and a custom version of Xine.
Linux has another learning curve for me than products like Windows server, but i'm slowly getting used to it again.
These days i have very limited money to experiment with computer systems. So i turned an old desktop into an esx.
Using linux for my vm's has been very rewarding. There is a ton of resources. Most of the stuff is free. The communities usually love to help you reach your goal.
The main reason people don't switch is just lack of familiarity, driver issues and unsupported software - all driven by network effects.
There's very little about the OS itself which is preventing adoption. It's entirely about the ecosystem.
That pretty much requires a fuck up on the part of Windows or Mac to drive people to use Linux.
I feel like paid $199 just so Microsoft can sell me to advertisers. Bill never would of stood for this shit.
Windows XP, Vista, 8. I mean even if these fuck ups haven't brought Linux market share above rounding error, nothing will.
One thing that could make desktop Linux viable - is throwing transparency, openess, customizability and all similar "good" Linux stuff through the window, having a dictatorship with good UX people and forcing users to one true way.
People do not care that they can tinker with their desktop - I really do not care that I cannot make my macOS or Windows look like screenshots in unixporn subreddit - I just want an experience that's concise, the same for every application, polished, clean, lean, etc etc etc. Windows had this in Windows 7 (10th version with UWP programs for tablets really disturbs the UX), macOS still has this.
Elementary and especially Solus distributions are doing that - "oooh you want X? well screw you buddy, that's poor UX".
Vista was a major screw up. I didn't mind Windows 8 as much. But yeah if those mess ups don't sway people, nothing will.
Personally, I don't care about USB-C replacing USB 3.0, or the introduction of the new Touch Bar. The problem is the pricing (that Touch Bar is costing me $700-800 on top of the equivalent 2013 MacBook Pro model). The removal of the physical escape key and the inexplicable removal of the 3.5 mm jack on iPhones does not help to relieve fears that Apple has completely lost control of basic sense as to what their professional users want and need. The MacBook Pro line has fallen into the same category as the MacBook Air. It's supposed to be a professional machine, but it's now being specced to the lowest common denominator, meant for any average consumer.
For some of us, Linux is not being transitioned to with enthusiasm. We do not consider it a drop-in replacement for OS X, and already pre-emptively regret the inferior keyboard and trackpad we'll get on the OEM PC laptop manufacturers offer. Honestly, what the hell is with PC manufacturers that still include the TrackPoint™ Style Point, Nub, Nipple Mouse, Clit Mouse?!
tldr; Apple is alienating its professional users. Windows is flat out unusable as a Linux-based developer platform for us. Linux is simply the only remaining alternative. Frankly, it's not "good enough" - it's simply the only remaining alternative when you can't reason spending $4,000+ on a laptop you actually do still want to buy.
I'm experiencing substantially more carpal tunnel issues because of the Mac's trackpad-only setup. The trackpoint means I never have to take my hands off the keyboard, which I love. Two months in I'm used to Apple's keyboard, but I don't think it's as good; my error rate is still much better on Thinkpads because the more sculpted keys make it easier for my fingers to know where they are.
And I should say that here I mean the older Thinkpad keyboard, not the newer one that looks much more Apple-ish.
Most people switching, however, are annoyed by the absence of visual polish everywhere (hardware and software) and the lack of robustness compared to Apple laptops, which you can literally walk on or drop in your bag without paying attention to what else is next to it. Except for mil-spec laptops and thick toughbooks, I wouldn't do this with any computer.
Apart from console apps, there is no unified visual language on Linux, and most desktop application developers do not expect to make as much money as with mac or Windows. This often results in apps that just "get the job done", without much extra polish, but sometimes a LOT of extra features, which can actually confuse users. The best advantage of most Linux software is to be open source, which is unfortunately something 99% of people couldn't care less about.
I'm a day to day Linux user, and I couldn't be happier to have switched back from osx (2004-2016), but the switch is clearly not for everyone yet.
I weigh 138kg. Please hand over your apple laptop for a quick test.
Personally, I have no issues with Linux being used almost exclusively as a developer OS (on desktops and laptops). I don't really see Linux moving towards being used by non-technical users any time soon, and I'm perfectly okay with this.
I've been on Ubuntu for nearly a decade, so I barely notice many issues any more, but you're absolutely right. It's a long way away from wonderful.
Obviously, I'm just used to it; I wait 2-3 months before upgrading to the latest releases because having three monitors has _always_ been a headache for me when upgrading any sooner. That's my new normal.
That said, I rarely have any issues that make me lose time. I might have to spend 5 minutes googling around before buying things like web cameras to ensure compatibility, but otherwise, everything tends to be stable. My desktop stays running for months at a time without complaints.
The next step would be to go beyond stable and make the platform a joy to use. I want to be able to plug in some of these fun toys like multi-touch pads and high dpi screens and have them work just as well as they do on other platforms. The journey to that ideal is far longer on Linux. But it remains free, hackable, solid, and powerful. And the more people using the system, the faster progress will come.
I remain hopeful and patient.
There's scope somewhere in there for Canonical to improve Launchpad so the experience of fixing issues is more streamlined.
Currently I'm looking into what Guix and Nix can offer me wrt. custom packages.
There aren't enough of them?
Because it's fucking awesome.
Some people like them.
Actually, I was surprised how well the Windows Subsystem for Linux actually works. Yesterday I even compiled some Ubuntu packages (using C++, Qt, Boost, etc.) and submitted them to my PPA on Launchpad. From Windows!
The good news is that a lot of these things are getting fixed. E.g. it used to be that path expansion for executable files lookup didn't work too well (as in, you could launch native Windows executables only via direct path, because the subsystem would otherwise check for an ELF header, so you couldn't launch foo.exe in your $PATH just by name), but I understand that this is getting changed in upcoming releases.
I'm keeping an eye on this thing, it is promising. I'm so looking forward to putting all the Red Hat-branded breakage behind me and not have to wrestle with my computer all day.
But I agree that it would be great if they could solve this in the future.
The trackpad situation is a legitimate gripe, but this is hopefully improving with Microsofts Precision Trackpad standard.
The big advantage of stepping outside the Apple garden into the wider world is choice. You have choices like matte screens, 17" screens, up to 64GB RAM, hotswap batteries, upgradable components, ARM chips, Xeon chips etc etc
Even the current Macbook Pros (high spec Airs) have had equivalents available in the PC world for a while, e.g the X1 Carbon, which is being upgraded to KabyLake soon, while Apple have only just released their Skylake machines.
Are there choices? No, I do not see any choice. You see i3, i3-gaps, fluxbox, openbox, xfce, gnome, cinnamon, mate, kde, panteon, budgie, bspwm etc etc and go "whoa, all these choices", meanwhile I couldn't care less even if there three times more clones of a tilling wms, I just want macOS desktop. Choices of stuff that I will never use are useless.
Different people have different preferences - I like the TrackPoint far better than a touchpad, because I can use it without moving my hands from the keyboard.
I'm a huge linux fan and have ubuntu on a desktop and laptop, but if you hate it there's really no reason you have to use it.
If I were starting up a 2D media company I might have some concern moving new hires from probably-familiar Adobe applications to the (excellent) Linux alternatives; but still probably not enough to dissuade me completely. For a new 3D studio the industry is so diverse anyway that it would be worth going with Blender to start with, or Maya if the heart desires it.
For software, I don't think there's a worthwhile platform which you can not develop for from Linux (aside apple nonsense, but even all-apple shops end up using OS X VMs for iOS builds since XCode build is highly stateful(!)).
On that note, I honestly can't think of anything other than Adobe products and Microsoft Office that is preventing a mass migration to Linux (or rather Ubuntu, let's be honest) at this point. It has all other killer apps. Dropbox, Spotify, VSCode, Atom, Steam, Chrome, VLC, etc...
I know that Excel is the only thing keeping my dad on macOS these days. He is definitely not the Apple fanboy he once used to be.
What an amazing day that would be... when, at least Adobe, would release builds for Linux. Does anyone know if there's a serious obstacle for them other than market share?
And while Steam is supported, the Linux library is a shadow of the Windows library even if you exclude older, underplayed games.
True, but you can't expect things to change from one day to the next. There are many more games on Linux now than there were like 3 years ago. Like 2000 more. And it's still growing, and in 2016 we have got more AAA titles than ever mostly thanks to Feral Interactive's porting efforts. It's better and better for Linux gamers. And even AMD is slowly fixing its broken Linux drivers.
If NVidia, Intel, and AMD got serious about Linux graphics support for enthusiasts and professionals, I'm sure we'd see a huge improvement and make switching viable for many traditional hold-outs. Chicken and egg, unfortunately.
They dont have any open source drivers for ANY platform. So that's the same on Windows and Mac. On Linux you have the Nouveau (open source, independent) drivers for nVidia though, and you can already play some games with it despite major performance loss.
As for graphics switching you'll be happy to know this is a problem that the SOLUS distro is going to tackle in 2017 to have a better solution than the existing ones which are mostly broken.
I have deployed thousands of workstations with various ranges of graphics cards. Nvidia is the simplest, a trailing second is AMD (they do make rolling RPMs easy though) They are just as buggy as the windows drivers.
Intel make brilliant drivers now, its just a shame the graphics cards are tiny compared to AMD/Nvidia.
Nvidia just don't do opensource drivers. For me, I frankly couldn't give a fig if they are open source or not, just so long as it works.
The "TearFree" xorg feature does the best job of vsync I have seen in a long time. On the downside, the variable refresh rate stuff hasn't quite landed in mainline yet, and audio over display port hasn't either. holding breath
I don't think that is necessarily true. WINE does a wonderful job nowadays to run older games - while the support for DX11 is still very much work in progress. Most DX9 games run just fine, and older versions of DX have little to no issues either. IN some cases WINE does a better job running an older Windows game than Windows 10.
This is why I still have windows machines for work. If open source CAD software were competitive (or if the commercial companies offered Linux ports) I would be 100% linux.
At this stage in Steam use on Linux, I would say it exists, but it does not appear "supported" by Valve.
I've been running steamboat. Linux for a while now and have never had any issues running g it. Sure, there aren't a huge number of games for like us, but that has more to do with the game development ecosystem at the moment. Developing on Windows or Mac is just easier. You have access to all the 3d packages from autodesk and other companies.
env LD_PRELOAD='/usr/$LIB/libstdc++.so.6' steam
Make sure you download Steam from official website, not the version packaged with the distro.
From your link:
If you have a 64-bit system, you must install the 32-bit Multilib version of your graphics driver, lib32-alsa-plugins to enable sound, and lib32-curl, lib32-libgpg-error to enable update at first run.
Maybe someone can explain why I need a GPU driver for what is in essence a browser and install wizard? Not being facetious, I honestly want to know why drivers have anything to do with the Steam client working or not.
And is it not time for 64bit Steam client on Linux so that users of 64 bit systems can be free of all these hacks?
Somehow I managed. 'dpkg add-architecture i386' ? Also, the steam client never asked for a driver on my machine. Proprietary nvidia does the magic I need for everything to run.
pebcak yourself, I linked to a very long list of google results for this very problem, suggesting quite strongly that the problem does not just lie with me but with the way in which Valve choose to roll out the Steam client. Or are you suggesting that dpkg -i steam.deb and resolving some dependencies is beyond all of those Linux users? Or maybe the problem does actually lie with Valve.
Is that a combination of heard and read? I like it!
I believe case preservation but insensitivity has been in Mac OS since the original Macintosh File System; certainly it was in the Hierarchical File System and was inherited by HFS+. When they switched to OS X, they still had to maintain filesystem compatibility for the Classic system, and maintaining it for the entire OS probably made porting other Mac applications simpler as well.
Also, I think it's pretty unfair to call Inkscape "terrible". It has, by far, the better shape editing tools (especially CSG) between it and Illustrator. As a bonus, Inkscape tends to output SVG files with maybe one or two unwanted transform()s, rather than the six or seven you get with a typical Sketch or Illustrator SVG.
For video editing, kdenlive is the state of the art on linux. Here's a guide written by a professional video editor who uses only free software http://slackermedia.info/
I hope you don't mean Adobe's Photoshop and Illustrator. I tried those a couple of weeks back. Compared to Gimp and Inkscape they are as buggy as hell. Simple .png export with custom dpi often doesn't export to custom dpi. Exported images often aren't pixel-precise if you repeat the export and layer boundaries somehow suffer from float rounding errors if you load/save the file multiple times. I find it amazing how supposedly professional tools can have such major problems. I guess it's just good marketing.
The only thing missing in Gimp is CMYK support and that might be sole reason people are still using Adobe products.
Well I might give it a try, anyway...
I run Arch + i3 as a WM, so a very bare-bone setup, and use just terminals and a browser. No other GUI app. I don't need anything else and I am not a power user, rather a non-tech guy who wants to get stuff done.
About ten years ago, I read all the evilwm source and fixed some quirk in an afternoon. That's usability!