I've used Mint in the past, and it was my go-to distro for family members who aren't so technical.
I'm not bothered by the licensing issues mentioned, and I'm ambivalent about the namespace issues, but I've been increasingly uneasy for some time now about Mint's security practices. Serving downloads over http and not providing GPG signed SHA hashes like every other distro is fairly irresponsible in this day and age.
This recent security issue, and the poor response to it, are basically the straw that breaks the camel's back for me. I'm moving on to Ubuntu-Mate, since frankly Mate was the primary reason I was using Mint anyway. Serving downloads of the most popular Linux distro from the same machine as is running WordPress is cringeworthy, and failing to take the compromised machine totally offline until it's 100% sure the compromise has been mitigated (through reformatting, including boot sector) shows really poor judgment.
I'm a bit sad to be so critical, since I recognize that Clem has done a lot for the Linux world, and as a Mint user I've benefited personally from his work. But when you're distributing operating systems to so many users, you have to take security seriously. To do otherwise, even on a "hobby" project (although I'm fairly sure it's his full-time job now) is pretty irresponsible.
In many ways, I'd like to pitch in, but based on other interactions I've seen and read about, I'm not sure my input would be welcome, particularly wrt security issues.
Edit: I'm also playing around with FreeBSD for my development environment, since I can use Mate on there. To be honest, I don't really need a DE these days anyway, since I only use terminal and a web browser. I should look into just using a Windows Manager.
Edit 2: Apparently they do provide GPG signed hashes. I've been looking for them each time I've downloaded Mint distros, but never came upon them. So I stand corrected.
> I've used Mint in the past, and it was my go-to distro for family members who aren't so technical.
I would consider myself pretty "technical".
And yet, I am not willing to spend more effort than absolutely necessary for setting up my Linux OS.
Just b/c I am a programmer and even love to use zsh and Git from the CLI does not imply that I have any patience for fiddling with drivers and kernels.
In the "Linux Community" there only seems to exist the hacker (who loves to spend ages tinkering with config files and debugging hardware issues) and the technically incompetent user (who most likely uses Windows anyway).
I think that's why a lot of people wind up on OSX. You get the CLI environment you're used to with Linux, but rarely lose half a day because a driver upgrade broke your desktop.
Funny, because I've been using Arch Linux now almost 5 years on my work computer without any problems. Same desktop (xmonad), same editor (emacs) and same terminal (xterm) with the most awesome configuration stored in Github. The only problems I've heard here are coming from the OSX users, who face difficulties when upgrading the OS into a new version, which breaks homebrew or something...
Perhaps Servercobra was talking about the barrier for entry. I haven't yet set up an Arch distro, but I can say that getting started with OSX is beyond simple. Based on what I have heard, setting up Arch can be rather difficult.
I really like the idea of Arch but I am intimidated to jump in. What was your experience like when you first booted up Arch?
Installing Arch Linux requires seems to require both knowledge and time. I haven't installed myself, but I've seen friends and people in the Internet describing it that way.
If you want to quickly try something similar, I'd suggest Manjaro Linux, which is based on Arch. It really feels like Arch but with non-minimal defaults for those who don't want to set up everything themselves.
Well, I have experience with Linux for the last 20 years, since I was 12 years old, so it was not that strange to setup Arch for the first time, way easier than the early Red Hats back in the days. I also have long history with OSX, including the early 10.0 builds until 10.6. I honestly think that having a nicely configured computer build with a simple distribution, like Arch, is much easier for day-to-day work than an OSX installation, that can break lots of things when there's a new version out.
So, so, so much this. I did a full-stop switch to Mint for ~2 months and, for the most part, quite enjoyed it. What finally killed it for me was when I was back in an "office" setting where I wanted to use a monitor and bluetooth peripherals. After a few days of struggling through various bluetooth drivers and CLI management tools, I gave up.
To make things worse, the general response I've heard from other folks that use Linux full time was, "oh yeah, why weren't you just using USB peripherals?" The general feeling I got was that I was a n00b for even trying to use a BT mouse and keyboard in the first place.
I genuinely wanted to make the switch work, and it's a shame that it just didn't (for me). Honestly, the last thing I need sprinkled into every few working days is 45 minutes trying to get some driver re-installed.
> In the "Linux Community" there only seems to exist the hacker (who loves to spend ages tinkering with config files and debugging hardware issues) and the technically incompetent user (who most likely uses Windows anyway).
wow, that's a pretty elitist comment, even for HN. essentially what you're saying is that people who don't like spending time trying to make their computer work are technically incompetent?
I'm curious- what do you think of Fedora? It's in the top 5 distros along with OpenSUSE, Ubuntu, Debian, and Mint, and yet I hardly ever hear people talk about it.
I had personally given up on Fedora years ago, but recently was told I should give it a second look and I've not had time to try it out.
Fedora isn't for non technical family members. Here's a bit of mild whinging that's only relevant if you want to give it to non-technical people.
There's a move to give stuff generic names, rather than the obscure names they had in the past. For example, Nautilus has been renamed to Gnome Files, or just Files. When a non technical person needs to search for hep this new name makes it impossible for them to create a useful search term.
[files foo bar] is going to be different from [nautilus foo bar]. Frustratingly the old name works for searching, but it's not in any titlebars or about boxes or menu items, so the non-technical person has to just know that files is also sometimes called Nautilus.
Fedora 20 has an appstore. This has something like 4 different names - in the menu, in the title bar, in the about box, in the icon.
For what it is (a rapidly released testing distro) it's lovely - nice community (from what I could tell) and lots of activity.
That's not so much a Fedora thing but a GNOME move, as the RPM containing "Files" is still called Nautilus.
Also, I am a bit surprised since it goes contrary to your argument, because when a user looks for a file explorer he's much more likely to find that looking for "Files" rather than the (rather strange, really) name of "Nautilus".
I would see myself as a technical user and yet I have no idea how the apps on my Android device are called. One is called generically "Gallery" and another one even worse, "E-Mail".
KDE historically worked around this problem by displaying a description as well as the name, e.g. something like "Nautilus - File explorer". It worked great.
Thing is, the move to generic names is basically an ego trip from Linux developers. Consistent products like Windows or OSX, which are monolithic and have bazillion of users with the exact same configuration, can get away with it; but the Linux world is a forest of different apps from random developers, haphazardly packaged by this or that distribution and continuously updated every few months. In this environment, thinking you can just refer to "Ubuntu files" or "Fedora files" is a pipe dream; often the solution to your problems will be on sources that are not specific to the distro you are actually using (see for example the Arch and Gentoo wikis).
Who is this user though? Apple has Finder, Microsoft: Explorer, Android/iOS: nothing, and unless they've installed the CLI/minimal edition of a linux distro it has a file manager and they launch it not by selecting the application from a menu but clicking on a folder icon.
Making the name basically irrelevant unless they need to ask a question about it in which case enter googlability.
I don't think there's that much wrong with "Gnome Files". That should be enough to search for help, and also gives information about what the program does. Similarly, I would say the names should be "Firefox Browser", "Geary Email" etc (if we're aiming to be user friendly).
"Gnome Files" should be fine, if people know it's called "Gnome Files", and if people know to use "exact search".
In Fedora 20 the name "Gnome Files" was hard to discover (that may have changed in later Fedoras) and non technical people just don't know how to search.
I like the categorisation aswell as a name (and version).
In your desktop environment, 'Open web browser' (as an action), can be associated with whichever browser you prefer. And perhaps a context menu on that to choose between many.
I prefer something like: Gnome file manager: 'Nautilus', to the browser: 'Web'. How ghastly.
To differentiate between many, like both of Windows' web browsers: Edge and Internet Explorer. You could say Windows web browser Edge. Or Edge; Windows' web browser. Windows itself is a confusing name, but that's another conversation.
so I just moved to Fedora 23 after 10 years on Ubuntu. It was something that I was very hesitant about, but Fedora 23 has totally smashed it out of the park.
I think the usability is miles ahead of Ubuntu atleast - I realize that is a personal opinion, but I love that Fedora is a tightly integrated Gnome (and soon Wayland distro).
Don't want to be that evangelist guy BUT OpenSUSE really deserves a second looks and it is non-technical family member friendly. They are super stable and provide one-click install off the web like Mint, but it install them as a regular repo which makes updating them a breeze.
On the technical I love OpenSUSE for zypper, build (Any software you need is probably on there and it will even build for other distros), rolling releases in Tumbleweed, and the best KDE default environment for over a decade..
OpenSUSE is great, YAST is the best "control panel" app I've seen in any distro, you can configure antyhing without ever opening a terminal or vim. Zypper has also been more robust for me than apt (I've seen apt-get hell once or twice).
That said, stay away from Tumbleweed (rolling release). In my experience it's been even less stable than Arch or Sid. In the past I've had kernel updates making the machine unbootable, though thankfully the distro does keep all previously installed kernels, easily selectable in GRUB. Just yesterday I banged my head against a year-old bug that pgadmin3 is broken because it hasn't been rebuilt against the distributed version of wx (come on).
Fedora has come a long way from where it came from.
It is much better then it used to be and has become really fast, secure, and yes: user-friendly.
I use Fedora @work and Ubuntu @home.
The reason I use Ubuntu @home is that my roommates run Ubuntu too and we get all the same Versions on whatever software.
But @work I run Fedora. There aren't any particular reasons except that it just "feels" better to work on that on Ubuntu.
I definitely recommend to give it a try.
Using something like Fedy or easyLife makes setting up Fedora fun and fast.
Ubuntu is just too dated for development work IMO - I need to rebuild everything from source or find 3rd party PPA for anything remotely recent - I'm fine with compiling deps for production but for development the distro really shouldn't be getting in my way of trying new stuff out.
That said I've had many performance issues on GNOME - I like the way it looks and the "feel" and I got used to the UI over the last year or more but frankly it's constantly hogging my PC down, both desktop an laptop - when I switched to KDE/plasma 5 I saw my WebGL chrome app go from 40-50 FPS to consistent 60 FPS - both tests done after a clean start and simply starting the app letting it run for a while. The entire system feels more responsive - can't trace the issue but the results are measurable and noticeable
I've had flicker and stability issues with KDE 5 when I last tried it a year ago but now it appears quite stable. My only complaint is that themes/design community are nowhere near to Gnome.
Any distro with a release schedule is going to have caveats about not having prebuilt packages for the latest XYZ. The only things I can think of that might give you a faster update schedule than Ubuntu (in terms of newer versions, not just point releases) would be Fedora, Arch, or Gentoo.
Even the standard releases are too stale, in LTS dev tools and compilers are ancient.
I've forgot to mention I've moved to Fedora a year or two back for this reason - got tired of rebuilding/PPA hunting for every part of my OS when I want to use the latest version of tool/lib X.
And once dependencies are too old (very often) prepare to be rebuilding 5+ custom libs and figuring out the differences between debian package/path layout and what the library build uses ... so much wasted time.
It's easily the most secure Linux distro. It has SELinux enabled by default (and it actually works!) and compiles binaries using most of the available hardening features, other than basically any other mainstream distro.
They have an excellent testing/QA process, especially given the speed at which they're developing - this results in a very high quality.
If you call secure an OS which can crash process without giving meaningful errors.
I have lost hours debugging mysterious crashes because of SELinux, and it is really not safe to have components unexpectedly crashing when they are part of your core infra.
Plus I guess that like every security frameworks it runs with priviledge, it has a lot of lines of code, is hard to audit, and thus highers the surface of vulnerability.
Wow. You literally think selinux is bad because it uses strcmp and enums? Maybe you should read a book about C sometime before you make a fool of yourself on the internets.
I use Fedora and I truly do love it. I find it to be far more stable than other distros I've tried, and it really does "just work."
I know I've been suggesting it to people who are considering Linux for the first time, with the caveat of disabling SElinux.
With that said, I'm not sure if it is good for non-technical people or people who aren't interested in learning about it. Getting certain things installed can be a bit of work, as they don't support things like Chrome or even Chromium by default. I also view it from the perspective of not knowing how to use any graphical installers (do they exist?). If I was aware of that stuff, I'd possibly consider it for non-technical users, but as it stands, I don't.
I'm not even sure I would recommend a novice user to disable SElinux because the only times I configure SElinux are when setting up developer specific things.
There are two huge disadvantages why I wouldn't recommend it to everyone (though I use it myself):
1) strict licensing restrictions - lots of basic functionality/software missing from official repositories; there is rpmfusion of course, but that's not something my mom should know how to install
2) Very short support. You'll be stuck without updates in no time, and upgrading to possibly broken release (that might radically change things) every few months is ridiculous.
So.. if they were to offer LTS version, I'd happily recommend it to everyone. As it is, it's great if you don't mind the caveats mentioned above (especially everything that stems from 2nd point). CentOS is not a viable alternative, as it is "enterprise" OS stuck with archaic software.
> 2) Very short support. You'll be stuck without updates in no time, and upgrading to possibly broken release (that might radically change things) every few months is ridiculous.
New releases are out after around 6 months. If you find release upgrades risky you can always wait a whole year and stay one release behind (any bug would have been ironed out in that time).
I usually don't update in the first couple weeks and I hadn't any problem with a Fedora upgrade in the last two/three years.
Fedora's latest release rocks. Fedora used to have stability issues here or there because it's the cutting-edge version of RHEL, but I think it's unlikely you'd run into those in everyday usage with mainstream hardware. It shouldn't be less stable than Ubuntu.
I used to use Fedora with KDE 5 or so years ago. I liked it fine. The only think I lacked was the convenience of deb packages since so many projects offer debs, and using alien to install debs with yum was painful (they've changed package managers now, if I'm not mistaken).
Probably, if I move away from the convenience of a Debian derivative these days, it will be to FreeBSD.
If you do end up switching to FreeBSD and are looking for some help, feel free to email me! It'll be a bit of the blind leading the blind; I finally got FreeBSD working [wifi, desktop] 6 months ago. But I'd been running Ubuntu for the past 5 years before this so I'm familiar with some of the transition-confusion. I've been using FreeBSD exclusively for work and and servers with the exception of work servers (which are still Ubuntu).
What projects are you looking at that primarily only package a deb? I frequent far flung corners of the web and have never had trouble finding an RPM (have on occasion had trouble finding a deb though)
I've had both, depending on what I was looking for at the moment. Server and/or enterprise stuff, perhaps some management system for something, never offered debs. Lots of user software seems to give me debs a lot more often than rpms. It also shifted over time I think, debs being more popular recently and rpms less so.
Have you given Arch a try yet? Their are also some family friendly projects like Antergos. Arch wiki is my all time favorite for great documentation, even when not using arch, go figure.
Personally, I wouldn't recommend Arch (even the "friendly" variants like Antergos) to the less technically inclined. You don't need a to be extremely familiar with Linux to get started with Arch, but you do need to be comfortable at the command line and are expected to be able to read the (excellent) docs and debug things yourself.
It's a very educational experience, but not really something the "average grandmother" is likely to have the patience for, particularly when they're just trying to watch a flash video on Facebook. That's the real strength of Mint, and I'm not sure what best fills that niche if Mint is off the table.
As an Arch user, the lack of MAC in any well supported form throughout Arch and its derivatives is a deal breaking turn off to me. I cannot imagine telling someone who is not technically literate to start using a Linux distro without being able to trust that arbitrary software cannot read all their documents and steal their passwords.
Mint is having its Manjaro (archlinux based prepackaged desktop distro) crysis. They had the same problem and backlash. They ironed out many security issues and practices since then. Mint will probably do so too, Clem did a lot of work, it's probably just a new territory for him and his team.
I woudn't mind so much if they were running WordPress on the same machine if it were isolated in a container or jail. There's certainly no excuse not to do so these days now that php-fpm exists.
> Edit: I'm also playing around with FreeBSD for my development environment, since I can use Mate on there. To be honest, I don't really need a DE these days anyway, since I only use terminal and a web browser. I should look into just using a Windows Manager.
FreeBSD has good support for DEs with XFCE and Lumina, anyways. At least I've never had a problem.
I've been running Mate and Gnome3 for the past 6 months as well. Mate had some issues and, for both, the screen went black for 1-2 seconds when videos start playing.
I was able to get a compositor (Compiz) working on Mate and this fixed the tearing issue. It had some issues where it just kept crashing and rebooting that I never figured out. So I switched back to Gnome3. Although Gnome3 comes with a compositor (I think? And I think it might be Compiz?), I can't seem to get rid of the screen going black when a video starts.
But it really doesn't make a big difference and it's not a big enough annoyance to matter. I don't really use FreeBSD for media anyway. It's just the auto-play videos on Facebook that triggers the infrequent frustration.
Me either. Media is definitely not my use case. I'm running FreeBSD in a VM on top of OS X as a host anyways, so graphics support is likely more stable for me for that reason.
I think Lumina is looking really promising, though. I recommend giving it a try if you have the time.
Thanks for the Ubuntu Mate heads up. Over the last 10+ years, and from Apple OS X to MS Windows, the Ubuntu Gnome 2 desktop was the most pleasant for me to use.
I put my parents on Xubuntu about 5 years ago on a whim, then moved them over to Mint about 2 years later. About a year-ish ago, I saw that Xubuntu was using whisker-menu (main UI menu with a search box like Windows) I put them back on it and went back, myself.
>unfortunately, this is also where we run into some of the limitations of UAC. Remember, there is no effective isolation; there is no security boundary that isolates processes on the same desktop. The OS does include some protective measures to keep the obvious and unnecessary avenues of communication blocked, but it would be impossible and undesirable to block them all. Therefore, Microsoft does not consider breaches of that nonexistent security boundary to be security breaches.
In default and probably usual configuration UAC does not represent security boundary between medium-il (user) an high-il (equivalent of linux's root). Only if you bother to run under non-admin account are you protected from escalations.
Leo Davidson then provided demonstrations of privilege escalation between those two integrity levels without triggering uac.
I haven't checked Windows 10, but at least until Windows 8 most machines running Windows had been running most software effectively under root.
GPG signatures are kinda useless (when hosted on the same server). If I am able to replace the download I could probably deface also the pages where you give your signature and key fingerprint.
The point of GPG is to download the public key of the signer (in this case, the package maintainer). Then, you check it's bona fide, either by verifying that you have a trusted connection through your web of trust, or (more likely) by verifying the key id is mentioned in other trustworthy places (like legitimate Ansible and Bash scripts, etc.).
After that, you have a trusted signature, and it doesn't matter if it the signature page is defaced.
If you're downloading the signature each time you're downloading a new version of a package or iso, and its SHAs, you're using GPG incorrectly.
I've had many signatures for package signers in my keyring for 5+ years. If and when they replace the key, they let the community know and we update our keyrings.
Yes, if you don't want to do that, fine, but for those of us who are careful users of GPG, it's a huge barrier against malware in packages and distros.
I am first time linux mint user. The only place I can find who the developers are/what keys are used/signed and so on is their (compromised) site. Once we have established trust it is easy to maintain it.
If I had known they had GPG signed hashes, my work flow downloading the distro for the first time during the compromise would have been as follows:
1. Get key id of the key used to sign the hashes.
2. Google for key id. Hmm, no one's used the key id before, that's odd. Not in my web of trust. And it doesn't match the key id mentioned on other forums/scripts/sites. I'll definitely hold off until I can verify the key directly with one of the distro maintainers.
Even if you haven't previously established trust, GPG can be an extremely valuable ally.
True, but the site has to be compromised at the moment you are downloading the keys (not anytime later!), and then all your further installations have to me MITM-ed or you would take notice. Not so trivial for attacker anymore.
Ideally, the keys should be distributed in safe fashion too, so this whole question should be moot.
The user interface/experience of PGP/GPG is not great, and that could do with fixing. However if you already understood how to use PGP (and granted that's unlikely for the users Mint targets), then you would not have been caught out by this.
Thank you, that was indeed a typo I wrote as I rushed to get out the door earlier. I obviously meant to write "if you're downloading the public key" (indeed, I referred to "download the public key of the signer" in two paragraphs up, so I suspect that was clear to most people reading). The hashes for each new version are what are signed, obviously (with the private key of the maintainer).
My sentence about "trusted signature" could have been clearer, too. You have a public key that you trust, that you leverage to verify any new materials signed by the package/distro maintainer. So by extension, you can trust the signed hashes.
I've been using PGP/GPG on a regular basis since 1998, so I'm pretty used to the workflow by now and not at all confused about key pairs and signatures (although I will admit that I sometimes got confused about the point of subkeys until I read this a few years ago: https://alexcabal.com/creating-the-perfect-gpg-keypair/)
Yeah, I figured out that you merely made a simple mistake, since rest of your comment makes all kinds of sense. :) Apologies in case I sounded like I wanted to put you down.
APT can automatically verify GPG signatures against a preinstalled keyring. Obviously, that doesn't help in the case of this particular incident (ISOs), but it does help in day-to-day updates.
> Add to that, that they do not care about copyright and license issues and just ship their ISOs with pre-installed Oracle Java and Adobe Flash packages and several multimedia codec packages which infringe patents and may therefore not be distributed freely at all in countries like the US.
Seriously, with the rotten-ness of the US patent/copyright/political system, it's better for mankind to just say "ok, US users can't get this, but everyone else can".
E.g. many European banks do this for "US persons" - they simply cannot get accounts because the legal risks are just too high.
Edit: It's not just banks. E.g. BMW Group (and likely other huge non-US corps with US subsidiaries) refuse to allow US persons to look at financial statements, again due to regulatory hassle.
I'm also surprised how often I ear about this "patent" argument while actually "the world except US" don't recognize software patents. It's really a "US only" problem, the rest of the world don't care about this.
Most patent encumbered codecs have patents worldwide, including in most EU countries. Go to the MPEG-LA website and take a look at the patent lists if you're curious.
> Seriously, with the rotten-ness of the US patent/copyright/political system
I've gotten this vibe on Reddit, but it seems people seem to be against copyright unless it's Youtube not enforcing it vigorously enough when it comes to small channel owners getting their content stolen. Then suddenly everyone appears to be for copyright laws.
> "ok, US users can't get this, but everyone else can"
Setting aside for a minute that this isn't possible, it's like saying you won't release your app on iOS because of Apple's walled gardens. Sure, the walls suck, but the users inside are numerous and spend lots of money. Most product creators don't have the option to exclude US users.
Nobody is saying to not release for US (in fact you can choose to use the noCodex version) but Why Should problems in US or anywhere else limit the experience that everyone else can get?
> Add to that, that they do not care about copyright and license issues and just ship their ISOs with pre-installed Oracle Java and Adobe Flash packages and several multimedia codec packages which infringe patents and may therefore not be distributed freely at all in countries like the US.
Hmm, that was actually one of major selling points for Mint around me - it was the distro that "worked", with relevant software, codecs and drivers being preinstalled and not crippled due to strange laws on the other side of the ocean.
It's user friendly, yes, but also reckless. The distro won't "work" any more if it gets sued into oblivion by Oracle, Adobe, Nvidia, AMD and whoever else feels like kicking puppies.
Hmm, why would they be. Mint already offers a special version for people living in countries where they can be sued for installing a video decoder or a video driver - it's called a "No codecs" version and it's meant for users in USA and Japan.
Suing in the rest of the world would be a rather wasted effort right now.
The flip side is that if you believe that laws like those around software patents should change, someone ignoring them and taking the risk of being sued is a more compelling demonstration than carefully observing the rules and then complaining about them on HN.
More pragmatically, it seems unlikely that Oracle or Adobe will sue a distro for helping them distribute the Java/Flash runtimes. They want their runtime to be on as many devices as possible.
> The flip side is that if you believe that laws like those around software patents should change, someone ignoring them and taking the risk of being sued is a more compelling demonstration than carefully observing the rules and then complaining about them on HN.
As EU citizen I have zero influence on US politics. Complaining on HN to US citizens is literally all I can do.
> More pragmatically, it seems unlikely that Oracle or Adobe will sue a distro for helping them distribute the Java/Flash runtimes.
Then maybe those companies should change their license terms. Right now, Adobe does not even allow you to use a downloaded Flash installer on two computers, you must either download it separately on each PC or apply for a special license: http://www.adobe.com/products/players/flash-player-distribut...
If they want to shoot themselves in their feet, I'm not going to interfere.
> As EU citizen I have zero influence on US politics. Complaining on HN to US citizens is literally all I can do.
What about making nice things that US citizens can't legally have? If you're legally in the clear in your own jurisdiction, you don't even need to take a risk yourself.
I'm not a lawyer and I don't know exactly what our situation is in the EU, so don't take my word that that's safe. Software patents may be technically banned, but I think people disguise them as 'business method' patents. And if you use US project hosting like Github, maybe someone can sue you there. I'm suggesting that, at the margins, we should take some risks.
IF, IF ,IF... meanwhile we are going by just fine while the others wait for Hypothetical Risks and Hypothetical all in one interfaces (this last one is just my rant against Canonical).
Most people install some packages as soon as they install an OS. A distro could be very nearly as user-friendly by creating a wizard that takes users through the steps of installing these proprietary things in a legal way.
I remember not long ago Mint included just about every browser media plugin ever made, including RealMedia, WindowsMedia etc., stuff that was obsolete 15 years ago. Maybe it still does.
All that stuff doesn't just mean bloat, but also significant security risks.
You can somewhat get away with it because desktop Linux isn't a major target (yet), but it's just very bad practice that shows a lack of care.
> I remember not long ago Mint included just about every browser media plugin ever made, including RealMedia, WindowsMedia etc., stuff that was obsolete 15 years ago. Maybe it still does.
I remember installing it when it was relatively new and people were gushing over it. A few weeks later a new version came out. I tried upgrading when I found there was no upgrade path. Upgrading Mint means reinstalling Mint.
I remember the days before apt-get when there was only dpkg. Before Debian I used Slackware so I'm all too familiar with package management (or lack of).
The idea that someone would release a new distribution, based on Debian of all things, and it not be able to upgrade was repelling to my mind. Re-install Mint to upgrade? No thanks I'll install Ubuntu over it.
Cinnamon is nice but I never understood why it needs its own distribution. I should be able to apt-get install cinammon-desktop or whatever and it work like any other package.
> Cinnamon is nice but I never understood why it needs its own distribution. I should be able to apt-get install cinammon-desktop or whatever and it work like any other package.
On Debian you actually can 'apt-get install cinnamon' and have the full desktop.
I'm sorry for the rant, it's a bit off topic and not called for, but would like to say my experience is sort of the opposite:
I've been a Slackware-current user for most of a decade, and I love that it's so easy to upgrade the OS using slackpkg. I can (and do) often only upgrade a subset of packages, and never even need to reboot afterwards, not even after replacing the kernel (although, obviously...). A simple package system without dependency tracking has had many advantages for me. But installing stuff not in the standard system is more of a pain; there are third party repos but no where near as convenient as Debian/Ubuntu. I compile a couple SlackBuilds (packages) per week. I admit I've spent a huge amount of time learning how to do things like that manually and wouldn't recommend Slackware to those who won't want to learn sysadmin.
On the other hand my experience with upgrading between Ubuntu releases is terrible. A number of things have broken, permanently, every time, and I can't trivially upgrade from Ubuntu 14.10, because they purposefully break the package manager in old releases by breaking URLs. Also other pain points, like it not allowing installing both 32 and 64 bit devel libraries. I need those! The package manager is too complex and clever for me. (Honestly, I don't want to spend time learning how to override it.)
I'm planning to format the drive and reinstall from scratch because it seems easier than fixing it.
A simple package system without dependency tracking has had many advantages for me.
Are you equating "simple package system" with "per-package service isolation" here? Because otherwise I don't see how your example about upgrades not requiring a reboot ties in with the package system. Debian has probably the most extensive package system, yet it still allows you to do upgrades without reboot. The major exception to this is dbus, because it still can't do stateful restarts. But I don't think that's a problem that Slackware is able to avoid, unless it avoids dbus completely.
Also: since Wheezy, Debian allows parallel installation of many development and shared library packages from all (10!) arches, for example to facilitate cross-compilation. It's still a work-in-progress, but I believe over 80% of the archive (not including packages with binaries) should be co-installable now.
Thanks for the information. So I assume Ubuntu now also allows parallel multiarch devel packages?
I've never tried Debian, and my experience with Ubuntu is very limited. I guess my comment on reboots was off-base, and maybe more of it. I meant that packages aren't densely interconnected with dependencies. (macports has it really bad, seems every time I want to upgrade python I need to recompile gcc or something.) Anyway, the simplicity makes it easy to modify packages. E.g. last week I fiddled around with lua packages to allow parallel installing multiple versions (lua 5.1, 5.2, 5.3 are incompatible languages which are often mistaken for the same language by distro maintainers). However I realise creating .deb files is also easy, so I guess I could do just the same there. Also, Slackware's multiarch support is a very simple hack using a small script that works 100% of the time as far as I've seen. It's telling that Debian still only supports 80% of packages after who knows how much work put into it.
Ubuntu should be able to do it, as the multiarch spec was developed in collaboration [1]. I'm not sure how well-integrated it is with the rest of Ubuntu's archive and infrastructure though (for example, Ubuntu first used a modified syntax in sources.list vs Debian's dpkg --add-architecture approach [2]). My 80% figure is also just a wild guess, I've tried it very early and failed for some exotic packages, but right now I only use it for a few packages.
I understand what you mean about tightly integrated package dependencies, and for me Debian is the rare exception in that it mostly works. My main irk with it is that dpkg only allows one version of a package to be installed, so you end up with package names like gcc-4.9 to allow parallel installation. Or in your example, there's a lua5.1, lua5.2 and lua5.3 in the archive -- but that doesn't mean that every lua library is also available in all three versions.
But the package compatibility is still way better than ecosystems like Maven/Gradle or Stackage, where dependencies can be so tightly coupled that you end up working around old bugs in one package, or new bugs in another.
Thanks. Hmm, the MultiarchSpec page lists "Co-installable -dev packages" as an unresolved issue, but hasn't been updated since 2014. The MultiarchCross[1] page specifically about that issue hasn't been updated since 2013. I couldn't work out what the current situation is.
Slackpkg is relatively new, it was part of the slackware 9.1 release. A lot of people, myself included, remember experiences with older releases without a real package manager.
That was my experience as well, and after interacting with them a couple of times on issues with name space collisions I realized that it was more like some person's distro rather than a new mainstream distro. The difference being are you in the release management business (hard) or in cobbling together a set of packages that you like for you (easier). I went back to Kubuntu and waited to see if they would grow into that. After all, everyone has to start somewhere.
The compromise on their site with Wordpress was sad, but the best description I've heard for Wordpress was that it was "a rootkit with a blogging package ride along." It is so hard to secure a wordpress install and keep it that way.
"why it needs its own distribution"
Because....Linux ? Seriously, in theory in Linux you should be able to mix and match any DE with any windows manager. In practice it doesn't work like that at all.
Hmm; I'm puzzled by a contradiction between this and another recent article. From this article we learn that we shouldn't do this:
"Secondly, they are mixing their own binary packages with binary packages from Debian and Ubuntu without rebuilding the latter. This creates something that we in Debian call a "FrankenDebian" which results in system updates becoming unpredictable <https://wiki.debian.org/DontBreakDebian#Don.27t_make_a_Frank.... With the result, that the Mint developers simply decided to blacklist certain packages from upgrades by default thus putting their users at risk because important security updates may not be installed."
"Nobody else requires that you rebuild every package before you can redistribute it in a modified distribution - such a restriction is a violation of freedom 2 of the Free Software Definition, and as a result the binary distributions of Ubuntu are not free software."
I appreciate that the latter one is discussing a hard requirement as a result of Canonical's IP licensing. But the former seems to indicate that it would be bad practice to just copy all of Ubuntu's (or Debian's) binary packages and build a new derivative distribution on top of it. Is the latter piece arguing in part for a freedom that would be a really bad idea in practice?
> to just copy all of Ubuntu's (or Debian's) binary packages and build a new derivative distribution on top of it
This is okay. However, copying Ubuntu’s and Debian’s packages into the same repository and then mixing them willy-nilly is not.
You can either:
a) Base your derivative on binaries from Debian xor Ubuntu, then add source packages compiled with these binaries as you like.
or
b) Base your derivative on sources from Debian and/or Ubuntu, then compile everything together with your custom packages to make sure that all ABIs match.
Ah, ok. So the issue is mixing binary packages from different distributions or different releases of the same distribution (the Debian doc mentions mixing Debian stable and testing for example). If that's what Mint are doing then I'm surprised they aren't stuck in perpetual dependency hell.
I realize they have not taken security seriously in designing their delivery mechanisms. However I do not care. Linux Mint solves all my problems of configuration. It is configured so nicely and with Cinnamon it comes with so many useful bells and whistles, I'm not moving anywhere. It's great. Mint will be back after this blow and I am sure it will not happen again. I use it daily and I am super productive on it as my desktop. My family uses it also, on new and old computers, and it is perfect.
This. Because of Linux Mint, I don't even notice my Operating System anymore(which is a good thing).
Everybody else here is recommending 'safer' options, but seriously I don't want to pay the cost of fighting with my OS (which is what every Linux experience has been for me since 1996). Linux Mint has been the only OS which makes me forget that I am using Linux. It beats OS X in almost everything.
> I realize they have not taken security seriously in designing their delivery mechanisms. However I do not care.
I hope then that you don't use you computer for developing software or for work or anything remotely important, because otherwise using insecure software is most egregiously unethical towards your users/customers/employers because you are consciously choosing to exposing them to unnecessary risks.
> Secondly, they are mixing their own binary packages with binary packages from Debian and Ubuntu without rebuilding the latter. This creates something that we in Debian call a "FrankenDebian" which results in system updates becoming unpredictable
This is interesting because Debian itself encourages derivative projects to use their binary packages[1]:
> For those derivatives that re-use Debian binary packages, add some source packages and modify some source packages, where possible we encourage them to use standard Debian mirrors and add a second repository containing only the source and binary packages that have been added or modified.
Or maybe they don't encourage that behaviour but still give guidelines in case you want your derivative to work that way? I'm not 100% sure.
I believe the problem is the mixing of Debian and Ubuntu binaries rather than the fact they're not recompiling everything. Ubuntu doesn't just take Debian and add a few extra packages - it's a complete recompile with package versions likely don't match any Debian release.
Oh, Linux Mint mixes Debian and Ubuntu packages by default? I was under the impression that there is Linux Mint based on Ubuntu and Linux Mint Debian Edition based on Debian.
I had to quit using Mint because a lot of packages in their repo are really, really, old. Just to give example that made me install Manjaro on friends PC is ownCloud client for file syncing. It's a few years old, contains a lot of security bugs and doesn't work with HTTPS. You need to add repo with opensuse in URL address. Mint developers were contacted by ownCloud developers to resolve the issue, a few years ago, and they ignored it.
Manjaro, really? The one distribution with a worse security track record than Mint? "Our SSL certificate expired, please change your computer clocks one year back" Manjaro?
Yea, I know, I expected this answer. I needed something quick to install, not deb... and with latest versions of packages. I was thinking about Arch, but it's too long to install, I had ~20 minutes. Manjaro is ok to people how know what they're doing and who know how to setup basic features of OS.
Fucking bang on point. Mint has a nice out of the box experience but after using it for a while you begin to see all the shit they stuffed in the cracks. It is a shame as Cinnamon is a lovely DE, I wish there were an official Ubuntu build with Cinnamon as the default DE similar to Xubuntu with Xfce.
Installing Ubuntu GNOME and then installing Cinnamon would give you a nearly identical setup. Cinnamon is basically just a replacement for GNOME Shell. You could also install Nemo if you prefer that file manager.
There's something interesting to be said here. "They make {{ package }} unusable by hijacking it's name space", well who gave them that name space? I understand the whole first come first serve and all but if we played that way things could get messy real fast.
There was recently an article on HN about the "Web of Hashes" and this article got me thinking about it. Why not give each application an UUID and let that be it's name space? Give the user an option to still use- they're example- xedit while having another xedit installed along side?
I can see how this could also get messy. Just spit balling here.
That's not what was exactly written but is illuminating none-the-less (the original quote only says "name," which is a very different thing to "namespace"). Why do package managers have a single namespace to begin with? I should be able to install packages regardless of name conflicts, by using a namespace, e.g.
$> apt-get install mdm
Found org.debian.mdm and com.linuxmint.mdm.
$> apt-get install com.linuxmint.mdm
$> apt-get install org.debian.mdm
There's obviously the chance for a file system conflict, but the package manager should be able to keep track of that for you and abort if it would occur (or allow you to install it to a different prefix).
It's incredibly naive to believe that name conflicts would never occur, especially with the 3-letter acronyms/contractions that are so prevalent in Unix. I'd pin the blame this specific problem squarely on Debian, it shouldn't be happening in the first place.
Distributions manage the namespace of executables. Well, they're supposed to anyway... that's much of the complaint here about Linux Mint.
In Fedora (and most distros), programs with an executable name overlapping another program will be renamed to a unique name as part of the distro packaging. (Putting aside programs intentionally named the same thing because they provide the same function, which are managed differently via "alternatives".)
This puts some implicit pressure on creators of software to not reuse names that are already in use, which seems to work for the most part.
Gentoo handles this by categorizing packages into a format like app-editors/vim. This way packages in different categories can share the same name:
dev-lang/crystal (The Crystal Programming Language)
games-mud/crystal (The crystal MUD client)
x11-themes/crystal (Crystal decoration theme for KDE4.x)
When installing packages, if a package having a unique name across all categories, a simple "emerge vim" will install it. Otherwise, the category can be specified with "emerge dev-lang/crystal".
I absolutely love the concept of NixOS, but I haven't tried it out. Is it well-supported enough to use day-to-day as a developer? Do you often have to build things from source?
It's worth playing with to see if it could fit into your work; for instance, the Haskell ecosystem is very well supported, while Ruby seems to have issues. I am now using it for work on some of my projects. Generally you won't have to compile from source; that said I did recompile glibc when that last bug hit. (Security updates for core packages are not a great story now.) It's not perfect, but it's very hackable.
> Is it well-supported enough to use day-to-day as a developer?
Speaking as someone who knows the developer: no.
> Do you often have to build things from source?
There's ~6500 packages, so it's likely you'll be installing some stuff from source. It's really hard to predict without knowing specifics though. http://hydra.nixos.org/eval/1237359
If you track 'unstable' you generally don't need to build from source as that branch (or channel as Nix calls it) is updated only after everything has been successfully built on Hydra. There are sets of packages excluded from the build but I think those are mostly interpreted languages (emacs packages, etc).
It's generally well-supported enough so long as you don't need the latest and greatest packages within about a month of their release. They're having issues with their continuous integration system (the box it's running on isn't powerful enough and the project doesn't have money to get a new one), and there's occasionally breaking errors in important packages meaning a new version of the package repository doesn't get rolled out for a while, even if you're not using those packages.
Note that building from source is exactly the same process as building from binary - i.e. "nix-env -i firefox" will try to install from "cache" (the output of the continuous integration system's build process) and if it can't find something, build from source. Most important things are in the cache, some things aren't (mostly obscure packages and things with no-redistribution licenses), but in general it works out well.
I use it day-to-day as do some others in my company, and we also use it for all our cloud deployments. You have to relearn a number of things, like setting configs and so on, but on the other hand for nix pkgs it is nearly always a completely hands-off process regardless of whether you need to build from source or not (Nix handles that as part of the build, retrieving binaries from a remote cache is an optimization). I recommend giving it a shot!
Personally I switched from Ubuntu to Fedora a couple of days ago because I've had it with Canonical. It was the first time I switched distro almost since I started using Linux (though I've admined servers with other distros and other OSes in the meantime). I am satisfied with Fedora thus far.
Mint's HiDPI is simply the best experience on Linux these days, not to mention it just works. But let's attack it because it doesn't conform to some autistic standards of ours. Way to go friendly Linux community! Let's make all distros unusable, super complex, require all people to wear their own personal TPMs and certificates so that they can feel finally secure. Let's blame Mint for not having DNA real-time sequencers for confirming package authenticity! Let's force my grandma to compile all her packages - she must be upgraded as well or she won't make it during singularity, right? /s
You are blaming Mint for things that are wrong in some context with Debian/Ubuntu or Linux or even unsolved in computing as such. And the small team of developers simply can't respond to every single issue within minutes as you wish, they have their plate full.
Gnome 3 only allows whole integer steps in DPI scaling. This means that the options for me are a) too small at 1x or b) too big at 2x. I use a scale factor of 1.4x in KDE. I haven't tested on Cinnamon, so I don't know if it accepts non-integers for scaling.
I would gladly use KDE - I like it, but this https://bugs.kde.org/show_bug.cgi?id=162211 is still not fixed after so many years (and no - the title is not updated - it doesn't affect only remote media)
An yet, it just works, looks nice, is fast on old hardware, mounts everything out of the box, controls your audio from the volume icon and gets out of your way in general. 17.3 is one the best out of box experiences imo.
Poor security in what sense? That something can be spoofed by an attacker with unlimited resources? How many of your desktop users are actually at risk? How about Win 10 telemetry? Or whatever OS X is sending back to Apple? Happy about it?
They mean from a usability perspective. Mint is famous for it's volume icon that includes album art for what's playing, various other info, and controls to let you pause/skip tracks. It's like a tiny music player in your task bar.
I've written a similar thing for the Mac, using NSStatusBar [1]. It's no big deal, or reason for a Linux distribution to be "famous". There are zillions of apps in the Mac store that do stuff like that.
Does Linux have a standard API for surfacing interactive icons and menus in the menu bar? An ICCCM extension, perhaps -- set some properties on the root window and cross you fingers? Or would you have to do it differently for every different Linux desktop environment or window manager that supported such a feature?
It depends on the desktop environment. Most desktop environments provide a way to extend the desktop with extensions which can be written in various languages. In the case of Cinnamon for Linux Mint, being based on Gnome, it uses JavaScript to extend the desktop with extensions, applets (extensions that display on the menu bar), and desklets (extensions that display on the desktop). In fact, a good portion of Cinnamon[1] is actually written in JavaScript.
Every once in a while, I get difference error in /etc/issue*. I really hate all of these. I use Mint only because I like its GUI interface. I really cannot stand Ubuntu desktop for a moment
If Mint is just really a GUI interface, do they really need to put out a whole new distribution? Why not focus on the GUI and let the distros do the distribution?
Ubuntu Gnome - works VERY well - if you enable the "window list" plugin for shell, it works as well as cinnamon.
You can also use some other distro like debian or fedora that also provides cinnamon.
> if you enable the "window list" plugin for shell
And add minimize/maximize buttons and applications ("start") button. I have no idea why Gnome has to experiment, they have lost so many users unnecessarily. At least it's configurable to something sane. They are far from catering to the same feature-averse audience of iPad so there's no use in trying to (poorly) emulate its minimalist design language.
That's why Linux Mint wins. It look completely like MS Windows. Taskbar, Icons, Keyboard Shortcuts,...everything is similar to MS Windows. As a Windows user, I find it most easier to use and that's why they are winning.
I prefer taskbar from 7+ and in my experience only KDE can mimic it, albeit clunkily. From screenshot it seems Mint is closer to XP. I think GNOME can be beaten to look more familiar and I can tolerate it. At least it now respects Windows key convention and runs its search functionality.
As a super happy user of Linux Mint - guys, please keep doing what you are doing! Thank you so much for giving us a proper desktop Linux! You have my (financial) support! Don't get pressured by some random loud Internet criticism and change for worse! Please don't do Win7->Win8 or iOS6->iOS7 regression in Mint as well because of a few unhappy voices trying to acquire power over you!
It has been stable and reliable for me too. But the concern is not about reliability so much as security: some patches are not delivered due to the way Mint (apparently) mixes binaries from Debian and Ubuntu. It may be quite reliable but still insecure.
So, functionally, is there any real difference between using Mint's ISO to install-from-scratch versus using you're preferred distro of choice (Ubuntu/Fedora/FreeBSD, etc.) and installing the Cinnamon Mint desktop on top of it?
I've been playing with Mint for the past few weeks and experimenting with full-Mint-on-a-VM versus Ubuntu-with-Cinnamon-desktop, and I don't really notice much of a difference. After reading about all of Mint's problems this morning, I'm tempted to stick with Cinnamon exclusively as a DE unless someone offers a compelling reason to use the full distro.
> So, functionally, is there any real difference between using Mint's ISO to install-from-scratch versus using you're preferred distro of choice (Ubuntu/Fedora/FreeBSD, etc.) and installing the Cinnamon Mint desktop on top of it?
No. Switch to another DE. Cinnamon is developed by the same gung-ho cowboys that develop Mint so that's a poor indication that it's managed any better than Mint itself.
I see a lot of people asking for alternatives. I spent 12 months trying almost every distro I could get my hands on and have some recommendations for those interested.
This was my shortlist at the end of all my adventurism and testing.
1. Linux Mint
2. Ubuntu MATE
3. Antergos Cinnamon
Pretty short list but those are what I found I settled on as possible choices for my own use. If the goal is getting down to business and getting work done rather than fiddling with the system I think those 3 would fit most people's needs. I was a longtime Xubuntu user prior to this adventurism, and IMO there are just better alternatives though it would probably be #4 if I had one, but I'm just not a fan any longer. MATE man handles XFCE.
I leave Mint at the top because other than these security concerns, it remains the best distro for me. I love their LTS update policy, continually delivering updates to Mint during the entire support span of Ubuntu LTS. Their desktop env is also just better IMO than alternatives.
Ubuntu MATE is pretty good and for the type of person like myself who is drawn to Mint, would be a really good alternative. It's missing a few features of Cinnamon, which is superior in general for me to MATE. But overall this is what I'll install if I decided to ditch Mint.
Antergos is just Arch with a nice installer. I didn't spend a long time testing this but it would be my choice for a rolling distro. Many people I know want that and they offer Cinnamon as a main, supported environment. Might be the best of every world for some. I prefer the slower updates of LM and UM, and install newer packages through PPAs or compiling it.
As an aside, I have completely given up installing other desktop environments onto distros that didn't originally ship with them. I see people recommending that, and it may work out but it's a mess if you want to switch back in my experience. I prefer to pick a distro that ships with the DE of your choice. I would not run for example, 'sudo apt-get install cinnamon-desktop-environment', anywhere at any point. :)
Hopefully this helps someone out there looking to migrate off of Mint. I'm still using it (on 17.2 here) but may move to UbuntuMATE or Antergos Cinnamon, depending on Clem's response.
Accepting poor security practices from the Linux mint developers cannot be justified just because you are not paying anything.
Open source projects need to be held to the very highest security standards.
How these security breaches are handled today set a precedent for how such security breaches are handled in the future. So it's a good idea to learn and improve from such security breaches and set a positive precedent.
If Mint advertised, "Free to download, but also free of good security practices", then I could buy this argument.
The problem is that Mint allows users to trust it. It would be better if they didn't work on the distro at all if they're not going to take security seriously.
People on HN become furious about antivirus programs and routers and whatever else that are discovered to be laughably insecure. An OS is no less of security product than A/V or a router. In fact, it's the first line of defense for an end user.
And unless you are using your computer only for media consumption it's also unethical because you are only delegating that insecurity onto your customers/users.
People I understand your criticism, but may I suggest donating to them too if you've used Mint? Once his bills are paid off, maybe he'll spend more time worrying about Mint?
What do you need Ubuntu for? I loved the Cinnamon experience and was used to dpkg/apt-get, so wanting a rolling release with up to date software, Debian was the easy choice. Choose Cinnamon in the installer and basically you're set.
There are way less Mint developers than others, and it is mainly user driven. - So of course it less likely to be professional, at the same time it is more user driven.
Ubuntu refugees like me. I not only don't like the new UI (and amazon shopping lens) but there's also hundreds of little things like gnome2 gedit having a usable search/replace dialog box as opposed to the gnome3 thingy-in -the-corner.
Also I like to be able to take a theme and modify components one by one like changing the colour of title bars or the font in menus without having to wade through 2k lines of CSS and non-existent documentation and then spend half an hour debugging why it's changed things all over the place. The v3 theming interface might as well let theme creators use obfuscation DRM to prevent users from modifying their themes.
Amazon lens turned me off of Ubuntu (and towards Mint) more than anything else. I lived with many bugs in Ubuntu. But the day they decided to corrupt it with pre-installed $$MEGACORP$$ stuff, even if it's uninstallable (and sharing my data of searches on the UI no less!!), is the day I swore off Ubuntu. Too bad.
I am at my home workstation - I don't much like ubuntu, am generally happy with debian or fedora, and have played with arch, gentoo, and others, but had heard good things about mint "just working" so thought to try it out.
It installed really smoothly, and was really quick and easy to set up for me, and my not-so-techy wife.
It installed the nvidia drivers when I asked it to without having to touch the commandline (there's a "drivers" or something like that tool in the system settings). I'd had some problems with them before on a pure debian install, so I was quite impressed.
In general, it was a pretty good experience. I'm thinking I may well switch over to fedora at home though. With my home machine I want something that just works, that I don't need to go faffing with config files any more. I have enough of that at work.
I am, although I use the Xfce version. Back in the day, before I learned to program, I used distributions like Arch, Slackware, Gentoo, etc. that generally required manual configuration and knowledge of the system and Linux in general. I realize this seems backwards, but these days I'd rather spend my time developing things than configuring my workstation. My interest in computers has moved from administration to creation, and using an operating system that takes care of the former allows me to better spend my time doing the latter.
That said, in light of Mint's issues I'm seriously considering moving back to Xubuntu or some other disto that has Xfce as a first class citizen. I'd be interested in suggestions.
I am. I've tried dozens of Linux distributions (sadly, this isn't an exaggeration) and Mint is likely the easiest and fastest one to set up and start being productive with.
When my daughter asked what Linux distro she should install on her laptop, I didn't hesitate in recommending Mint.
Maybe because the company behind Ubuntu has practically declared the desktop dead and we want a desktop that is really a desktop and not some future(already dead)fantasy all in one?
Yes Xubuntu is a desktop but i'm not waiting around for the axe to fall and try again to force unity on everyone... I had to change once, better change to distro that is commited to the desktop.
I really don't see how distrowatch indicates anything, why would people using a system go to that site in particular? The latest wikimedia statistics from the middle of 2015 have:
Those stats are very unreliable. Most Linux web browsers (including Chrome) do not send the specific distro they're being used on. Ubuntu patches Firefox to include Ubuntu in its User-Agent, but most other major distros do not.
Me... I had been using Arch and FreeBSD for a time because I felt it was what serious programmers had to use. Then one day after FreeBSD refused to boot up again I figured I was bored of messing about in recovery mode yet again so I just thought I would give Mint ago. 20 minutes later everything was up and running. Since then I haven't been able to find the time or the will to spend a whole evening setting up my machine to change back.
I installed XMonad. Can't remember the exact sequence of events, but I think whilst messing with the config to set it as the default desktop it crashed and could only boot into recovery.. It would probably have been an easy fix, but I really just wanted a working machine quick.
The poor saps who have it recommended to them or installed for them.
It's a desktop environment theme at best, and could be packaged as such. Maintaining a distro for general consumption (it's pitched to new users as "Windows like"), is very hard work and carries a lot of responsibility.
Given the grave lapses that are described here, I hope it's obvious to you that absolutely yes, you should be using something else. I'll guess that Ubuntu might be your cup of tea.
"Obvious"? Now you are turning criticism from the thread into FUD. I used mint for a long time and really liked it. By issue with it was twofold: old packages and non-rolling release. But if those are not immediate concerns, I'd still recommend it to anyone wanting to try Linux.
Don't have to believe everything on the Internet, including me, but I wouldn't be so fast. Ubuntu's interface is so annoying, I've seen many people turned away from Linux because of it and that's just a shame.
The issues described are not going to impact the average user, and due to some of these issues Mint would probably be more convenient, for the average user.
Note that this suggested title is outdated as Mint was compromised twice. Meaning, it is not so brief anymore. The comment itself is more about the security practices.
More ontopic: Having your Wordpress instance compromised is rather expected with the amount of security bugs Wordpress gets.
The linked page is actually a comment on an LWN article with that title. Changing the submission title to that wouldn't reflect the content being highlighted here.
I don't want to have a lengthy meta-discussion about titles, but as someone who is not in the mint community (either one), the HTML page title of the comment thread linked is more informative than the editorialized-by-excerpting text from the comment body.
I'm not bothered by the licensing issues mentioned, and I'm ambivalent about the namespace issues, but I've been increasingly uneasy for some time now about Mint's security practices. Serving downloads over http and not providing GPG signed SHA hashes like every other distro is fairly irresponsible in this day and age.
This recent security issue, and the poor response to it, are basically the straw that breaks the camel's back for me. I'm moving on to Ubuntu-Mate, since frankly Mate was the primary reason I was using Mint anyway. Serving downloads of the most popular Linux distro from the same machine as is running WordPress is cringeworthy, and failing to take the compromised machine totally offline until it's 100% sure the compromise has been mitigated (through reformatting, including boot sector) shows really poor judgment.
I'm a bit sad to be so critical, since I recognize that Clem has done a lot for the Linux world, and as a Mint user I've benefited personally from his work. But when you're distributing operating systems to so many users, you have to take security seriously. To do otherwise, even on a "hobby" project (although I'm fairly sure it's his full-time job now) is pretty irresponsible.
In many ways, I'd like to pitch in, but based on other interactions I've seen and read about, I'm not sure my input would be welcome, particularly wrt security issues.
Edit: I'm also playing around with FreeBSD for my development environment, since I can use Mate on there. To be honest, I don't really need a DE these days anyway, since I only use terminal and a web browser. I should look into just using a Windows Manager.
Edit 2: Apparently they do provide GPG signed hashes. I've been looking for them each time I've downloaded Mint distros, but never came upon them. So I stand corrected.