One particular omission that caught my eye was OpenWRT descendants, there was quite a number of them a few years ago. OpenWRT is huge in the wireless router market.
Many people see the large number of Linux distributions as a negative thing. I don't agree. The rich diversity of Linux distributions is what fuels constant development and makes it possible to run on pretty much any piece of hardware out there and handle any computing task you can throw at it.
There are even separate Windows distributions out there, in addition to the ones sold to regular customers, they have specialized distributions for e.g. point of sale computers. And of course, there are some custom hacked pirate Windows distros.
If this is too broad of a question to answer, if someone could provide a link explaining this sort of basic stuff, I'd appreciate it.
- Versions of software included
- Whether any "non-free" software is included
- How the directory structure is used (for instance, do you use /opt or /usr/local)
- Is there a package manager / ports system, or do you compile everything on your own?
- In the event of multiple solutions to the same problem (common), the distro often makes a default choice
- The GUI often has custom features, such as configuration applets
- The support culture around each
Several distributions (such as Ubuntu and its derivatives) are moving towards replacing init and related scripts and tools completely with upstart or similar options (http://en.wikipedia.org/wiki/Init#Replacements_for_init).
Generally, the biggest difference between distros is that some come largely pre setup with all necessities taken care of out of the box with an easy install, such as Ubuntu, whereas others allow users to configure their operating system to the smallest detail and only provide the tools to help them do so, such as Gentoo. Generally the latter distros require more savy users and will require them to familiarize themselves with the more mundane differences between distros.
I think the most defining characteristic of each distro is usually the package manager or equivalent system. Also, the amount and quality of packages available for distro. Most major distros have their own and each determines a lot about the distro. Debian is known for apt-get for example (which Ubuntu borrowed) which is held in high regard for being relatively easy to use. Gentoo, however, uses portage, which is slow because it usually downloads packages as source and compiles them before installation, but provides a ridiculous amount of flexibility in configuring a system, right down to specifying the compiler flags packaged will use. Both of the these have large repositories of packages for users to browse and select from. Slackware, by contrast, has no package management system to speak of, but does have packages which you can download and install separately.
Yes. In practice, however, remember that linux software is usually manged through a package manager. This means each distro packages each software package to install using their own conventions and formats. Some conventions (such as freedesktop.org guidelines) and formats (both SuSE and RedHat use YUM/RPM) are shared between many distros, others are not. Also many distributions will patch software specifically for their needs and may even have different compile-time options enabled.
how large is the difference between base versions (i.e., debian vs. redhat)
There are differences:
- between how they handle software packages, like I stated earlier
- in QA and release policies (what RedHat considers stable and Debian considers stable might be different even for the same package)
- in security features (SELinux, grsec, AppArmor, ProPolice, etc.)
- in default packages installed (PCLinuxOS ships with KDE, Ubuntu with Unity, Debian with Gnome, etc)
- in init systems (Sys V, systemd, upstart, OpenRC)
- in available support options
- in release schedules and methods (rolling release [Gentoo, Arch, Foresight] vs scheduled releases [Fedora, Ubuntu] vs "when it's done" [Debian])
- Commercial (Ubuntu, SuSE and RedHat) or volunteer/nonprofit (Debian, OpenSuSE, Fedora and most others)
- in legal and "ethical" standards (nonfree software availability, availability of software known to be infringing on US patents, nonfree drivers or firmware, etc.)
There is an attempt to arrange things logically, so even if you're unfamiliar with the init system on a specific distro -- if you've worked on any other Linux (or even BSD) init system before you should be able to figure things out pretty quickly. Other things, like SELinux, aren't immediately intuitive if you're not familiar with them.
The biggest advantage of the Debian (and derived) distributions is that they have the largest set of prepackaged "things," which makes most things an "apt-get" away. This is very convenient, but not essential.
Back to your question, prepackaged "things" must be installed on the distribution that prepackaged it (you can "cross the streams" to some extent, but a cross-distribution install has a finite, possibly large, probability of not running).
If you are willing to build the "thing" itself, the portability is pretty close to 100% that it will Just Work[tm] with a "configure && make" build. For complex "things" with dependencies, this can grow geometrically into being a major effort to find and build all the dependencies.
Also, it's interesting how many different flavors of Linux are there, suited for a single goal or for specific groups of users. Interesting examples of Ubuntu versions: http://ubuntuce.com/, http://distrowatch.com/table.php?distribution=sabily and http://www.planetwatt.com/.
1) They nailed package management early on. Apt doesn't feel revolutionary today, but in the 90s it was WAY better than what Redhat was using as it's package management core.
2) Debian's staunch adherence to their free software guidelines drives forks in two ways. First, you can fork without any worry you're violating a license on some component. Secondly, it encourages people to fork because some people will want to include some of those missing components in out of the box installs, and that simply won't happen with Debian.
3) Ubuntu. If you look at derivatives, about half of them forked off of Ubuntu, so Debian gets to count them as descendants because Ubuntu is based off Debian.
And APT (with policy) still blows RPM out of the water. Lack of policy is a big part. I'd like to play with Mandriva, which uses APT tools but the RPM format just for comparison, but the fact that I can pull apart archives using ar, tar, and gzip, as opposed to relying on an inconstant binary format, has proven invaluable more than once.
Ubuntu is not only based on Debian, it still regularly merges in packages from Debian.
They should make some kind of distinction between a release that is based off of the parent, but then goes its own way, vs a release that IS the parent + additions. The relationship is much closer.
And actually it goes the other way too - debian tries to merge changes from ubunto back into it. (They have a tool that automatically compares them and suggests merges.)
Still very much the case. Their ass-backwards networking setup scripts, as just one example.
Also, Debian was not dependent on a corporate master, so by basing off of Debian you could be confident that the rug could not be pulled under you by a competitor.
The 3rd reason is that a lot of recent distributions are based off of Ubuntu, so they necessarily become Debian based distributions as well. Basing off of Ubuntu gives you the best of both worlds: you get the cadence and corporate backing, but less of the risk -- If Ubuntu starts playing hardball, you can relatively easily transition to a Debian base.
- simplified the installation process
- narrowed the focus - switching away from the perception of Debian being a 'jack of all trades, master of none'. To give an example, the initial release of Ubuntu only supported a couple of thousands of packages on 3 platforms, versus Debian's tens of thousands of packages on 8 platforms.
- increased the release cadence
However, it ended up shipping late and RedHat derivatives got a foothold. If it wasn't for Ubuntu, it would have never fulfilled it's original intention at all.
Also, since I'm commenting here, that chart shows a whole lot of completely dead distributions continuing onto the right. It would be a lot more interesting to see how long these forks actually lasted.
Every raster-image under the sun gets auto-scaled - and then it fails on the one format that was made for scaling...
AtheOS wasn't a linux distro. It was written from scratch as a hobby OS based on POSIX.
Archlinux was based on Linux from scratch and took some inspiration from crux to keep things simple (I was one of the first volunteers that helped with arch).
1) Install debian, but don't choose to install the desktop environment
2) Install the programs you want, e.g. 'sudo apt-get install xorg openbox chromium'
3) When you want to browse the web, run 'startx'
I've switched to running my personal laptop and work desktop this way. Granted I'm not a typical user, but it's surprised me how much of the modern linux desktop I can strip away and still have a pleasant experience.
Recent (and rare) release a few weeks ago (most updates are just pushed through distro-upgrade commands) : http://crunchbanglinux.org/blog/2011/11/27/crunchbang-10-sta...
The prerogative over time seems to be continued experiments in the art of minimalism and simplicity. I like that (I love that).
By default a gdm/gui/login kicks in at bootup ... I always kill that ASAP so I can "startx" or somesuch manually.
It's worth your time to try it! (You'd hear more about it but it's 100% word-of-mouth)
Sure: if I didn't have a 9-5 job I'd build everything myself on top of Arch or Debian ... I don't have the time and #! fills in the gaps for me.
Will definitely give it a try!
FYI, the latest ISO now uses SLim instead of GDM.
Thanks for sharing about rcconf. I'd never heard of that Debian tool before.
The other is the Linux Router Project which was a router/firewall distro that would fit on a 3.5" floppy disc. (http://en.wikipedia.org/wiki/Linux_Router_Project) I actually used it and was disappointed that it didn't really go anywhere. It was nice to be able to back up your firewall by copying a floppy disk :)
I'm not being facetious here, I really am curious.
> What are the main features of Sabily?
> The main software are: Zekr and Mus-haf Othman (Quran study tools), Minbar and Firefox-praytimes (prayer times applications), Monajat (application that popups prayers every predetermined time), Hijra (islamic calendar) and Nanny (parental control tool). Arabic language is also well supported. And of course the graphic design is also customized (see screenshots).
Unlike Jesux http://pudge.net/jesux/ I think they're serious.
Slackware and Yggdrasil soon followed as a way to cleanup and remove bugs from SLS.
It took me a while to think how to find it, but I eventually got there by looking up what year it was first released and scrolling down that column.