Hopefully it does not become a thing like how cryptocurrencies are + everyone wants their own coin. There are already enough distributions of Linux. What does this bring that KDE neon, Kubuntu, or even plain old Ubuntu ´KDE desktop not do already?
To be honest, I appreciate attempts like VoidLinux that tries something genuine and brings enough difference to warrant its existence and support, but this really feels like some consumer wanting their own named distribution of what already exists.
For someone who does not know Void Linux, what does it bring that is genuinely new? From the webpage's description, it looks like Arch with a different init system.
To me e.g. Fedora's Team Silverblue and NixOS seem genuinely different, since they completely rethink how installations and upgrades are done.
IIRC it is by some BSD-style people who took some of the design ideas from BSD. LibreSSL is a huge plus for me in terms of security, and the more I learn about systemd the happier I am to get away from it. Didn't know about Musl, but that's nice too [ http://www.etalabs.net/compare_libcs.html ]
>What does this bring that KDE neon, Kubuntu, or even plain old Ubuntu ´KDE desktop not do already?
Not much from what i can see other than their 'Nomad Desktop' which just seems to be some custom desktop applets for KDE. I use KDE Neon myself, this doesn't look all that appealing to me personally.
It looks like their servers are struggling with the current load.
If you are curious about Nitrux:
Nitrux is a Linux distribution based on Ubuntu suitable for laptops and desktop computers. Nitrux provides all the benefits of the Ubuntu operating system combined with a focus on portable, redistributable application formats like AppImages. Nitrux uses the LTS branch of Ubuntu as a basis using only the core system and then slowly building up to ensure a clean user experience. Nitrux is suitable for newcomers to Linux as well as *nix experienced users. Nitrux uses KDE Plasma 5 and KDE Applications; we also use our in-house software suite Nomad Desktop adding to the user experience.
Well, actually, they themselves describe KDE Neon as "not quite a distro".
> Is it a distro?
> Not quite, it's a package archive with the latest KDE software on top of a stable base. While we have installable images, unlike full Linux distributions we're only interested in KDE software.
Anyway, I recently switched my laptop over to KDE Neon after my faithful FreeBSD install broke.
I have previously used KDE-based desktops on Fedora and on openSUSE. (And a lot of other DEs and distros over the years but the point was to speak about KDE.)
For my laptop I have been very satisfied with KDE Neon. Still debating whether or not to run it on my desktop.
I also have been very happy with KDE Neon on my Dell Latitude 7370. The only downside is when I plug in a 4k and 1080p monitor, the system doesn't handle the multi-dpi setup.
Thanks. Scrolled through the whole page to see if it’s a new distribution with interesting new tech, or just a desktop spin of something.
Who knows, maybe they enabled some power management or browser hardware acceleration out of the box, which would make it immediately better than most attempts at a Linux general purpose OS. I saw no mention of that, only desktop environment features..
Speaking of which: I’m not too well versed on the state, but isn’t chromeos just a Linux browser with power management and hardware accelerated browsing? Ubuntu and fedora and as far as I’m aware none of the traditional distributions do that, instead they make desktop environments all day. Maybe it’s time for some market analysis?
I like the idea of a distribution where AppImages are the first class way of adding applications, but I was pretty disappointed in Nitrux. It's all the little things, like apparently not having a good way to display AppImage icons, and how hovering over the dock icons doesn't tell you what the hell they are. Why can't I point a widget at a directory of AppImages and get a custom menu? etc.
Christ, they might have even fixed some of this since I last used it, but I can't tell because they don't offer an ISO you can dump on a USB to try out
"...this means that the ONLY way to use Nitrux is deploying the ISO image using znx. We DO NOT recommend that you flash the ISO raw to a storage device AT ALL, please use znx." [0]
Once again, a fancy website and lots of ambitious words built around a shoddily implemented half-baked idea.
Nothing at all to do with the database issues, the site works fine for me and I grabbed the ISO, it just doesn't actually work unless you use znx, as stated on the site.
So it sounds like Ubuntu with different default GUI apps. I would almost consider recommending something like that for brand new users but it would have to be rather established. If you have the slightest experience installing packages and editing the occasional config file you can switch your experience to match this fairly easily.
Vanilla Ubuntu tends to include the mid range GUI, the one that performs OK and doesn't look too bad. Plasma should likely be the default unless you know you want a lighter weight GUI. Unity and GNOME just don't cut it for serious multiple monitor usage. I always find myself on Plasma or xfce4. One example is in Ubuntu 18.04 it is impossible to move the top taskbar off your 'primary' monitor. That is so ridiculous.
This thing is getting panned hard right now on HN, but I think it looks cool. I don't really like AppImage too much, nor Snaps or Flatpak, though. It's a terrible workaround for open source OSes not being able to provide a stable ABI. The UX is a bit nicer, though.
Still, a distro with nice, modern UX is welcome. I also enjoyed Manjaro for a while.
> I don't really like AppImage too much, nor Snaps or Flatpak, though. It's a terrible workaround for open source OSes not being able to provide a stable ABI.
It's a lot more than that. The point is that applications should be self-contained and should not depend on the host operating system, and can be installed and run in isolation without affecting each other unless you want them to.
Hopefully it will one day mean that app developers can make just one package that can be installed on all Linux distributions.
Isolation is one thing. We can accomplish that with kernel primitives we have today. Yeah, a distribution mechanism like snaps would be needed.
But snaps are simply doing too much. Like, I should be able to download and run software like I can on Windows and to a lesser extent, Android. The deployment mechanism and the sandbox and the ABI issues all need to be solved separately. Boxing an entire distro per each app is Not necessary for isolation, Not good for security, and generally not a good solution to the ABI issue that prompted it.
If it weren't for ABI stability problems, there would be no issue with still having the isolation of snaps while being able to update components of the base system like libssl.
On a side note, I really dont like how these things clutter the shit out of mounts. Like, I appreciate that they are doing so for isolation, but the side effect is that merely having an app installed adds n mounts? That seems unsustainable.
> It's a terrible workaround for open source OSes not being able to provide a stable ABI.
Actually AppImage would work significantly better if there were some stable and consistent base system it could rely on other than just the kernel. They're more a work around for the limitations of package managers, which require package maintainers, can't handle installing to alternate disks, aren't portable, are fragment as hell, etc.
Kenneth Reitz is the author of Python library Requests. After Requests exploded in popularity, he developed a somewhat unique approach to marketing his libraries, to the point where anything he promotes nowadays becomes controversial (i.e. “is it good because it is good, or because Kenneth writes about it all-cute-y?”).
What a strange sentiment. I've never once heard anyone call Reitz controversial.
I think the widespread usage of his libraries speaks for itself as to whether they are "good", especially considering they have standard library equivalents.
Am a big fan of Arrow, it's far better than DateTime.
Some of what Kenneth builds is very good, some less so. Sometimes he just pulls annoying stunts that are disrespectful of the community at large, like with pipenv. With all that “Human” marketing, he has burnt quite a bit of the goodwill accumulated with Requests, but he’s also built a critical mass of developers who will basically adopt anything he makes regardless of actual merits; which in turn generates backlash. The two phenomenons tend to feed on each other.
I found most of his projects quite useful. Sure, it might involve a bit of overselling, but the API design is really good for the most part.
Also - I don't think you can really sustain an open source project that brings almost nothing, except a really good API (which some people really value) to the table without good marketing.
There's nothing wrong with things in the linux community being well designed. Nor is there anything wrong with cutesy marketing, valuing aesthetics, a focus on GUIs, or making more distros.
This project looks very similar to the style Elementary took with their respin.
Kenneth Reitz aside, why is any of this a bad thing?
Just what the world needs... Another Linux distribution.
Seriously though, you need a lot more to sell me on a new distribution than "Ubuntu, but with AppImages". That's actually a large part of the problem of FOSS in general: We focus on the technical underpinnings rather than on the problems that we intend to solve. If the creators of this distribution want to address specific issues of the Linux desktop experience, they should do a better job communicating that, not least to attract contributors.
I was just thinking about focussing on technical underpinnings like you said, while reading their copy. Do I care that Nomad Firewall uses Qt? That's almost the first thing they say about it, and then later about how it's written in C++ with a QML user interface. That's what I'd expect for a Qt application these days, and it's also almost entirely irrelevant.
I do tend to look askance at things written in JavaScript that aren't destined to run in a web browser, but ultimately what I care about is whether it works, and I suspect users who aren't developers (that would be most of them) really only do care if it works for what they need it to do.
We keep promoting stuff as "ooh look this is written in Rust!" or "it's in JavaScript!" but the important thing should be the effects of those things - i.e. you might choose to write something in Rust for performance and reliability reasons, and to write something in JavaScript for portability.
The technical underpinnings arent unimportant though, because most abstractions are leaky. So, something built on something approachable is more approachable when something breaks. So its not strange the underpinnigs are mentioned.
We seem to have a cultural inclination to fork first and ask questions later; I would love to see FOSS push more people to contribute rather than coming up with yet another new product. The amount of personhours spent writing the same code, over and over again, is staggering.
The actual Ubuntu has already tried to move to the AppImage-like package management: it's when you put all dependencies of the program into the package. These "Click" packages were used for the UbuntuPhone applications. There were also "Snap" packages for more server-oriented use cases.
The whole deal is about the software developers who are having hard time managing the dependencies. Because they've got to update the code all the time. Especially bad for the proprietary ship-once-and-forget software.
Ubuntu has not changed the package format. To be used, it has to be tested. To be tested, it has to be used.
One that isn't snapshot images (docker, bundled java/ruby/libraries)
One that takes testing and integration seriously
One that takes security seriously
One that uses c-groups like they were intended
One that allows multiple versions
There are a lot of things that could come together for a new style and generation of package manager, but to do it right you would have to manage a whole lot of things
Honestly not sure why this is downvoted. Would really like to hear from the contrarians since none of this strikes me as controversial.
I am curious why "snapshot images" are bad though. What's the fundamental difference between this and shipping a static binary (they're just a different image format, right?).
hardly.... the whole point of dynamic linking (and shipping or not shipping dynamically-linked binaries) is to reduce the amount of redundant code on a system. (it also helps keep update sizes down.) If you're shipping application images then youre shipping copies of all your library code -- you might as well have made a static binary; it certainly would be easier to work with.
The problem with dynamic linking is that experience has taught us that it is more trouble than it is worth in a lot of cases. It's great where it makes sense, like in base system components (but only if said components are stable so they can be relied upon), but otherwise it's just a headache. There's a reason static linking, Docker, etc. have become so popular.
> you might as well have made a static binary; it certainly would be easier to work with
Not all programming languages can produce static binaries. That's the point of an image format to begin with.
Reducing redundant code is a nice idea, but there aren't any systems^1 that solve for dependency hell and disk efficiency, and I'd rather trade a few hundred MBs of disk than fight with dependency hell.
^1: This isn't entirely true; Nix (and friends) do solve for this, but it's a relatively niche system and it's got its own usability issues that are (in my opinion) well worth sacrificing some disk to avoid. If these systems cross that threshold, then I'm happy to jump on that bandwagon.
As a user of linux on the desktop since 2003 I haven't seen any "dependency hell" save for packages not designed for the current distro/version or software not properly packaged. Usually proprietary crap NOT in your distros repos.
APT (on linux) and MSI (on Windows) platforms both attempt to solve for those factors. Both are also bedrocks for the the operating systems they are designed for. Everyone else on this thread seems to be skipping right over both these names. Why?
Can't speak to MSI, but apt doesn't do a very good job. For me, this manifests in an error like "one of your packages is broken, run again with the -f flag to fix it" which has never ever ever worked for me ever. I've had some luck with aptitude in the past, but it often fails to resolve because you're only allowed one version of a package on your system at a time. Static linkage on the other hand is pretty close to foolproof.
Ok, perhaps I should have said I’m not aware of any Linux package managers that support installing multiple versions of the same package except for Nix which manages it by munging the dynamic library lookup paths at runtime so the binary loads the right versions of its dependencies. Without this, I don’t think it’s possible to (correctly) support multiple versions, and even then the package maintained must take care that multiple versions of the same package don’t wind up in the closure (or the loader will load the first it encounters in the search path).
Security at the package manager level is closing the barn door after the horse has gone. As far as I can see you can't have a useful system and be able to install malware safely. The best choice continues to be curation.
Testing is logically discrete from package management its entirely possible to test well or poorly with any system.
For several other points nix also featured on the front page right now seems interesting.
Does this new package management system mean that each package is deployed with it's own version of dependencies? Or are dependencies only installed once, in a central location, like they are with 'traditional' Linux package management systems?
I see it as evolution, they all explore the feature-scape and find interesting and new ways to interact with your computer. Or they find new development models or new ways to make money. Some distros are short lived but their ideas may live on as things that stick and increase in frequency in the next generation of distros. I doubt KDE would look as it does without Gnome and vice versa. Ubuntu was also at some point yet-another-distro. Etc.
They are doing a thing that they care about. Perhaps others might care about it, too. You clearly do not. Therefore, you're not the target market. Why complain about people doing a thing?
They refer to themselves and some of their software as NX multiple times. Nitrux OS is the full name of NXOS, just like RedHat Enterprise Linux is for RHEL.
Uses rpm-ostree for atomic updates with desktop applications run in Flatpak containers and development done in VMs or Docker/OCI images.