Smart move by Valve IMO. Canonical is offloading a lot of work onto them if they continue down this road, and Valve easily has enough sway to convince people to use a new distro.
I think it'd be a huge mistake for Canonical not to reverse their direction on this. The original decision was defensible when it wasn't obvious how many problems dropping 32 bit support would create, but obviously it's a big issue to a lot of their customers.
Why would Canonical care? They don't get any money from desktop users. Canonical get to scrap 32 bit on their desktop and server offerings. Why would they want to rethink wasting that money just to prolong the time when they abandon it? Where's Steam going to go? How can they be certain that whatever distro they switch to won't eventually abandon 32 bit? (I mean, I guarantee they will.). Will they then go back to Ubuntu? Or a third distro, then a fourth.... Why don't they just fix their problem now and get it over with?
They may not make money from desktop users, but all their historical actions indicate that the desktop is very important to them.
> Where's Steam going to go? How can they be certain that whatever distro they switch to won't eventually abandon 32 bit? (I mean, I guarantee they will.). Will they then go back to Ubuntu? Or a third distro, then a fourth.... Why don't they just fix their problem now and get it over with?
Well, look at it from this perspective: from a technical perspective this is hard to solve, and the project that they use in order to make this possible (Wine) seems to be lost on fixing this. If Valve doesn't make much money off selling Linux games, they're not going to invest resources to fix this problem Ubuntu has just created. The big loser here is Ubuntu users. And the "fix" involves fixing 32-bit Windows binaries, which from a practical standpoint none of the companies that were only releasing Windows binaries in the first place are ever going to fix this. This isn't about games, this is about software compatibility, and Ubuntu just announced they don't care that much about it.
Valve doesn't care as long as SOME distro supports what they're doing, and there's definitely going to be some distro that says, yes, supporting gaming is worth it to us. Also, if you care about Linux on the desktop, companies like Valve are who you WANT on your side. The last thing you want to do is alienate a company like that, and have other companies say: "well, look what happened when Valve tried to support linux"
If you're building a platform, which is what Ubuntu is, the last thing you want to say to potential users is: "you can't do what you want here". Games may not make Ubuntu money and they may not be on the radar of a lot of users, but being "the platform that's bad for gaming" and "the platform that can't run Wine apps" is a bad look.
Also consider who has the leverage? Valve is offering a massive catalog of games that can't be easily obtained any other way. Ubuntu is is competing with other distros, Windows, and older versions of their own software. And they have virtually zero lock in -- if I want to use a different linux distro it's pretty easy. If I want to get the games that Steam offers in an alternative fashion, on the other hand, in most cases I'm SOL.
So Ubuntu has essentially zero leverage here, and worse they're shooting themselves in the foot for no reason, since nobody forced this on them. As the article states, Debian still supports this, so in a sense they're going out of their way to make life more difficult for their users.
A majority of Windows games can be played more or less flawlessly at near native performance on Linux via Steam's Wine-based Proton compatibility layer. This will break with the removal of 32-bit libraries in Ubuntu.
So Valve just clones the libraries they need and redistribute that on the Steam repositories. They already have the CI infrastructure for Steam itself and the Steam Runtime, a giant blob of set-in-stone libraries.
Indeed. If you look at their website now they've gone from showing desktop Ubuntu front and centre to hardly acknowledging it except as an OS that a developer might want to run.
That's not true, as much as people want to believe that.
Ubuntu and Canonical jumped on AWS quickly and started to generate supported images that people liked that gave them fresh stuff that they couldn't really get with Amazon Linux and RHEL was still stuck in the 90's enterprise pricing model, so Ubuntu was a natural fit there.
Sadly, most Ubuntu users are on mac clients, and Canonical isn't going to care about 1 laptop when they need to spend the money to support the millions of instances that one laptop is spawning on AWS, Azure, GCP.
The few "linux desktop" people who are left will still have the same desktop, as long as they get their vscode, docker, cloud tools, and a terminal/ssh they won't care much.
1% of 90 million monthly users is almost a million users.
Supporting something with 900,000 users seems, at the very least, somewhat compelling. Especially since they were already doing it. It's not like they have to actively recruit these people, but what they're definitely doing is telling them to f-off.
Also, this isn't just losing Steam: wine apps with 32 bit installers is a major source of software.
I know personally I actually do want to use a Linux distro now that Unity 3D support is coming, and Ubuntu would have been my first choice, but I know I'm definitely going elsewhere now :-/
If anything loss leaders are more important for platforms, since learning a platform is a big time investment and people will reach for what they know when the time comes to choose something.
I can't claim to know if this was the right economic decision for Canonical because it's a very complicated question, but what I can say with certainty is they just gave 900,000+ people a reason not to use their platform, along with sending a strong signal to a lot of observers that might be thinking about investing time in their platform. Is the engineering time they save going to gain them something that makes them attractive to 900,000 new users? That's a lot of people to appeal to.
Canonical already gave up the desktop long time ago, following up Red-Hat footsteps, because those loss leaders were dragging the company down into insolvency.
I'd say it's a stronger signal that they're a smart company that knows when to say enough is enough. 900,000 is objectively a large number of users, but if those engineering resources are better put to use service the larger customer base, it's simple a good business decision.
I get pissed every time a service drops support for something I haven't updated in 5 years, but as a business owner, I also actively discontinue services that cause more headache than revenue. It's simply specialization.
Valve still need an effective hedge against Microsoft and Apple walling off their desktop/laptop platform in the same way virtually every platform is presently walled off.
Convincing someone else, like valve to do some or all of the work would have avoided the negative PR that Canonical ate.
By time the third OS comes into play there will probably be enough shims that Valve's work will be drastically reduced. Valve is playing the long game in everything. That is their operation style. Which seems strange when most every other company is all short term profits.
So it seems like if they're playing the long game they would have fixed their 32 bit issues a long time ago. From a software engineering side, relying on 32 bit libraries seems like big pile of tech debt.
I've been experience a similar issue trying to get people to migrate off Python 2.7. And managers keep prioritizing features over the "long game".
Now if you say they're prioritizing the platforms with the larger customer bases, that makes more sense. And it's not the long game, its just enhancing the profitable side.
The vast majority of those 32 bit windows games are not written by valve. Valve in fact provides native Linux binaries for its own games (but they might still be 32 bits).
Nobody is going to recompile thousands of 10 year old windows games just to please a bunch of gamers on Ubuntu.
They did just finish a project that allows most games to be played on linux. Odds are they know what that tech debt looks like and it isn't something that has a reasonable time frame to talk about.
I can't think of anything common it's an issue for outside of Wine and Steam, both of which have already been snap-ified before. For the corner cases of legacy 32 bit proprietary software which requires 32 bit system libs 18.04 will be supported until 2028.
Other distro's have already went this route without the world burning down.
True, but Wine and Steam are very common use cases on the desktop. While both applications could probably go the snap route, I'm not sure they'd be up to taking on then entirety of the 32-bit stack support across all platforms which is what they'd effectively have to do (maintaining the 32-bit stack just on some platforms while using the provided versions on others would be more work for them)
Canonical has, yet again, made a mistake here if they care about the desktop at all. Sure, 32-bit x86 hardware is pretty much obsolete. All 32-bit software? Not so much. Seems like the reasonable compromise would be to identify the subset of 32-bit packages/libraries needed to support Wine/Steam and continue to maintain those rather than the entire 32-bit version of the distro.
> ...Valve easily has enough sway to convince people to use a new distro
If Valve supports some distro, then it'll be possible to put that in a container and run it on Ubuntu. That part could probably be automated, and users may not even notice the difference.
I don’t know enough about the underlying technology to say if that would work well, but given the level of hardware access games need that sounds really dicey. Ubuntu is a fine distro, but it doesn’t really offer anything terribly unique you can’t get elsewhere, whereas Steam definitely does.
This is actually very much possible, and will not cause much of a performance degradation (if any).
You can easily directly expose the underlying hardware to a container, and it is in fact very much possible to run games from a container.
And I say this as someone who ran games on NixOS inside an Ubuntu container as a workaround (because nix is pretty weird when it comes to dynamic linking).
From Canonical's perspective, how important is gaming? If it's costing them a lot of time and money to support it, and bringing in approximately zero revenue, is it a mistake to drop it?
It's not all about revenue. Ubuntu has generally had the reputation as the most user friendly and easy to use flavour of Linux for years now. Not always justifiably, and they've made some questionable decisions lately, but it's still been the obvious recommendation for new Linux users, and the one people are most likely to have heard of. A pretty large percentage of technically minded desktop PC users (eg, those most likely to install Linux) are going to care somewhat about gaming.
If someone asks the question "what distro should I use for gaming on Linux", the answer is usually "any is fine, but Ubuntu's probably the safest option". With this decision, the answer becomes "literally anything but Ubuntu". Ubuntu became popular because of good sentiment and word of mouth. And the same geeks trying out Ubuntu on their laptops are the ones who end up making the decisions on what distro their company should use on their servers/VMs too.
I don't know how much general popularity really helps them, maybe it really is irrelevant, and if everyone stopped using it on desktop tomorrow they'd still retain their popularity in server/cloud. But I feel like the only reason they became popular on servers is because they're popular on desktop. There aren't many reasons to use Ubuntu over RHEL or some variant on servers, except that more people will go "oh yeah, I've used Ubuntu before, I know how to use apt".
Well, the difference between distros is pretty marginal as an end user. If two of them are roughly equivalent but one offers the massive Steam catalog, that seems like a major selling point. They might not directly make money off of it, but they could lose users over it.
One big issue from what I read are installers for Windows software running in WINE. The installers are often still 32bit, and without 32bit support they won't work anymore. So even if the actual Windows application is 64bit, you can't run it in WINE because you can't install it.
This doesn't really surprise me, one big specialized software I used a while ago couldn't be installed under Windows 7 64bit because the installer still was 16bit. I don't think anyone wants to touch a still working installer unless they have to, so they stay behind in technology.
To add to that, the reason these installers are 32 bit isn’t necessarily legacy or laziness. If you’re packaging your software to run on both 32 bit and 64 bit it makes sense to make your installer run on the lowest common denominator and install the appropriate version, as opposed to providing two installers and leaving it up to the user. So 32 bit is the obvious choice. On windows this isn’t a problem because Microsoft rarely drops support for things.
This is kind of the same reason 16-bit installers were a thing for NT software. In the mid 90s when NT ran on four different processor architectures all of the non-x86 platforms had emulation for 16-bit x86. So MS recommended that you bootstrap your install with a 16-bit process that checked it was running on a 32-bit platform (i.e. not 16-bit Windows) and then launched the appropriate 32-bit installer for the CPU.
FWIW Wine can also install 32bit software with 16bit installers since it does support protected mode 16bit code (which is the vast majority of 16bit Windows code anyway).
Also on modern 64bit Windows you can use https://github.com/otya128/winevdm/ to run some 16bit programs. It is hit and miss, but most 16bit installers work fine.
Worth noting that removing 16bit support in windows is an enormous security win, because the DOS stuff relied on things that are really bad for security. 32-bit support is a lot more sustainable.
Wiki suggest 16bit of Address Space Layer Randomization can be bruteforced in minutes, citing source:
Shacham, H.; Page, M.; Pfaff, B.; Goh, E.J.; Modadugu, N.; Boneh, D (2004). On the Effectiveness of Address-Space Randomization. 11th ACM conference on Computer and communications security. pp. 298–307.
Nobody expects aslr from 16 bit legacy code with only 1mb of address space. I think you are describing 16 bits of randomization on a 32 or 64 bit process though.
What I mean is that the virtualization of 16 bit code (using vm86) is isolated from a security perspective, it won't violate process boundaries or kernel security features like ACLs. I believe this was less true in 9x which might be where the confusion comes from.
What sort of security critical software will run as a 16bit Windows program and be the target for an attack accessible from wherever those attacks enter the system?
Dunno about on Windows, but the Linux kernel support for 16-bit code has lead to some nasty, subtle privilege escalation exploits due to the really gnarly processor state changes required.
It's been a few years since I wrote one but at least back then windows installers were an amazing mess. We used Wix as an abstraction on top of the underlying libraries (which at the time were still win32, not sure what's happened since). I'm not sure if any progress has been made since but it was a pretty painful experience, especially localization.
Every Proton incompatibility I've faced so far has been something like this. e.g. it can't boot because the DRM doesn't work, or the paths don't resolve properly. If I can get to the options screen it's plain sailing.
32-bit closed source software won't be suddenly recompiled to satisfy some open source purity standard. People will stop using Ubuntu for games - and "linux has no games" will be a strong argument again. Gaming is a huge issue on linux that only recently was solved with Steam+Proton and Ubuntu team just decided to screw with it.
Play stupid games, win stupid prizes.
Mir is actually still alive and being developed at Canonical.
Ubuntu phone (the software stack of Mir and Unity8, running on Ubuntu, on phones, as opposed to a phone backed by Canonical) is still alive, usable and arguably the most mature of the FOSS "GNU/Linux on a phone" projects.
And in a few years you can just switch to flatpack or whatever it is that wins out, not like you're locked in for life just the packages exists as snaps today.
Upstart was originally developed by Canonical. Several years after it shipped in Ubuntu, it was shipped in RHEL6. RHEL7 replaced it with systemd.
While it probably had some RedHat contributions, to call it a RHEL thing does not make much sense.
Now that the WinRT scare has passed and Valve no longer fears Microsoft imposing an App Store-style hegemony over Windows, Valve has little external incentive to continue maintaining tools targeted at Linux users. The life raft is no longer strategically valuable, and 19.10 is a convenient cover for spinning down Ubuntu support. I wouldn't be surprised if this is followed up in a few months' time by an announcement that the internal Linux team is being dissolved altogether.
I wonder if this statement is mostly an attempt to get Ubuntu to reverse course. It seems unwise to pull support from the most popular distro if it could be supported with additional technical effort, such as bundling the libraries with Steam. Of course that's effort that Valve would likely prefer to spend elsewhere.
It almost certainly is but it's unclear whether it will work. Canonical is a surprisingly small company. Its revenues were $110M last year[0]; compare this to Valve, which had an estimated $4.3B in revenue in 2017 or Red Hat, which had $2.9B in revenue last year. Canonical has to keep a sharp eye on staying profitable and, the last anyone heard, their main focus was on their server distribution[1] because the bulk of their income was from server support contracts. Expending manpower on continuing to support 32 bit binaries probably isn't worth what it costs them regardless of what pressure Valve tries to apply.
I think lots of people are missing the clear economic inbalance here.
If you're gaming on Linux and you buy a game:
The creator of the game gets a cut, the publisher gets a cut, Valve takes it's cut, NVidia/AMD/Intel got a hardware purchase in there at some point, the porter (Feral or whoever) gets a cut.
The Linux distro gets .... absolutely zero. On top of that when things break, people blame the distro. I was at Canonical when we founded a community run PPA to maintain fresh Nvidia drivers, and we couldn't even secure the funding for _3_ GPUs for volunteer developers, we had to apply for community funds. All that so a handful of volunteers can do _years_ of free engineering work that is clearly Nvidia's job. They wouldn't even spot us 3 GPUs for the effort!
It's sad, there are people on the Ubuntu desktop team that have been busting their asses for 15 years trying to make the desktop a thing. It's just not a sustainable unless people were overwhelmingly buying a ton of linux games.
> On top of that when things break, people blame the distro.
That's because it is often the distro's fault. That's one of the reasons it has been historically such a goddamned pain to port software to Linux: there are bunch of distros and no standards you can rely on. It's a non-platform.
Sure, the only way to fix that is with more QA, which as it turns out, no one wants to pay for (on the desktop).
EDIT: And even if you have better QA and the distro is rock solid, when that binary blob Nvidia gives you that your engineers can't fix breaks, you'll still get the blame.
> ...when that binary blob Nvidia gives you that your engineers can't fix breaks, you'll still get the blame.
The more things change, the more they stay the same. When MS was getting sued for underspecing system requirements during the Vista era, one of the interesting bits that got revealed to the public during the trial is a chart of Vista crashes/lockups by source:
The biggest single source? nVidia at just under 29%. Another 9.3% was attributable to ATI (now AMD). Only about 18% were directly attributable to MS itself. Video driver crappiness was reportedly a huge motivation for Microsoft's switch to the WDDM video driver model.
I think honestly Canonical is just "pulling a Redhat". Redhat walked away 20 years ago from the consumer market to focus on the server market (Fedora came much later.) Server is where they make money. They don't get any money out the video game market. The users don't give them any money. Neither does Valve. Nor the game developers. Nor the hardware developers. They need money to develop and maintain stuff and grow. They might reconsider if there is push from the server customers though.
My employer uses Ubuntu for most developers, meaning all our software is designed to work on the latest LTS Ubuntu and we (I assume) pay a significant amount for support. But the only reason we use Ubuntu is because it's more familiar to the average developer than RHEL.
Right. Valve wants gaming on Linux as a hedge against Microsoft. They are swimming in money from selling cosmetic items in TF2 and Counterstrike and DOTA. So they can subsidize a bunch of Linux development that has no short term revenue value because they view a single dependency on Windows as a threat to their Steam vig revenue.
Canonical makes no money supporting Linux gaming. Their customers, their actual paying customers, want servers. So that's what Canonical is going to focus on.
I remember several years ago thinking that RedHat was the be-all end-all of user-friendly Linux distros. They were on a nice track back then and I could see them taking the Linux market by storm at one point. Then they started going off into this weird enterprise tangent, and that's about when Ubuntu came in and swept up vast amounts of the market, eventually including a lot of the enterprise folks.
Ubuntu, too, has made some rash decisions at times. They cling to their sluggish developer-unfriendly Unity UI, randomly kicked out ffmpeg from 14.04 for political reasons, among other things. These kinds of rash decisions cost them every single time.
I understand you. I was a RedHat user during 98-2002.
At that time RedHat was the king of Linux desktop, and “the year of Linux desktop” was predicted multiple times.
But in retrospective, I think that RedHat did a good move. Where is Mandriva/Mandrake Linux today?
Even SuSe which is doing fine, didn’t have the growth of RedHat.
It was sad for users hoping for the year of Linux desktop that never materialized; but a good business decision for RedHat.
To some extent Mac OS X was the "year of the *nix desktop". Not Linux, but it could just easily have been a Linux kernel instead of BSD if they wanted.
Android is the Linux phone. 3 billion people are running a Linux kernel in their pocket right now.
It did happen, just that it didn't unfold in the way that RedHat users originally imagined.
Sure, I don't think we should stop at Android. The reason Android and MacOS won market share is because those companies built an ecosystem around it, including hardware, and very importantly the end user never needing to touch the command line. They built a lot of exactly what pure Linux lacks.
Ubuntu is pretty far from that. I still have to do quite a bit of command line fuss to setup and maintain my Ubuntu system. Fine for me as a dev, but not fine for mass adoption.
That "weird enterprise tangent" is a billion-dollar business, and no matter what the market share is for Ubuntu in the enterprise, the revenue share is very, very much in Red Hat's favor.
Are there numbers on this? I wouldn't be surprised if enterprise Ubuntu support is already the new thing, seeing as so much enterprise-friendly hardware is already very much Ubuntu first. NVIDIA's machine learning products, for example, are mostly designed for Ubuntu, and most server-side stuff these days tends to be released with .deb packages first.
I wouldn't be so sure of that. Ubuntu gets a cut every time somebody spins up an Ubuntu VPS on AWS. Yes, Red Hat has more revenue, but Red Hat is a lot more than Linux these days.
Red Hat is a lot more than Linux, but that revenue disparity is huge, and I bet they are licensing out RHEL to everybody who buys other stuff from them.
The distribution has been slightly modified to work on EC2. The Ubuntu license allows you to do that and still distribute it, but not under the Ubuntu trademark.
I'm fine with this, but it is a bit annoying to see a developer not deal with 64-bit builds. I wonder if it's only because Windows allows them to be lazy while it generates more money for them?
Either way I think valve's Linux team should rethink this. They depend on 32-bit libs, but Surely they can find alternatives for the 64-bit build and who knows it might even make the experience on Linux even better.
My guess is that there are a lot of third party developed games which only work on 32-bit compatible systems and Valve either doesn't know exactly which ones are which or simply doesn't want to put the effort into organising them such that users are warned about which specific games have incompatibilities. The other option is just letting the games that would only work on 32-bit compatible systems fail with cryptic error messages and have to field masses of complaints about it while looking extremely unprofessional in the process. In this case it's not that surprising that they would just decide to drop support, especially considering that Ubuntu is a relatively small chunk of the market to begin with.
It's pretty unfortunate. I totally get Canonical's desire to get rid of the seemingly unnecessary bloat and maintenance burden from having to support 32-bit applications but this is the consequence and it's a big blow to gaming on Ubuntu, especially seeing as Steam has only been supported on Ubuntu since 2013.
Ubuntu makes its money from server and enterprise clients. If Valve needs to support 16/32 bit for its customers then it's up to them to do it. I expect Valve is a much bigger company than Ubuntu, maybe Valve can contract out this work to ... Ubuntu instead of just freeloading.
Valve has contributed a ton of work to Ubuntu/Linux. They’ve pretty much single handedly turned around Linux gaming to the point where I can safely expect most windows-exclusive games on steam will run smoothly on Ubuntu. Freeloading couldn’t be farther from the truth.
This is not fair or true, Wine developers did a lot of work, the big progress happened recently because Vulkan drivers that allowed a better reimplementation of Directx11, Dx9 worked fine before Proton and Vulkan.
Valve made a big contribution now but is not correct to give them all the credit.
> They’ve pretty much single handedly turned around Linux gaming to the point where I can safely expect most windows-exclusive games on steam will run smoothly on Ubuntu.
Steam being 64bit is minor issue really, the problem is that the vast vast vast vast vast vast vast majority of games on both Steam and Windows/Proton and everywhere else are 32bit. 64bit being the majority of new stuff is really mainly a Linux/Mac thing.
On Windows (and thus Wine and Proton) most stuff (i'm including everything here, not just popular applications) is either 32bit only or 32bit/64bit hybrid - 64bit only is very rare. Even in games where you are more likely to see 64bit, it is often something you see on big AAA games that need it - most smaller games are 32bit, even stuff released recently.
As an example here are some recently released games on GOG from my account: "Corpse Party: Sweet Sachiko's Hysteric Birthday Bash", "Tsioque", "Dex" and "Pillars of Eternity". I'd list more but i do not have much free disk space to check the exe files (and my current computer is weak so i do not buy new games much), but these are games released the last 3-4 years or so (the first one was released just a couple of months ago) and they are available only in 32 bits.
In terms of non-gaming software, almost everything out there on Windows is 32bits. In general if a program doesn't benefit from being 64bit (that is, need to use a lot of memory), chances are it'll be a 32bit executable.
And of course these are just recent things. On games alone, just my GOG library is ~550 games of which only a tiny few provide 64bit binaries (and i expect my Steam library to be similar).
On a personal note, whenever i release something on Windows, unless i have a reason for it to be 64bit, then i release it as 32bit - it will run in more systems (be it natively in older computers - note that netbooks/ultrabooks with 32bit Windows were sold until recently - or via VMs) and use less RAM (i think i read recently that there was an attempt to make something like x32 on Windows that would solve this, but i can't find it anywhere and anyway that would only work in future Windows versions whereas a 32bit program will work practically everywhere).
32bit code, at least on Windows, will be here for pretty much as long as a CPU on the 8086 lineage exists. And for as long as you want to run Windows code on non-Windows OSes, you'll also need those OSes to support 32bit code.
I think Ubuntu needs to rethink this. I am a software vendor and ship software on Linux and I had to switch to 64-bit because 32-bit was too much of a pain in the ass.
I couldn't afford to let go of 10% of the users who are on Linux, but Valve can. Which means I will use Ubuntu less for fun, and more for just work.
>I'm fine with this, but it is a bit annoying to see a developer not deal with 64-bit builds.
The developers have often just moved on. And when these games were made, 64bit didn't even exist or it wasn't worth considering. Deadline constraints and such mean corners are cut, and portability across architectures often isn't considered a priority.
If they are dropping Ubuntu 19.10 over lack of 32 bit support, it's probable that they're going to drop support for Catalina and up as well. I don't think they've made a statement yet, however.
Dropping support for Catalina would be akin to dropping support for macOS entirely, while dropping support for Ubuntu is not the same as dropping support for linux.
In fact Valve already have their own distro, and it's not based on Ubuntu.
Hmm, I've done some googling and it looks like Valve has a 64 bit version of Steam for macOS already. You may need to delete Steam.app and re-install to get it. So you will still be able to use Steam itself on Catalina+, but will need to wait for 32bit only games to have a 64 bit version made available (or for someone to come up with some kind of compatibility tool)
Quite a lot of games on Linux rely on Wine to run. They are 32-bit Windows games packaged up with a 32-bit version of Wine. The 64-bit version of Wine still has quite a few bugs and really shouldn't be used for commercially sold software. Valve is making the correct decision here based on Ubuntu dropping 32-bit support.
I’m not sure why valve doesn’t just ship a 64bit client with their own compatibility layer(s).
There’s a lot of problems on various linux distros hunting down and setting up the correct 32bit libs. Windows is fine for obvious reasons, but valve already seems to want to move to a more platform independent approach.
Valve already ships the equivalente of a whole Ubuntu 12.04 32-bits included in Steam to be used by games.
I would really like if Valve could move Steam to use Flatpak or some other thing instead. Their bundled libraries are a mess and lots of time doesn't work correctly.
Someone even made an unofficial flatpak for Steam, too. It's on Flathub and it works great for my needs. Makes perfect sense with the way Valve packages Steam already, since they try really hard to provide the same environment on every distro. The only problem I have with it is you can't easily add additional library folders due to the sandboxing. (Need Valve themselves to add portals to the application code so Steam can pop up a native file chooser which grants access to whatever file the user chooses, and there's no directory tree portal anyway … yet). But that's solvable. Much better, long term, than Valve stubbornly clinging to whatever distro continues to build for an obsolete architecture at their own expense.
I'm running Steam on a 64-bit Ubuntu 18.04. What changes with 19.10? I figured Canonical was only announcing the end of the i386 installer; presumably this is not about that?
This kinda sucks for me, as a Linux based developer who’s also a gamer. I’ve been gaming on Steam with the Proton compatibility layer, and most of my Windows-only games work flawlessly. If future Ubuntu upgrades break this, then I won’t upgrade until I find another solution.
Which one between openSUSE and Fedora is the most community driven? I'd rather see them break away from targeting a Linux distribution that is mainly funded from a company that gets their income from servers/cloud. They need to either go with a distro that has a desktop oriented company behind or with a distro that is community driven.
Valve is currently maintaining SteamOS which is based on Debian, so why wouldn't they want to go with that? The Steam client is already in the Debian non-free repository.
Correct. Fedora was the distro I switched to. I found the standard WM (technically DE, but I'll allow it) to be slow and clunky, and it's performance improved with Cinnamon. I'm more of an openbox guy though.
If this does happen and Valve pulls support for Steam on Ubuntu, I'm probably going back to Windows. This news is depressing since I've been a happy Ubuntu user for about a year now.
Many people including myself play games that require very low latency (overwatch, csgo) at resolutions (1440p or 4k) and fps (144) that I don’t think streaming will be able to do.
The first consumer 64 bit processors came out a decade and a half ago. Why is legacy 32 bit still a thing? It's not like anyone can say they haven't had ample warning time that 32 bit deprecation was coming.
Video games are more media than they are software. Sadly their format is software but their release is media. Old games aren't actively supported and won't be updated. You need to ensure some kind of compatibility with 32 bit software or drop a huge amount of support for games. And if you don't, the video game industry will probably drop you.
I think you will probably need some kind of compatibility wrapper you can wrap 32 bit binaries in that will make it compatible with 64 bit systems.
This. Games are done when they're done; much like finished novels, films, or music, once the developer considers it done, it won't get updated anymore. Also likewise to film or music, if it ends up becoming a classic, it might get remastered or remade a decade or two later, but those are the exceptions rather than the norm. The vast majority of games will just stay like when they were originally made, much like the majority of novels, music, and film.
Makes sense. I guess this isn't close to anything like what I use Linux for, nor is it a core requirement of a modern Linux desktop. I'd be surprised if I'm running any executables on Linux that were compiled more than, say, a year ago. The Ubuntu release cycle is every six months, with a lot of software updated more frequently than that.
And yeah, maybe the compatibility layer should be provided by Steam itself, rather than the OS. Could even just be a container or a thin VM. I don't see why it should be the OS's responsibility to continue supporting 32 bit executables indefinitely. Certainly that's not something I would want Ubuntu spending a lot of effort on, as it doesn't benefit your basic Linux desktop, server, or development use cases.
Steam for linux is heavily dependent on WINE. WINE has been reluctant to move to 64 bit because, as I understand it, it will mean that 32 bit executable(which are the majority of games they support, I think) won't be able to run with it[0].
At least that's how I understand it. It's more about the issues with backward compatibility then inability to create a 64bit version.
Wine has had 64 support for ages. A typical wine installation is multi-arch, supporting both 64 and 32 bit windows programs which is very important because running older 32 bit only programs is a huge use case for wine. The post you linked seems to be about the difficulty of packaging the 32 bit support for 64 bit only ubuntu.
In media production, particularly film and audio, I am extremely dependent on, and enjoy far more more often than not - legacy plugins that may not even be manufactured any longer, or older versions I simply prefer.
This is certainly not limited to the scope of media production. Heck, I even still keep a PPC iMac G4 with Tiger so I can play older Mac OS 9 games like Tony Hawk's Pro Skater 2 and Riven.
The point is, many people prefer or are locked to older software that simply won't be updated for various reasons. When MacOS drops 32-bit support completely, I won't be upgrading. The warnings in Mojave are already bad enough.
My late 2013 MacBook Pro doesn't have issues with Final Cut Pro 7 in Sierra, which I vastly prefer for a lot of projects. It can handle 1080 just fine, I'm not interested in 4K for my purposes at the moment.
For me it's more older VST's - I also keep a copy of Logic Pro 9 around for that exact reason.
I certainly don't edit film on my G4! little laugh
There aren't many reasons to use 64bit if your program doesn't use much memory so if you stick with 32bit your program will use less memory (64bit pointers are mostly a waste of space) and run in more systems (those that for one reason or another cannot run 64bit code - note that this isn't just the CPU not being 64bit capable, firmware could also prevent 64bit support and some low end machines with 2GB of RAM could also come with 32bit Windows preinstalled which save both disk space - no 64bit libraries - and RAM).
End of support for 32 bit seems like a pretty damn good reason not to use it, I must say. It's better and simpler overall if everything is on the same page and you don't have to worry about compatibility issues; this gives more benefit than whatever marginal benefit exists for allowing small programs to run more memory efficiently.
My comment was about 32 bit code in general, not just about Ubuntu.
There is no end of 32 bit support, neither on Windows nor on Linux (the kernel itself can run even on a 486 IIRC) and AFAIK even macOS can still run 32 bit code.
Of course individual distributions like Ubuntu can drop 32 bit support (and personally i do not see it as much of an issue since you can switch distributions and Valve can simply recommend one that doesn't arbitrarily break their users' programs), but that is just the distribution doing it. The kernel can still run 32bit code and with enough effort even 19.10 will be able to run 32bit programs, the whole hubbub is about avoiding that effort.
Nobody cares if 32 bit is outdated. People still care about running windows games on linux which are still mostly 32 bit including things released yesterday not 15 years ago.
Steam has a lot of 32bit only games. And it relies on WINE to run them. I doubt they have 64bit version of all their popular games.
That is why windows is king for so long - the thought of dropping backward compatibility is unthinkable. Instead you do horrendous hacks to get things to work (such as having two different system folders - 1 for 32 bit executables and one for 64bit executables)
Yeah, they're not going to drop 32-bit for the foreseeable future. They see the value in backwards compatibility, despite HN readers continuously criticizing them for the lengths they go to accomplish it, or straight up ignoring how much effort they've put into it. There's too much software that is only 32-bit, particularly in businesses, for it to be anything other than an ideological argument for "purity".
Considering new software is released as 32bit only as recently as today, this "eventually" will only come if PCs stop using CPUs from the Intel 8086 lineage.
It's obviously more about the games themselves than the client, which are much more difficult to update in many cases and offering only 64-bit games would, AFAICT, vastly reduce the number of available titles in the library.
They already distinguish platforms, if they just recognize 32-bit and 64-bit as platform features and recognize that games might require support for either or both features, this would be almost a non-issue and they would be more future proof.
Surely this is steam by proxy, how can they claim to support a platform if none of the 32 bit games work? I mean they could also probably port steam to NetBSD, but i doubt any games would work, so they wouldn't claim to support NetBSD.
nobody would port old games that don't give revenues.
For example, if Paradox/Cerberos don't move a finger to fix a stupid annoying bug of a game that have like 6 years old (Sword of the stars), why hell they would port to 64 bits ?
There are some reasons to keep software as 32bit, like these: https://blogs.msdn.microsoft.com/ricom/2009/06/10/visual-stu... (tl;dr - bigger pointers mean higher memory consumption, they don't want to break third-party plugins, and they're porting to managed anyway).
Also, either I'm misunderstanding something, or Ubuntu is throwing the baby out with the washwater here. For example, Arch was always aggressive about not supporting anything but the typical desktop at the time of release, but the 32bit packages are still in a separate repo if anyone needs them.
It wasn't really until windows xp died that you could reasonably expect to drop 32 bit support, and windows xp lasted forever. Mix that in with 64bit not providing much benefit if you weren't addressing more than 2GB of ram and it took quite a while for 64bit adoption to become the norm even though hardware support was present.
I think it'd be a huge mistake for Canonical not to reverse their direction on this. The original decision was defensible when it wasn't obvious how many problems dropping 32 bit support would create, but obviously it's a big issue to a lot of their customers.