Hacker News new | past | comments | ask | show | jobs | submit login
Game dev: Linux users were only 0.1% of sales but 20% of crashes and tickets (twitter.com)
414 points by belltaco 76 days ago | hide | past | web | favorite | 400 comments



Linux is the perpetual scapegoat.

They selected a middleware, Coherent UI, that didn't work properly on anything but Windows. They also didn't make proper use of the Steam runtime, a mistake that continues to cause issues. Most games don't make these mistakes, so this isn't really representative of the larger state of Linux gaming.

It's worth pointing out that the devs did make a legitimately good attempt at making Linux/Mac support first class, by making a purely OpenGL engine for the game. But it seems they weren't aware of some other best practices.


Technically speaking I'm sure you're right. In practice it doesn't really matter though, if they did it that way it's either because it was easier or because that's the way they were used to doing it. The fact that "technically" it's not Linux's fault doesn't really matter, unless you're more interested in the moral concept of guilt rather that the practicalities of making a Linux port of a game.

Why should they change the way they develop games for 0.1% of their sales? So that they'd get kudo "clean code" points from people on HN? Most for-profit software cares very little about that, and for good reasons.

Gaming on Linux sucks because Linux is not popular for gaming. Linux is not popular for gaming because gaming on Linux sucks. That's the root of the issue.


> Technically speaking I'm sure you're right. In practice it doesn't really matter though

I think the details matter because they're relevant to how we as a developer / customer community move forward.

If the GP is correct, then the pain felt by these particular devs might not be a sign that targeting Linux is in general a bad idea. For example, we might help future projects be successful simply by spreading awareness of techniques known to make it easier to target Linux / SteamOS.

Regarding the 0.1% of sales issue, perhaps some of that number's smallness comes from a bootstrapping problem. I.e., there's a vicious cycle of: (bad drivers) --> (game is crashy) --> (poor sales) --> (gfx card vendors not motivated to improve drivers) --> (bad drivers) --> ...

I don't think there's much an individual game dev shop can do to break that cycle, but perhaps it's still useful to understand the problem.


Have you not seen Steam Play these last few months? Because of Proton the majority of the top 250 highest rated games now run fine on Linux.

I've been playing Skyrim, Witcher 3, Dark Souls 3, Castle Crashers, Overwatch, Heroes of the Storm, etc all of which are Windows-only but now WINE is so good it runs like it's native.

Linux gaming doesn't suck anymore and it's coming to eat Windows lunch. Check out ProtonDB to see compatibility with your favourite games and the Linux gaming subreddit for more news.


You know WINE exists 25 years though. During this time, Windows still has 95% market share ( https://store.steampowered.com/hwsurvey )

You could also think that, because of WINE, Windows is first class and then they check Linux with Wine.

Depends on the perspective.

PS. Yes, steam detects Wine


Sure, my point is though that it's only in the past few months when Proton was released that the compatibility was good enough it actually mattered to game developers and players.

By making Linux compatible with Windows games it gets rid of that objection "I'd move to Linux if it weren't for my games" which was the remaining objection for a LOT of people.

Because Steam tracks WINE that's a very good thing, so they can detect players who bought their games to play it on Linux.

This helps encourage Linux native gaming growth, because developers can see the chunk of Linux players rising as more get rid of Windows because they no longer have that remaining obstacle.


I'm not sure that this does encourage native Linux game development. Why bother putting in the porting work for 0.1% of potential users when someone else might do the work for me via Wine/Proton.


I used to think the same way about native first and forcing devs to adopt Linux. The whole "No Tux, no bucks" thing (which has had pretty much zero impact since it's not enough bucks). But, over the years, and especially with Proton being so good with Steam, I've completely switched. If Linux gaming is to be a thing, then there needs to be an adoption first perspective.

So, yeah... maybe that means most devs will just say "my Linux support is just it runs on Proton probably, good luck!" but the thing is... there are games on Linux now. Lots of them. Lots of good ones. I was playing The Witness last night by just pushing play on Steam. No winecfg or winetools or separate DriveC Steam installation. No messing with drivers. I pressed play, the game loaded, I played it. I've repeated this loop in the last few months with most of the games in my library. Endless Legend is back in my rotation again. All of the dumb anime Japanese games where they don't even know Linux is a thing that exists suddenly work. It's glorious.

Wine/Proton may be the lazy way to develop for Linux and might not give people the coveted title of Linux exclusive gamer, but it's working really well if all you care about is playing games and not installing Windows.


Compatibility is a stepping stone to increasing the 0.8% (actual numbers) Linux population on Steam to higher numbers. If 2% or even 10% of users were Linux-based then developers would have second thoughts about choosing DirectX over Vulkan for example when it makes it more difficult for them to reach those customers.

Also the "no tux no bucks" philosophy many Linux users take in avoiding paying for non-native games.


People keep forgetting that game consoles have their own 3D APIs.

Contrary to FOSS folks, professional game studios aren't religious about APIs, as long as there is money to be made.


> Have you not seen Steam Play these last few months? Because of Proton the majority of the top 250 highest rated games now run fine on Linux.

Whilst this is cool I think it may also, sadly, be commercially irrelevant. Why bother worrying about Linux compatibility when only a tiny (i.e., somewhere between 0% and 1%) number of players/purchasers will run your game on Linux?


Perhaps because you will earn way way more money by making your game cross platform given the 3 consoles and 3 pc platforms that exist and once you have committed to such linux support might only take 1% of your effort.


Correct. If your game can't cope with Linux "fragmentation" (most of which is already abstracted away by Steam, so the remaining "fragmentation" is with hardware, which is the same problem you have with Windows), then you're in for a world of hurt if you try to port to a console with its far-from-ordinary hardware and programming APIs and such.


Porting to consoles is much easier than Linux, as each console is a very fixed platform. In my (limited) experience, middleware like Unity works better on consoles than it does on Linux.


Why should I get excited about decent compatibility when I can stick with perfect compatibility? Linux gaming doesn't suck compared to linux gaming 5 years ago. It's still crap compared to Windows.


You’re coming at this from the perspective of someone who wants to set up a PC primarily in order to play games, and probably wants to try out every game under the sun. Of course, you’ll install Windows.

Try instead coming at the question from the perspective of someone who already has a Linux workstation (e.g. for work) and wants to do as little as possible in order to play a few games—maybe the ones their friends are trying to get them to play as a member of a team. Windows isn’t worth it here: you wouldn’t use it for anything else (so every time you boot into it, you probably have to spend two hours installing updates), and booting into Windows would also prevent you from multitasking to the Linux apps you rely on. Compatibility shims, if decent, are far more interesting to this audience.


People who own a Linux workstation at home and just want to play a few games is vastly outnumbered by people who own a Windows desktop at hone and just want to play a few games. Probably at least 100 to 1. And the former group, with the “almost works” compatibility, will be a much bigger maintenance burden per customer.

Heck, I’d bet money that the Linux casual gaming crowd you described is also heavily outnumbered by people who have a dedicated Windows gaming PC (e.g. me, Mac user otherwise).


You're neglecting the crowd that have Windows at home and want to stop using it also. If gamers can have the exact same UX on Linux then that's one of the biggest obstacles to switching solved.


Like me. I will hug windows 7 goodbye on my way to Neon OSville, with redoubled hope my sim City 3k and rct2 may now work without having to be in a VM


"People who own a Linux workstation at home and just want to play a few games" make up a disproportionately large amount of the developer-base for pretty much any software, though, including games. It doesn't matter if none of your users care about a particular feature, if a fair number of your own devs do.


There are lots of developers pretty happy with macOS and Windows at home.


Yeah, but for a piece of software to acquire Linux support, you don’t need the majority of its developers to own a Linux workstation and want to use the software with it; you just need a non-negligible amount (i.e. enough developers with the spare man-hours to get the work done.)

Sometimes, in fact, it only takes one or two developers. I can’t think of a good Linux example here, but I know of a good few projects (Dolphin, for example) where the macOS target is supported entirely by the one or two developers on the team who use macOS.


Quite true, my comment was more against the typical HN remark that "developers" only use GNU/Linux, as if the software for the two biggest desktop environments would appear out of thin air.


In my case, it's because it removes the one thing left that tempts me back toward Windows. I'm significantly more productive on a proper Unix-like OS (like Linux), and it's wonderful to not have to dual-boot or maintain a second PC if I want to take a break and fire up some game or other. Between the native ports and the growing library of reasonably-Proton-compatible games, I feel that pull less and less.


> Have you not seen Steam Play these last few months? Because of Proton the majority of the top 250 highest rated games now run fine on Linux.

Put me in the skeptical camp, I've been hearing the same line with wine support for nearly 2 decades and never found it to be remotely true.

I've had enough trouble getting the actual linux supported games to work. Some work with gnome+wayland, others only work in gnome+xorg, some silently fail, some just freeze, open source AMD drivers still crash the whole machine, etc.


Proton only got announced in the past few months. You can view on ProtonDB reports from actual users and what drivers/hardware they use.


Linux Gaming might not suck anymore but all combinations of Linux the desktop operating system, that I tried certainly do. Both windows 10 and macOS are so much nicer, more stable and consistent that it is not even funny. Just the other day installing a pip package froze Firefox on Ubuntu for >30 seconds.

That is especially true if you are forced to use professionally maintained Linux without root access.


> Check out ProtonDB

You almost convinced me so I checked out ProtonDB, and found out you're overstating things.

Only 50% of games are rated "gold+", and most of those are native Linux versions. Most Windows-only games have issues.

https://www.protondb.com/


The moral concept of guilt is uninteresting here.

Developing for any platform requires a degree of competence and research. The fact that lots of engines are themselves cross platform doesn't in fact mean that linux support is a box you can tick with no further effort it means that it is a lot less effort.

I think the take away is that their life sucked because they were ill prepared and not just ill prepared to develop for Linux by all accounts. We could do well to maintain comprehensive and evolving info on the current best practices of developing on this platform and importantly pro mote them so as many as possible are educated.

Incidentally honestly I don't think gaming on linux sucks. There are an absolute ton of games available. I tend to buy humble bundles and steam sales and I presently have about 10 I haven't even played yet. Maybe I would feel differently if games were the primary use of my computer. I don't need every possible game to be ported to gave a good gaming experience on Linux. I just need there to be more games than I can possibly play.


> Technically speaking I'm sure you're right. In practice it doesn't really matter though, if they did it that way it's either because it was easier or because that's the way they were used to doing it.

Yes but then the question is, why did they even bother? If GP is right that they picked a middleware that is known to only work properly on Windows, that pretty much means they decided to fuck themselves right from the start. I agree that generally, properly targeting Linux is harder for multiple reasons, but if you want to go multiplatform, make sure you build on a solid foundation.


But it's very relevant. Steam provides a runtime so developers have a stable foundation across several distros.

If you fail to use their runtime properly, you're doomed to suffer from Linux fragmentation problems.


> In practice it doesn't really matter though

It does matter unless you just want linux to perfectly emulate windows APIs you will always have to do work to port to a different platform. Choosing the wrong tools for the job and then blaming the platform is just bad engineering.

I don't bitch about how hard it is to assembly my desk because it requires a socket wrench when my last piece of furniture only required a screwdriver and that's all I own.


Counterpoint: There's a reason IKEA packs Allen wrenches into everything it sells.


Shipping games as live DVDs is an elegant...ish... solution:)


Any multimedia on Linux sucks these days. I honestly think it's regressed from the days of running mplayer in a terminal. I can't watch a video in a browser without horrible screen tearing or general performance issues. It seems the poor quality of graphic drivers and X interaction are the underlying problem.


FWIW I've been doing all my pc gaming from Linux the last few months. (Before that there was a bit back and forth - sometimes CSGO on my machine was best on Linux, other times on Windows bit now I haven't booted Windows for gaming since lste summer or something. And for me the convenience of KDE makes me stay for work as well.)


> Why should they change the way they develop games for 0.1% of their sales?

Your last paragraph provides the most important reason.

By changing the way they develop, they will increase their customer base.


It is relevent in this context, because other developers know they can avoid the said issues if they do things a little bit differently.


> Why should they change the way they develop games for 0.1% of their sales?

Have you seen how tight the gaming market has become? You need a niche to get any attention. Coming out with a truly cross platform game that has Linux as a first class citizen would buy tremendous free publicity. It would be on the front page here a lone, a site with an enormous audience at minimum once a quarter. It would appear in countless tech related subreddits, twitter hacker-verse would take off. The “maker culture” would adopt it as its son or daughter.

Sounds like a tremendous opportunity.

We are not representative of the culture but we are hackers and developers and eventually one spark is all you need to start a fire. I have the perception that gamers are dying to get off of Windows, but I could be wrong, I certainly was when I was a gamer.


I've been in games and game middleware (including in many cases with Linux support) for fifteen years and I have never once seen this happen. Generally there is a tremendous amount of commentary like this from Linux folks and the port neither gets meaningful (or profitable) publicity nor sales that outstrip support. This comment is the stereotypical Linux game post.

When supporting a platform can mean debugging and submitting patches for a users graphics drivers in exchange for a shockingly low conversion rate, it's an easy no for me.

PS I built and maintain an Enterprise AWS GPU app on Ubuntu and the platform is great. But it was very non trivial to get working.

PPS if you spend less than $150/yr combined on all software purchases (including mobile/console) please don't make a case for Linux gaming. TuxRacer is your apotheosis.


> This comment is the stereotypical Linux game post.

I have absolutely no affinity for linux. My times spent twiddling with settings, packages, and getting drivers to work are long behind me. I have been exclusively Mac for almost a decade now, so I am in no way shape or form a "linux gaming homer" I could care less, but I am an opportunist, and I do see an opportunity now (and not 15 years ago)

> I've been in games and game middleware (including in many cases with Linux support) for fifteen years and I have never once seen this happen.

Yes, 15 years ago, Windows 10, with all it's privacy intrusions basically being a spyware OS didn't exist. A Steve Job-less Apple, doubling down on it's disdain for OpenGL and gaming in general at Apple didn't exist (see relevant John Carmack posts). And forgive me, but I would be willing to bet we didn't hear about it your game because I was specific when saying "first class citizen" meaning the game worked just as well on Linux as it did Windows. If you achieved that and still got no publicity I would be shocked. All you would need is one popular twitch streamer streaming Fornite (which also didn't exist 15 years ago, and wasn't pervasive until recently) on Linux or some similar big title and you would have a spark.


I have seen this claim many times, but wonder about it's validity.

It's easy to think that a truly cross platform game that has Linux as a first class citizen would make a lot of stir. But I used to play Heroes of Newerth years ago when Linux support were way worse than today. I also read HN then.

I never saw any special treatment for that even tho I loved the game and it worked really, really good on linux.

Searching on algolia just proves my point: https://hn.algolia.com/?query=heroes%20of%20newerth&sort=byP...

I simply do not buy the story that just because you release it on Linux, have good linux support etc it will spread like wildfire.

S2Games, which made HoN, made a superior MOBA imo but none of my friends play it anymore. They play Dota2 tho. I think the reality of the situation is that no one cares about if a game run on linux or not except extreme nerds that would never use Windows and go through the daily struggle that is desktop Linux.

I used to be one of them, nowadays I only use Windows.


Unreal Tournament 2004 was cross-platform Windows/Mac/Linux in 2004. But like this post indicates, the tech support headache isn’t worth it, so these days you can only get the Windows version on Steam despite the cross-platform binaries existing somewhere in Epic’s (back then still Epic MegaGames) vault.


Can you still run UT2004 on the current version of, say, Ubuntu?


>daily struggle that is desktop Linux

lol what year is this? 1999?

but I agree with the rest of your points


Every time someone mentions not using Linux Desktop because they had a lot of issues, someone like you comes out of the woodwork and pretends that Linux doesn't have issues anymore.

Maybe that's yet another reason people don't switch to Linux: the evangelists are annoying and untrustworthy.


> Every time someone mentions not using Linux Desktop because they had a lot of issues, someone like you comes out of the woodwork and pretends that Linux doesn't have issues anymore.

Did someone call me? Jokes aside, as a person using Linux (95% of the time, for ~15 years or so), I can honestly tell that Linux has its fair share of problems. However, for some time, the problems I experience are not more frequent than Macs that I have or the Windows PCs of my family members.

Is Linux perfect? No way. Did it improve over the years? Yes, tremendously. Also, I can say that advanced desktops like KDE can do very nice things for automation and productivity. I'm currently happy about the state of Linux, but it doesn't mean its perfect or the very very best.


Don't get me wrong, I agree that progress has been made. Sound, unless you need low latency, is pretty much a solved problem now, for one.

But there are a lot of reasons that Linux's particular brand of issues are actually still a deal breaker for people, and refusing to acknowledge that will never attract anyone to the platform.


For low latency I've played with Jack a little while I was playing bass. It wasn't very bad, but I don't have recent information on the issue.

I for one do photo post-processing and development on Linux mainly, and have no problems while doing what I want to do.

> ...Linux's particular brand of issues...

Can you please elaborate? I'm interested. Since I'm using Linux heavily and for a very long time, I might be blind to that problems.


Just google around a bit, even just on HN, and you'll find dozens of examples. A lot of it comes down to poor drivers, especially for laptops, but much of it is systemic.

I'm reluctant to go into detail about my own personal blockers because every time I do I end up in a multi-page argument with some evangelists who insist that everything I want to do is completely wrong and I should just change my entire workflow to match theirs.


You're right. Laptop support tends to be problematic, and boils down to selection of right hardware "platform" in the beginning. The worst part is, the right platform is not always budget friendly.

I personally found out that professional class hardware (Dell XPS, HP EliteBook, Lenovo ThinkPad) has the best Linux support out there. I have a EliteBook 850G2 at the office, and except the fingerprint reader (which I don't use), everything is working without any problems. Battery life is also great (~7 hours). However, works for me is not a valid excuse, esp. with hardware.

If you want to discuss further, you can reach me via my profile page.


No, it is 2019 and my Linux Netbook still lacks hardware accelerated video decoding and OpenGL 4.0 support, although the card is a DirectX 11 compatible one.


Well yeah, it is of course much better today I believe. But I wouldn't be surprised if you still have to spend hours in trying to configure stuff if you have the wrong hardware.

Misunderstand me right, it's still mostly on laptops I have experienced issues. On a stationary computer you just get a performance drop, at least for most graphics cards.

I think it's great that it has improved so much and I hope it will continue to improve so that sometime in the future I can return to the promised land.


Tried installing Linux on MBP last month. Ran away screaming after 30+ hours of dealing with drivers issues. I do this every couple of years, hoping that finally THIS time I can get off Windows. Next attempt will be circa about 2021 probably.


Try this is an exercise instead pick a random dell. Attempt to install OSX on it. Post about how huge a hassle this was and how the end result was a non functioning brick and OSX still isn't ready.

If you google computer model linux. If the result is 17 pages of results about how it didn't work you may want to try a different model.

Generally how well your machine is supported is a function of how hostile your oem is towards openness, how different from existing hardware your machine is, how common it is, and how much time people have had to add support.

Current macs aren't well supported. Supporting all hardware under the sun is a Sisyphean task and ultimately an unimportant one. For Linux to be useful it doesn't have to support all possible machines just a good range of hardware.


I've installed Ubuntu 18.10 on an XPS 13, which everyone tells me is well supported by Linux, Dell even sell it with Ubuntu. It won't come out of sleep. Googling suggests other have this problem.


XPS is a range of models and 13 is a size it doens't uniquely identify the model. Does it have the problem under the lts version that dell presumably ships?


I don't know, I just tried installing the latest Ubuntu. I could go and track the version Dell ships for my laptop, and make sure I never upgrade it, but surely that proves the point that Linux is a pain to run?


How did we get from run the latest long term service release which ships every 2 years like clockwork to never update?


You said I should only run the lts version which dell ships...


> Linux on MBP

Well there's your problem. The companies that make the custom hardware that Apple uses in their laptops refuse to release driver support for Linux for basically the same reasons as the writer of this Tweet. Whether the fault for this is on Apple or the manufacturers is up for debate, but driver compatibility with Apple's laptops and anything but macOS has always been a crapshoot and only became decent for Windows in the last few years.

Note: This is coming from an Apple fan who has been wanting to try out dual booting a Linux distro or one of the BSDs but has watched support tickets get answered with "testing MBP drivers on Linux isn't worth our time" from multiple OEMs.


Installing linux on a Mac is your problem. Macs are notoriously a huge pain when it comes to linux compatibility. Tbh even windows isn't that great on a Mac...I'd just stick to OSX on a mac.

When it came time to replace my old macbook air, I got a dell xps 13 and linux works great on that. All of the hardware works out of the box without having to do anything with drivers.


Why not use OS X on an MBP, or Linux on a generic x86 machine? I'm not clear on why this is the only route to get you off of Windows.


Macs have good hardware. Or you might also want to dual boot.

Sad to say, but some Macbook models work great, and others are fucking terrible. I have two models - my older model where the only thing that has ever consistently worked is bluetooth, and a slightly newer one where nothing has ever broken.


Running Linux on new hardware is usually a bad idea, due to the nature of the process you have to expect at least a year before divers for new hardware have settled into distributions.

Then things should be pretty sweet for quite a while. Unless your hardware is really poplar, things will bitrot away eventually, but expect a 5-10 year sweet spot where everything should just work out of the box.


THIS is the problem. Windows drivers start working on day 1 and continue working. The breaks are when we went to 32-bit drivers in NT and when we disabled real-time hardware interrupts in Vista.

Linux needs a driver compatibility story this strong to even start.


On the flipside, while Windows has a greater quantity of drivers available for devices on day 1 of release, Linux tends to have a greater quantity of drivers available for devices at time of install. With Linux, there's no separate step of having to wait for Windows Update to pull the driver, since all the drivers are included alongside the kernel (the exceptions being printer drivers - which aren't developed alongside the kernel - and firmware for wireless NICs if you're going with a strictly-FOSS-only distro).

Meanwhile, I "fondly" remember having to have a USB stick on hand for Windows 7 installs because the default install didn't include wired (let alone wireless) NIC drivers for 90% of the laptops and desktops on which I installed it. Thankfully Windows 10 is better about this (at least on the wired front; wireless drivers are still hit or miss), but still.


> Meanwhile, I "fondly" remember having to have a USB stick on hand for Windows 7 installs because the default install didn't include wired (let alone wireless) NIC drivers for 90% of the laptops and desktops on which I installed it.

I worked in an IT support shop at the time windows 7 was released, and I imaged and installed hundreds of copies of windows 7 over the time I worked there. While you're right about wireless drivers being a crapshoot, I can not remember a single instance of missing wired NIC drivers on install. I'm not doubting that some were missing (there is lots of hardware, lots of manufacturers out there), but it was definitely not as huge a problem. The biggest issue was usually SD card readers and trackpads which required downloading from the manufacturer.

I've done a few linux desktop installs (same job) and the situation was definitely more painful. Issues with sleep/wake, webcams, network drivers (usually wireless), multiple displays were basically guaranteed, and the help process was usually "You're using the wrong hardware", which isn't really helpful.


"I can not remember a single instance of missing wired NIC drivers on install"

Were you pre-installing NIC drivers with your images? That'd be a good reason for the high success rate.

It might also have to do with specific manufacturers/vendors. Most of my installations were on Dells; it's possible HP or Lenovo stuck with chipsets that Windows properly supported out-of-the-box. Linux worked fine in all cases.


dota 2 works on linux though. ;)


> Coming out with a truly cross platform game that has Linux as a first class citizen would buy tremendous free publicity

It does not. Take a look at the Steam games supporting Linux. There's thousands of them, it's not special.


> would buy tremendous free publicity.

That would not buy food on the table. Most money comes from Windows and what games should be optimized for.

See r/choosingbeggars for more entertaining stories on the "value" of "publicity".


Choosing beggars literally think that publicity is payment enough. Linux gamers PAY.

Really a big difference.


> Coming out with a truly cross platform game that has Linux as a first class citizen would buy tremendous free publicity.

This particular game (Planetary Annihilation) was designed with Linux support in mind from the very beginning; as the person who wrote the tweets notes, he and numerous others within the company were huge proponents of supporting Linux. As events proved, it didn't work out for them.


They later admitted they overstated this issue and that their problems were self caused.


You're either underestimating revenue from Windows purchases or overvaluing that publicity bonus. There are already a lot of indie games with Linux support, and triple A developments already sell millions of copies. If that added publicity made another 1000 people buy the game, it still won't be a financially sounds decision.

Heck, I'm mostly on Linux but boot into Windows for gaming... I'd probably continue to do so, even if the games were released for Linux just for that extra performance and significantly less chance too run into an obscure issue with my setup.


win10 really messed things up with the SAAS and telemetry, and integrated update rollups.

the thing is microsoft really painted themselves into a corner. ya see windows was made for all sorts of skullduggery to occur , this was supposed to be a boon for MS but they punched so many holes in W32 that it was secure as a sieve, I know this first hand as i have spent a major amount of time decompiling and snooping through the binaries. the file structures in a windows OS has big voids of zeros for padding, leaves lots of room to insert malcode and fix the file headers, Thread Local Storage is neat, a thread can move data packets through its kernel objects to other objects or static files on swap. And the alternate data paging that win executable files may have is just mindblowing. a file {any W32/W64 file} that every one expects to do one thing can be trojaned by the system as a feature using alternate data streams in a file! IT GOES ON AND ON this and win10 is why i left windows and gave linux a serious try and never looked back. Linux runs games and keeps getting better at it with no cat and mouse game of you modding system files so things work and having MS updates demodd them and then prevent you from making thing workable without major klugeing around.


Literally nobody cares about Linux gaming. It's not even press-worthy because of SteamOS. Anyone who games on Linux does it for the adventure of the process, not because it's convenient.

If you ported your game to the Atari ST or the Amiga you'd get more press and probably more sales.


There's nothing adventurous about using your preferred operating system and enjoying being able to have a working game experience. I've spent significant time in all 3 major OS options over the last few decades, and came to the conclusion that Linux is the one I prefer. A few years ago, I resigned myself to either rebooting to Windows or running Windows in a VM when I wanted to do any modern gaming. The significant advancements over the last year in terms of video driver functionality (both on Nvidia and AMD sides), binary compatibility (WINE/Proton), and more developers releasing native games have been a huge boon and very welcomed. I use Linux full time because it's what I enjoy and feel comfortable with as a primary OS, and I absolutely care about the gaming landscape improving. I'm only one person, but many others won't speak up.


It's like Mac gaming. It sucks. Unless it's a for-Mac native title, which is exceedingly rare these days, it's going to be trash because the publisher invariably uses some crappy DirectX to OpenGL wrapper that cripples performance and crashes constantly.

Developer kits like Unreal and Unity have helped a lot here, but those are far from flawless. Even those struggle with Linux because there's just way too many distributions and way too few standards.


I used to reboot into Windows to run games. Thanks to Proton I don't bother anymore. In fact I dread booting into Windows now because I know it's going to spend 30 minutes patching itself and then reboot on me.


Rhetoric like this is less than effective. Some of us do game in Linux because we don't want a Windows box. So, literally is demonstrably false.

Now, statistically, I realize we don't exist. But at absolute levels, we do.


You've got to recognize you're an outlier.

In practical terms nobody plays games on Linux unless they're using something like SteamOS or, technically, Android.

Given how ridiculously hard it is to get a simple application to run across all the various distributions of Linux that exist, expecting something as complicated as a game to run at all is asking way too much.

Windows is ridiculously hard to support, but at least it has sales volume to justify the work necessary to get a game launched. Linux doesn't.


>adventure of the process, not because it's convenient

There's no need for adventure for me. I prefer to code on Linux and not need to reboot into Windows. Playing games on Linux IS the convenient way for me.


Most people just get a game console or play games on their phone. A very tiny group of people do what you do on Linux.


Most games don't make these mistakes...it seems they weren't aware of some other best practices.

I see this time and time again. Game developer runs into one of the difficult subjects in computer science/programming. Dismisses the difficulty. Gets themselves into trouble. Blames the library/product/platform.

The last time I was at a game jam, I found myself explaining generational garbage collection to a game dev. He then proceeded to "solve" all of his problems by writing a 5 minute hack to ensure all of his objects would be collected as quickly as possible. Which didn't work, because the lifespan of his objects is dependent on gameplay, so he can't ensure all of his objects will die in Eden. I kept trying to explain that he should actually hold onto his objects (especially bullets in his bullet hell game) and recycle them as much as possible, so that he would reduce memory pressure and have those objects promoted to old space, so they are looked at less often.

I see this sort of thing time and time again.


> I see this time and time again. Game developer runs into one of the difficult subjects in computer science/programming. Dismisses the difficulty. Gets themselves into trouble. Blames the library/product/platform.

Members of this particular team worked on the highly acclaimed Total Annihilation (1997) and Supreme Commander (2007) so inexperience or lack of technical expertise seems unlikely to be the cause here.


Parts of PA are brilliant, absolutely brilliant. The team pulled off some incredible feats:

- In a 16-player game, each player can operate thousands of units across multiple planets orbiting a star, and afterwards, you can replay the match exactly within your own copy of the game.

- The planets themselves were mobile battlefields which could be destroyed, moved, and weaponized.

- The path-finding for those thousands of units crashing into an opposing army of thousands of units is smooth and there isn't total chaos on the field.

But the team also made some really questionable bets that ultimately doomed the title.

- Then, for some reason, they used an out-of-proc UI compositor to draw the 2D elements that would fork one process per layer IIRC and drop interaction events, and broke the tool that the players use to interact with the wonderful simulation.

- Then, for some worse reason, they launched it like this, and reaped the terrible early reviews that doomed the title. Even after patches resolved the issues, the long-term damage was done.


So they have no excuse for missing out on those other best practices.


What on earth has a generational gc and can't handle a little bullet spam?


Nearly any per-frame instancing tends to hit generational boundary cases due to ambiguity over lifetimes that the VM can't account for. They usually don't result in GC pauses that are large enough or frequent enough to make the game unplayable, but they do prevent the desired "solid 60hz". It's actually a huge thorn in the side of fast gameplay code. Recommendations always turn towards engineering a static allocation(value types if you can do them, object pools if not). However it takes enough effort to build the latter that quite a few games running on GCs ship without doing it in all cases, and some runtimes make it borderline impossible(string processing).


OP was at a game jam, though. They were building a prototype. If they were going for a smooth 60hz, they were doing it wrong.

If there were pauses causing significant gameplay issues, it's doubtful GC was the specific cause. There was probably something very wrong being done - my guess would be instantiation of a managed resource like a texture or 3d model.

I doubt generational GC came in to it, and feel like the OP is prematurely celebrating his own insight in to what was causing the issue.

(Then again, maybe it was a VR game and pauseless 120hz was the goal.)


Saw "Coherent UI" in your comment and thought, "I bet it's Planetary Annihilation" and was not disappointed.

The dev team struggled to get Coherent UI working on Windows, to say nothing of Linux. They switched to Coherent UI late in the development cycle and the beta/launch was constantly glitched out. The problem with trying to run an out-of-proc UI renderer within a game loop were so numerous -- the game would frequently lock up in full screen with multiple instances of the Coherent UI render process running if you managed to escape.

... in Windows, not in Linux.

For fans of TA and SupCom, early PA was terrible. Later, the team worked through a lot of the issues, but the damages to their finances (and reviews) were done and PA never got the e-sports league it deserved.


Truth be told I don't think PA ever had a chance to truly be big. So few people wanted the Supcom style gameplay. Forged Alliance Forever only has a couple thousand people and the best or second best commentator/stream of games (Gyle) gets only a few thousand views per video. I think the gameplay style just doesn't appeal to as many people


He points out that a very large percentage of the crashes were graphics driver related, not dependency issues that would be solved by the Steam runtime.


I mention the Steam runtime because this affects the Linux version to this day: I am unable to launch the game (on a fully supported system and OS, mind you) without deleting some files from their install.

> He points out that a very large percentage of the crashes were graphics driver related

This was still largely related to the Coherent UI middleware and the strange way it was being used, as other posts have noted.


My gaming box is still Xenial while all my work machines are at least bionic. While btrfs and my personal system setup practices make it easy to test and also port setups from machine to machine, I'm not relishing the moment when I'll have to bite the bullet and upgrade.


I think you're in denial.

If developers need to get this, that and the other thing right specifically for Linux, but it's still only 0.1% of sales, then it's just not worth doing - even if there aren't any bugs.


100% this.

source: - am a business person - am a developer - am a linux users - am a linux gamer

I love the idea of being able to game and work/dev on the same machine. But its not the current reality and its not the reality of the near future. Ive recently settled for macos because even though it doesn't offer optimal solutions for anything, it offers the really nice solutions for everything.

I tried linux gaming for years, and Steam really really gave it a strong push. But the truth is, gaming on linux is complicated and doesn't offer the same experience as on Windows or even macos. I'd rather my favorite studios and developers dedicate its linux resources to customers that pay instead.


I keep a dualboot system because of the current situation.

Linux for "real work"

Windows for gaming (no-frills setup) and music production (much easier and less clunky low latency setup, way more availability of free synths etc)

I absolutely adore Linux for everything it has done to my career but I cannot drop my Windows system yet.


I likewise run and prefer a Linux desktop and boot into Windows for the occasional gaming foray. Also, to run stuff like Fusion 360.

I have to say that even on Windows, driver stability leaves much to be desired (for Nvidia in my case).


The thing is, if you only boot into Windows occasionally, you'll have to wait for updates about every time you use it. That's pretty annoying and has actually stopped me from doing it at some point.


I don't think I realized it until you mentioned it, but I think I'm in the same boat with my dual-boot box. Doubly so because it's a laptop and half of the time I only have phone-teathered internet.


> I have to say that even on Windows, driver stability leaves much to be desired (for Nvidia in my case).

I agree. That is why I tend to leave my Windows box really bare. It has Steam with some 4-5 games on it and my Music Production Software. Every other endeavor is tackled in Linux.


Take a look at cloud9 for your real work :)


> I love the idea of being able to game and work/dev on the same machine.

For what it's worth, I used to have that, and I am happy I don't anymore. Having two separate computers for gaming and work has been very positive to both my work and gaming experiences. I recommend this to anyone who can afford it.. even if you work on Windows.


Settled on MacOs for gaming? I built a Windows PC (for gaming only) because I was tired of every title I wanted to play being exclusive to Windows. Plus graphics card support is much better on Windows than everything else.


i travel a lot, so unless i want to two laptops, I had to compromise.


I agree with you that it is definitely not commercially worth it. But I guess you missed his point. He is not saying that "it was worth it for this game if well implemented", but that this game is actually a really bad example of how gaming on Linux is. It is like analyzing PC gaming by looking at Devil May Cry 3 or Resident Evil 4...


In general you are actually looking at 2% of your potential user base not 0.1%.


This is a question of margins, so you're neither right nor wrong.

It normally isn't just 0.1% of sales when developers do get everything right, although whether or not the usual 1-5% of sales is worth it depends on the particular game, budget, and sales numbers. Linux support clearly wasn't worth it for Planetary Annihilation due to its low sales numbers, but a game with (1) high sales and (2) easy porting process would be leaving money on the table not to support Linux.


From the article:

> "We eventually laid out a guide with known good versions of Linux and graphics drivers, but it didn't matter. Part of the allure of Linux is the customizability, so few actually stuck to it, and generally wanted to run the game on older hardware we didn't support."

It seems that it's because Linux users are unwilling to upgrade their hardware. From a philosophical standpoint, I agree with them, we shouldn't need to buy a new computer every 3-5 years, it's very wasteful and I am very against planned obsolescence and perceived obsolescence. That said, I can see the benefits of writing code to work on the machine you have in front of you, and not doubling your QA work by testing it on older machines that no longer even receive OS upgrades.


I bought PA early, and tried playing it on a low end 2007 laptop, and it was almost playable. The same machine could only barely handle supcom on lowest settings anyway. Meaning, if your hardware could handle supcom well, then it could probably handle the basics of PA (at least that's my bet)

They should not be expected to support systems that could only kind of handle the previous game anyway


But if the set of best practices that works for your target hardware platform is narrow-band, that set itself can severely constrain the set of game houses that can develop for that platform.


I work for a small company that produces a DAW and VST plugins. Supporting Linux is a huge amount of work compared to Windows and Mac.

The main issue is that 'Linux' is not a thing you can support. You have to pick the distros you want to support, and then once you've picked a distro, what versions you want to support.

And you need to use the C++ version that ships with that distro, so if you want to support old versions then the entire project is held back from using latest C++.

And distos aren't backwards compatible. ie when libcurl4 is released, they remove libcurl3. So you can't have one binary for Ubuntu 18.04 and 16.04.

So it means for every product, have a build for every distro / version combination you want to support.

So now the solution is AppImages where you bundle up your app with all dependencies like a container. Haven't investigated this yet, not sure it will work for plugins.


> And distos aren't backwards compatible. ie when libcurl4 is released, they remove libcurl3. So you can't have one binary for Ubuntu 18.04 and 16.04.

If you don't want to depend on the libcurl provided by the OS, ship your own.

If you don't want to depend on the glibc provided by the OS, ship your own, with your own dynamic linker.

readelf -n <your binary> will tell you the oldest kernel that will run your stuff. Put that in the requirements document. Write a bash script that sets LD_LIBRARY_PATH correctly and make sure that's the main entry point to your application during deployment.

You're set.


This is not as easy as it sounds, because funny things happen if you ship your own libc - I remember last time I tried this there were problems with PAM and name services. But that was a number of years ago.

Can you ship with your own version of GNOME different from the currently running one?


> Can you ship with your own version of GNOME different from the currently running one?

I'm confused. GNOME is a desktop environment. It implements a task bar, a control panel, a way to set your desktop wallpaper etc. Why would you ship GNOME?

Maybe you meant shipping a proprietary application that uses a GTK version that is different than the OS-provided GNOME uses?

If so, of course you can. GTK is a bunch of shared objects that generate system calls, X messages, etc. It has no reason to interfere with the OS-provided GTK as long as it's got everything it needs in your bundle.


This doesn't work for things that are dynamically loaded into other applications (as VSTs are). In general, you need to be using the same version of the libraries as the rest of the application.


Your suggestion to ship glibc caused coffee to end up over my keyboard :) This is the road to hell, but you'd only know that if you had the slightest clue of the ABI interactions involved in swapping out a core library like that - starting with the reality that parts of the (probably binary-only) graphics stack must be linked and loaded in-process, and they naturally depend on at a minimum glibc. Your suggestion is to effectively ship the game in the form of its own Linux distro, which is of course complete nonsense.

Snap and Flatpack will hopefully help with the dependency aging problem, but they're brand new, and they both still suck one way or another.

To give you an idea of what swapping out glibc would involve, here is the list of shared libraries loaded by 'glxgears' on my machine, arguably the simplest possible OpenGL program:

    /lib/x86_64-linux-gnu/ld-2.28.so
    /lib/x86_64-linux-gnu/libbsd.so.0.9.1
    /lib/x86_64-linux-gnu/libc-2.28.so
    /lib/x86_64-linux-gnu/libdl-2.28.so
    /lib/x86_64-linux-gnu/libexpat.so.1.6.8
    /lib/x86_64-linux-gnu/libgcc_s.so.1
    /lib/x86_64-linux-gnu/libm-2.28.so
    /lib/x86_64-linux-gnu/libnsl-2.28.so
    /lib/x86_64-linux-gnu/libnss_compat-2.28.so
    /lib/x86_64-linux-gnu/libnss_files-2.28.so
    /lib/x86_64-linux-gnu/libnss_nis-2.28.so
    /lib/x86_64-linux-gnu/libpthread-2.28.so
    /lib/x86_64-linux-gnu/librt-2.28.so
    /lib/x86_64-linux-gnu/libz.so.1.2.11
    /usr/lib/x86_64-linux-gnu/dri/i965_dri.so
    /usr/lib/x86_64-linux-gnu/libdrm_intel.so.1.0.0
    /usr/lib/x86_64-linux-gnu/libdrm_nouveau.so.2.0.0
    /usr/lib/x86_64-linux-gnu/libdrm_radeon.so.1.0.1
    /usr/lib/x86_64-linux-gnu/libdrm.so.2.4.0
    /usr/lib/x86_64-linux-gnu/libglapi.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libGLdispatch.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libGL.so.1.7.0
    /usr/lib/x86_64-linux-gnu/libGLX_mesa.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libGLX.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libpciaccess.so.0.11.1
    /usr/lib/x86_64-linux-gnu/libstdc++.so.6.0.25
    /usr/lib/x86_64-linux-gnu/libX11.so.6.3.0
    /usr/lib/x86_64-linux-gnu/libX11-xcb.so.1.0.0
    /usr/lib/x86_64-linux-gnu/libXau.so.6.0.0
    /usr/lib/x86_64-linux-gnu/libxcb-dri2.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libxcb-dri3.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libxcb-glx.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libxcb-present.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libxcb.so.1.1.0
    /usr/lib/x86_64-linux-gnu/libxcb-sync.so.1.0.0
    /usr/lib/x86_64-linux-gnu/libXdamage.so.1.1.0  
    /usr/lib/x86_64-linux-gnu/libXdmcp.so.6.0.0 
    /usr/lib/x86_64-linux-gnu/libXext.so.6.4.0
    /usr/lib/x86_64-linux-gnu/libXfixes.so.3.1.0
    /usr/lib/x86_64-linux-gnu/libxshmfence.so.1.0.0
    /usr/lib/x86_64-linux-gnu/libXxf86vm.so.1.0.0


Ignoring your personal attacks, you and I are on the same page.

I do agree that shipping your glibc is "the road to hell" and shipping with the glibc could be qualified as "shipping in the form of its own Linux distro", as amusing as that sounds :) But that's the price of freedom you pay if you want to draw your platform boundary at the kernel -- you need to ship the whole userland! Where's the surprise in that? That means you need to understand the interactions between different version of /usr/lib64/opengl/nvidia/lib/libGLX_nvidia.so.0 (that you need to ship) and nvidia.ko (that is nvidia's binary driver loaded by the kernel, that you do NOT ship)

That's what people mean when they say they "are targeting Linux", even when they don't really understand the kind of work that that statement entails :)

Linux is an OS kernel. I'd wager it's the most popular one by far, in terms of the number of platforms that uses it.

Saying that "Linux is fragmented" is the wrong way to look at the problem. One should rather realize that Linux is just and OS kernel that is used by countless platforms. It's up to the developer to draw platform boundaries and focus engineering effort on the chosen ones.

If you need to support GNU/Linux with glibc-2.25, then you need to set up your testing bench to accommodate that. If you need to have native look and feeling in ubuntu, kubuntu and lubuntu, you already have 3 platforms with at least 2 LTS versions that you need to test with.

Each platform has its own quirks. Linux distros are the ones with the least amount of quirks, but in exchange we get many many platforms, because it's so easy to create them.

I guess that's life :)


Why would you want to take on that massive burden for such a tiny percentage of sales - most of which you'd have gotten anyway?


glibc is pretty good at ABI backwards-compatibility. The right thing to do would be to statically link libstdc++, libgcc, and extras like libcurl, but dynamically link libc, libX11, and libGL. Luckily libX11 and libGL are pure-C so you can get away with static-linking libstdc++, which means you can use whatever C++ version you like.


It's definitely technically possible to find a hacky combination that works today, but only by shouldering the cost of understanding all the possible deps and symbols and the structure of their underlying objects used by all possible e.g. graphics drivers and X11 libraries now and into the future. You can hack around it today, but it's a fool's game and absolutely not something that can be relied on to continue working in any sound manner, without risking e.g. some memory corruption due to a struct layout difference long in the future, etc.

I wanted to check whether the official Nvidia driver uses C++, but it's not installed on this machine just now, and of course that I even have to check just highlights the problem with this approach.


> I wanted to check whether the official Nvidia driver uses C++,

I don't know for nvidia, but I had a problem two years ago on a machine with a radeon card: I was developing a GUI software which used LLVM at some point. Insta-crash at runtime on this computer whenever I'd oepn a window. The reason ? the radeon driver linked and initialized LLVM which didn't support being initialized twice...


There's no other option, though. If you aren't integrated into the distros' builds, then you have no way of getting your binary updated when a dependency makes an ABI-incompatible change -- much less shipping a single binary that works across all distros. The only reasonable option is to determine a subset of your dependencies which offer long-term stable ABIs, dynamically link those, statically link the rest, and hope for the best.


But there is another option, it was mentioned in the original complaint :)

> So it means for every product, have a build for every distro / version combination you want to support.

Of course it's a total pain in the ass, but it's still vastly preferable to unfixable bugs sometime in the distant future


I'm not sure how per-distro (or distro-version) builds fix the problem, unless you're committing to continuously produce new builds for future distro versions to account for ABI changes. If you can do that, you can also fix issues with static linking if/when the come up.


There's a better solution: don't ship on an OS that insists on being a pain in the ass to ship on.


This is added complexity that neither the Windows nor Macintosh runtimes demand of their developers.


Windows actually does require this, which is why applications tend to ship with (or otherwise require) some kind of C(++) runtime, like those provided by MSVC or MinGW. Some rely on these being installed separately, while others include them as part of the app's installation process. Same deal for .NET. I haven't done enough macOS development to confirm or deny, but I'd assume the situation is similar there, too.

The actual difference is that Visual C++ and XCode (presumably) automate most or all of this, since this is the standard way of compiling and distributing applications for those platforms. In contrast, Linux development tends to revolve around software that can be recompiled for each distribution, so the tooling is going to be optimized around a workflow of relying on system-wide libraries and binaries and other data managed by a package manager.


Great and now I have to also track security updates and ship those for all my dependencies. libcurl is a great example with a release cycle of 2(!) months and tons of vulnerabilities being fixed each time.


[flagged]


[flagged]


That is completely ridiculous. Pointing out that statically compiling dependencies or shipping them with a game can solve many so called compatibility problems is directly related to games working on linux.


It is also the solution they currently use on Windows ... And what games are doing on Steam that support Linux.

I won't comment on shipping your own glibc, because that probably could cause issues.


As you can see, both comments were indeed flagged by one of those stackexchange types that is always marking useful questions as off-topic. Probably the same person who thinks using anything but rpm is bad.

The thread has then been detached for the crime of discussing voting, so only we can see it.

The first rule of HN voting is not to discuss voting.


It's a wonderful idea - on paper, but all that's needed is for one of the many mandatory deps for running a GL program to link against the system libcurl or any other replaced dep and you're back in Crashville again, population 1 - cloth-eared developer


Are there mandatory dependencies for running a GL program that can't work without using the system libcurl shared library? How does that make any sense?


If you distribute and link libcurl v69 dynamically, the dynamic linker symbol table for your process will contain references to v69 symbols. If you then load an e.g. GL driver that in turn links libcurl v42, the GL driver will end up with some combination of v69 and maybe v42 symbols, despite being compiled against v42 headers containing v42 struct sizes and offsets.

Now there is an opportunity for silent memory corruption in your program, one that even once detected, which may not happen for years, cannot easily be fixed without completely rearchitecting the build.

When using a wide variety of system APIs including at least NSS and OpenGL, there is little control over what dynamic libraries end up loaded, the exact configuration will vary across machines

Substitute libcurl for libexpat or libX11 or libstdc++ and the principle continues to apply. I can't think of a good reason something like libcurl would be used by the Nvidia driver or some bizarro embedded board's GL driver, but that doesn't mean the Nvidia driver doesn't now or in the future link it, and if/when that day arrives, the problem lies entirely with the program that mixed random system deps with random self-compiled deps.


> If you distribute and link libcurl v69 dynamically,

This whole thread is about avoiding this.


> So now the solution is AppImages where you bundle up your app with all dependencies like a container.

But isn't this what you're doing with Windows? No version of Windows comes with libcurl (as far as I know) so you put the DLL in your application directory. It's no different.


On Windows, I use the Windows API for downloading files. I don't need to worry about curl.

On Linux, I could have statically linked curl, but didn't realize it was going to become an issue. Ubuntu 18.04 was released and curl3 was removed, so that means our software that was working was either automatically removed by the package manager or it started crashing.


Can a installer prompt user to download missing libraries if needed? For example - Dota2 on Windows downloads bunch of C++ runtime libraries on first launch. Sharpkeys and bunch of other software do that (certain version of .net runtime).

libcurl may or may not be a good example but surely all software has dependencies that isn't shipped with OS. In Linux case the special pain is, library that is shipped with OS might be outdated but bundling compiled lib/o/so files should work.


On Windows, I often get prompted to install C++ runtime dependencies when installing. One big thing Steam did was manage multiple installs of DirectX for each game. It's just a matter of Linux being cost/benefit to overcome these things, like they have on Windows.


Which is probably way at the back of the developer's minds: the main audience of Linux is happy to compile from source (either themselves or via a downloaded package).


Exactly. Quite a few of the "problems" with Linux development ultimately stem from trying to shoehorn Windows-oriented expectations into Linux development/deployment. Very few applications in the Linux world target specific distros themselves; instead they publish their source code and leave the actual distro-specific finagling to distro/package maintainers (sometimes those maintainers are also developers, but they're regardless treated as separate concerns).

Of course, quite a few game developers are curiously averse to publishing any kind of source code, which tends to thus require compatibility efforts to happen on the developer side instead of the package maintainer side. This can still be worked around by targeting a specific distro (say, Ubuntu or SteamOS) and letting package maintainers for other distros apply whatever workarounds they deem necessary to get the app running outside of a "supported" environment (see also: Steam, Spotify, Slack, etc.).


I'll grab source code if I'm doing development or something obscure, but I feel like packages have been pretty solid and widely available since the early to mid 2000s. When working with source code I'll almost always install to a custom PREFIX--but if it's just an app the source is the last resort.

All of the commercial packages I've used at work for the past 15 years target either one or a handful of distros. Even the open source stuff that's either too new or outside of the default repos I much prefer to install from a package; turbovnc, grafana, chrome, etc. The few games I've seen have worked the same way as other commercial user-oriented apps.


You don't have to statically link, you can use LD_PRELOAD to point to local versions of all libraries. It isn't optimal, from space/security perspectives, but it works. You can use the local libs first or last in the path.


You don't even have to do that. When you compile your binary you just need to set the rpath to your own directory of libs and they will be search automatically and fall back to the system if missing.


Last time I looked the rpath is a static absolute path, which is inconvenient if you want to install the libraries elsewhere.

The automatic fallback to system libraries can also lead to mysterious problems.


The linker recognizes a variable that expands to the location of the binary so you can use relative paths.

${ORIGIN}/lib will be the relative library path.


Fascinating.

Some useful stack overflow answers on this topic. First, the trick to get GCC to pass the ORIGIN option to the linker: -Wl,-z,origin.

Secondly, how to pass it as the rpath option:

"-Wl,-rpath,'$ORIGIN' -- note that you need quotes around it to avoid having the shell interpret it as a variable, and if you try to do this in a Makefile, you need $$ to avoid having make interpret the $ as well."

https://stackoverflow.com/questions/38058041/correct-usage-o...

The chrpath utility is also fascinating, as it allows you to rewrite the elf headers to change the rpath. Possibly better for installs than passing lots of env variables.

Various warnings on this topic to not use actual relative paths unless you really want that behaviour.


I work for VCV, which also ships for Linux. I don't find it an issue at all, since the build system is a Makefile supporting Mingw64/MSYS2 on Windows, Mac, and Linux.

We statically link everything except glibc, which we've decided to dynamically link to glibc 2.23 (meaning that Ubuntu 16.04 is the oldest we support). We've had no problems with this approach, although the disadvantage is that we have to use an old version of GCC to compile the software, since I haven't figured out a way to make a new GCC version produce binaries which link to old glibc.

No need for containers or anything, just make a mostly-static-except-for-glibc binary.


> the disadvantage is that we have to use an old version of GCC to compile the software, since I haven't figured out a way to make a new GCC version produce binaries which link to old glibc.

I personnally compile with latest GCC or LLVM on centos 7, this way I can use the very latest C++ standards with a venerable glibc


That sounds like more something to blame the Debian family over than the whole ecosystem. I might be running "bleeding edge" on Arch but I can still depend on and use libcurl3 with the libcurl-compat package.

I maintain about two dozen software packages in the AUR, including some really old stuff like the Heretic 2 Linux release from 1999 and RBDoom 3 BFG which has a boatload of dependencies. Breakages are extremely rare for the average package even with the rolling release and generally any breaking change in a common library will see the legacy version hang around since stuff will still depend on it.


I don't think it's Debian, because Debian tends to have compatibility packages for old versions too. For libcurl there's libcurl3. It depends on the package, of course.


Only for so long - try installing qt3 on a recent Debian or Ubuntu.


I don't have a Debian machine to test it on but a casual search finds qt1-3 in the AUR on Arch. Of course building the whole GUI suite would be a bit annoying but its better than nothing if you have some really old software that depends on it.


I might compare qt3 to Silverlight or ActiveX.


Why not pick a "distribution" like org.freedesktop.Platform/18.08 (i.e. pick a flatpak runtime) and support that? It runs on all desktop distributions. Updating supported flatpak runtimes is then on your leisure, not on their release cadence.


How long are flatpak runtimes supported? What is their update policy?


The runtimes are not locked down, anyone can publish theirs (under their domain name, ofcourse), so it depends on who publishes it.

For the freedesktop one, they have following policy[1]:

  - When a new stable release release is done, the changes on that branch will only be:
    - security updates
    - stable releases (tested carefully); no ABI breaks / API build breaks.
    - we will try to keep updating that branch as much as we can

  - We release a new major release every year, only if:
    - There is a ABI break
    - There is a "API build" break (apps might not compile because new major releases of important packages (GCC))
    - Looking at the GCC and other major project cadences, this is likely going to happen annually.

  - We only maintain the current stable release and the previous one, this means:
    - Stable releases get 2 years of security updates
    - We maintain maximum 3 releases at any given time:
      - Development
      - Stable
      - Old stable
[1] https://gitlab.com/freedesktop-sdk/freedesktop-sdk/wikis/rel...


Proprietary software on Linux is a bit of an alien there.

But there is always static linking, is it not? (And now flatpak+snap+...)


I concluded ages ago that no substantial applications should be integrated into an operating system. Dynamic linking if you're building something portable is just an awful idea because of how much variety you'll inevitably have to support (or choose not to support).

Some people have written about AppImages, this also applies to FlatPak and similar techs too. Isolating anything more complicated than a command line tool is the way forward and the tech exists. Not using isolation this way is a recipe for pain, whether it be on the desktop or the server (or the phone).

It depends how the plugins are loaded - it'd be great if they could use sockets so that AppImages were viable. No idea myself, though.


They are VST plugins and the are dynamic libraries. Call dlopen and call a specified function.


Confusingly, you can statically link a dynamic library (i.e. the dynamic library includes all its dependencies statically instead of recursively depending on more dynamic libraries). You can even link it in such a way that the dynamic library gets its own version of each dependency regardless of what the main executable is linked to (otherwise the main executable's links could override yours).

I did this once with a ruby gem that had particular c++ dependencies that kept breaking when the build machine had different library versions than the production machines. If you aren't integrated directly with the development of a linux distro, it's best not to use their packages as runtime dependencies, since they really only consider their own use before changing things up.


I am developing an open-source DAW with Qt (https://ossia.io), to solve the problem you mentioned I used AppImage with great success - of course you have to ship all your libs, but that's what you have to do on macOS and Windows too anyways.


Do you ship any plugins? Do you do a separate AppImage for each plugin?


> Do you ship any plugins?

I used to but I'm switching to a different model

> Do you do a separate AppImage for each plugin?

no, they are just loaded as .so / .dll / .dylib in the ~/Documents/score/addons folder. They can't have further dependencies though.


The DAW isn't Bitwig is it? (just because your username has the same amount of syllables ;) If so, then PLEASE keep up the great work supporting Linux. I was only able to ditch my Mac a year ago because I switched from Ableton Live to Bitwig!


renoise is also available on linux, great product works like a charm on linux :D


also, Renoise is frigging awesome.

it is a blessing to be able to work with a different paradigm than piano roll and with such modern tooling (renoise supports vsts, rewire integration and whatnot), being essentially a supercharged tracker.


I've been meaning to pick it up, would you recommend any tutorials for figuring out how to use the tracker interface to sequence?


Not BitWig


Music software's dependance on Windows is such a mess. Tons of cool plugins are stuck as windows-only VSTs! I was really hoping that Propellerheads would use the fact that it can ship on any platform to ship Reason and all associated REs on linux or in-browser, but instead they gave in and added VST support.


A historical lack of low-latency audio is likely why you haven't seen many music software releases on Linux. For years, you had to have a custom kernel to get "decent" performance.


Can confirm. I fondly remember late nights hacking the Gentoo kernel so my college radio station could reliably live stream shows via icecast and darkice.


Music Production is one of the last things I keep Windows around for. That sounds like a cool job though.


biggest problem with plugins imo is that often they don't support the same things as the daw does so usually come without linux installers (even though the code runs just fine on linux... )


Why wouldn't you just create a statically linked binary?


AFAIK you can't call dlopen from a static binary, and if it's a DAW you have to support dlopen in order to open the VST plug-ins


Some binaries are meant to be linked statically


bitwig?


Why use not Electron? Take a look at vscode it has plugins and it seems to work everywhere.


Electron is basically Node+Chrome. There’s no way in hell it’s appropriate for writing VSTs.


In about ten years someone will post one to HN. Written in Javascript.


And it will use 1TB of ram to run.


Unfortunately DAW and VST plugins are not something you can make in Electron


Can you please explain why?


Audio software has strict buffer/latency demands which usually cannot be guaranteed on interpreted language platforms.

Doing audio synthesis with JS or any other interpreted language really is totally possible and has been done in a more or less serious way in several implementations and webtoys etc. But if you need extremely low latency and guarantees you cannot go that route, sadly.


You can always write native extension in c++/rust/c.


so then why use electron as well if the majority of your code is going to be native anyway?


exactly.

the UI is the minority of the code. the main engine is gonna need to be implemented in some language which has strict guarantees about performance. also, not having a garbage collector that fires in a seemingly random way helps a lot too :)


It's probably a decent way to package your program. And you can use all the glammy JS you want.


There are few reasons: - they most of the time run embedded in a DAW - they are usually computationally intensive themselves - they are meant to be instanced as far as memory/CPU can go

ad to the third point: Music producers already require and use pretty powerful rigs: 32-128GB RAM is not uncommon, CPU as good as it gets. There's great benefit when you can run 100 instrument synthesizer instances parallel vs 14 instances - it's a difference between a differentiated orchestra and a rock band.


Because the plugins embeds into existing native applications and uses existing native SDKs.


Friends don’t let friends use Electron


> You have to pick the distros you want to support, and then once you've picked a distro, what versions you want to support.

What if game developers release the game's source code and let community developers help with porting to different distros and platforms? The game's assets can remain paid. For example, Doom has been ported to pretty much all platforms, and it's up to maintainers to ensure compatibility. I guess at this point it becomes a partly open source/free software game.

I'm aware this may not align with current business practices.


> And you need to use the C++ version that ships with that distro, so if you want to support old versions then the entire project is held back from using latest C++.

> And distos aren't backwards compatible. ie when libcurl4 is released, they remove libcurl3. So you can't have one binary for Ubuntu 18.04 and 16.04.

Release the source and the community will help you with many of these issues.


I don't know why people say that. You don't support Windows XP and Windows 10 either with the same binary.

In contrast to Windows you can provide or pay someone to provide libcurl3 if you want to continue to use it. MS will just say you can go F yourself.

In contrast to Windows you can study the source code and write a wrapper that provides a central API point that you can use in your single code base.

In contrast to Windows you can actually go there and provide patches. Even if the original authors won't merge it you can still use it via a fork etc.

And last but not least I bet there are actually still people supporting and providing libcurl3 binaries to this day and you just need to google their package server and add 2 lines to your installer script (one to add the public key for that package repo and one to add the package repo to your package manager).

PS: If you provide software for sizable amounts of people you need to provide 1-3 out of 3 reasonable Distros: Debian, RHEL, Suse. Even if you just provide one most people can deal with it thanks to VMs or docker.


For a 32 bit version, a single binary compiled in VS2017 could support XP to 10. For 64 bit, a single binary supports Vista to 10.

In my experience, MS doesn't say go F yourself. They go to extreme lengths to keep old software working.


> You don't support Windows XP and Windows 10 either with the same binary.

I think you picked a terrible example there, because quite often people do; certainly Vista upwards is quite normal (that was the transition point for lots of APIs).

Windows is much clearer about how you're supposed to solve "DLL hell". You have the OS libraries, which provide a stable API; COM, where interfacing is done at runtime dynamically; and for everything else you put it in your application's directory.

Theoretically if COM components aren't interchangeable - the API has expanded - they should have a different CLSID and therefore not clash.


This is a common misconception. Microsoft is a little insane about backward compatibility. You can target versions of Windows with a single binary from the latest all the way back to unsupported OS's like XP. There are lot's of companies that take advantage of this. It's one of the reasons why MS has so much trouble moving app developers to the new hotness, even if it's safer, faster, or whatever. Because of that the new hotness doesn't get enough traction to support continued development and they sunset it early which draws even more ire from people.


I use a Windows audio software binary last compiled in 1997, for Windows 95. It still works just fine on Windows 10 64 bit.

That's 20 years of backward compatibility.


While I love Linux, I think this is a totally fair "state of the union". At the end of the day, gaming or kickstarters or similar... are a business. You need to figure out your ROI (Return On Investment). It doesn't many any logical sense to spend a huge amount of time on a platform that nets you tiny amounts of sales (and gains you a huge amount of support costs).

This is why gaming on the Mac is only now starting to be a real thing. Not a decade or so ago, there were still relatively small numbers of people using a Mac. It cost a lot to develop for a new platform and didn't make much money. Now it makes more sense.

Of course Linux fragmentation will probably hold it back for significantly longer.


Is it really fragmented? You support Ubuntu (in either its flatpak runtime or the Steam runtime) and you let everyone else figure it out on their own. Distros like Arch have been very good about providing the Steam Native Runtime and Ubuntu flatpak runtime in its own package system for just that reason.

The support burden for developers is proportional to the operating systems you officially say you support. If you support "Linux" you are supporting thousands of completely dissimilar execution environments. If you support Ubuntu, or Steam, or a Flatpak target you are only supporting that one operating system.

And thats fine. Thats all Linux users really want anyway. If it breaks on your distro its your responsibility to fix it so long as it works on its officially supported target OS.


> You support Ubuntu (in either its flatpak runtime or the Steam runtime) and you let everyone else figure it out on their own.

And isn't that already how it is? Pretty sure Steam only supports Ubuntu LTS, steam on every other OS else is an unsupported hack.


I think what is what zanny is saying. The fragmentation argument is fud.


They don't support every LTS as far as I know. Also, they don't really make any effort to keep up with getting ready for the next LTS release.


> This is why gaming on the Mac is only now starting to be a real thing.

Is it? I mean, there's a good amount of support for gaming on the Mac, but I think it's always been so ... I wouldn't call it "a thing" though ... not in comparison to PC or Console ...


I remember when major games being launched on Mac was worth a mention, or even a full feature in an Apple Keynote. Nowadays games are being released on the Mac constantly and nobody bats an eye. It's a better time to be a Mac gamer than ever before.


It would be even better if they had decent GPU's available without spending $1000+ on an external setup.


Or if they went with Vulkan instead of rolling yet another half-baked graphics API while also deprecating OpenGL because reasons.


Agreed, have been a Mac user since 2004… and got the first Intel iMac specifically for the ability to boot Windows for gaming too. I think the only native Mac OSX games I had when I got that first 12" PowerBook were Q3, UT2004, Starcraft / WC3 / Diablo 2… and maybe Tux racer ;)

Have long given up on dual booting. Granted much of my gaming is also done on consoles, Mac gaming support is better than ever outside of still generally anemic GPUs (and who knows what will happen when they move to ARM… a great many games will be orphaned in x86 forever -- e.g. will Starcraft HD get ported to ARM?)

Thank you, to all the indies (and open source engine porters) who have supported Mac over the past 14 years!


This was definitely not the case 10-15 years ago.


I had the iMac G4 about 10-15 years ago and I had a slew of good games for it. Nowhere near the amount available on other platforms but I was kept busy. These days when I look on Steam for instance yeah there's perhaps a larger proportion but still most games are Windows only ... The Mac never was nor will it ever be a serious gaming platform.


The same claim can be made for Linux today. I have a large number of games I can play. I would still say it is a smaller market.


Not disagreeing ... it makes little sense to develop commercially for Linux and never has. That doesn't mean that Mac isn't still a niche platform though, and banging on about number of tickets raised isn't really an appropriate way to quantify the value of a particular segment, or not.


They said the problem wasn't fragmentation of distros, but the versions of graphic drivers, and it's not like Windows machines are all on the same version of the same driver.


Windows generally has less of an issue of heterogeneous graphics driver ecosystem, mostly for market reasons---Windows laptops and desktops are bought more often for the purpose of playing games than Linux machines, so a Windows machine hitting the market with a graphics driver that's too outside the norm tends to perform poorly in sales, and the market encourages driver homogeneity in basic functionality. I've seen some stinkers trying to write my own game engine (THANKS FOR LYING ABOUT THE OPENGL CAPABILITIES BY IMPLEMENTING IT IN SOFTWARE AT SECONDS-PER-FRAME PERFORMANCE, INTEL, YOU ARE A PEACH), but not as many as in Linux.

(in short, "What machine should I buy to play games" is a question that's easy to answer for Windows; there's reams of magazines dedicated to the question. It's a harder question to answer for Linux, which makes it a harder question to answer for developers trying to write games against a Linux-based OS).


This is just an out-of-context tweet (well, the context is there - it's a reply to another tweet, which is out of context)... so perhaps it was just auto-crash reports, but I wonder if you get more tickets from Linux users, because Linux users are more likely to file tickets?

When something crashes on my partner's computer she yells at it and restarts it. When something crashes on my computer I spend an hour trying to figure out WHY. If I can't figure it out, I log a ticket thinking I'm being a good citizen!


They still represent a vastly disproportionate amount of work for how much extra money they get you.

They get you 0.1% extra sales, but they cost you 25% (= 20/80*100) extra support work. And this is nevermind the extra time spent during development.

Your point would hold if the issues reported by Linux users would also fix issues on Windows. Some issues would occur on both systems, but I bet the vast majority is weird Linux and setup specific bugs.


> I bet the vast majority

This is the crux of your assertion and requires substantiation.

On the other hands, all these Linux user might be doing you "a favour" taking the time to log these tickets that less conscientious users on other systems would.

Of course that does depend on the classification of the tickets but coming from that community I wouldn't expect them to be trivial issues ...


It's even more of a favor if the bugs aren't just linux-only. But if they're in gfx drivers like the tweet implies and auto-reported, the only thing you'd really care about is "newest drivers?" Same as the parade of crashes in Windows for the same reason, you want to just tell people to update.


Which is why chromium pulled support for certain drivers on linux


They didn't pull the support at all.

They've added another driver to a long list of blocklisted drivers that cause issues with the hardware acceleration. Nothing about that list is Linux-specific, as it includes a bunch of macOS and Windows drivers as well: https://src.chromium.org/viewvc/chrome/trunk/src/gpu/config/...

Firefox does that too: https://wiki.mozilla.org/Blocklisting/Blocked_Graphics_Drive...

Between a crashy browser thanks to a crappy driver and a stable browser without hardware acceleration, browsers will always just take the latter route. Don't like it? Run it with a flag and expect it to be a little less stable. That's it. You absolutely can't blame neither the browser nor the platform for that, only the vendor of the driver (or in this specific case NVidia, since it's actively hostile to the vendor).


I'd expect that class of bug is easy enough to triage ... what's the problem?


> requires substantiation.

From the linked series of tweets:

"In the end they accounted for <0.1% of sales but >20% of auto reported crashes and support tickets (most gfx driver related)."

"So yes, fragmentation is still totally an issue."

"We eventually laid out a guide with known good versions of Linux and graphics drivers, but it didn't matter. Part of the allure of Linux is the customizability, so few actually stuck to it, and generally wanted to run the game on older hardware we didn't support."


Historically, that 0.1% income claim has been disproven a couple of times. Some indies reported a pretty even split between Windows, MacOS and Linux and even a willingness for Linux gamers to pay more. But that was before Steam for Linux, so I do not know how that has changed things. Just wanted to throw that out there.


Why do they have to do the work?

If someone releases a game and I play it, maybe I get a Linux-only bug where the audio sometimes becomes static-y until I reset (real world example). If I file a bug for that, they probably won't fix it and I probably will (and did) continue playing. I lose nothing, they lose the time to triage a bug.

If I can't launch the game and I file a ticket, it's probably because I either want to exchange my money for a working game - which on my own might not be enough incentive for them, and that's OK - or because I have to do it because of asinine refunds policies.

The developer is framing bug reports as a burden, when really they're neutral or a benefit - the bug that exists because of the lack of platform-specific work is the burden. That's understandable, there are a lot of platforms and not a lot of paying players.

If bug reports are not a benefit, perhaps software will stop trying to send usage data back home? I doubt it.


Because having a buggy game on any platform is a reputational risk. If your game is buggy, people will complain and people on other platforms might reconsider buying even though the version they'll get might be bug free.


You'd thinking having another build target would help find bugs on existing platforms. Kind of like the BSD's.

Didn't ID software outsource the Linux ports to someone in the community and just provide it as free?


>They still represent a vastly disproportionate amount of work for how much extra money they get you. //

(With a sarcastic tone ...)

So you spend 1000 man-years developing your AAA title, which you release for MS Windows only.

Then someone makes a WINE wrapper on their w/e off and you start selling on Steam as being able to run (but not supported) on Linux.

Now you find so many more Linux users filing bug reports to help you fix your game ...

Bloody Linux users, eh, who'd have them.


That's not what happened in the article (tweet). They released a native Linux version.


Sorry, I thought the thread was about the broader subject of supporting Linux gamers rather than Planetary Annihilation alone. The title should probably be changed.


Similarly, if I hear from all my friends that a program crashes a lot on Linux, I'm much less likely to want to download it in the first place.


https://twitter.com/bgolus/status/1080544133238800384

Follow-up tweet by the same author:

> As a follow up to this, I've been told by those actually involved with Linux stuff that this wasn't true. I probably just stopped paying attention to Linux issues at a time when everything was broken.

Just so the discussion does not overly focus on the (apparently wrong?) numbers.


This is a follow up to another tweet though


This is a pretty well-known effect in gamedev. The specific numbers change, but Linux sales are only a small percentage of total, while generating much more development and support work.

At least dev work can be drastically simplified with middleware like Unity, which is how small studios can even consider Linux. But support difficulties are real. Players have so many possible distros, configurations, and drivers - actually supporting all of them would require a level of Linux expertise that gamedevs simply don't have (being typically Windows or OSX based themselves). Limiting yourself to something like Ubuntu LTS helps a bit, but there are still plenty of gotchas.

So it becomes a simple matter of numbers: given relatively small amount of extra sales, is it worth the extra work, and spending the time to learn Linux development and administration in sufficient depth? Sadly usually it is not.


But for a game that is built on Unity, wouldn't these support issues be Unity's responsibility? Assuming the issues are triggered by their glue layer.

What's the arrangement there, does anyone know?


Unity does not do your customer service for you. Also, bug fixes might only go into a future release of Unity (months out) and updating to the new release to get the bug fixes might break your game. Unity games that spend a few years in development often ship on a very old version of Unity.


Gaming is the #1 reason I haven't fully switched to Linux on the desktop at home (at work I exclusively run Linux).

A lot of my Steam games work flawlessly with Linux but others will just silently fail to launch and it takes hours to properly diagnose and fix, which isn't what I'm after when I get some time to play games.


I've switched to 100% Linux at home due to Steam Play. All of my Windows games (except 2) work perfectly on it. I really hope Valve keeps improving it, it has the potential to be a game changer for Linux gaming.


Your comment is the perfect example of how there's an "invisible" demand demand for gaming on Linux.

Nobody can see it, because those who want it will mostly make do with Windows.


+1.

Personally, I spend 98% of my time in Linux, but I dual-boot just for the sake of the few games I like that need Windows to run. If not for that, I'd have dropped Windows years ago.


I haven't logged my hours, but substitute 98% above with whatever my number is (90+%). I don't even have Windows installed on my main, but older, desktop. I just have it on my newer laptop, which is where I do any gaming of significance (when I have time nowadays).


Invisible demand is undistinguishable from no demand, as far as B2C is concerned.


I agree that they can't distinguish it but there was no visible demand for an iPhone before the iPhone.


Only if we forget about the work done by Symbian, Psion, Microsoft, Compaq, Dell in portable Touch devices.


How did the demand work out for Symbian, Psion, and Microsoft phones?


Pretty well in Europe and Asia, until a certain company in Mountain View decided to screw Sun and offer a mobile OS for free to OEMs., in exchange for user data.


People forget that Windows Mobile 5 and 6 unseated Palm to become the sales leader in smartphones in the mid-2000s, shortly before the iPhone came along and blew them and everybody else out of the water.


Not true. Invisible demand is a potential market and a bunch of prospects in waiting.

However that niche is way too small to be worth it.


For all the prophets out there: no one stops you to build a business for that "invisible demand". Let's see how you can predict the market.


This happens all the time. Some endeavours fail, other succeed. Some are more likely to succeed or fail than other and some succeed or fail unexpectedly.

Nothing to do with prophecies though.

Not every demand can be fulfilled, no matter what.


I'm in the same boat. I'd love to be done with windows but since I spend a large amount of my free time playing games with friends I need to be on windows.


Despite the problems and shortcomings, I'm actually pretty impressed with how well Valve has managed to make Linux gaming work over the past decade. Of course, a lot of credit is also due to other contributors to the WINE project, but Valve used Steam to create a targetable set of dependencies for native games in addition to their WINE contributions. I'm frankly a bit amazed that they solved that part of the problem. And a bit disappointed in the people blaming them for not somehow solving the GPU problem by continuing to push Steam Machines no one was buying anyway.

For me, the gaming situation on Linux has become tolerable enough that, in comparison to Microsoft's Windows 10 bullshit, it is no longer a barrier. However there are several other barriers that are unlikely to be dealt with any time soon.


Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: