Hacker News new | past | comments | ask | show | jobs | submit login
A guy preserving the new history of PC games, one Linux port at a time (404media.co)
244 points by rcarmo 7 months ago | hide | past | favorite | 133 comments



> People think of the PC as this completely invincible platform where once you get a PC version it lives forever

that's true -- thanks to the free win32 api implementation wine, x86 userspace emulators like qemu or box86, and piracy for drm-laden titles.

decades from now, windows games will be runnable on then-current platforms just like you can now run them on your android or arm macos device.


I mean, I've been running NES, SNES and GameBoy games on emulators (in Linux, natch) for decades now. Hell, I played through FF5 for the first time a few years back.

Edit to add: I also run DOS games like Ultima Underworld, Master of Orion 2, etc in DOSBox; things are already pretty good, but I will point out two other examples to argue for releasing of source: Jagged Alliance 2 and NeverWinter Nights. Both games released for Linux, but only JA2 had a source release eventually, and it still runs, because having source gives you that. Meanwhile, I still can't go back and replay NWN.


Someone has to maintain Wine and all the rest. So it requires effort and isn't something simply guaranteed.


But that's one thing to maintain. With every game you port to Linux, you have one more thing to maintain and update to whatever shit breaks it on the userspace side 3 years from now. That's O(n) while maintaining wine is O(1).

Yes this is oversimplified as wine is more complex than most games, and also there will be regressions and stuff as you can't test every Windows game ever every time you change something in wine, but still.


This is a bad analogy, because Wine also has dependencies on various "Linux APIs". So technically if you maintain Wine you are also maintaining all the libraries Wine depends on, and so on.

It would probably be easier for a game to just target the libraries Wine depends on, or even better, some smaller and more "least common denominator" abstraction layer (e.g. think libao for audio) sitting on top of the same libraries. While you can see Wine as de facto providing such an abstraction layer (e.g. you can switch audio drivers, video drivers, etc. in Wine), they are forced to expose an API whose evolution they don't control (Win32), which is something that long-term is doomed to fail.


> So technically if you maintain Wine you are also maintaining all the libraries Wine depends on, and so on.

You are confirming that it is a good analogy. If some lib changes its ABI, in the wine case you need to fix it once: in wine. If you have 2000 Linux ports of games, you need to fix it 2000 times.

Your libao example is exactly the problem we currently have on Linux. Nothing is really stable, api and abi breakages happen almost everywhere. You can be careful and pick projects that promise to never break abi compat only to still get disappointed by at least one eventually.

And while win32 is not under your control regarding new features, it is by far the most stable userspace api and abi in existence.

See also: https://news.ycombinator.com/item?id=32471624 (article and comments)


So something like Wine can be developed for Linux with focus on gaming.


The problem is maintaining long term compatibility which is what Wine is doing. Native Linux ABIs can break (soname changes and etc.) and unless someone keeps translating old ones into newer ones, it won't work.

I.e. something Wine-like for Linux ABIs themselves that games use would be great.


The good example of that idea is translating older SDL into newer one.


It would be nice actually to have a Linux own compatibility layer similar to Wine, that preserves old APIs same way Wine does for Windows ones. But so far no one is really planning to make such.


Then maintaining Wine is more like O(lg n).


TYo keep Wine going buy the commercial version of Wine - Crossover from codewavers.com


of course. please donate (money or effort) to these important preservation projects!


in the future, finding a compatible emulator version will be a lot easier than acquiring the original intended hardware


It's already easier. My point is that someone has to keep the compatibility layers working on modern systems.


Sounds like we need an all in one VM solution with GPU access, where the whole OS is embedded with the game and can run ontop of any OS providing there is a player for the VM itself.


In theory, can't an LLM be used to translate code from the original platform to the emulation environment, in such a way that there is no copyright infringement ?


If you define LLMs as magic portals for ignoring copyright law and also assume they're magically able to perform arbitrary code transforms defying compiler engineers, then sure.


You don't need arbitrary code transforms, just correctness-preserving code transforms.


LLMs are inherently bad at this whole "correctness" thing...


Well, not if you force them to use only correctness-preserving transforms. Just reject anything else.


So how would you prove that the transforms it conjures up are in fact correctness-preserving?

Also, if you only allow correctness-preserving transforms, why would you need an LLM in the first place? Just apply those correctness-preserving transforms directly (that's basically how recompilers work, which have been successful in doing that since the late 90s)


1. You write and prove the correctness preserving transforms yourself. They are low-level, so you only need to do a small number of them.

2. The aim is to apply the transforms in such a way that you get different code, which is still efficient, or maybe even more efficient. This is where you rely on the "intuition" of the LLMs. No guarantees there, that's why it was a question.


...but that is exactly what optimizing compilers (and re-compilers) do since forever, and they are already quite good at that.

I fail to see how an LLM could magically improve that.


Because LLMs have shown that they can reason about code on a higher, more abstract level than a typical compiler.


That's just not true. They're not "reasoning" anything, they're assembling code that appears similar to something they were trained on, within whatever additional language training constraints they're given. I've never had an LLM spit out code that didn't need fixing, which defeats the purpose of preferring them over a re-compiler anyways.


There is a problem of preserving online-only games - which work in the "game-as-a-service" model. Without a third party / pirate server the games simply wont work. Different thing is that a dead multiplayer game is a dead game, especially MMOs.


The reddit linux gaming "community" used to be virulently anti-Wine. They considered it an atrocity.


These people have completely disappeared


You can be Pro-Native without being Anti-Wine.

In 2001 I wrote: "Those irrational people who reject Wine for purely political reasons, are doing much more damage to Linux than Wine will ever do. They're trying to argue that trivial invisible implementation details matter so much to users, that they would reject Linux if their favorite games weren't native ports, even if they ran under Wine. That's totally ridiculous."

I won't name names (because they're easy to guess), but some previously anti-Wine people are still around, and doing great work! And Wine didn't prevent that from happening, or destroy native Linux gaming.

Oh what the heck, ok, I will name one name of one anti-Wine person who's completely disappeared from the tech scene, because he was a dishonest exploitive lying scumbag lawyer who lined his own pockets instead of paying his hard working, talented, and fanatically dedicated employees (ahem make that contractors who he led to believe they were employees), who he deceived and screwed over financially and legally, getting them in undeserved trouble with the IRS, so he deserves to be mentioned: Scott Draeker.

https://en.wikipedia.org/wiki/Loki_Entertainment

https://www.eurogamer.net/a-loki

https://games.slashdot.org/story/01/10/28/1437229/lokis-drae...

https://www.linux.com/news/lokis-draeker-if-i-had-do-it-over...

>NewsForge: Or how about the TransGaming model of using WineX?

>Draeker: The arrival of TransGaming to me is the clearest indication that Loki failed to jump-start a Linux gaming industry as we’d hoped, because TransGaming has nothing to do with Linux games. Their message to game developers is: “Use DirectX and develop for Windows. We’ll help you sell your Windows products to Linux users.”

>TransGaming’s strategy is the same one Corel used in its Linux applications business. In the end I don’t think they’ll be any more successful than Corel was.

https://www.osnews.com/story/211/lokis-draeker-why-run-windo...

>Loki’s Draeker: Why Run Windows Games on Linux? Eugenia Loli 2001-10-24 Games 5 Comments

>“Competitor Scott Draeker isn’t impressed with TransGaming Technologies’ plan to use its version of Wine to get Windows games to work on Linux. […] Not so fast, says Draeker, whose Loki Entertainment has been the flagship company of that “traditional” approach. Draeker has doubts about games running on Wine working as well as games actually made to run on Linux. Although Loki filed for bankruptcy back in August, the company has continued to release games, including Kohan: Immortal Sovereigns in late August and Postal Plus ‘coming soon.’” Read the rest of the article on NewsForge:

https://web.archive.org/web/20011120221025/http://www.newsfo...

Loki - "we're not quite dead yet": Filed for bankruptcy protection not bankruptcy, but future still looks bleak:

https://www.eurogamer.net/article-31053

Loki: A promising plan gone terribly wrong:

https://archive.ph/20020418225227/http://www.linuxandmain.co...

Chris Chapman @retrohistories tweets quoting and summarizing that article:

https://twitter.com/retrohistories/status/142254664440337203...

>Loki was founded by Scott Draeker, an ex-lawyer with an ambitious plan: port already-successful PC games and sell them to Linux users.

>Yup. Sell games to a niche market (particularly in 1998) best known for their conviction that software should be free.

>Digital downloads weren’t really a thing back then (broadband adoption wasn’t there), so, like everyone else, they produced boxed copies. But these were hard to get hold of, especially outside the US. Loki’s global distribution network was small, further limiting sales.

>They burned through money like it was scrap paper, spending extravagantly and massively overestimating demand. They borrowed money from a friend of an employee and bought a projection TV. Late in the company’s life, when they couldn’t make payroll, they kept buying furniture.

>"Though its biggest seller not handled by MacMillan had moved only 5,000 copies, Draeker would order 12,000 to 15,000 copies of new games. These cost money to duplicate and package and, now, money to warehouse. The biggest miscalculation came with Quake III Arena, originally published by id Software. Draeker thought that a "limited collector's edition" shipped in a tin box specially made in China would be just the ticket -- so he ordered 50,000 units, making it the least limited of all of Loki's editions. About 7,000 units sold; most would be unloaded on a liquidator later. The miscalculation cost the company a quarter of a million dollars at a time when it was having difficulty meeting payroll."

>Chapter 11 wasn’t the end for Loki; they restructured and stayed operational for another five months, but not enough people were buying their games to get them out of the hole they were in.

>Then, suspicious transactions and negligent bookkeeping were uncovered.

>Coworkers who thought they were employees were told that they were contractors, and that they personally owed the social security taxes they believed Loki had paid on their behalf all along.

>Even more serious were the large sums of money transferred to Draeker and his wife.

>"So it was not until January that Loki's employees learned that they hadn't been employee at all. Instead of W-2 forms, they received 1099s, which is what are sent to outside contractors. The significance is that Loki employees who believed that withholding and social security taxes were being taken from their pay -- and who had been paid (when they were paid at all) an amount reflecting wages with those deductions having been make -- now learned that instead their pay had been reduced by the amount that would have been paid for those purposes, which now they had to pay themselves, because Loki hadn't."

>Seriously, you gotta read this account of some of the dubious behaviour that happened at Loki. It’s like a true crime story where the culprit got away.

>"Asked if Loki had recorded those erroneous transactions, Draeker replied, "We didn't have anyone keeping records at that time. It was -- it was in the bank statements, the record of that." Those bank statements had not been kept by the company. Additionally, the company was apparently unable to produce any financial records for the period from September 1999 to May 2001. The deposition took on a surreal air at times, with Draeker refusing to say whether or not he is a lawyer and in one spectacular moment testifying that as president of Loki he could say how much had been paid to Scott draeker and when, but as Scott Draeker he could not say whether he actually received the money. Yet when asked if, shortly before the bankruptcy filing, Loki had paid him $13,000, he replied, "Uh, as I said before, there are several occasions on which Loki did pay me. And I don't recall specific dates or amounts."

[...]

Also, I wrote this on slashdot in 2001 about my own experience with Scott Draeker:

https://games.slashdot.org/comments.pl?sid=23108&cid=2491410

>I worked full time at Maxis on The Sims for three years, and all that time I kept the idea of porting The Sims to other platforms in mind. So I wrote code as portably as I had time to, and thought a lot about what would need to be done. I evangelised to my co-workers and managers at Maxis about how I thought Loki would be the ideal company to port The Sims to Linux. Since there really isn't much demand for a Linux port, I proposed doing a Mac port in a way that would facilitate them both. Before The Sims was ever released, I wrote and sent a proposal around Maxis, outlining how to port The Sims to the Mac and Linux, using SDL and Open GL.

>I met Scott Draeker at the Game Developers conference on March 7 2000, about a month after The Sims shipped on Feb 4. I suggested that Loki port The Sims to Linux, because I was optimistic that it was going to be a popular game. He didn't seem to think so, and brushed me off, with a "go away kid, you're bothering me" attitude.

>But I gave Scott Draeker the benefit of the doubt, that he was just tired after a long day in the trade show booth, and not really as curt and indifferent to the idea as he seemed.

>Once The Sims shipped, I left my full time job at Maxis to work on some of my own projects, but I kept working on The Sims for Maxis as a contractor. I worked on content creation tools, developed Transmogrifier and other stuff. I still have legitimate access to The Sims source code, and I keep Will Wright up to date on what I'm doing.

>As a proof of concept, I started porting The Sims to Linux on my own time. I hoped to overcome the skepticism of some people at Maxis, as well as Scott Draeker at Loki, by demonstrating that it was indeed possible, and experimenting to find the best approach empirically.

>My goal was to find the best approach to getting The Sims to run on Linux. Not just to use one particular technology or another. The end result is what matters most, not the way it's implemented.

>Thanks to the encouragement of John Gilmore, I certainly did consider using Wine, but at the time it was nowhere near sufficient. (But since then, Transgaming has made astounding progress with Wine, and it's now obviously quite sufficient, to my delight.)

>So I used SDL to do a native port of The Sims to Linux, and got most of the game running quite well, except for drawing the people and roofs (which would require hacking a system memory back end to Mesa), and sound (which would require using OpenAL, with which I hoped Loki would have been able to help me).

>I was actually quite surprised at how quickly I was able to get a native port of The Sims running on Linux. My previous experience porting SimCity [catalog.com] to Unix took a lot more time. But the tools are much better and computers are way faster now. And of course I was more familiar with the code base.

>I offered the results of my work to Loki on reasonable terms. They didn't seem interested. I talked to some people at Maxis about it, and they said that Loki had been discussing it with Maxis, but they hadn't heard back from them in a long time.

>I finally got some brusque uninformative email from Scott Draeker, and we talked briefly on the phone, but he said that he was really busy, he had a lot of paperwork in progress that had to be finish, and he'd get back to me some time. So I stopped working on the port, and waited to hear back from him...

>I considered approaching other Linux game companies about porting The Sims to Linux, but decided to wait, because I still believed Loki was the best company to do it, and I did not want to undercut their ongoing negotiations with Maxis. Just the opposite -- I encouraged Maxis to quickly reach a fair deal with Loki, because I believed we could work together to get it to market fast. But Maxis wasn't the only company dragging their feet.

>Months later, I finally read on the net that Loki had decided not to port The Sims to Linux, because "Maxis wanted too much money". By that time, The Sims had been topping the charts for months, so of course Maxis was asking a lot for it.

>What I didn't know at the time, was that Loki was soon to declare Chapter 11. So it was actually a combination of Maxis wanting a lot for it, and Loki not having any money. But of course Draeker didn't mention that fact at the time.

>But fortunately, my time and effort porting The Sims to Linux was not wasted, because Maxis needed The Sims to run on Linux, as the multi-player game server for The Sims Online.

>So I used the original port at a guide, and more cleanly ported and optimized the newer Sims Online code to Linux again, making a headless build without all the graphics (removing SDL and DirectX). But the Linux build of the code is for Maxis's internal use on their servers, not as a commercial product for Linux.

>I made the same code base compile on both Windows and Linux, and both with or without graphic. The SDL graphics code still works on Linux, but it's only used for diagnostic and debugging purposes, and not for production.

>It's nice to run the graphical build of the Linux server in order to see what the server's doing during development. But the production server can't require a connection to an X server, and doesn't read in any graphics, because many must run on the same machine in parallel.

>Even though Loki blew their chance to port The Sims to Linux, I still wanted to see it happen anyway. But because so much time had passed since the release of The Sims, I would rather put my efforts into finishing porting The Sims Online client to Linux, and work with some other company than Loki.

>But I discussed it with Will Wright, and he explained to me in his reasonable, thoughtful, well considered manner: a native port of The Sims Online client to Linux would not be practical as a commercial product, because of its nature as a dynamically updated online game.

>The way The Sims Online and many other online games work, is that the server and the clients all run the same deterministic simulation in lock step, funneling any user requested changes through a central "headless" server, so the actions can be scheduled to happen at the same time in all parallel universes.

>So the server simulation and protocol must be EXACTLY the same as the clients, or all hell will break loose. Any online game, no matter what the architecture, requires that the client and the servers be in sync. That's not so hard if the game is trivial like Othello or Quake, but The Sims network protocol is much more complex and quite sensitive to incompatibilities.

>So there is absolutely no way to support any more than one client executable, because the clients and servers must be updated together in real time by downloading patches, just like Ultima Online and other games.

>In order for there to be a Linux port (or a Mac port), it would necessarily have to be done in-house at Maxis, built off of the same code tree, developed in parallel.

>It is simply not possible for a third party developer like Loki to stay in sync with the ongoing development at Maxis of The Sims Online. That would require enormous overhead and resources on the part of Maxis, all for an extremely negative return on investment: it would extremely complicate and slow down the development process, require extra programmers, quality assurance people with Linux skills, etc.

>Cross platform development requires a LOT of overhead -- please believe me if you haven't tried it. The gross income from selling Linux clients would be infintesimal, and would never outweigh the enormous cost of development. There is absolutely no way EA would ever allow Maxis to flush their stock holders' money down the toilet like that.

>That is the harsh, real, undenyable reason that Wine is the most practical and economical way to run games on Linux.

>I am quite pleased that Transgaming has developed Wine so far that it can now actually run The Sims! What's wrong with one Linux company coming up with a free and practical implementation of a great idea, that puts another Linux company out of business? Think of it as evolution in action, to quote somebody whose name doesn't deserve mentioning.

>The way Transgaming has improved Wine is so generally beneficial, that running The Sims Online on Linux the very day it's release on Windows, is now practically in the bag! With Loki's pace and approach, there was never any hope of that.

>The thing that matters most is the fact that a game DOES run on Linux, not HOW it's implemented. Real People in the Real World don't care about religious issues like if it's running under Wine or if it's a native port. It takes over the whole screen anyway, so what does it matter? The end experience is the same.

>Thanks to the generality of Wine, now there exists a whole spectrum of solutions, from binary emulation, to recompilation, all the way to native porting. Wine could be an extremely useful tool in the process of doing a fully native port.

>Those irrational people who reject Wine for purely political reasons, are doing much more damage to Linux than Wine will ever do. They're trying to argue that trivial invisible implementation details matter so much to users, that they would reject Linux if their favorite games weren't native ports, even if they ran under Wine. That's totally ridiculous.

>The fact that a game runs on Linux at all, is MUCH more important than whether or not it's a native port.

>Another advantage to Transgaming's Wine approach, is that all the existing free external tools like Transmogrifier, SimShow, Facelift, Art Studio, Home Crafer, Menu Edit, File Cop, and the many third party tools, will all probably run under Wine. And if they don't, Transgaming considers it a bug in Wine, and wants to fix it. Most of those tools will never be ported native to Linux, so the only way to use them is though Wine.

>I just can't believe that people would attack Transgaming for all that they've done and given back to the community. The alternative is for Linux to simply hold its breath and go without most games.

>The consequences of that alternative are dreadful, and much more harmful to Linux than the imaginary consequences of Wine. Now that Wine has been improved enough to run games like The Sims, it has so many other wonderful uses as well. Why would you ever consider sacrificing all that?

>It's not worth attacking Wine out of political correctness, in order to wait around forever for native ports that will never happen. Please don't cut off your nose to spite your face.

>-Don [donhopkins.com]

[...]


What did he do? The Wikipedia entry seems harmless.


I added more links and quotes to explain what he did, and my own experience with him. (; Hold my Wine. ;)


Never been part of that community, however the best place to play Windows games is on Windows.

GNU/Linux following OS/2 steps on Windows games support, will find the same fate in game studios adoption.

It is quite telling that despite all game related NDK APIs being available on GNU/Linux, hardly any studio bothers with making their Android games available on GNU/Linux.


Anecdotal but: Windows 10 and 11 is actively hostile to users in a way that was unthinkable 10 to 15 years ago. The amount of telemetry, content scanning, hiding of offline accounts, and reusing private content against you (Copilot, if you're wondering) is nothing short of alarming.

I'm moving my gaming and desktop experience to Linux because of this though I admit it's at a snail's pace.

More importantly: Office 365 is becoming more and more web-first: The only reason to use it within Windows today is because of the legacy add-ins, which Microsoft is slowly deprecating.

On its current trajectory, I don't see a future where Microsoft Windows remains relevant.


Meanwhile game studios won't care about what you think, and will happly pretend only Windows exists for PC gaming.


Or, game studios will ignore your advice and just follow Steam's lead.

You know, given that Unity and Unreal engine have been touting cross-platform for years and even Activision is making a half-hearted attempt here.

And because the PC gaming experience has been degrading for the past 4-5 years. It's not just user hostile, it's become hostile to gamers.


Here is an hint, gammer culture doesn't care about FOSS, only about IP, exclusives, and great games regardless of the platform.


Strange. Your argument is exactly why Windows doesn't have the stranglehold on publishers or gamers that it once had.

Windows and the PC isn't the platform for great games like it used to be, because it actively gets in the way.

That doesn't mean Linux is automatically the answer, but there's an opportunity here because Microsoft has made multiple missteps in the past few years specifically with regards to games.


Better check the market numbers, after game consoles, which have always been a top target since the 1983 crash in US, PC (Windows) dominates the charts.

You don't seem well prepared for the FOSS crusade.


You’re making and taking this quite personally.

Games thrive on consoles; isn’t that how Steamdeck is positioning itself?

And didn’t you just say consoles outweigh PCs already? And why are consoles consistently in the lead?


Moving goalposts, the point of the discussing was about "Linux" games.


The goalpost shifting and strawman is yours.

You made this a Linux only discussion, not me. I'm perfectly happy if Wine proves to be reliable enough for my games.

In fact my original post was an anecdote (me). For some reason you chose an anecdote to make a series of personal attacks and unwarranted generalizations, but I'm happy to debate and bite because it's a fun topic for me personal attacks notwithstanding.


> GNU/Linux following OS/2 steps on Windows games support, will find the same fate in game studios adoption.

Valve says otherwise.


Proton is the proof of Valve's failure to make game studios write native GNU/Linux games.


SteamOS came was indeed a failure, but I would have said it would fail in 2015. Anyone who’s been into games or software since the 70s or 80s would see this.

Steamdeck, however, came out last year and it was a sensible choice by Valve to support Wine and Proton and here’s what’s different.

Steamdeck is a platform that happens to run Linux as a full OS and Platform, and Wine in 2023 basically makes Linux (or at least Steamdeck) as another device driver.

Steamdeck is also a poor platform for PC gaming; but why is Microsoft is getting involved and helping? Perhaps its because PC gaming is a drop in the bucket compared to handheld and mobile.

If Valve remains committed with dedicated hardware and marketing—-and this is a big if—then it becomes another power console and platform.

The difference here is maybe Linux will get some of the crumbs and benefits.


PC Gamming === Windows, not WINE.

The only thing Microsoft is helping with, is exposing XBox Cloud streaming, which again isn't Linux.

https://support.microsoft.com/en-gb/topic/xbox-cloud-gaming-...


Wrong.

https://arstechnica.com/gadgets/2023/04/handheld-mode-for-wi...

By the way, Linux is your strawman; not mine.


> It is quite telling that despite all game related NDK APIs being available on GNU/Linux, hardly any studio bothers with making their Android games available on GNU/Linux.

What does this really say, other than even if it was literally technically zero-cost to port a game to Linux, studios would still not bother about it ?

(This is ignoring the fact that I rather doubt we are at the "zero-cost" level and that Android has probably the worst backward compatibility of them all)


It does say exactly that, even with very high GNU/Linux compatibility in what concerns porting Android games, no studio cares to make it, because there isn't worthwile from monetary point of view.

Actually, not even the studios that were invested into Stadia, cared about GNU/Linux, even though Stadia was mostly Linux based.


Yeah I was going to say Linux is a strange choice of platform for this project. While the kernel per se is solid, the userland has zero backwards compatibility. I expect his ports will bitrot and none will run two to five years from now, while the win32 versions will be kept supported in perpetuity. Though who knows, maybe this is what finally makes the Linux community start paying attention to backwar.. pffhahaha no I can't finish this sentence with a straight face. I have been using Linux for over 20 years now and am absolutely certain it will never be a stable target. Just compile for win32. Or wasm. Maybe one day wasm will be the stable desktop ABI.


> While the kernel per se is solid, the userland has zero backwards compatibility

That is entirely wrong.

It is not because your latest distribution does not provide old/antique glibc by default that it means you can not run it.

It is currently pretty trivial to get a 20 years old glibc (or whatever library) recompiled under Linux to run your old game and call it a day.

Software like Nix nowadays even make that surprisingly easy with sandboxing.

That is currently the strength OSS and something you will never be able to do on Windows. As long as the version used is known and the source not lost: Software lives forever.


It is not that trivial, and there have been breaking kernel changes that prevent running ancient glibc versions. Fortunately, glibc is open source, so you can actually patch it to run in older versions.

Just grep a kernel's Kconfig and see how many entries claim "this will break binary compatibility".

Even compilers are not that good at building old software.


> Even compilers are not that good at building old software.

Even old compilers can be boostrapped the same way.

Sandboxed prefx-install package manager like spack and Nix also allow you to do that without too much pain currently. They also give you the possibility to generate full featured Linux images if necessary for QEMU for instance.

That said: It is true they do not do it by default, it requires configuration (probably due to a too small userbase I guess).

I wonder if the userbase on that dedicated to old games/old app/old tools would be enough to dedicate a project on that entirely


You point to Nix as if it was doing anything whatsoever the remove difficulty. You are replacing the entire libc. There's practically no benefit whatsoever to a prefix-install, Nix, a container, or whatever you want. You are going to hit the same issues whether you do this with Nix or you do this with plain old chroots. At the end of the day you'll have to use a chroot or ld/elf trickery anyway since the game binary will expect the libraries in hardcoded paths.

You'll hit build issues, bootstrap issues (at which point you'll likely need an old Linux on a VM), kernel issues (which will force you to either patch the kernel, the libc, or both), and then a myriad of desktop integration issues, which is as usual the pain point and the main reason I always say static linking is absolutely useless for forward compatibility. For example, the game will try to set a rare resolution & refresh rate using Xrandr (good luck!), or will try to use OSS (for which user-space emulation with LD_PRELOAD is almost always useless, the in-kernel one prevents you from using pulseaudio, and CUSE doesn't support mmap which a shitton of games use), or even expect some specific behavior from the window manager, etc.

With more recent software, you'll hit issues with the software expecting some D-Bus services e.g. such as project utopia (yet another Linux desktop compatibility disaster that has been quickly forgotten by everyone involved, except for the software that decided to embrace it and now will never run again). In these cases building the libraries is only half the way...

That said, it's still true that a lot of times having the source available for all the underlying libraries is what makes these things possible _at all_.


This is exactly what happened when I tried to run some of my old Loki [1] games a few years ago on a modern Linux distro. Tried a few tricks to get them working, but every problem solved opened up 2 more.

1. https://lokigames.com/


Nix actually is more capable than you give credit for. Sure, it does not solve every conceivable problem you can come up with. However, talking out of experience, it does make running older software easier and much more viable.

Nix offers more than just prefix installs. It offers:

1. Reproducible builds of past packages and all of its dependencies down to the compiler that was used to build it.

2. Reproducible configuration of an entire distro provided by NixOS. It can also build VMs from NixOS configuration. So running an older kernel or DE is hardly an issue in the rare case this is required.

3. The ability to treat package definitions as code. You can take an existing package or its dependency and modify it however you like with a few lines of code. Tedious tasks like modifying rpath are abstracted away. It makes iterating on packaging work easier and less error prone.

4. Large binary cache provided by the NixOS Foundation. Binary builds of packages dating back years are currently available. This is helpful because it greatly speeds things up.

All of these things reduce the amount of work required to run old software. And new ones too.


You must be one of these who believe that if you have a copy of a game binary and all its dependent libraries down to the last byte it will work as-is on a newer desktop. The entire point of my previous post is that this is as far from the truth as it gets, and even if I keep such a backup set of libraries (which we do, as that is the easiest part) the game will still not work.

Nothing that you mentioned is absolutely of use and is just going to get in the way. For example, the kernels you are going to need to use likely don't even have NPTL. No ELF exec in the NixOS "binary cache" will even work without NPTL; NixOS did not exist during the LinuxThreads eras.

Reproducible builds are absolutely useless here; I have the exact binaries that I want to run already, and most of the solutions are going to require patching the underlying libraries while still running the same game binary, practically the complete opposite idea to "reproducible builds and reproducible dependencies". You are going to spend more time fighting the package manager than the actual problems. How much does NixOS help running games which require OSS? Old versions of OpenGL that your graphics card now supports only poorly?

Static linking and NixOS may have its usefulness somewhere but definitely not here. They are not guarantees of long-term compatibility, and may actually hinder it (by preventing patching of the underlying libraries, and enforcing deviations from LSB that a lot of software can't cope with).


> You must be one of these who believe that if you have a copy of a game binary and all its dependent libraries down to the last byte it will work as-is on a newer desktop.

That is not at all what I wrote. I think I made myself clear in the first three sentences from my previous comment.

> Nothing that you mentioned is absolutely of use and is just going to get in the way.

I disagree. Making it easy to recreate old software environments is a huge time saver when it comes to running old software. "Absolutely of no use and is just going to get in the way" is a baffling statement. What's the easier alternative then? Patch the latest versions of software to make it compatible with the old ones?

And I repeat, I said time saver, not a magic wand that solves every exotic problem. And the LinuxThreads thing isn't an inherent problem with Nix either. Software made after Nix can benefit from its reproducibility.

> Reproducible builds are absolutely useless here

> You are going to spend more time fighting the package manager than the actual problems.

> by preventing patching of the underlying libraries

The exact opposite is true. Nix makes it easy to use previous package definitions as a starting point and modify it to fit your needs. Applying a patch to a dependency is only a few lines worth of Nix code. Compare that with other package managers, where you have to fork a whole world's amount of package definitions in order to apply changes to a dependency few levels deep. Also see point 3 from my previous comment.

> enforcing deviations from LSB that a lot of software can't cope with

There are countless ways to deal with this. If source code is available, then Nix supplies the adequate configure flags. This works most of the time. Grepping and patching is also an option. If source code isn't available, there's buildFHSEnv, steam-run, and nix-ld. Also, many applications support configurable paths.


> You are going to hit the same issues whether you do this with Nix or you do this with plain old chroots.

The main advantage of Nix here is not its binary cache and reproducibility. It is Nix's ability to bootstrap any compiler tool chain + coreutils + kernel from scratch with just a bit of line of config.

Nix allow you to do what you would normally with a chroot but under steroid. It is not a magic bullet, far to be, but one order of magnitude easier to setup the exact tool chain you want than with anything else.

Most path related problems can be solved with a bunch of link and/or patchelf.

> will try to set a rare resolution & refresh rate using Xrandr (good luck!), or will try to use OSS

On this your are perfectly right. That will not protect you against this kind of crap related to exotic dependencies.

On that aspect Windows again is not much better (probably worst). Good luck to try to run in 2023 an old game that has a dependency on Novell IPX, an old version of Nvidia PhysiX or harcoded assumption on the GPU type


> while the win32 versions will be kept supported in perpetuity

Yeah.. no. I am surprised people would still believe the claim that Win32 has "good backwards compatibility", specially for games, when even the TFA is claiming that a Windows 7 game is _not_ going to work on Windows 11 without "maintenance".

I'd not go as far as say that all Windows 7 games won't work, but it is definitely a good chunk. And I believe it's getting worse, with Windows 10 era games likely unable to run in "Windows 12". I already have at least one piece of software with runs in early Windows 10 but not in late Windows 10. People are resorting to literally use the Wine libraries in Windows to run old games. At this point, to target a Linux API doesn't seem that bad and likely would even be easier than having to maintain your win32 game AND wine.


Barely any of the older Linux binary-only games I have (e.g. the early Humble Indie Bundles ca 2010) run without a lot of effort compared to old Windows games (in WINE). And even then there are several that I gave up on and some that only run without audio. Running the old Windows binaries is way easier. For several years whenever I buy a DRM-free game I make sure to download both the Linux version (if any; to install and play now) as well as the Windows version (to keep for the future, whenever the Linux version becomes annoyingly difficult to run).


Ok, how about iOS games?


Couldn't choose a worse platform. The user isn't even free the install what they want.



Strange article ignores the elephant in the room - the general success of Proton. Native Linux gaming is obviously ideal, but I suspect that if Valve couldn’t figure it out then it’s unlikely to become the status quo.


I think Lee is one of the biggest critics of Proton, and Valve embracing it as the default way to deploy games on Linux.

From Valve's point of view makes complete sense, dedicate resources to a single point of development that improves how their whole catalogue runs on Linux, as opposed to paying porting companies/individuals - like Lee - for each game to be ported to a native Linux version.

It also makes complete sense that Lee (and Ryan C. Gordon - icculus) to be against a platform that makes their skills irrelevant[1] and actively cuts their contracting opportunities. Therefore a promotional piece for these skills might be silent on an otherwise relevant topic.

[1] and also theoretically provides worse performance and compatibility.


Just like with OS/2, it makes Linux irrelevant for game studios.

Not even studios targeting Android care, and it would be quite easy, given that ISO C11, ISO C++17, OpenGL ES, Vulkan, OpenSL and OpenMAX, all exist on GNU/Linux, and most middleware engines as well.


> Fifteen years ago, Lee said, the idea that independent games could be so successful was so new, people were not thinking 10 years ahead. Now that he’s been doing it for so long, he has a new way of thinking about it.

8 and 16 bit home computer games were what people nowadays call indie games, followed by Flash games, naturally we knew how successful they could be.


Had several times a chat with Ethan Lee, he is perfectly aware of the technical challenges to keep game binaries working in the very close source hostile environment which is elf/linux.

BTW, I think Celeste is one of the most important games he does maintain on elf/linux.

One of the major choke points is ELF with its GNU extensions severely abused by glibc devs (and indirectly libstdc++ gcc devs and libgcc gcc devs). It is hell to generate elf/linux binaries with works on "old" distros from "recent" distros.


Wouldn't using containers help with that? Stabilizing the whole environment by pinning it to a known-working distribution + version.


It might help, but at this point Linux containers can't really stabilize the whole environment, especially for games. Particular pain points include accelerated graphics (which theoretically can have a stable kernel interface, but in practice is so complex and performance-sensitive that it's not stable enough to be a "reference platform", so to speak) and modern game controllers (which present a whole mess of concerns typically "addressed" in Flatpak by granting the device=all permission and hoping for the best [1]).

I also know a guy who ran into issues with a kernel update breaking a custom allocator, although I don't know the fine details. That wasn't for a game, but games also use custom allocators for various reasons.

[1] https://github.com/flatpak/xdg-desktop-portal/issues/536


This is what valve is doing with collabora "pressure-vessel" at the cost of a very expensive and intrusive kernel mechanism (mount namespace), and beyond sanity complexity ELF handling: they have to "scan" the host system and depending on the GNU versions (modules/symbols) used, have to cherry pick down to the host glibc libs to import into the container (usually it is limited to the driver libs). Currently, "pressure-vessel" is not finished as there are still bugs and missing features (basically speaking, if you don't have a massive and mainstream arch or ubuntu/debian...). Have to check on it in the near future, the devs have to fix the handling of host environment variables and some host libs which are not imported (for alsa dmix and dsnoop IPCs to work as intended in the container singe games need only to care about alsa-lib API).

Then to work around the toxic behavior of glibc/gcc devs, valve is building a "mitigation" of extreme cost and complexity with the help of collabora.

The guilty are the glibc and gcc devs, they deserve hate, and this is righteous hate.

The other mitigation is to drop the ball, and go full wine+vkd3d as a layer to protect game binaries from glibc and gcc devs SABOTAGE. But this is a gigantic shabby kludge (troubleshooting is just horrible, and not breaking anything on the long run...) ON TOP of the already kludgy and bloated glibc/gcc... valve calls its wine+vkd3d version "proton" (probably running in a "pressure-vessel", I don't use proton but a custom lean build of wine+vkd3d).

Valve seems not to do that fairly (hope I am wrong): proton is wine+vkd3d with nasty additional components. At least they should backport in plain and simple C everything from proton into wine+vkd3d... but some says there are too many real windows components (actual copies) and nothing really pertinent to backport. On the good side, dx12 is said to be a simpler vulkan, hence games, if coded with a conservative usage of the windows APIs, should really work better and better. Some troubles on the video/audio codecs, but game devs should really use a LGPL ffmpeg/libdav1d statically linked into their binaries for that.

Keep in mind: all that should not exist in the first place.

The alternative is to be extremely competent on elf/linux and to manage to generate game binaries which will work using only the core video game components, and that even on "older" distros... namely what Ethan Lee is doing, because do not expect game devs to know how to do that, it is a technical abomination ("pinning" can become quickly very hard, and it worsen if the game is using c++): game binaries naively generated on "recent" (for instance glibc 2.34) distros (what "normal" game devs do) won't run on "older" (for instance glibc 2.33) distros. Allready got at least a game and demo in this case on steam.


Sounds like maintaining a glibc fork would simpler than fighting the official ones with dirty tricks, but I assume they know what they are doing.

I am not sure glic maintainers "deserve hate" - very little people actually do. But it is probable that their incentives and culture run against the changes that would be required here. I can figure that "preserving Windows games compatibility" would stand _very_ low on their priority list.


glibc is tightly coupled with gcc.

Valve would need to fork glibc and most of gcc (at least the libs). And all distros which want to run games would have to provide video game core components using such runtime. Well, they chose wine+vkd3d (but a fishy "proton" version as I described in my previous post), and a "container".

The sabotaging behavior of glibc/gcc devs is not breaking windows games, it is breaking native elf/linux games, to a point you wonder if they are not rolling for msft and friends. Don't worry, they perfectly know what they are doing, hatred well deserved.

But there is a long term solution to the core of the issue: ELF has to go, replaced with an excrutiatingly simple executable/shared library format without static loading, that for modern hardware architecture. A bit like json/wayland, namely much harder to abuse. Or in ELF: do deprecate static TLS, static loading and GNU versioning (let's keep the latest ELF relative relocation).

Another way I am seriously thinking about: full "IPC" for video game core components: wayland/drm IPC, pulseaudio2 IPC (maybe something simpler though), /etc/resolve.conf, direct linux syscalls (in other words full static linking)... vulkan3D/drm IPC (oooof! super can of worms and lower performance), xkb IPC (oooof! another can of worms). game binaries would be static PIE executables.


I wrote this up about relicensing the Unix port of SimCity that I did for DUX Software as free open source software under GPL3 for the OLPC (which runs Linux):

"It's a good idea not to just keep a copy of the source code stashed away somewhere, but also any contracts relating to the rights to the code."

https://donhopkins.com/home/micropolis/dux-maxis-contract.tx...

https://donhopkins.com/home/micropolis/dux-don-hopkins-contr...

https://donhopkins.com/home/micropolis/olpc-ea-contract.pdf

https://news.ycombinator.com/item?id=13693675

DonHopkins on Feb 21, 2017 | parent | context | favorite | on: Development content accidentally shipped on a DOS ...

It's a good idea not to just keep a copy of the source code stashed away somewhere, but also any contracts relating to the rights to the code.

In the early 90's, DUX Software licensed the rights to port SimCity to Unix from Maxis. Then DUX made a contract with me to do the work. I kept a copy of my contract with DUX, the original floppies they gave me with the original PC and Mac source code, as well as versions of the source code I ported to Unix.

Years later I got a job working for Maxis on The Sims. Before we shipped it, EA bought Maxis, so a lot of people were let go, projects were canceled, physical and digital files were shuffled around, and institutional knowledge was lost.

After we shipped The Sims but just before I left EA, on a fluke, I asked a Maxis old-timer if he had any idea if the contract between Maxis and DUX for SimCity still existed, and where it might be.

As you would expect, it was in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying "Beware of the Leopard." ;(

So I waited late into the night for the leopard to fall asleep, made a photocopy of it, then returned the original to its hiding place. ;)

Several years later, John Gilmore suggested we persuade EA to relicense the SimCity source code under GPLv3, so it could be shipped with the OLPC.

Of course nobody in EA Management knew where the source code was or if it even still existed, but fortunately I still had my copy.

And of course nobody at EA Legal even knew if EA owned the rights to the changes I'd made (Maxis had gotten into some pretty terrible SimCity licensing contracts in the past).

But fortunately I'd kept a copy of the contract between myself and DUX, and the contract between DUX and Maxis, proving its provenance, which clearly stated that DUX's rights expired after 10 years, after which the rights to all the modifications I made went back to Maxis (and thus were inherited by EA).

Once all that was cleared up, the most important factor was that EA deputized someone on the inside to shepherd the project through the various stages of approval, relicensing, development and QA. Otherwise it would have died on the vine, since everybody in a big company, no matter how well intentioned, is always 500% busy doing their own stuff and can't be distracted by something that doesn't affect the bottom line.

It finally made it through both EA Legal and QA, and we released the SimCity source code and binary for the OLPC under GPLv3!

https://github.com/SimHacker/micropolis/tree/master/micropol...


Great post. It sometimes is a little bit awesome and terrifying to consider the edifice on which the entire superstructure of IP ownership rests. Think about all that intangible property out there that’s subject to long forgotten contracts, copyright assignments, license agreements and the link. Entire empires can be made or broken on the existence (or non-existence) of a piece of paper.


That's incredible, thank you!

I was just reading the source for Atari 2600 Pole Position yesterday and marveling that it survived until the present day.

I was only the developer on one commercial game, but I often wonder what happened to the source code for it after the studio was bought out.


to echo some other points in this thread -- i used to be very much in favour of linux ports / recreations, actively supporting them with my wallet.

but even without switching away from x86, the linux userland churn (64bit, and lately and even largely, wayland) is too much.

windows games still work.

(wayland driver for wine is actively being merged into the upstream at the moment)


There is a Wayland branch on the collabora gitlab where devs first started developing Wayland support for those who don't want to wait and can build from source.

It's going to be fairly different from the wine mainland after all the reviewing. But for those who don't want to wait a few months for that review process it works pretty flawlessly from what I saw.


Software maintenance shouldn't exist. Instead of fighting the source code to map the current trendy environment, we should compile our programs to easily interpretable formats that does not contain any environment specific code and therefore only contain logic specific to the given software.

So instead of having a binary that contain some syscall to access a key state, provide it the function on startup (or the key state value directly if programs become stateless functions)

As for the simple format, this is IMO the only way to guarantee accessibility as the alternative is having people constantly working on porting software over preferably with source access.


This is precisely the wrong solution. If anything, the CPU bytecode is one of the most stable things, so switching to some "interpretable format" is not going to help much. It is the total carelesness of literally everyone else in the entire stack which is the problem.

If developers considered the inability to run a 30 year old game/software in the most recent release of "libwhatever" as a critical release blocker bug, then this problem would not exist.

But obviously, this is not a popular solution. And currently we rather have the opposite situation, where open source projects dump compatibility as the default (e.g. Linux sunsetting a.out support), and only preserve compatibility in the comparatively rare case that there is enough developer mass.


ISAs aren't stable. Can you guarantee me that programs will be compiled for x86 in 20y? What instructions will they output to access the environment? Linux, Windows, MacOS? Will any of these even remain in 20y?

No standard last forever, your bytecode will ultimately deprecate and this is not a "developers should stop being lazy" situation.

Why do projects dump compatibility? Because this is extra work. And no project can receive indefinite support. Simple formats do not depend on maintenance.


I am going to post here what seems to be the two least shabby recipes (Ethan Lee is aware of them) to generate elf/linux binaries which will run on "old" and "recent" distros. Then you will understand why normal game devs are not welcome on elf/linux, thx to the very nice and video game caring glibc and gcc devs (irony):

First recipe:

You must not link in the glibc "startup" runtime objects, aka you must not have main(): your executable must be a pure ELF64 binary, only the "_start" ELF entry point and the ELF ABI ("_start" is basically a main() anyway). Careful: all glibc runtime services which require those "startup" runtime objects are gone (you need to be able to find out which ones... good luck browsing glibc code...). Because those glibc "startup" runtime objects often require the same version or above glibc runtime being around (it is very vicious, look at glibc 2.33->2.34 with the new version of the libc_startup symbol... delicious).

It means your build script on elf/linux must have fined grained compilation/linking (don't use the gcc compiler driver unless to discover static libgcc and static libstdc++ filesystem location... if you don't need to fork them).

All host system dependencies must be libdl (dlopen/dlsym/dlclose), yes even the libc, memcpy, errno, etc. Of course you will have to use the static libgcc and static libstdc++ (if c++ is used). Namely if your game binaries are pulling code paths from libgcc and libstdc++ which requires glibc services, you have to fork the libgcc and libstdc++ from the gcc you use in order to properly "libdl-ized" those code paths (readelf command will tell you that while you have a look at the undefined dynamic symbols... and their required version).

Then, you have the abomination of the ELF static flag: many glibc libraries have this flag, namely those libraries must be "statically loaded", namely have a ELF dynamic DT_NEED entry. You will find a ELF dynamic DT_NEED entry for libdl since you use (dlopen/dlsym/dlclose), and probably libpthread/libm(yes...!!!)/etc. Use the binutils readelf command to know if a library has the ELF static flag and must have such an entry (but you must still libdl it and its symbols to work around GNU symbol versioning issues!!). Some TLS code expect some TLS storage to be allocated and inited at executable startup, namely "static".

Of course, all distributed binaries must follow those guidelines... even from third party.

You want to expect normal game devs to be able to do that? ABSURD

2nd recipe:

you get your hands on the most recent gcc, and the oldest glibc you want to support (~10years, more?). Compile such glibc, then this gcc to use only this very glibc (good luck configuring that cleanly). You can use main() and directly glibc services here, but still, you must libdl everything from the video game core libs (you need the headers only). Expecting that to go smoothly is just unreasonable, it will probably be a nightmare. All your distributed binaries must be built using this gcc and glibc.

Expecting normal game devs to do that? GROTESQUE

Both recipes must not generate binaries with the latest ELF relative relocations.

The video game core libs on elf/linux are:

wayland(static code in binaries)/libxkbcommon fallback-to-> x11/libxkbcommon-x11 (don't use libX11, only xcb libs).

vulkan fallback-to-> GL fallback-to->CPU, be very conservative with shaders, VERY.

libasound: alsa-lib, the software mixer is hidden behind the API, as it could be pulseaudio1/pulseaudio2(pipewire/wireplumber)/dmix/dsnoop/jack/whatever, have a ffmpeg resampling library ready if your audio formats are weird, or the user microphone audio format is weird. The trick: make user configurable the "speaker configuration" (5.1/7.1/headphones/stereo/etc), but with a automatic fallback to stereo if unable to open the device (all software mixers should support stereo mixing).

joypad: linux /dev/input/eventXX files, check write access if you want to configure the rumbles. You can implement dynamic detection with linux inotify there (usually overkill though).


Would anyone ever sign up for a fucking newsletter from the first god damn second they open a website? Why do they do this? What kind of a stupid purpose does it have other than annoying users?


Hint: Use abuse@ emails and they will stop doing this on their own :)


Some context for those wondering: Those are reserved aliases for google. If you send a mail go abuse@company.com and they use google, that mail will be receiced by google (and the company if they have that set up). It's intended for reporting spam sent out by company.com email addresses.

Not entirely sure how it works, but I assume sending newsletters to that causes some automatic sanctions for the company that are a massive headache to resolve.


No, abuse@ is not some sort of Google invention but to the best of my knowledge was an informal convention that then got formalised in RFC 2142 [1] in 1997 along with many other common mailbox names. If Google sets this up automagically for their users it is great, but they are merely continuing this long tradition.

[1]: https://www.rfc-editor.org/rfc/rfc2142.txt


https://support.google.com/a/answer/178266?hl=en

This is if the domain (company.com) is managed by Google.


TIL. There’s gotta be a browser extension for this!


Note that if you close the modal and read the article and then reload the page, it doesn't appear anymore. Which is weird; weird to ask for a newsletter subscription when you never saw any content yet, weird that this modal doesn't exist anymore after this.


Worse yet while browsing with JS off by default: the popup shows up, but cannot be closed (and does not work for subscription, either), only obscuring the page.


Disable CSS too and it's fixed. :)

"Perfection is achieved... when there is nothing left to take away"


Yes, many non-technical people do. If it didn't work, the people who add that popup wouldn't keep doing so.


You're assuming that everything is run by purely logical data-driven statistics-oriented leaders with accurate models and methodologies.


That is what many marketers do, yes. I've worked in ads, I know how it works. If the numbers aren't there, they usually don't continue.


That is true, but IME only if it incurs costs.

In this case, however, a pop-up on your own site costs nothing. In fact, removing it incurs costs in development time, even if it's a simple line delete.

So, I am 100% sure no marketer would ever push for removing it, unless end-users specifically tell them that they are not going to use the site because of the pop-up. The likelihood that they realize that on their own is next to zero because of the difficulties in retention analysis to catch this.


I'd say if their click through rate to the actual article (aka bouncing off the page instead of clicking the close button and continuing to read the article) is low, they'd likely remove it. At least that's how I've been able to convince certain marketers about the inefficacy of their tactics. But other things, like modals to subscribe to the newsletter that pop up after reading the content half-way, simply work too well to ever give up.


Now, that is a different take. You have been able to convince them. It's not like they realized this themselves, or that they deduced something from the numbers.

Again, only my experience at past employers, but in a lot of cases the developers don't get heard when making these suggestions.



At what point is the faulty generalization simply...a true generalization? Just because you don't want to believe it's true doesn't mean it's not true.


[flagged]


Yikes, you've been breaking the site guidelines so badly that we have to ban the account—especially because we already asked you to stop. It's not ok to be abusive like this on HN, no matter how wrong someone is or you feel they are.

If you don't want to be banned, you're welcome to email hn@ycombinator.com and give us reason to believe that you'll follow the rules in the future. They're here: https://news.ycombinator.com/newsguidelines.html.


What does "technical" have to do with it?


Technical people often complain about popups and other marketing practices, not realizing that the vast majority of people aren't technical and do not complain, much less care, about such practices. They'll click the close button on the popup and move on.


They absolutely care. It's just that they think it can't be helped. People love it when I install uBlock Origin for them. They tell tell me the web just feels nicer, less annoying to use.

"Users don't mind" is nothing but adtech propaganda.


They definitely care. They’re just not empowered to do anything about it It’s ubiquitous and technical solutions are not available to the non-technical.


I can tell you from experience, they really do not care. It's not even something that crosses their minds. You can see for yourself, ask 100 non-technical people you know about stuff like ads and popups. I bet over 50% of them simply do not care. It is a technical person's bias to think they do.


I assume that asking "do you care?" will give you "no" quite often. But asking "do you want to see ads?" will give you an equal amount of "no" if not more. Or asking "do you want to block ads?" will give you a lots of "yes". It's all in the question, as over 50% of all savvy marketeers will know.

That's probably the reason why even the people in the marketing department of my last tech lead gig run adblockers on their Macs. They used to not do that but as one of them recently told me "the internet has turned to shit". I had the feeling the irony of the situation was not lost on him, but he was unable to verbalize it.

I also recall a company where the IT just rolled out an adblocker one day (in 2018 or so) to about a 1000 seats and it was the talk of the week in the cafeteria because the internet had become sooo much better.

Sure, these are individual experiences with little statistical value. They confirm my bias that people, given a choice, would like to see less ads. Most of them don't know they have a choice beyond "seeing ads" and "not using the internet", so they submit. Show them more choices and they gladly chose less ads.


> ask 100 non-technical people you know about stuff like ads and popups

> ask

No. Don't ask. Show them. Just install uBlock Origin and see how they react. Show them a better world.


I have done that. Most people go, "oh, cool," then continue on. They really don't care one way or another, and it's more evident when I ask them later about what I just did and how they feel.


> I have done that. Most people go, "oh, cool," then continue on. They really don't care one way or another, and it's more evident when I ask them later about what I just did and how they feel.

They start caring when they use another computer that doesn't block popups.

You say you've installed ublock origin for people? Lovely - go back a month later and remove it, then see how much they care.

People don't care when they think they have no chance to get something, but they do care when something they have is taken away.

Many people don't care about the results of a lottery for a car, but take away those same peoples $1 for a ticket, and suddenly they are very interested in the result.

Your experience matches reality: since you're only executing the first half of the experiment, you don't see any results. The reality is that the results only show up if you complete the experiment.


I've also seen what you described, as some have gotten new computers or started using new browsers (or new profiles in browsers which reset extensions). People generally find it nice to use ad blockers but really don't mind that much when it's gone. The average person tolerates a surprising amount of bullshit, you see it already with the amount of ads people already used to watch with cable and now watch with YouTube ads, for example. What seems essential for you and me is just a little perk for most people, it seems.


That's not my experience at all. I install uBlock Origin on every browser that I use. It's literally the first thing I do. After a while I started hearing people's comments. The web just feels nicer, they said. They know. Even when they can't quite explain why, they know.


Like the sibling commented, have you removed the ad blocker and seen their reaction after some time? In my experience they don't really care after a while. That is to say, they found the ad blocker a nice perk but not a fundamental need.


> have you removed the ad blocker

I'd never do that to anyone.


Then, like the sibling says, you've only done half the experiment. Regardless, I would also never remove someone's adblocker, but I've seen that when it is removed, for whatever reason such as switching computers or browsers, people often don't care enough to install it again.


I'm not a fan of experimenting on people. That's what adtech does with their engagement A/B tests.

In the end it doesn't matter if they care or not. We care. We think it's the right thing. So we install it again for their benefit. Because we care about them.


I don't have access to everyone's computers that I know. I tell them to install and use an adblocker and sometimes they do, sometimes they don't, up to them. Ironic that you say you don't want to experiment on people but then seek to patronize their choices by doing things for "their benefit." People can do what they wish, once the information has been passed on to them.


I don't randomly grab people's phones and install things on them though. I'm talking about the computers I share with coworkers, friends and family. I use the computers too. Since I don't tolerate ads, they won't need to either.

So I don't see it as patronizing them at all. I see it as leadership. I'm putting my reputation on the line. People trust me with all this computer stuff, if I screw things up they won't trust me anymore. So I make it my responsibility to ensure it works and that they will like it, even though they "don't care". For example, on these shared computers I don't turn it up to the point it breaks pages. I know how to deal with those breakages, they don't. So I ensure they never have to.


If you're sharing the computers then that's a different story. We don't share our computers so we don't all have ad blockers on them. Mine does of course and I add them to my family's but if they switch browsers or reset / upgrade their PC, I'm not often there to redo the install. So I'll tell them to do it but they don't always do it. So I give up, if they know about ad blockers but don't care enough to reinstall them, that's on them at that point. Which leads me to my point that people don't generally care.


I haven’t had to ask. They verbalize their frustrations.e

I think your questioning probably requires a certain level of care, thought, and ability to form a response that isn’t necessarily indicative of whether someone cares to whatever degree. Someone can say, “Oh, I don’t care” and still experience annoyance indicating that they do actually care.

I also think 50% is way too low a bar to claim a class of people does or doesn’t do sometbing like “caring”. If 60% of people “don’t care” and the it’s a really big deal for people who do, it doesn’t seem like a claim that “such people don’t care” is the truth.



But technical people ofter block JS by defaut which prevent the button from closing the popup. Fortunately, reader mode on firefox solves the problem.


The "they" in my second sentence is about non-technical people, not technical people, I should have clarified.

Regardless, the number of people who use the internet and who actively block JS by default is a fraction of 1%.


Didn't paper mags come with subscription cards?


Those don't hover over the first article you try to read.


That modal is a bad design, along with the site name - I almost closed the page thinking it was a 404 not found.


That modal window cannot be closed without JavaScript, rendering entire page unreadable, if not the Reader View.


I actually wrote about how bad the <dialog> tag is for the web and I'm sort of glad to be proven right, if by absolute annoyance

https://tane.dev/2021/02/revisiting-dark-patterns-with-the-h...


On Firefox, go to View, Page Style, No Style to make it readable without JavaScript.


You can also try the Kill Sticky add-on for Firefox: https://addons.mozilla.org/fr/firefox/addon/kill-sticky/. It will also get rid of the annoying banner in fixed position at the top.


yeah but why is it on me to defeat a hostile user experience?


I actually did, until I saw page link and comments on HN discussing content.


Same here.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: