Hacker News new | past | comments | ask | show | jobs | submit login
World of Goo Update, 10 Years Later (tomorrowcorporation.com)
663 points by troydavis on May 1, 2019 | hide | past | favorite | 278 comments



World of Goo was a great example from the early Android days of what people thought the future of mobile gaming would be. High quality games that could be controlled with tapping and swiping.

Unfortunately it turned out to be mostly games preying on the impatience of the player to make microtransactions.


Flight Control, Doodle Jump, (early) Angry Birds, Fruit Ninja, Tap Tap Revenge, Cut the Rope, Fieldrunners. I really miss the good old days of iPhone gaming.


If you haven't seen it before, check out the "Battle of Polytopia". Basically mini-civilization with a few extra races you can buy to support the developer. http://midjiwan.com/polytopia.html


I used to love polytopia. But, once one figures out how to win, it loses playability. But, totally worth the $5 or whatever I paid.


My kids love Polytopia! I'm using it to lure them into Civilization.


Ha. My brother knows I like civ so he's using that to lure me into playing Polytopia.


Flight Control :(

I wish they update the game to make it compatible with latest iOS.


Firemint was eaten by Electronic Arts (EA) and most of their games disappeared (including Puzzle Quest, which they acquired).

Companies should consider when they can open source things like this to let greatness live on, if they won't find a way to support it directly.

https://en.wikipedia.org/wiki/Flight_Control_(video_game)


Tiny Wings was a masterpiece.


Switched to Android years ago and am still sad that Tiny Wings is the pretty much the only game that is Apple exclusive.


Maybe try Dragon, Fly! It is quite similar.


Tiny Wings still exists, and even came to Apple TV a couple of years ago, with some updates for newer iOS versions.


I keep a prepaid SIM in an old iPhone 4 as an emergency backup phone so I'm not SOL if I lose my phone because I don't have a landline at home. Every now and then I check on it to make sure it's still working and always end up playing a few rounds of Flight Control when I do.


There's a game that is a lawsuit worthy knockoff on the App Store right now. No complaints.


Wow there are a lot of them. And of course they all have in-app purchases :/


Slightly related: Human Resource Machine of these guys was the best mobile game I've played so far.

The game idea is just fantastic and I think this game is a great example of how to teach stuff (here assembly) through video games.


Seconded. If you have any interest in CS and are looking for a cheap mobile game to pass the time, give this a look.


Only game where I have 100% achievements. Currently working my way through and optimising 7 billion humans.


Robert has been doing what I knew could be done with mobile gaming for many years now: http://www.nimianlegends.com

I don't know if he's picked up more people or still solo, but Brightridge is a beautiful open world with much to explore. While everyone else was making candy crush and Mario knockoffs, he was building that.

I even played the likes of Infinity Blade and it felt... underwhelming.

BrightRidge really leveraged the technology available.


I find controlling on the device you're supposed to be looking with utterly frustrating - how are the controls? I'm all for supporting android games that are games, and not skinner boxes with "Angry Birdz" written on the side.


To me the pinnacle of games on Android is Clash Royale: how it essentially shrunk the frenetic real-time action of many pc and console multi-player games into a very watcheable 1vs1 quickfire game, with an engaging meta based on card updates and new cards, and even the fads in tactics. It has micro transactions but no nagging and no ads.

Also I think the best phone games utilize a "portrait layout" than landscape, but many many game developers does seem to think that way, and are creating games with landscape orientation, that often requires awkward hand positions to play.


If there's money to be made from subverting something, someone will do it.


It was on PC and Wii quite before android and a bit before ios. The gameplay is definitely touchscreen friendly though


Turns out most people use the ‘alternative app stores’ to avoid paying for games and developers are just following the money. High quality games are appearing on iOS.


That might have been true in the past, but today the App Store is just as buried in free-to-play/pay-to-win crapware as Android is.


There's still a ton of high quality game there.


It's the same on Windows Store as well, UWP crapware.


There is quite a bit of indie developers. I personally published Rob - robot programming game. Basically a port of RoboZZle (which btw exists on Android and iOS too).


Holy cow, I bought the game 10 years ago, found my old email with the "Secret World of Goo Download Location" and it worked :) No Linux update yet, but I'm going to play the original anyway. Such a fun game.

Edit: .deb package didn't work on Xubuntu 19.04 but extracting the .tar.gz and running ./WorldOfGoo.bin64 worked like a charm


I just remembered I too bought the game 10 years ago, as part of the Humble Indie Bundle. I also found the old email but it didn't "just" work.

Interestingly, the part of the chain that failed was their then email marketing provider, that hijacked all the links in the email to provide click tracking, doesn't exists anymore. So all the links are dead even though they would have redirected to pages that are still functioning.


look at the text version of the email, it contains the actual URL.

based on your post i checked if i got the bundle that included world of goo, and it turns out i did. it contained the key url as http://www.humblebundle.com/?key=... and that url is still working. as someone stated elsewhere in this discussion, the humble bundle version is not updated. it's downloading version 1.30 right now...


in the meantime i am playing osmos in the browser which the humble download page helpfully includes. i am pretty sure that can't have been there when the bundle was released initially


This is hilarious, and sad.


Have you tried making an account on Humble Bundle? It might be worth a try to see if it recognizes your old purchases.


Some of them encode the target URL in the tracking URL itself so it might be worth checking that. I think MailChimp uses a Base64-encoded JSON object in the tracking URL which you can decode and get the real URL even if the tracking link is no longer working.


This is actually super impressive. Managing to create an application bundle which doesn't break down every now and then with new system updates is super challenging. 10 years is just wild.

Unless you only have libc as a dependency but I doubt that for a graphical application.


> 10 years is just wild.

And on Windows, a 10 year old program NOT working is extremely rare.

Hell, a 20 year old program is most likely to be running too (most common issue would be the installer but usually it can be worked around).

> Unless you only have libc as a dependency but I doubt that for a graphical application.

You can also have xlib as a dependency and it'll work since unlike whatever madness is built on top of it, the X11 API is stable (but of course modern developers dislike that stability and want to replace it with Wayland that breaks everything again so they have new toys to play with). Example: https://i.imgur.com/YxGNB7h.png (from an old toolkit of mine, showing the same binaries running in 90s RedHat and last year's Debian).


To be fair, X11 sans extensions is stable, but those extensions are how they provide a lot of the more complicated functionality that's been built since X11 was born.

Wayland originated, AIUI, because we've got layers upon layers of complexity and indirection to bend X11 into what we want on a more recent environment, and building a system with more awareness of what the constraints are in recent systems lets you implement things much more cleanly than those levels of complexity (or, at least, not much worse, if you're using XWayland).

(Of course, it's not implausible that in 20 years or less, we'll think of Wayland as requiring lots of kludges and hacks to do what we need then, but you can only plan for known knowns and known unknowns.)


[citation-needed]. Heck, I've gone as far as running Windows, a Linux VM inside, and running the old program in Wine: more likely to run that way than in Win10. I'm aware of all the impressive Windows compat work that went into keeping the Win32 ecosystem ticking, but recently (last ~2 years) this seems to have gone downhill.

In the long run, it always comes down to emulation, regardless of system (e.g. Win16 is no longer a part of current Windows). Perhaps ReactOS has some promise w/r/t old Windows executables...


Doesn't that mean Linux is entirely broken? Isn't the whole point of OSes that things that used to work, keep working?

I mean, lots of things are wrong with Windows but I can double-click a .exe from 1998 and in most cases it'll just run. I'd assume Linux, being used a lot for servers and long-running apps, would have this too?


Linux itself has this and is very strict about not breaking old userspace.

It's the userspace libraries that likes to drop compatibility with old binaries left and right and with most distributions it's not possible to install old versions of libraries.

Containers like Flatpack and Snap promise to solve this by just bundling the application with all the libraries it needs, frozen in time. The Kernel then can handle it.


I have little Linux experience, so pardon me for asking silly questions. Doesn't Linux then choose a weird place to put the border between "OS" and "userspace"? If, say, an opengl library or an ssl library installed by the distribution breaks backward compat with earlier versions, how is that not the OS breaking backward compat? Sure, if a user hand-installs some new library version, globally in the system, --force, y y y, then I get it. Things will break. But that's not what's going on, right?

It feels a bit weird to see the kernel folks put so much effort into "never breaking userland" only to have the distributions throw that all away with a "yolo, the new libyadayada is 1% faster and who cares about old programs!" kind of attitude. Is that really how this works? Or am I just misunderstanding some nuance? I'm not trying to be critical on anyone btw, I'm just trying to understand what's going on :-)


> Doesn't Linux then choose a weird place to put the border between "OS" and "userspace"?

Your fallacy is assuming that "Linux" is a single entity with a single mind. Linus is very strict about backwards compatibility, but he only controls the kernel, not glibc or Mesa or KDE.


Sure, but that just means a failure of Linux as a product experience.

Users don't make that distinction, even a little bit.

This is also arguably a failure of the open source model in practice (though not in theory). It tends to generate projects that are really good at infrastructure and really bad at product. The ones who fill in the product experience as best they can are the companies that are trying to get paying customers, because those customers won't stand for a weird experience, regardless of the technical/governmental reasons


The open source model expects apps to be open source. And if they are then an app can be recompiled to support new library versions.

It is the hybrid open source library / closed source app that causes problems like this.


Having the source code will not help you when the APIs change - see SDL1 to SDL2, Gtk1 to Gtk2 to Gkt3 -soon- to Gkt4, Qt1 to Qt2 to Qt3 to Qt4 to Qt5, etc and all the desktop environments and tons of applications wasting their time "upgrading" from lib version N to incompatible lib version N+1 because the lib's developers do not care about their users (both in the end user and library user sense) and broke backwards compatibility), just so that these applications will do the exact same thing, just now with a different API.

No, backwards compatibility isn't something you solve with open source, open source is perfectly capable of breaking it in all the fabulous ways possible as has been already demonstrated many times.

Backwards compatibility is something you solve by actually caring about it when you are designing your version N+1 library (and preferably, even when designing your version 1 library's API to allow for it).


SDL1 can be installed alongside of SDL2. Same with GTK and Qt. I really fail to see the issue here. If a developer of a Qt4 application doesn't feel like updating to Qt5, users can still use the Qt4 version with Qt5 installed in their system. Most Linux distributions still package Qt5, Qt4, Gtk3, Gtk2, Gtk1.2, SDL1.2, SDL2...

On top of that, if a binary-only application really wanted to stand the test of time, it could statically link the libraries.


At least Debian has dropped GTK1 and qt3 - it's not trivial to get old libraries on a modern system.


This doesn't help much when the specific application you want to run is not available in your system because it doesn't ship Qt3 or Qt4 or Gtk1 or whatever (note that AFAIK none of these are shipped anymore in Debian sid) and it also doesn't help much if the application you are trying to run is using a library that, even though it exists in version N, there is already a version N+Y (Y=1, 2, whatever) that essentially provides the same functionality to the application but it is backwards incompatible while providing more functionality to the user that you'd like to use.

To make that last part more clear, imagine a theoretical "libopenfile" library that shows a file dialog, this library in version 1 exposes a function like "open_file(const char* path);" but in version 2 they improved the GUI a bit (e.g. making the dialog make better use of screen real estate, added a create folder button, etc) and changed the function name to "lof_select_file(const char* path);" because arbitrary tidyness reasons and in version 3 they changed the signature to "lof_select_file(const char* path, unsigned flags);" and added the ability to load plugins that could provide more actions for each file in the displayed file list, show overlay icons, etc.

As things are right now, a modern Linux system would need to have all three versions of the library installed - assuming the distribution cared about backwards compatibility to provide all of them instead of the latest and perhaps the previous version - meaning there would be three different libraries providing essentially the same functionality in three slightly different ways. Moreover, and also very important, version 1 applications do not get to use any improvements introduced in version 2 and neither version 1 nor version 2 can use the plugins introduced in version 3 - even though, again, all they want is to show a dialog to choose a file.

The way that would have been done right is for the version 2 of the library to still export the "open_file" function but simply make it call "lof_select_file" and version 3 make a "lof_select_file2" (or "_ex" or whatever, if the reason to not do that is because you dislike the naming, you are part of the problem) that accepts flags and have "lof_select_file" call it with a default set of flags. Then everything would work, from version 1 to version 3 and all applications would get the functionality introduced in subsequent versions (and as a sidenote, no, i wouldn't remove open_file from the header files to encourage using the renamed function since backwards compatibility in source form is very important too - not everyone who gets to compile a piece of code is the developer of that code, or even a developer).

Now the above is a bit of a strawman, but it is exactly the situation you get with Gtk1, Gtk2, Gtk3, Gtk4, Qt1, Qt2, Qt3, Qt4, Qt5, SDL1, SDl2, etc and pretty much most of libraries that you find on a desktop system (notable and thankful - so far - exceptions, being the X11 libraries and... Motif, although that isn't installed by default anywhere so it isn't really something you can rely on).


In a worst case scenario, couldn't you simply bundle a static version of an old library with an old open source app to avoid the problem of a distro dropping old dependencies?


This works when it's only libraries that are changing. As soon as protocols change, you're fucked. Case in point, Wayland. Right now, most Wayland compositors run Xwayland (i.e. an entire X server) under the hood to enable legacy applications without Wayland support to run on them. This will eventually not be supported anymore. Not today, not tomorrow, but maybe in 10 years. Then it's gonna be absolute game over for everything that's compiled against X libraries.


In a Linux distro you specify dependencies by library version and you are never required (as far as I know) to upgrade to a new major library revision. Games compiled for SDL1 are not required to upgrade to SDL2.

There are some absolutely ancient games (Xgalaga for example) that I used to play on Linux on 1996 that still play just fine today on my Debian machine.


Games compiled for SDL1 are not required to upgrade to SDL2, but games compiled for SDL1 will also not get any functionality introduced in SDL2 that could have been transparently made possible to them if the API wasn't broken in SDL2 - e.g. supporting alt-tab, wayland, etc.

Xgalaga works because the X11 library developers have not broken backwards compatibility. This is a good thing. This is what i'm referring to when i'm talking about a library not breaking backwards compatibility and what something like Gtk and/or Qt (and, IMO, SDL) should strive for.

EDIT: i give a more detailed example here - https://news.ycombinator.com/item?id=19813074


> Having the source code will not help you when the APIs change

Yes it will! If it's free software, you can update to use the new APIs!


Developer time is neither free nor infinite.


Your personal dev time is as limitless as your desire allows it to be.


Errr

1. users needing to think about recompiling things is definitely not the experience they want

2. It assumes the compiler, runtime, etc don't change API ever. It also assumes the libraries don't change API ever. Or don't change implementations of existing APIs in incompatible ways.

Instead, both happen all the time.

Having source doesn't fix this.

Distros don't fix this either unless the only stuff you use is in the distro (and even then they still get it wrong).


> users needing to think about recompiling things is definitely not the experience they want

I think the intended OSS experience would be for users to get everything from their distro’s repository, whose packages get recompiled by the distro maintainer.

I agree this doesn’t work in practice, but there’s a reason Debian tells people to only ever use packages from their official repos.


I mean I get it, but it's somewhat ironic given how often i see folks making fun of Apple/et al for saying people are holding it wrong, yet if you look at the other comments in this discussion, it's literally the same thing.


> It is the hybrid open source library / closed source app that causes problems like this.

There can exist good reasons to deliver the application as binary even though it is open-source.

The difference rather is: Microsoft cares about binary compatibility while the developers of userland libraries on GNU/Linux do not.


Agreed. As a former MS-focused engineer in a previous life, I was constantly surprised by how many executables just continued to work 20 years after authorship. It's a different set of tradeoffs than other OSs have made, but a really remarkable one.


Gog.com makes their living specifically by updating games that no longer work on modern versions of Windows.


Through emulation (DOSBox), in many cases.


> Gog.com makes their living specifically by updating games that no longer work on modern versions of Windows.

Many old games that GOG distributes are not written for Windows, but for DOS.


Most distros are distributed as binaries.

But since the source is available and the licenses allow distribution, it's much easier to just recompile the thing.

Binary compatibility is not valued if you can just recompile the source. Why waste brain cells maintaining compatibilty when the compiler can bake you a new compatible binary?


Because your users might be actually using more software than what you deem worthy to put in your distribution.


Most, if not all, distros have no vendor-level restrictions on repository choice, you can make your own.


I think you are conflating kernel with distribution.


there's a reason people don't use linux outside of the server space and most of it is every piece of weirdness introduced above linus. the more esoteric your linux flavor(and closer to bare metal it is), the better your linux experience long term.


A dependency is a dependency. The distro does not matter. It matters what the game depends on and what the distro ships.

Or, you know, snaps and appimage and flatpack. Where your only hard dependency becomes Linux. Which doesn't break userspace.

Some early linux games made the mistake of using low level graphics libraries (not one of the opengl levels), these might not work because the newest drivers just don't support it.


Except that Snaps and Flatpack are over-engineered inflexible garbage compared to how this is handled in sane environments. Flatpack can't even handle the concept of installing to different media without building a brand new repository on it FFS (I don't know if this is true of Snap because it is far too Canonical-centric for my tastes). There's no reason it has to be like this.

Even AppImage is slightly over-engineered in my opinion, but otherwise the only real failing of it is that since there's no standard for ELF embedded icons (wtf is wrong with UNIX devs!?) it's a shit-show to get them to show up as anything other than a generic one.

This isn't hard, Linux people. Just standardize on a set of core libraries and stop breaking ABI outside of major versions, then you'll magically have this thing we call a "platform" upon which applications can run without being recompiled.


Your helping hand is welcome! Even coordination of developers toward solution is helping a lot, because we have no free but dedicated managers. We are developers after all.

Create ticket in bug trackers of all major distributions and describe the problem and your proposed solution. If your vision is right and attractive, developers will join the movement. This isn't hard, it's just time-consuming.


I would, except I've seen others do that and they are unsurprisingly ignored or ridiculed. Someone already made a standard proposal for ELF icons, nobody uses it.


Linux open source - you can contribute but on your own fork over there in the dark far away from us enlightened folk that actually make worthwhile contributions to this project. I only contribute on things that clearly have no "stewardship" or vested interests, lots of useful proposals shot down


Wow, very constructive, great post, love it, keep it up.


To the end user none of that matters, all they know is that this thing does not "just work". windows and macos has that, no flavor of linux does(currently). i look forward to the snap/appimage/flatpack war, it's long overdue


Eh, macOS breaks old apps like it makes Apple money.


Like Aperture from what.. 2014? Won’t run on the next macOS.. that’s an Apple application that can’t last more than a few years on an Apple system.

Even Linux desktops are better than that, but not much better. They seem so emotional.


And that was for me the main reason i stopped using macOS, as every new version of the operating system made software i had paid money for stop working - to the point that nothing works now.


Basically, it's a matter of being able to force the issue. Distros are largely volunteer effort and are already a lot of work. Forcing library devs to cooperate would be a gargantuan task and just rejecting the changes would kill most distros (basically no libraries do this). Instead the distro devs just try to fix the compatibility problems themselves but they occasionally miss stuff.


The only reason there are so many distros is because of how ridiculously complicated it is to build a functional Linux Desktop from the component software. If Linux were a more stable platform, you wouldn't need distros (or package managers for that matter).


Welp this is where the issue and the meme is. This is where the Stallman rant: https://www.gnu.org/gnu/linux-and-gnu.html came from and is still a point of contention. The linux kernel is the only standard part of any distro. Anything beyond that is optional however it will most likely be GNU coreutils. Even then it may not be standard as coreutils may be replaced by busybox or BSD utils or any other core Unix utilities package.

So that's what we are left with. Linux is not an operating system. GNU/Linux is, Busybox/Linux is, and all the distros are operating systems but Linux is not. people just like to clump all these different operating systems together and pretend that they are different versions of the same OS.

If you want a "Linux" operating system that is well defined, look for the ones without Linux in the name. The BSDs have actual UNIX support (i.e. a standard library of tools) and Guix is a Unix-like/Linux-like that focuses on reproducibility and dependence stability.


It's made worse by the common UNIX developer insisting on hard-coding fixed paths into things. The whole ecosystem is just plain hostile to the idea of applications that aren't recompiled every time something changes.


Got a list of such software?


Serious question; doesn’t static compilation do the same thing essentially?


Kind of, but I guess the other solutions are slightly better at accomodating services that are more than a single binary. I think static binaries really should be evaluated again as they seem to be written off by most big players. Though Go(lang) has preferred static linking which has been something that seem to have been appreciated.


Static has its own issues. GPL vs LGPL forces a lot of apps to dynamic. Also security vulnerabilities in libraries become unpatchable.


I’d rather be able to run something that’s security flawed than not run it at all. Those security issues in libraries are probably not such a big deal anyway, because you can run Unreal Tournament 1 multiplayer now and not immediately get hacked, I’m sure. Maybe I just got very lucky.


Apps are always the most secure when they can't run at all because of ABI changes in their dependencies.


Of course as far as graphical desktop software goes, 99% of the dependencies are in the userspace. Unless the userspace libraries (or parts of it) decide to follow Linus' mantra of not breaking the applications that rely on their code, kernel not breaking userspace isn't very useful to graphical desktop applications.


The main difference is that most Windows apps are statically linked. A game from 1998 will very often come with its own set of .dlls that were built with the game and unless the underlying Win32 api changed, they should "just work". On Linux it's also possible to statically link libs, but it's far less common to do so - most apps rely on the libs in the system. Which in theory is a great thing, your app will always use the latest, safest version of whatever library it needs(while the windows app continues using the unsafe 1998 version of libs for instance). But of course in practice people who make the libs can't test their changes with every single app in existence, so inevitably the back compatibility gets broken, and Linux does have the issue of "oh I installed some system updates and now this app that always worked doesn't work anymore". Again, in theory apps are supposed to say which libs they need and package managers should respect that. But theory and real life are not necessarily the same.


> The main difference is that most Windows apps are statically linked.

No, the main difference is that on Windows the GUI libraries (USER32.DLL, etc) are considered part of the OS and have an API that remains backwards compatible. The applications are not statically linking the GUI libraries, they are dynamically linked against them, pretty much like on Linux.

However on Linux the GUI libraries do not care a bit about preserving backwards compatibility (see Gtk1, Gtk2, Gtk3 and soon Gtk4, see Qt1, Qt2, Qt3, Qt4, Qt5 and soon Qt6 - although in Qt's case they cant do much thanks to C++'s ABI issues and TBH i wouldn't expect them do since Qt is primarily made by a company with interests in providing a crossplatform UI library, not the native GUI library for Linux).

Windows simply provides a much richer API than Linux as a standard that applications can rely on.


There are *-compat[1] packages for backward compatibility. You are welcome to support them, if you really need them.

[1]: https://pkgs.org/download/-compat


Compat libraries are band-aid to the real problem that is the developers of the libraries not caring about backwards compatibility and the only fix is to cultivate a culture of caring about backwards compatibility instead of writing excuses. Compat libraries are only applicable to a few libraries (where are the Gtk1-compat libraries?) and all they do is freeze the API in time without any updates to the provided functionality (where are the SDL1-compat libraries that work seamlessly with alt-tab in desktop environments, like SDL2 does?).


> where are the Gtk1-compat libraries?

Here: https://github.com/openSUSE/gtk1-compat

> and all they do is freeze the API in time without any updates to the provided functionality

Yep.

> where are the SDL1-compat libraries that work seamlessly with alt-tab in desktop environments, like SDL2 does?

It's open source. If you need it, take care about it, OR pay to a somebody else to take care about it. Nobody will work for you for free.

It's like roads. Roads are free, because lot of drivers are using it every day, so they are paying taxes to build and maintain them, but road to your house in the middle of nowhere is not free, because nobody shares it with you. However, road making tech is open sourced, so you can build your own road if you need it.

Of course, you can use paid road, if you want, because they are providing better service for small pay per ride, or you can hire developer company to build private road to your house and then payback loan to a bank. It's up to you.

But statements like "free and open roads are bad, because road to my own house costs lot of money for me" is just childish play. Roads are not free to build and maintain. Roads are free and open to use, but only because they are paid by someone else. For your own private road, pay with your own labor or money.


> The main difference is that most Windows apps are statically linked. A game from 1998 will very often come with its own set of .dlls that were built with the game and unless the underlying Win32 api changed, they should "just work".

Delivering a set of DLLs is not static linking.


It's almost like the D in DLL stands for something.

That said, it's not strictly static linking but it is closer to it in effect than the global assembly cache or the shared DLLs which resulted in the kind of "DLL hell" that windows 95 was famous for.


Day-to-day Linux system updates won't change major version numbers of libs in most distros, though. These breaking changes only seem to occur in major releases in my experience.


> Doesn't that mean Linux is entirely broken? Isn't the whole point of OSes that things that used to work, keep working?

"Broken" is a strong word. "Different" might be more fitting.

While I don't think anyone in the world of Linux and Linux-distros work with the explicit aim of breaking compatibility, you have to realize that Linux and most Linux-distros are built entirely around open-source software.

Most users install software via a distro-provided package-manager which fetches binaries from a repo. These binaries have been built from source, by the distro-vendor, specifically for that distro.

That means Linux compatibility traditionally has been more about source-level and compile-time compatibility more so than 100% uncompromised binary compatibility. If a change breaks binary compatibility, but the distro/system as a whole still builds fine, that's still compatible in most day-to-day scenarios of a common Linux end-user.

And unconstrained by having to maintain strict binary compatibility, it also allows developers to clean up historical garbage and technical debt they otherwise couldn't.

This approach obviously has both drawbacks and advantages, but calling it "broken" seems unfair, overly opinionated and really just wrong.


> "Broken" is a strong word. "Different" might be more fitting.

No, "broken" is a perfectly fitting word, an application not running is broken, not different.


Linux is working perfectly on lot of servers, computers, smartphones. It holds steady third place in the world rating for desktops, first place for smartphones, dominant at server/embedded/supercomputer markets. If a developer released an app which cannot be running at these platforms under Linux, then app is broken, of course. Why you blame perfectly working OS for developer fault?


Because it is the OS that broke backwards compatibility and caused the application to stop working.

That the kernel is used in "servers, computers, smartphones" and wherever else metric of popularity doesn't magically make applications run. Developers who work on libraries that are relied on by others to provide basic functionality (like GUIs) not breaking their APIs is what make applications run.


TLDR: The developers of the GNU/Linux userland libraries do not care about binary compatibility.


Binary compatibility is only a pressing issue if the majority of the software you run is closed source and not provided by your distro.

Why waste time maintaining something which is not needed?


> Binary compatibility is only a pressing issue if the majority of the software you run is closed source and not provided by your distro.

Binary compatibility is a very pressing issue if there exist software that is closed source and you want to have more users than just some nerds.


I don’t care about closed source software. I don’t care about massive non-nerd adoption of Linux. None of those things benefit me or drives the things I care about.

Either way: The enormous success of Linux despite these issues seems to be evidence contrary to your claim.


Yes and no. It's a different mindset. In many cases Linux can run CLI software from the 1960s just fine but the desktop moves fast and breaks things often.

The legacy support in Windows is admirable but does bloat the OS and create security and performance issues, for example. I suppose one could always fire up a VM with a 10-year-old version of Ubuntu to run older software, but it is indeed a hassle.

The correct answer depends where your priorities lie.


> Yes and no. It's a different mindset. In many cases Linux can run CLI software from the 1960s just fine but the desktop moves fast and breaks things often.

Serious question to all the GNU/Linux hackers: why can't we standardize the desktop on GNU/Linux similarly?


Two people can agree on a window manager if one of them is dead.


> Two people can agree on a window manager if one of them is dead.

But why is there no similar fight concerning CLI software?


There absolutely is.

bash vs zsh vs tcsh vs csh

vim vs vi vs nano vs joe vs emacs

screen vs tmux

and thats before you even get to all the configuration tweaks a given user has made to each.


Some are trying to, but there is a bazaar of interests. Microsoft has no competing interests, they just enforce it top to bottom.

Lennart Poetering is trying to standardize from the bottom up, correctly IMHO, there's some push back, but mostly inertia and lack of resources.


Not exactly desktop, but for GUI Linux has GTK and QT. GTK is by far the most common and I'd say the standard. Anyone who argues is a KDE fanboy ;)


Qt sees far more industrial usage than GTK. It has a nice C++ API and the licensing situation has always been clear for companies. GTK was never really a success outside of the Linux open source desktop community.


Neither Qt nor GTK care about backward compatibility.


I'm ready. Will you pay for my work to achieve your goal?


The problem rather is that even if you do, many distributions will not obey "your" standard.

EDIT: So the problem is that it is not a problem of work to be done, but a problem of willingness to standardize.


> Doesn't that mean Linux is entirely broken? Isn't the whole point of OSes that things that used to work, keep working?

Not Linux, libraries instead are the ones breaking compatibility over time. Or missing libraries as well...


As far as the user is concerned it is "Linux", when people say they use "Linux" they do not just mean the kernel, they mean the entire OS.

On Windows these libraries have APIs that remain backwards compatible (new methods can be exposed, but existing stuff remains there). Unless the equivalent Linux libraries also decide to provide backwards compatibility, they'll just make Linux look broken (even if it isn't the kernel's fault).

(and no, this isn't something you solve by being pedantic about what is userspace and what is kernelspace and who controls what, as that information doesn't solve the actual problem - you solve it by cultivating a culture that actually cares about backwards compatibility)


> Doesn't that mean Linux is entirely broken?

It depends on how you look at it. Linux does not break kernel-userspace but userspace libraries, like the c libraries, constantly change as security holes and bugs are patched. This is a more secure approach as updates bring in security and bug patches. The flip side is that approach can sacrifice backwards compatibility if an api interface changes or removed. This forces the developer to constantly rebuild and check if anything is broken. The idea is everyone is proactive and stays up-to-date. That's how Linux works.

Microsoft's approach is to let the application developer bundle the libraries with the application if necessary. That's what all of those visual c++ updates are for when you install certain software, they install the libs the application was linked against. So your computer is littered with multiple library versions, some of which may contain security flaws.

The latest approach in Linux is to bundle the application and libraries in containers. It's a hybrid approach that takes the MS idea of bundling the linked libs with the application but wrapping it up into a neat container that in theory should isolate the system from any vulnerabilities the libraries may contain.


No Linux distributions does not work like that. The kernel interface is pretty stable but userland is not. Dynamically linked executables will not work with new versions of the libraries they depend on in most cases. This is one of the problems flatpak, snap, appimage, docker and nix/guix are trying to solve.


Flatpack, snap, aappimage, docker, static linking, etc. are solving other problem: how to bypass package manager and ship outdated and buggy libraries to user, because it will save some money for developer at expense of user.

Whole GNU foundation was created to bash with this "solution", then Linux kernel + GNU runtime was used to create Linux distributions, which are free from binary-only distribution problem, but developers are creative.

Why not just ship partially compiled binary or IR code, or obfuscated source with open-source adaptation layer? It will make adaptation problem much easier for end user at any distribution or/and architecture.


> I mean, lots of things are wrong with Windows but I can double-click a .exe from 1998 and in most cases it'll just run.

There are many old games that won't run properly on modern Windows.


Missing a runtime... missing dll... missing direct3d or different version of it, compatibility for certain version of Windows. It is all the same everywhere.


On Windows, you can easily install lots of parallel versions of libraries.


So can you on Linux. Libraries are installed with their full version in the filename (e.g. /usr/lib/libpurple.so.0.13.0) so you can keep lots of parallel versions.


> 10 years is just wild.

Most 20 year old Windows applications still run on Windows 10.


I've got the same email, dated 23rd October 2008, and very impressive that the link still works and has the updated binaries listed.


You can play the windows version on WINE, right?


The deb package not working is sad. Any idea why that is ?


When I tried to run it from the terminal, it said something about a missing Gnome thing, maybe it would work on Ubuntu (I'm on Xubuntu). Either way, I already have a ton of other games I run directly without installing as a package, it's not really a problem, I actually prefer it.


I loved playing World of Goo when it came out.

And nothing has made me feel older than finding out it's 10 years old. I would have been like... 4 years? 5 at the most? Wow.

Congrats to them!


Games like World of Goo, Braid and Super Meat Boy really paved the way for indie games. They showed that small teams can product quality entertainment, and that there's a real demand for it.


back then the indie scene was super interesting. https://experimentalgameplay.com/ was full of gems and igf had plenty interesting concepts too with lot less focus on trying to make them marketable.


Do you know why the sector did not boom?


it did boom!

edit: rewrote the whole paragraph to be more clear and remove unnecessary negative connotation against mainstream indie which I do not share

thanks to early experiment success and profitability indie games managed to get a lot of visibility and the genre exploded. whole business revolve now around making indies meet buyers (kickstarter, indiegogo, steam greenlight); as a compromise they lost some of the original experimental nature and gained in term of polishing and content quality.

I'd say the sector transformed; it lost some innovation, gained some marketability. there's less explorative projects now and they are harder to find, but you can still find them; currently itch.io seems to be a trove of these kind of content, but the lack of curation makes hard to find anything among the pile of crap (no offence intended, it's just Sturgeon's law)


It also got absolutely flooded - Steam is chock-full of indie stuff, but as everything always is, 99% of it is crap, and by making it easier for a developer to get visibility, it's now virtually impossible to sift through to find the promising stuff.


It's worth pointing that there still is amazing gems in there, just last year we had games such as Into The Breach, Celeste, Obra Dinn, Gris, and many many more. But yeah, since the bar has been lowered, there's also far more crap too.


Another issue the mobile games market suffered greatly from was all of the copy-cat idea stealing. Any original idea that showed a little promise seemed to be quickly cloned by several shady studios, further complicating the issue of discovery others have mentioned.


It did! It boomed beyond anyone's expectations, and still is.

The problem now is that there's so many indie games out there that it's a lot harder to get discovered.


For me, that happened a bit earlier with Introversion's games. Theirs were some of the first third party games on Steam.


Introversion's story is pretty interesting. Early success with Uplink, then Darwinia didn't do so well, DEFCON sold amazingly well on Steam, then I'm pretty sure they were almost financially ruined by porting Darwinia to Xbox which sold extremely poorly, followed by announcing Subversion which they worked on for years but was eventually canceled, to finally the enormous critical and financial success of Prison Architect.


Their Wikipedia page has collected the story pretty well from news releases over the years. It seems that they've had a lot of "that saved the company" moments.

https://en.wikipedia.org/wiki/Introversion_games


Can't forget Limbo. That game had a huge reach.


Don't talk to us about feeling old at 15, haha!


His account was made in 2011, which makes me think maybe (hopefully?) he meant 'I would have guessed' and not 'been' 4 or 5 years.

Either that, or at 15 he is way more involved in software than I am now :P


I think he was guessing how old he thought the game was, not how old he was when he bought it.


Totally misread it, in all honesty.


You interpreted their written communication correctly; it was just poor communication.

They wrote as if they were speaking, and it would only have been understandable when speaking because of context and intonation. "I would have been, like..." works when you then act out what you would have been like.


It's entirely possible that you played it first only 5 years ago.


My brother got it for my daughters, which means they must have been 5 and 6 years old at the time. I remember they loved it, but it hasn't made it across the couple of new Macs we've had since then. I'll have to get on to him for the codes!


Would have though the same although I fondly remember the music and some tracks are still on my more recent playlists.


Likely in preparation of the Fre Giveaway for the Epic Gamestore? https://www.epicgames.com/store/en-US/


This seems like a strategy everyone can appreciate: Epic pays indie developers to update their old games


> Epic buys Rocket League publisher, announces Epic Launcher exclusive, and goes silent after backlash

Seems like a better way to put it. They are buying IP left and right and pushing their unfinished, buggy client in a very unhealthy way.

[0] - https://www.eurogamer.net/articles/2019-05-01-epic-acquires-...


I've bought a few games on Epic, and had no problems. The lack of junkware like on Steam is great – I can honestly browse the store knowing that every game is high quality (even if not necessarily in the genre I enjoy).

Achieveemnts not popping up helps focus on the actual game.

Cloud save, family sharing etc. are of marginal use for me and I imagine 99% of consumers outside the vocal minority.


Well, I was about to comment that the lack of cloud save is the absolute dealbraker for me as I play games at home and at work, so if I can't sync saves automatically then I won't buy a game from there(#1 reason why I haven't bought Metro Exodus for instance, game which I am otherwise super interested in).

But I guess that makes me a vocal minority :P


I used to “cloud save” my Borderlands 2 save files by symlinking the save game folder to a Dropbox folder


Yes, that would be an option if my work allowed any services like that(and yes, it's slightly odd that dropbox/google drive are not allowed, but Steam cloud sync is).


Not sure if it helps any, but box.net allows webdav access and can be mounted natively by Windows.


Oh but it's not a technical issue - both dropbox and google drive would install and work fine, we have full administrative access on our machines with no restrictions. But it is against policy.


One man's Junkware is another's charming indie. GOG is a curated store and, among others, initially denied Zachtronics game Opus Magnum. I personally appreciate the fact that Steam and Itch don't curate.


The link you provided doesn't say what you claim it says. The parent post is discussing giving away free games, not acquisitions. What is unhealthy about the way Epic is trying to attract users? I know they've received some negative press but I have yet to see a good argument for this.


Buying IP in order to pull it off competing storefronts is anti-consumer.

>thereafter it will continue to be supported on Steam for all existing purchasers


From the update at the very top of the article the GP linked:

> Epic has offered an update on the confusingly worded statement issued to press earlier today regarding Rocket League's future status on Steam, now insisting that it has "not announced plans to stop selling the game there".

Even if they do pull future sales of Rocket League off of Steam why is that a problem? Valve has purchased game companies. I've never seen anyone complain that you can't buy Left 4 Dead on GoG.


IIRC, Valve games were available on Origin.

When EA pulled their games from Steam, Valve replied in kind.

One thing that irks me is that Valve seems way more committed to interoperability, open standards and its consumers' wellness than its competitors (likely as a result of being privately owned); while the competitors in question use every trick in the evil corporate playbook to try to increase their marketshare. We've seen how they play, and the friendly façade is likely to change to ad more value-extracting one once/if they gain marketshare.


No shit. Linux gaming make's Valve a ludicrously small amount of money compared the the, frankly herculean, effort they've put into making it viable.


Valve made L4D2. Or at least did a significant portion of the work. The problem with Epic is that they are either paying publishers for timed exclusives, or purchasing publishers outright in order to create artificial monopolies, and restricting consumer choice.


Turtle Rock made L4D. Valve bought Turtle Rock.

If paying for exclusives is bad what viable alternative strategy should Epic be using to get users to their store?


But you are talking about different games. TR made L4D and Valve bought them. but Valve made L4D2.

And they released L4D2 a year after L4D sparking a lot of community uproar, starting things like the L4D2Boycot groups.

Man, just talking about the game makes me want to play it again. That was such an addiction for me xD


What Valve did on L4D2 isn't relevant. My point was that Valve did the same thing with L4D as Epic is doing with Rocket League. Talking about L4D2 does not show that this point was wrong.


I was just pointing out that you 2 (parent posters) were not talking about the same game. I'm not denying your point or anything lol


Making a good store that people want to visit?

Edit: I was being serious.

"what viable alternative strategy should Epic be using to get users to their store?"

What's preferable for the consumer? buying a monopoly, or competing on quality, price, customer service, etc, etc, etc.


> competing on quality, price, customer service, etc, etc, etc.

What metric are you using to define quality and how would it make people use this store over Steam? How much cheaper would you need to sell games to get people to go to the Epic store? Would developers put there games on the Epic store if the price had to be so low that they made more per sale on Steam? Good customer service being a factor requires having customers to experience it. It helps long term but won't get your first wave of customers.


Quality in this case could be website design, the overall quality of games, or discoverability of good games.

Price in this case would be margin. If Steam takes 10%, Epic could take 5%. Games makers could pass the savings on, or increase their own margins.

Yes good customer service requires customers. What's your point? Your first customer experiences good customer service, mentions it to their friends, who then shop with you, who then mention it to their friends. It's called word of mouth. It's a powerful marketing tool.


Website design won't drive customers to a new store instead of Steam. Epic is curating their store so that should already cover quality and discoverability. Apparently they think that isn't enough.

> Price in this case would be margin. If Steam takes 10%, Epic could take 5%. Games makers could pass the savings on, or increase their own margins.

Steam takes 30% and Epic takes 12%. Very few games have chosen to reduce their price to go with the increased percentage.

For a web based store customer service only matters when things go wrong. Most customers should get a fairly standard experience of paying money and receiving their product. Do you consider Steam's customer service bad? Would a marginal improvement in customer service convince you to deal with the hassle of installing another launcher, setting up a payment method on a new store, and splitting your game library?


The store is 4 months old.

All of these things don't happen instantly.

You asked for a viable strategy, this is it as evidenced by the fact they're already trying it. Is it guaranteed? No. But nothing in business, or life, is.


I asked for a viable alternative strategy that didn't also involve paying for exclusives to initially attract customers. Apparently Epic doesn't agree with you about this being viable by itself.


This is really a big part of it for me. I have no problem using multiple launchers, but Epic has given me little reason to think that they'll secure my payment info or account details.


How about releasing the games at a cheaper price, as Epic charge so much less? Cheaper prices for games, shows a real commitement to the End User.


I've seen several comparisons showing that Epic's prices are not actually lower than Steam's (in general).

Can't find them right now, and don't have the time to do a thorough comparison, but looking at the first three games on the Epic Store front page that also have Steam pages with prices seems to confirm this. That's "Oxygen not Included", "Vampire: The Masquerade® - Bloodlines™ 2" and "Outward", all of which are priced the same on both stores (€22.99, €59.99 and €39.99, respectively).


> If paying for exclusives is bad what viable alternative strategy should Epic be using to get users to their store?

Offering the games DRM-free (the audience that GOG is aiming - though I am really concerned about what GOG is currently doing with GOG Galaxy and cloud synchronisation).


I don't have time to look it up right now but I'm pretty sure GOG had layoffs a few months ago. DRM free doesn't seem to be a winning strategy.


Epic are trying to do with gaming what Disney are doing with TV -- not pay distributers when they can do that but themselves, and use a strong monopoly on a sector of popular material to leverage their way in to distribution.

It sucks for consumers.


Well, I just saw that 'Super Meat Boy Forever' is going to be released soon on the Epic store, but not until 2020 on Steam.

Also after Epic bought 'Rocket League' and removed it - for new buyers - from Steam, I'm starting to loose a bit of sympathy for Epic.

After all, Epic and Valve are both making a huge amount of money, so there doesn't seem that much of a need for such aggressive strategies.


I've loved Super Meat Boy ever since it was first released all those years ago and will be buying Super Meat Boy Forever - in 2020. There's plenty of other stuff out there to distract me until then.


Rocket League hasn't been removed yet. There's no real confirmation it's going to be removed, and is still for sale, and currently being reviewed bombed on Steam.


Too bad it seems they don't seem to provide a Linux-version here.

I'd love to revisit this old game, but I only had it available for a now defunct Android-tablet.


Because of this post, I just found out about Human Resource Machine...

I’m hooked. Currently I’m trying to optimize my Level 17 solution.

I hope these people continue producing new games.


If you want a slightly quicker REPL for HRM, I've got a clunky-but-functional CLI version that I use when I'm at work for testing ideas - https://bitbucket.org/rjp/hrm/src/master/

Pull requests gratefully accepted, etc.


When following the link to the repository, I'm forced to login, then receive an access denied error from Bitbucket. Is the repository private?


Ah, bugger, sorry, I thought I’d made it public a while back but obviously didn’t. Apologies again, it’s definitely public now.


Is there a way to view the repo without signing up?


Should be possible now; I had mistakenly thought it was public already. Apologies for that.


They actually just recently released a sequel to Human Resource Machine called "7 Billion Humans". Builds on the gameplay but adds parallelism


It's actually a different mechanism entirely. Human Resource Machine is optimising a single threaded small memory process. (Single Instruction Single Data) 7 Billion Humans is about massively parallel programming more like code on a GPU. (Single Program Multiple Data) The ways to solve the puzzles are very different because of this, and it's interesting but not a way of optimising that I'm used to (so I haven't finished it yet!).


Thanks for the heads up!


I love games like Human Resource Machine. If you’re interested in more games of that genre, check out TIS-100, Shenzhen I/O, and SpaceChem!


Also in a same-but-different kind of way, Opus Magnum (but then you can spiral out into Factorio and other games of its ilk)


It sounds like you guys aren't aware of EXAPUNKS, which is Zachtronics newest game. It's my favorite of the whole bunch, you guys should check it out! You get a hacker zine to print out along with it!


The speed at which Zachtronics pumps out quality games is astonishing to me. Every time I buy one, it hooks me immediately and 5 hours later I'm still playing.


You’re right, I wasn’t aware of EXAPUNKS. Checking it out now!


I bought the Humble Indie Bundle in May 2010 that included World of Goo. I think I have the game on Steam as well through the Steam key that was included but can’t check right now. Anyway, since some are saying it may take a while before the Steam version is updated, I was wondering if the download that is accessible through my account on Humble Bundle has been / will be updated?


I got it in the Humble Bundle. No Steam key provided and I just downloaded it from the HB site and it's not updated. Pity, I really like this game and think my kids, who weren't born when it came out(!), would really love it too.


from this comment it looks like there is a different way to access the bundle that get's an updated link. in any case they are aware of the humble bundle and seem to promise to update that too: https://tomorrowcorporation.com/posts/world-of-goo-update-10...


the humble bundle download link has been updated. it features version 1.51 now


I'm surprised nobody here has mentioned the Wii version. I do own the PC release (from when it was in that first Humble Bundle) but this game was a brilliant fit for the WiiMote controller. I still play it today.


There's a switch version that uses the accelerometer/gyro in the joy cons to emulate a wiimote, it works surprisingly well - definitely worth a shot if you want to replay it!


Agreed, the Wii version was great.


The other games from this developer, Human Resource Machine and 7 Billion Humans, are fun little pseudo programming puzzle games. They're pretty good!


And if you like those, any of the Zachtronics series of games are perfect. Spacechem is a personal favorite (machine layout programming), but there's also TIS-100 (modular system programming), SHENZEN IO (modular digital logic programming), and Opus Magnum (machine layout programming). The last two are actually upgrades of older games they made, KOHCTPYKTOP and The Codex of Alchemical Engineering. I'm hoping that Ruckingenur II gets an upgrade eventually (debugging! in a video game!).


SHENZEN IO was my favourite by a mile. Although looking through a datasheet half-written in Mandarin to understand what was going on, at 2am, to then go into work at 6am and look through more datasheets half-written in Mandarin - well, I swiftly began to question my life decisions.


You should check out EXAPUNKS if you haven't. It similar to SHENZEN I/O, but I like better.


I've played all his games and they're all fantastic. EXAPUNKS is probably my second favourite, but I'm an EE engineer so SHENZEN I/O is especially satisfying to me. Zachtronics games are my vice. As a recovering self-confessed video-game addict I only allow myself to play educational/puzzle games now, and they toe the line just enough - while being fun as hell.


Theres a steam bundle of all the games for folks who somehow don't already have them. I donno if 9 hours of the HN spotlight shining on them created the bundle, which would be creepy-cool, or if its been there forever. At any rate, it would be a good deal for programmers who don't already have the games. I enjoyed the experience of reading the TIS100 manual, its probably worth a significant fraction the cost of the game itself LOL. $70 will buy you a dozen hours of the latest FPS sequel, or hundreds of programming puzzle hours...

Now, for another blast from the past, who remembers programming MSP430 code using the "Lockitall LockIT Pro"? Its a somewhat more serious "game" in the same genre. I found a printed manual for my Lockitall next to my TIS100 printout. I see the website is still up. My printout has a copyright date of 2013. I remember I was inspired to buy actual '430 hardware to play with, and that dev board is still laying around in my basement somewhere.


Don’t forget EXAPUNKS!


Those two are available on the switch as well. I highly recommend anyone check them out. Girlfriend and I love to play together.


wow.. just a few bits from the update:

To be super clear, there are no new levels, no new characters, no new battle royale deathmatch mode. This is just a gentle remastering we did for fun.

The framework has been replaced. This is the thing that draws all the graphics onto your screen, and sends all the audio to your speakers, etc. This means the Win / Mac / Linux version should work on modern computers again without freaking out, and you can run the game on modern displays at whatever resolution you want.

Game now runs at a hi-def widescreen 16:9 aspect ratio by default. The original ran at a squarer 4:3 ratio.

Resolution of graphics is doubled. The original game ran at 800×600, and the tiny graphic files didn’t scale to huge monitors very gracefully. We used a few different high quality upscaling tools to start, and then went over each image by hand, tweaking each image further as needed. In a few lucky cases, we still had the original source files and were able to use those. But if you still really want the original flavor, there’s a setting to use the original graphics, also included with the game.

Fabulous joke about looking good in hi-def has remained unchanged.

Brought over graphical and UI improvements from releases on other platforms, like Nintendo Switch.

No more encrypted assets or save files. We hope this makes the game more open and friendly to mod.

The config.user.txt file is now located wherever your save file is stored. So no more editing that file in your Program Files folder. It has a bunch of new config vars exposed as well.

We’ll be updating the game everywhere Win / Mac / Linux versions are available in the next few days.

These updates 10 years after the initial release. These guys love their fans.


It's worth noting that they're launching on Epic Games, free for two weeks, tomorrow, I believe. So this is an update they were just recently effectively paid to do. Though of course, it also benefits the folks on Steam as well.


Will it benefit the folks on Steam? I was under the impression the update was not coming to Steam for a year, although the end of the blog post does seem to suggest it'll be coming out. Maybe unclear reporting when the patch was announced a few weeks ago?


My reading of the blog post would be yes - "We’re slowly updating the game everywhere it’s currently available over the course of this and next week." - but I wouldn't take it as a definitive statement.


The post says they're updating it on every platform it's published on. The "Secret World of Goo Download Location" already has the 1.5 built, if it hasn't rolled out to steam I expect it'll come soon.


Devs who update a game to make unrestricted modding easier. In 2019.

Really cool!


Given what happened to them it's even more impressive:

https://arstechnica.com/gaming/2008/11/acrying-shame-world-o...


Internet-wide piracy statistics are meaningless, they basically show you how popular a game is in countries where properly licenced software is far out of reach financially, even with moderate regional pricing applied (you can't go down as much as you'd need to make it viable without risking massive losses from region-faking first-worlders, which is kind of like piracy but would not show up in the same stats)


90% piracy is about par for the course. Doom and Quake's was even higher IIRC. It basically tells you that your game runs on cheap PCs and poor people won't spend more than 5-10$ on a game. Steam sales and region-based pricing are one way of trying to capture that market.


I was reading a book which was essentially the memories of some BBS nerd from the 80s (i do not remember the name, sadly) and he basically quoted a magazine post from around back then - the 9 out of 10 piracy was a thing even in the 80s, more than a decade before Doom. It doesn't surprise me that it would be the same 10 and 20 years after it.


Fair point - I was not being judgemental of the poor, just impressed that these indy developers are mature enough to understand and not be salty about it (I think they almost went bankrupt at one point).


The pirated copies aren't all lossed sales. If you want some more statistics form the creator of the Humble Bundle I suggest you a small read about the topic: http://blog.wolfire.com/2010/05/Another-view-of-game-piracy


No data to back that claim up. From my POV Steam still was seen as a burden in 2008 and wasn’t nearly as popular as today, pirating was still easier. I bought the game on steam around 2011/12.


IIRC the data they used was their own, that is, they seeded a pirated version onto the known suspects themselves and through the online functionality (iirc a leaderboard of creating the highest tower) were able to detect it.


After AoE II, Worms, and World of Goo I wish similar update for Red Alert 2.


CnC (original and Red Alert) is getting a remaster[1] - if that does well, I can't see RA2 being far behind. It's also quite playable in CnCNet.

[1] https://www.pcgamer.com/uk/command-and-conquer-red-alert-rem... (URL is somewhat misleading - release date is unknown)


Oooh - I wonder if they will there will be a revamped Mac release.


Have you taken a look into OpenRA? There is a RA2 mod you might be interested in.

https://github.com/OpenRA/ra2


Worms too? I missed that news. Do you have a link?


This one https://youtu.be/Dgcm6YrFglw and I think there is also Worms HD.


When I read that Worms was getting remastered, I thought that meant the original DOS game with that style of graphics. The cartoony DirectX versions that have been around for about 15 years now always felt like the remastered “high definition” versions in comparison (“HD” as in relative to the DOS game rather than HD the 720i/1080p specification)


OpenRA is pretty excellent.


> To be super clear, there are no new levels, no new characters, no new battle royale deathmatch mode. This is just a gentle remastering we did for fun.

This update is great and all, but I was actually... really hoping for new levels. I've wanted them for the past ten years. Pretty please? Maybe some day? :'(


They did unencrypt the assets and save files, so maybe some modders will add new levels?


people cracked the crypto long ago


There was (is?) a WoG modding site that contained a lot of new levels.


If anyone has custom level recommendations, I'm all ears! I tried a few in the past and found them of middling quality.

(I say this with all possible respect to the modders, of course, their efforts are appreciated.)


Very nice. But I wonder why they did encrypt assets or save files in the first place for a single player game? Is this ever useful?


It may help stave off blatant rip-offs for a while.

Of course, if you strike it big, someone will eventually find enough time with a debugger to get around it, but as a small indie dev you'll probably be most interested in making sure you can break even first.


Assets are protected by copyright. If you are not concerned about copyright you can just redistribute the whole game.


Thus preventing the users from being able to use the fair use rights from US copyright.


Everyone knows you can't prevent piracy/copycats forever, but it was likely built into whatever they built the game with, or wasn't too difficult to add.

Yes a determined person could still get to them but it helps buy you some time as your game launches.


Some frameworks do this by default to protect grafic assets from stealing. At least too easily.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: