Unfortunately it turned out to be mostly games preying on the impatience of the player to make microtransactions.
I wish they update the game to make it compatible with latest iOS.
Companies should consider when they can open source things like this to let greatness live on, if they won't find a way to support it directly.
The game idea is just fantastic and I think this game is a great example of how to teach stuff (here assembly) through video games.
I don't know if he's picked up more people or still solo, but Brightridge is a beautiful open world with much to explore. While everyone else was making candy crush and Mario knockoffs, he was building that.
I even played the likes of Infinity Blade and it felt... underwhelming.
BrightRidge really leveraged the technology available.
Also I think the best phone games utilize a "portrait layout" than landscape, but many many game developers does seem to think that way, and are creating games with landscape orientation, that often requires awkward hand positions to play.
Edit: .deb package didn't work on Xubuntu 19.04 but extracting the .tar.gz and running ./WorldOfGoo.bin64 worked like a charm
Interestingly, the part of the chain that failed was their then email marketing provider, that hijacked all the links in the email to provide click tracking, doesn't exists anymore. So all the links are dead even though they would have redirected to pages that are still functioning.
based on your post i checked if i got the bundle that included world of goo, and it turns out i did. it contained the key url as http://www.humblebundle.com/?key=... and that url is still working. as someone stated elsewhere in this discussion, the humble bundle version is not updated. it's downloading version 1.30 right now...
Unless you only have libc as a dependency but I doubt that for a graphical application.
And on Windows, a 10 year old program NOT working is extremely rare.
Hell, a 20 year old program is most likely to be running too (most common issue would be the installer but usually it can be worked around).
> Unless you only have libc as a dependency but I doubt that for a graphical application.
You can also have xlib as a dependency and it'll work since unlike whatever madness is built on top of it, the X11 API is stable (but of course modern developers dislike that stability and want to replace it with Wayland that breaks everything again so they have new toys to play with). Example: https://i.imgur.com/YxGNB7h.png (from an old toolkit of mine, showing the same binaries running in 90s RedHat and last year's Debian).
Wayland originated, AIUI, because we've got layers upon layers of complexity and indirection to bend X11 into what we want on a more recent environment, and building a system with more awareness of what the constraints are in recent systems lets you implement things much more cleanly than those levels of complexity (or, at least, not much worse, if you're using XWayland).
(Of course, it's not implausible that in 20 years or less, we'll think of Wayland as requiring lots of kludges and hacks to do what we need then, but you can only plan for known knowns and known unknowns.)
In the long run, it always comes down to emulation, regardless of system (e.g. Win16 is no longer a part of current Windows). Perhaps ReactOS has some promise w/r/t old Windows executables...
I mean, lots of things are wrong with Windows but I can double-click a .exe from 1998 and in most cases it'll just run. I'd assume Linux, being used a lot for servers and long-running apps, would have this too?
It's the userspace libraries that likes to drop compatibility with old binaries left and right and with most distributions it's not possible to install old versions of libraries.
Containers like Flatpack and Snap promise to solve this by just bundling the application with all the libraries it needs, frozen in time.
The Kernel then can handle it.
It feels a bit weird to see the kernel folks put so much effort into "never breaking userland" only to have the distributions throw that all away with a "yolo, the new libyadayada is 1% faster and who cares about old programs!" kind of attitude. Is that really how this works? Or am I just misunderstanding some nuance? I'm not trying to be critical on anyone btw, I'm just trying to understand what's going on :-)
Your fallacy is assuming that "Linux" is a single entity with a single mind. Linus is very strict about backwards compatibility, but he only controls the kernel, not glibc or Mesa or KDE.
Users don't make that distinction, even a little bit.
This is also arguably a failure of the open source model in practice (though not in theory). It tends to generate projects that are really good at infrastructure and really bad at product.
The ones who fill in the product experience as best they can are the companies that are trying to get paying customers, because those customers won't stand for a weird experience, regardless of the technical/governmental reasons
It is the hybrid open source library / closed source app that causes problems like this.
No, backwards compatibility isn't something you solve with open source, open source is perfectly capable of breaking it in all the fabulous ways possible as has been already demonstrated many times.
Backwards compatibility is something you solve by actually caring about it when you are designing your version N+1 library (and preferably, even when designing your version 1 library's API to allow for it).
On top of that, if a binary-only application really wanted to stand the test of time, it could statically link the libraries.
To make that last part more clear, imagine a theoretical "libopenfile" library that shows a file dialog, this library in version 1 exposes a function like "open_file(const char* path);" but in version 2 they improved the GUI a bit (e.g. making the dialog make better use of screen real estate, added a create folder button, etc) and changed the function name to "lof_select_file(const char* path);" because arbitrary tidyness reasons and in version 3 they changed the signature to "lof_select_file(const char* path, unsigned flags);" and added the ability to load plugins that could provide more actions for each file in the displayed file list, show overlay icons, etc.
As things are right now, a modern Linux system would need to have all three versions of the library installed - assuming the distribution cared about backwards compatibility to provide all of them instead of the latest and perhaps the previous version - meaning there would be three different libraries providing essentially the same functionality in three slightly different ways. Moreover, and also very important, version 1 applications do not get to use any improvements introduced in version 2 and neither version 1 nor version 2 can use the plugins introduced in version 3 - even though, again, all they want is to show a dialog to choose a file.
The way that would have been done right is for the version 2 of the library to still export the "open_file" function but simply make it call "lof_select_file" and version 3 make a "lof_select_file2" (or "_ex" or whatever, if the reason to not do that is because you dislike the naming, you are part of the problem) that accepts flags and have "lof_select_file" call it with a default set of flags. Then everything would work, from version 1 to version 3 and all applications would get the functionality introduced in subsequent versions (and as a sidenote, no, i wouldn't remove open_file from the header files to encourage using the renamed function since backwards compatibility in source form is very important too - not everyone who gets to compile a piece of code is the developer of that code, or even a developer).
Now the above is a bit of a strawman, but it is exactly the situation you get with Gtk1, Gtk2, Gtk3, Gtk4, Qt1, Qt2, Qt3, Qt4, Qt5, SDL1, SDl2, etc and pretty much most of libraries that you find on a desktop system (notable and thankful - so far - exceptions, being the X11 libraries and... Motif, although that isn't installed by default anywhere so it isn't really something you can rely on).
There are some absolutely ancient games (Xgalaga for example) that I used to play on Linux on 1996 that still play just fine today on my Debian machine.
Xgalaga works because the X11 library developers have not broken backwards compatibility. This is a good thing. This is what i'm referring to when i'm talking about a library not breaking backwards compatibility and what something like Gtk and/or Qt (and, IMO, SDL) should strive for.
EDIT: i give a more detailed example here - https://news.ycombinator.com/item?id=19813074
Yes it will! If it's free software, you can update to use the new APIs!
1. users needing to think about recompiling things is definitely not the experience they want
2. It assumes the compiler, runtime, etc don't change API ever. It also assumes the libraries don't change API ever.
Or don't change implementations of existing APIs in incompatible ways.
Instead, both happen all the time.
Having source doesn't fix this.
Distros don't fix this either unless the only stuff you use is in the distro
(and even then they still get it wrong).
I think the intended OSS experience would be for users to get everything from their distro’s repository, whose packages get recompiled by the distro maintainer.
I agree this doesn’t work in practice, but there’s a reason Debian tells people to only ever use packages from their official repos.
There can exist good reasons to deliver the application as binary even though it is open-source.
The difference rather is: Microsoft cares about binary compatibility while the developers of userland libraries on GNU/Linux do not.
Many old games that GOG distributes are not written for Windows, but for DOS.
But since the source is available and the licenses allow distribution, it's much easier to just recompile the thing.
Binary compatibility is not valued if you can just recompile the source. Why waste brain cells maintaining compatibilty when the compiler can bake you a new compatible binary?
Or, you know, snaps and appimage and flatpack. Where your only hard dependency becomes Linux. Which doesn't break userspace.
Some early linux games made the mistake of using low level graphics libraries (not one of the opengl levels), these might not work because the newest drivers just don't support it.
Even AppImage is slightly over-engineered in my opinion, but otherwise the only real failing of it is that since there's no standard for ELF embedded icons (wtf is wrong with UNIX devs!?) it's a shit-show to get them to show up as anything other than a generic one.
This isn't hard, Linux people. Just standardize on a set of core libraries and stop breaking ABI outside of major versions, then you'll magically have this thing we call a "platform" upon which applications can run without being recompiled.
Create ticket in bug trackers of all major distributions and describe the problem and your proposed solution. If your vision is right and attractive, developers will join the movement. This isn't hard, it's just time-consuming.
Even Linux desktops are better than that, but not much better. They seem so emotional.
So that's what we are left with. Linux is not an operating system. GNU/Linux is, Busybox/Linux is, and all the distros are operating systems but Linux is not. people just like to clump all these different operating systems together and pretend that they are different versions of the same OS.
If you want a "Linux" operating system that is well defined, look for the ones without Linux in the name. The BSDs have actual UNIX support (i.e. a standard library of tools) and Guix is a Unix-like/Linux-like that focuses on reproducibility and dependence stability.
No, the main difference is that on Windows the GUI libraries (USER32.DLL, etc) are considered part of the OS and have an API that remains backwards compatible. The applications are not statically linking the GUI libraries, they are dynamically linked against them, pretty much like on Linux.
However on Linux the GUI libraries do not care a bit about preserving backwards compatibility (see Gtk1, Gtk2, Gtk3 and soon Gtk4, see Qt1, Qt2, Qt3, Qt4, Qt5 and soon Qt6 - although in Qt's case they cant do much thanks to C++'s ABI issues and TBH i wouldn't expect them do since Qt is primarily made by a company with interests in providing a crossplatform UI library, not the native GUI library for Linux).
Windows simply provides a much richer API than Linux as a standard that applications can rely on.
> and all they do is freeze the API in time without any updates to the provided functionality
> where are the SDL1-compat libraries that work seamlessly with alt-tab in desktop environments, like SDL2 does?
It's open source. If you need it, take care about it, OR pay to a somebody else to take care about it. Nobody will work for you for free.
It's like roads. Roads are free, because lot of drivers are using it every day, so they are paying taxes to build and maintain them, but road to your house in the middle of nowhere is not free, because nobody shares it with you. However, road making tech is open sourced, so you can build your own road if you need it.
Of course, you can use paid road, if you want, because they are providing better service for small pay per ride, or you can hire developer company to build private road to your house and then payback loan to a bank. It's up to you.
But statements like "free and open roads are bad, because road to my own house costs lot of money for me" is just childish play. Roads are not free to build and maintain. Roads are free and open to use, but only because they are paid by someone else. For your own private road, pay with your own labor or money.
Delivering a set of DLLs is not static linking.
That said, it's not strictly static linking but it is closer to it in effect than the global assembly cache or the shared DLLs which resulted in the kind of "DLL hell" that windows 95 was famous for.
"Broken" is a strong word. "Different" might be more fitting.
While I don't think anyone in the world of Linux and Linux-distros work with the explicit aim of breaking compatibility, you have to realize that Linux and most Linux-distros are built entirely around open-source software.
Most users install software via a distro-provided package-manager which fetches binaries from a repo. These binaries have been built from source, by the distro-vendor, specifically for that distro.
That means Linux compatibility traditionally has been more about source-level and compile-time compatibility more so than 100% uncompromised binary compatibility. If a change breaks binary compatibility, but the distro/system as a whole still builds fine, that's still compatible in most day-to-day scenarios of a common Linux end-user.
And unconstrained by having to maintain strict binary compatibility, it also allows developers to clean up historical garbage and technical debt they otherwise couldn't.
This approach obviously has both drawbacks and advantages, but calling it "broken" seems unfair, overly opinionated and really just wrong.
No, "broken" is a perfectly fitting word, an application not running is broken, not different.
That the kernel is used in "servers, computers, smartphones" and wherever else metric of popularity doesn't magically make applications run. Developers who work on libraries that are relied on by others to provide basic functionality (like GUIs) not breaking their APIs is what make applications run.
Why waste time maintaining something which is not needed?
Binary compatibility is a very pressing issue if there exist software that is closed source and you want to have more users than just some nerds.
Either way: The enormous success of Linux despite these issues seems to be evidence contrary to your claim.
The legacy support in Windows is admirable but does bloat the OS and create security and performance issues, for example. I suppose one could always fire up a VM with a 10-year-old version of Ubuntu to run older software, but it is indeed a hassle.
The correct answer depends where your priorities lie.
Serious question to all the GNU/Linux hackers: why can't we standardize the desktop on GNU/Linux similarly?
But why is there no similar fight concerning CLI software?
bash vs zsh vs tcsh vs csh
vim vs vi vs nano vs joe vs emacs
screen vs tmux
and thats before you even get to all the configuration tweaks a given user has made to each.
Lennart Poetering is trying to standardize from the bottom up, correctly IMHO, there's some push back, but mostly inertia and lack of resources.
EDIT: So the problem is that it is not a problem of work to be done, but a problem of willingness to standardize.
Not Linux, libraries instead are the ones breaking compatibility over time. Or missing libraries as well...
On Windows these libraries have APIs that remain backwards compatible (new methods can be exposed, but existing stuff remains there). Unless the equivalent Linux libraries also decide to provide backwards compatibility, they'll just make Linux look broken (even if it isn't the kernel's fault).
(and no, this isn't something you solve by being pedantic about what is userspace and what is kernelspace and who controls what, as that information doesn't solve the actual problem - you solve it by cultivating a culture that actually cares about backwards compatibility)
It depends on how you look at it. Linux does not break kernel-userspace but userspace libraries, like the c libraries, constantly change as security holes and bugs are patched. This is a more secure approach as updates bring in security and bug patches. The flip side is that approach can sacrifice backwards compatibility if an api interface changes or removed. This forces the developer to constantly rebuild and check if anything is broken. The idea is everyone is proactive and stays up-to-date. That's how Linux works.
Microsoft's approach is to let the application developer bundle the libraries with the application if necessary. That's what all of those visual c++ updates are for when you install certain software, they install the libs the application was linked against. So your computer is littered with multiple library versions, some of which may contain security flaws.
The latest approach in Linux is to bundle the application and libraries in containers. It's a hybrid approach that takes the MS idea of bundling the linked libs with the application but wrapping it up into a neat container that in theory should isolate the system from any vulnerabilities the libraries may contain.
Whole GNU foundation was created to bash with this "solution", then Linux kernel + GNU runtime was used to create Linux distributions, which are free from binary-only distribution problem, but developers are creative.
Why not just ship partially compiled binary or IR code, or obfuscated source with open-source adaptation layer? It will make adaptation problem much easier for end user at any distribution or/and architecture.
There are many old games that won't run properly on modern Windows.
Most 20 year old Windows applications still run on Windows 10.
And nothing has made me feel older than finding out it's 10 years old. I would have been like... 4 years? 5 at the most? Wow.
Congrats to them!
edit: rewrote the whole paragraph to be more clear and remove unnecessary negative connotation against mainstream indie which I do not share
thanks to early experiment success and profitability indie games managed to get a lot of visibility and the genre exploded. whole business revolve now around making indies meet buyers (kickstarter, indiegogo, steam greenlight); as a compromise they lost some of the original experimental nature and gained in term of polishing and content quality.
I'd say the sector transformed; it lost some innovation, gained some marketability. there's less explorative projects now and they are harder to find, but you can still find them; currently itch.io seems to be a trove of these kind of content, but the lack of curation makes hard to find anything among the pile of crap (no offence intended, it's just Sturgeon's law)
The problem now is that there's so many indie games out there that it's a lot harder to get discovered.
Either that, or at 15 he is way more involved in software than I am now :P
They wrote as if they were speaking, and it would only have been understandable when speaking because of context and intonation. "I would have been, like..." works when you then act out what you would have been like.
Seems like a better way to put it. They are buying IP left and right and pushing their unfinished, buggy client in a very unhealthy way.
 - https://www.eurogamer.net/articles/2019-05-01-epic-acquires-...
Achieveemnts not popping up helps focus on the actual game.
Cloud save, family sharing etc. are of marginal use for me and I imagine 99% of consumers outside the vocal minority.
But I guess that makes me a vocal minority :P
>thereafter it will continue to be supported on Steam for all existing purchasers
> Epic has offered an update on the confusingly worded statement issued to press earlier today regarding Rocket League's future status on Steam, now insisting that it has "not announced plans to stop selling the game there".
Even if they do pull future sales of Rocket League off of Steam why is that a problem? Valve has purchased game companies. I've never seen anyone complain that you can't buy Left 4 Dead on GoG.
When EA pulled their games from Steam, Valve replied in kind.
One thing that irks me is that Valve seems way more committed to interoperability, open standards and its consumers' wellness than its competitors (likely as a result of being privately owned); while the competitors in question use every trick in the evil corporate playbook to try to increase their marketshare. We've seen how they play, and the friendly façade is likely to change to ad more value-extracting one once/if they gain marketshare.
If paying for exclusives is bad what viable alternative strategy should Epic be using to get users to their store?
And they released L4D2 a year after L4D sparking a lot of community uproar, starting things like the L4D2Boycot groups.
Man, just talking about the game makes me want to play it again. That was such an addiction for me xD
I was being serious.
"what viable alternative strategy should Epic be using to get users to their store?"
What's preferable for the consumer? buying a monopoly, or competing on quality, price, customer service, etc, etc, etc.
What metric are you using to define quality and how would it make people use this store over Steam? How much cheaper would you need to sell games to get people to go to the Epic store? Would developers put there games on the Epic store if the price had to be so low that they made more per sale on Steam? Good customer service being a factor requires having customers to experience it. It helps long term but won't get your first wave of customers.
Price in this case would be margin. If Steam takes 10%, Epic could take 5%. Games makers could pass the savings on, or increase their own margins.
Yes good customer service requires customers. What's your point? Your first customer experiences good customer service, mentions it to their friends, who then shop with you, who then mention it to their friends. It's called word of mouth. It's a powerful marketing tool.
> Price in this case would be margin. If Steam takes 10%, Epic could take 5%. Games makers could pass the savings on, or increase their own margins.
Steam takes 30% and Epic takes 12%. Very few games have chosen to reduce their price to go with the increased percentage.
For a web based store customer service only matters when things go wrong. Most customers should get a fairly standard experience of paying money and receiving their product. Do you consider Steam's customer service bad? Would a marginal improvement in customer service convince you to deal with the hassle of installing another launcher, setting up a payment method on a new store, and splitting your game library?
All of these things don't happen instantly.
You asked for a viable strategy, this is it as evidenced by the fact they're already trying it. Is it guaranteed? No. But nothing in business, or life, is.
Can't find them right now, and don't have the time to do a thorough comparison, but looking at the first three games on the Epic Store front page that also have Steam pages with prices seems to confirm this. That's "Oxygen not Included", "Vampire: The Masquerade® - Bloodlines™ 2" and "Outward", all of which are priced the same on both stores (€22.99, €59.99 and €39.99, respectively).
Offering the games DRM-free (the audience that GOG is aiming - though I am really concerned about what GOG is currently doing with GOG Galaxy and cloud synchronisation).
It sucks for consumers.
Also after Epic bought 'Rocket League' and removed it - for new buyers - from Steam, I'm starting to loose a bit of sympathy for Epic.
After all, Epic and Valve are both making a huge amount of money, so there
doesn't seem that much of a need for such aggressive strategies.
I'd love to revisit this old game, but I only had it available for a now defunct Android-tablet.
I’m hooked. Currently I’m trying to optimize my Level 17 solution.
I hope these people continue producing new games.
Pull requests gratefully accepted, etc.
Now, for another blast from the past, who remembers programming MSP430 code using the "Lockitall LockIT Pro"? Its a somewhat more serious "game" in the same genre. I found a printed manual for my Lockitall next to my TIS100 printout. I see the website is still up. My printout has a copyright date of 2013. I remember I was inspired to buy actual '430 hardware to play with, and that dev board is still laying around in my basement somewhere.
To be super clear, there are no new levels, no new characters, no new battle royale deathmatch mode. This is just a gentle remastering we did for fun.
The framework has been replaced. This is the thing that draws all the graphics onto your screen, and sends all the audio to your speakers, etc. This means the Win / Mac / Linux version should work on modern computers again without freaking out, and you can run the game on modern displays at whatever resolution you want.
Game now runs at a hi-def widescreen 16:9 aspect ratio by default. The original ran at a squarer 4:3 ratio.
Resolution of graphics is doubled. The original game ran at 800×600, and the tiny graphic files didn’t scale to huge monitors very gracefully. We used a few different high quality upscaling tools to start, and then went over each image by hand, tweaking each image further as needed. In a few lucky cases, we still had the original source files and were able to use those. But if you still really want the original flavor, there’s a setting to use the original graphics, also included with the game.
Fabulous joke about looking good in hi-def has remained unchanged.
Brought over graphical and UI improvements from releases on other platforms, like Nintendo Switch.
No more encrypted assets or save files. We hope this makes the game more open and friendly to mod.
The config.user.txt file is now located wherever your save file is stored. So no more editing that file in your Program Files folder. It has a bunch of new config vars exposed as well.
We’ll be updating the game everywhere Win / Mac / Linux versions are available in the next few days.
These updates 10 years after the initial release. These guys love their fans.
 https://www.pcgamer.com/uk/command-and-conquer-red-alert-rem... (URL is somewhat misleading - release date is unknown)
This update is great and all, but I was actually... really hoping for new levels. I've wanted them for the past ten years. Pretty please? Maybe some day? :'(
(I say this with all possible respect to the modders, of course, their efforts are appreciated.)
Of course, if you strike it big, someone will eventually find enough time with a debugger to get around it, but as a small indie dev you'll probably be most interested in making sure you can break even first.
Yes a determined person could still get to them but it helps buy you some time as your game launches.