> Most commercial computer games are not only proprietary software, but also require proprietary platforms such as Microsoft Windows and Steam. This means you do not have the ability to understand or maintain your computer.
I understand my gaming machine very well. I don't often dive into the code behind my Linux boxes, or look up the Darwin source for my Mac either. Having access to source isn't required for running and maintaining a system.
> Most commercial computer games require very powerful computers.
False, check out LowSpecGamer or the Potato Masher PC build. Many PC games are quite capable of running on a wide range of hardware, you may have to turn down rendering resolution. The most popular genre of games today (MOBAs) are known to run on quite old hardware. The claim that you must buy a GPU more expensive than an entire laptop is disingenuous as well; an example is the Nvidia GeForce GTX 1050ti which can run many modern games at 1080p quite well and costs about $160.
> gaming-hardware-fetish-industry where people spend all their disposable income on bullshit such as graphics cards that cost as much as a laptop.
What I spend my money on and get enjoyment from is my own business. I enjoy reading about and sometimes purchasing new hardware. You might see it as a waste of time and money, but I do not or else I wouldn't do it.
> that means they don't require bullshit like Windows and Steam
I am not a big fan of Windows, but it gets the job done. Steam is an excellent source of games and is a big part of why PC gaming is in such a good place right now.
edit: more info about hardware
As for "spurn the computer game industry", that should apply to jobs. The working conditions are terrible. They need to be unionized, like Hollywood.
The problem is that a game is not just a piece of software. It's a creative work that people pour years of their life into. Both on the code side, but also on design, art, audio, writing, etc.
Suggesting that games should be free is like suggesting that books or art should be free - and yet people wouldn't suggest that, right? We understand that art and creative work doesn't come for free, and that even artists have bills to pay.
(As for high hardware requirements - that's not right either, there are plenty of great games that will play just fine on an i3 with an integrated video card. Most games are not AAA.)
It's the same as in other areas of human endeavor - the larger the business, the more constraining it's going to be. An executive at EA is closer to an executive McDonald's - your concern will be with market fit, scalable processes, reproducible results, etc. Not creativity.
But if you're a designer / developer at a small studio (or a team inside a big company that somehow managed to secure independence - it does happen!) then you're more like the executive chef at a local gourmet restaurant - you can stretch your creative muscles and do interesting things. However, the price is that you will probably never become a household name and a global brand. And whether that price is fair is entirely up to you.
PS. And if you want games that don't conform to societal expectations, you should really check out all of the avant garde / zinester games that are being made outside of the mainstream. There's a lot of boundary-stretching work being made if you look for it.
Many people who contribute to OSS are doing it because they use it in their work.
No one uses games in their work.
More significant, all - every single one of them, professional game developers are up to their eyeballs in work. It's the nature of making games, they are never finished and there is always more to do.
Professional game developers are very unlikely to have any side project or contribute to OSS (I was one for years).
What's more realistic is that money that's being spent on development also funnels into tools so that everyone can reap the benefits from, and to a degree that's already happening with stuff like Unreal.
The problem is, now that Unreal and Unity are so readily accessible to anyone, a FOSS competitor is unlikely to spring up: from my experience, game developers and artists are paradoxically so willing to shackle themselves to proprietary software (and actively deride anyone who opts for the "inferior" open-source alternatives), whereas I can make a lot more money as a software engineer using FOSS languages, libraries, and platforms.
Unity is making commitments to open source and is open sourcing more of their tech but open source remains a very very expensive option only for AAA teams.
Open engines like Godot and Irrlicht are still around even with all that happening. Sometimes the building blocks to build a AAA game engine are still somewhat closed (due to licenses on platform SDKs or 3rd party libraries) but overall you can make some pretty advanced game engines using openly available graphics techniques.
Why is that paradoxical?
I understand that if no decent alternatives to tools exist, there is a cost-benefit justification for buying good software. But if artists and game developers are so passionate about what they do that they accept more work for less pay, why can't they be more passionate about FOSS? I'm passionate about FOSS precisely because it allowed me to explore and hone my creativity at a time when professional tools seemed well out of reach.
If open source would really have a plan to supply income, food and roof for people contributing, they would come.
But just standing there, yelling at the mining-pit "You are all wage-slaves" with empty pockets, that attitude gains altitude pretty quickly.
However, I think by focusing on Steam and Windows the author misses a bigger threat.
> Most commercial computer games require very powerful computers. This means that a less-wealthy person cannot play online games with their more-wealthy friends. This is especially a problem for young people, for whom gaming is an important social activity. Also it's a problem for people who don't earn massive 1st-world salaries.
Most of the young people I know don't play PC games (exception: the ones who like Minecraft). They play on consoles or mobile devices. A PS4 or an iOS/Android device are way more restrictive. On top of that, the mobile games are particularly pernicious in their design. Many of them may not even consider children, as the goal of the game is to trap a handful of whales, but they still lock children in the skinner box.
The problem is advertising is what sells games now. When everyone can shovel a game onto Steam and burn a few people with a game in beta hell, people start to clutch onto their wallets.
The same thing happened with mobile dev and crapware.
While I think there are people in it to blame, I don't think "the industry" a whole is the problem, they're as much a victim as the gamers at this point.
We're in a bubble and it's slowly deflating (see games like Watch Dogs 2, Titanfall 2, and Dishornered 2), I'm worried for the devs who will get caught in it.
I don't see why they should free as in beer - developers need to get paid.
As for being open source, it depends on why you think that’s important. To me, the goals of open source are:
- Make sure that you aren't building your business on top of proprietary software. This isn’t relevant for gamers.
- To increase the productivity of all software developers. This is achieved by open source engines.
The author might as well say that we shouldn't use any proprietary platforms (like, say, Hacker News).
- Restrictive DRM ruining the experience
- Servers being suddenly shut down with no recourse
- People who implement their own servers being threatened with legal action
- Popular features being removed
- Communities shifting focus just to appease advertisers
- Spying features being (ineffectively) implemented in order to prevent cheating
- Exclusivity deals, and authors just flat out refusing to port things to other platforms
- High turnover, high stress working conditions, lots of duplication of effort for little gain
Not that any of these problems are specific to games, but the point is that these things don't generally happen with free software. I appreciate the vast improvements we've made in microeconomics to get games to the point where the average quality of a game is incredible and the price is ridiculously low, but the whole thing stinks of a bear market right now.
To me the golden era of gaming was about 10 years ago. Game engines weren't designed to make Hollywood blockbusters (see the current Battlefield engine and series). They were easily moddable. Maybe not surprisingly from that era came Team Fortress Classic, Counter-Strike, Desert Combat (the precursor to Battlefield 2) and mods you've never heard of like Tribes Football which anticipated Rocket League by 2 decades.
In other words, almost everything that exists today existed a decade or more and probably had more features. The current business model favors locked-down engines that have semi-regular releases of DLC or an entirely new game every year or two. This is horrible for building communities and pushes publishers to neglect games.
A more open view of game development would avoid a lot of this constant reinvention of the wheel and should create better content.
Now if they could take what they created and own it (and really own it, not just rent tech via licensing) and have a profitable business model going forward, there's no telling what that would do for the gaming industry as a whole.
If anything, now you can easily get a game engine for free so you don't have to mod a game to get one.
For now gaming communities have managed surprisingly well to keep old games alive, the biggest threat I see here is the growing move towards "always online" models and no provisions for local multiplayer in AAA titles.
Games are made for systems people have, not the other way around. The only thing that could make people switch en-masse to Linux for games would be to release latest Fifa or CoD on that platform and on nothing else. Of course, for EA it would be like killing a goose that lays golden eggs, so why would they.
I was recently told that we should develop our games for Linux. And while I agree with the sentiment, the truth is that if the game has a PS4/X1/PC version, then PC version will be 5% if not less of all sales - it doesn't make any financial sense to spend time on various sub-segments of the PC market.
As for the comment that you need super expensive PC to play latest games - also nonsense. I have a really old desktop pc, with Core i5-750(which is a 7 year old CPU at this point!!!!) and a GTX750Ti, and it plays any new game, on low-medium settings.
Development of GPGPU has been transformative for some areas of science and technology, but that development arguably has been paid for and made feasible only as a side effect of building what is mostly entertainment systems.
Also, I prefer desktop PC:s to laptops and (by incredibly wide margin) to those glitzy slates the general population seems to be poking for their limited computing need the last decade, so I'm actually pretty happy about incentives to keep desktop PC:s available and more or less affordable, at least in rich countries.
Games greatly improved as people shared the effective techniques they used. The game engineers didn't really worry about this lost of proprietary info because they're showing techniques not code and because in a quid-pro-quo relationship they expect that people who learned from their technique at one gdc will be presenting 2 years later at a time that the original presenter is really digging in on their next game.
This happens internally at large companies like sony. For instance the facial mocap tech that backs Uncharted 4 went through early iterations at other studios as well. This works for big companies. I'm not sure if they talked about it at GDC.
Apparently a few things have harmed the information sharing at GDC. Studios might be getting more paranoid. The pure/deep technical talks aren't as popular as "monetization strategy" talks. Really you might need 100 people in the world in one of the deep technical talks.
Also games like overwatch play well on 10 year old computers and $50 graphics cards.
Devs spend millions even on a simple game. It's a lot of bespoke software.
> beautiful games like The Battle for Wesnoth
> TAKE BACK CONTROL OF OUR GAMES!
> let the computer games industry and their over-priced fetish-hardware languish!
Saying something doesn't make it so.
Some other largely overlapping lists of games whose creators collect no rents:
https://wiki.debian.org/Game / https://packages.debian.org/stable/games/
https://directory.fsf.org/wiki/Category/Game (site linked to in post, but not the relevant category)
This. I'd love to try out a variety of fun Half-Life 2 or Quake 3 mods (which all work just fine on an old, low-end laptop running Linux), but everyone I could conceivably play them with is too preoccupied with Overwatch, Battlefront, Call of Duty 36, or whatever closed-server, walled-platform, resource-intensive game came out within the past year or two. Open Arena? These graphics look old, it must suck.
The graphics card comparers and obsessors are paying lots of money for higher power requirements, more noise, and higher failure rates. They can be safely ignored as a market.
I can get relatively cheap used laptops for my little computers-for-poor-kids project partly because the gaming people spend a lot of money at the high end.
There are several great open source games. And older games that went open source after the fact (id engine et al).
Spurning the industry seems a bit like saying: Hey movie industry - I've already got enough films, thanks, you don't need to make any more. Or telling writers that there are already enough books.
(FWIW, I hope to make a living from creating video games one day. I'd feel hypocritical not to buy games I want, and try to content myself with a selection of older FOSS games.)
That is a reasonable opinion.
It's ugly and not very fun.