This humble, relatively small dev team has written their game on top of a barebones engine and ported it to more three platforms and architectures, yet multi-billon dollar studios claim supporting anything beyond x86_64 Windows is an impossible feat.
I have so much respect for Wube. Absolutely amazing quality and care in everything they do.
I was just saying to my wife last night when her Lego Hobbits game crashed that I don't recall a single crash in my 1000+ hours in Factorio. Not one!
Factorio is the standard against which I compare not only other games, but all other software. For polish, stability and craftsmanship. I hope they read this thread because the love for the game everywhere is truly well deserved.
It is probably very unpopular but maybe one has to start asking in what way humanity as a whole does benefit from the creation of such addictive games.
Theodor Kaczynski wrote in one of his books that computer games will increasingly distract younger generations(males are obviously more affected) from solving urgent issues resulting in an erosion of freedom. maybe he was right
I hope you understand that if people were not spending a 1000 hours(presumably over years) on "addictive" games, they would be spending those 1000 hours on something else that was available in their time.
Nobody can and does spend 12 hours a day solving "urgent issues", we all need ton do something to decompress and relax, and video games are a modern way of doing so.
Video games are dopamine machines like few others. I fear they might cause us to waste more mental capacity than older pastimes.
I'm inclined to think of it as stealing too much of people's brain power for too little return, like TikTok etc.
Doesn't it sound almost undeniable that we would be healthier as a society if people were forced to do sports, read a book, create something, or talk to their friends in order to relax instead?
(I play lots of video games myself so I guess I'm just theorizing here)
Why do you think spending time solving hard logistical optimization puzzles is such a worse hobby than sports or reading books or painting or whatever? Even if it’s not your own cup of tea, why do you care if other people enjoy it?
It’s certainly no less valuable to society than e.g. playing chess or go or poker, and on average probably a better use of time than reading the newspaper or chitchatting on this website. Many of the skills learned are largely transferrable to solving other kinds of difficult technical problems.
You can make an argument that aerobic sports are valuable for general health, but beyond the exercise and some measure of social activity, the game part of the sport doesn’t have any a priori importance, and if people prefer to get their exercise some other way that doesn’t seem like an inherent problem.
“Forcing people” to “relax” with particular activities seems pretty authoritarian.
First of all: It certainly is my own cup of tea; Like I said, I play lots of video games myself.
On a societal level I'd say there are basically two reasons why I think video games might be a worse pastime than the older ones. First it's that you are more likely to spend "too much" time on video games because they are more addictive, you don't really get tired and you can do them at basically any time. Second it's that I think video games is a lower value way to spend your time. Sure, you might improve a bit at problem solving, but I think the value per unit of time spent is still very low. I even think this might come at a cost because really figuring out/performing well in a game can be very mentally taxing at the expense of more productive use of that mental capacity.
These two points apply to the examples you mentioned as well: reading news and talking on social media. I'm not saying video games is the only bad hobby. Apropos Ted Kaczynski, it might be tempting to say that most "post industrial" hobbies are the bad ones.
Lastly, I'm not talking about "forcing people". Even if choosing video games as a hobby is a bad idea then people (including me) are free to make bad choices.
>> we would be healthier as a society if people were forced ...
* * *
People have been complaining since forever that other people waste too much time on board games / novels / playing or watching sports / playing or listening to music / traveling / attending live theater shows / going to the pub / gardening / hiking / stamp collecting / politics / mathematics / philosophy / cooking / going to restaurants / whatever other activity you can name.
It’s fine to say that many people would do well to prefer activities that are interactive and creative vs. passive, physically active vs. sedentary, social vs. individual, skillful vs. mindless, etc. But Factorio per se seems like pretty high-hanging fruit, especially if people are playing it together. (Disclaimer: I don’t really play computer games.)
Oh, my bad, I see how I miscommunicated the "forcing" thing. In my head I was imagining an alternate world where video games simply didn't exist, so there people would have no other option than to (be "forced" to) read a book etc.
And yes, I agree that Factorio might be one of the "smarter" games out there. But the point about addictiveness, and the point about being mentally draining rather than relaxing, still stands for Factorio.
Personal anecdote: I spend much of my time in high-ranked (think top 0.1% of players) video game matches, and I think it might be too mentally stimulating to the point where I don't have the energy or willpower to get other useful things done.
Another analysis is to go ahead and consider video gaming primarily as a compulsive & addictive pastime. If so, it is one of the least harmful ever.
If video games are scratching some dark and antisocial itch leading to hours spent with them, it's easy to think of far worse outlets -- drugs/alcohol, sex/porn addictions, gambling, food, etc -- that have been with us forever.
>Doesn't it sound almost undeniable that we would be healthier as a society if people were forced to do sports, read a book, create something, or talk to their friends in order to relax instead?
I disagree. Firstly, video games are a very broad term so I'm not claiming that there are not video games that are made solely for the purpose of making money from people addicted to the dopamine rush. Bad actors are unfortunately present in all sorts of industries, but anecdotally, they are outliers.
For me, the video games that I play or played have brought me closer to my friends and left me with memories that I can look back on and smile at the thought of.
Infact, Minecraft Redstone[0] was a massive catalyst in developing my interest into putting small parts together into a working system, which eventually led me to learn programming.
If I'm understanding you correctly, you are suggesting that video games and similar dopamine machines are replacing more traditional past times such as sports/reading etc. Are you suggesting that before the prevalence of video games, people only ever engaged in things like sports/reading books?
TV is a thing, and before TV, people who didn't want to engage with sports or reading spent time with their friends in ways which weren't exactly productive. There are undeniable changes to society since the introduction of the WWW/electronics in general, but I don't think they've made society unhealthier in general; they've merely changed it in ways that we are still getting used to.
One last thing to note is that video games are a great way for me to also talk to my friends and have a purpose for hanging out(albiet online). Lots of them live way to far for us to feasibly meet in person on a regular basis, and playing a multiplayer game together is a great way to engage with each other and also have some fun on the side.
Basically, what I'm trying to say is that video games aren't necessarily unhealthier substitutes for past times we engaged in before they became prevalent.
Factorio has provided a sandbox for me to visualize and acquire skills on managing bottlenecks in data flows; skills that I have and am applying as the point software engineer to resolve critical bottlenecks in multiple e-commerce systems. I won't speculate as to the commercial value of resolving those data bottlenecks here, but whatever dollar number you're imagining is, I suspect, probably missing multiple zeroes.
Worth mentioning: 1000 hours over the last six years that Factorio has been released comes out to about 27 minutes per day, on average.
1000 hours since release in 2020 comes out to about 1 hour and 22 minutes per day on average.
Regardless of whatever "benefits" Factorio might have to society as a whole, one and a half hours a day of recreation sounds completely reasonable to me.
I've played about 100 hours of Factorio and have often wondered if it's approaches can be generalized beyond the game. Does anyone beside the poster above believe they are?
Does he have children? Time for computer games dwindles if you have kids, except for the computer games you can play with your kids. :)
I think games are fine as long as you are living an otherwise full life. Games are only problematic if they are an opiate that prevent you from achieving goals outside of virtual entertainment. Games should not be a present day "soma" to paraphrase some words from https://en.wikipedia.org/wiki/Amusing_Ourselves_to_Death
Also 1000 hours for a game can be similar to how many hours of TV many watch in a year, at least games involve more active engagement and problem solving.
Sure, but perhaps it is worthwhile to ask when these games have become "soma," or at least become that for certain individuals. The above poster was reacting to someone saying they put 1000+ hours into a single game. Maybe we'll reach the conclusion that it's time well spent, but I don't think it's wrong to at least ask the question of what this impact is on society. People don't seem to have the same antagonism to the question when it gets asked about, say, social media addiction.
For what it's worth, I do think we should ask the same questions about TV, or browsing HN. There are a lot of things that it's easy to mindlessly do for hours that, if given some reflection, we might find we'd rather not spend so much time on.
Side note: If you want to teach your kid (age 3+) to read, with the assumed prerequisite that you read lots of books aloud together and the kid wants to learn, let me recommend Bloomfield’s book Let's Read from the 1960s, about which you can see tokenadult’s recommendation here: https://news.ycombinator.com/item?id=4665466
The book is organized so that in the first ~100 lessons only one new spelling–sound association is introduced per lesson and all words use strictly regular spellings. Lessons have (initially somewhat stilted and then gradually more natural) sentences constructed from previously seen words, so that there is a natural spaced repetition built in. This is much more efficient than most reading curricula, because despite many exceptions English is at its core a phonetic system.
It takes about 10–20 minutes per "lesson", you can do maybe 2–14 lessons per week (we aim for about 1/day), and there are ~250 lessons, so overall it ends up taking about 6–18 months from start to finish, maybe 50–100 hours in total. Afterwards, your kid will be ready to read pretty well anything they can understand, and after a further year or two of practice (reading whatever kind of material they want) will be a strong and fluent reader.
Only real prerequisites are that the kid is interested and can sit still for 10+ minutes at a time, can recognize the letters of the alphabet, and can more-or-less make all of the sounds of spoken English.
If you would like to feel this on a more visceral level, I have been playing "Tunic" recently and it is a really interesting simulation of "you are a child who can barely read one word in twenty in the manual but you sure are having a lot of fun bumbling around in Zelda".
Those 1000+ hours are dwarfed by the hours my grandmother spend on her jigsaw puzzles. Also I think most people play games as a form of relaxation, time which someone would not spend on solving urgent issues.
At least games are interactive. I've never played, but Factorio looks quite healthy for the mind.
If you want to rail at wasted time, there are much better targets out there. Try corporate tax accountants, or insurance salesmen, or fossil fuel PR goons; people actively destroying value.
We have PFAs in the rain; plastic on Mt. Everest and the Mariana Trench. We have a warming planet alongside proxy oil wars. You want to solve those problems by looking at addictive computer games? Really? Not even reality TV, or corporate media monopolies - games? ..... I think your high horse is pretty sickly looking tbh.
The problem is that there is some kind evolutionary pressure to create increasingly more addictive game as every new game is in competition with every other game.
I'm not saying that games cannot be beneficial especially for learning english but if you speak with a boy you will understand how these games have taken his brain hostage as many of thoughts will be about the computer game he plays.
> The problem is that there is some kind evolutionary pressure to create increasingly more addictive game as every new game is in competition with every other game.
This is a sweeping statement that doesn't apply to many games, Factorio included.
A game like Factorio being addictive gives little back to the developers, as there are no microtransactions; after the game is bought, the transaction is over. If one would like to be cynical about it, games like this only have to trick people into buying a copy (and playing just for long enough for a refund to not be possible).
In the case of games with microtransactions, GTA V being the most profitable example in history, then yes, addictiveness does bring more cash to the company who owns the game.
Now, this opinion you held was incomplete and thus, wrong. What else are you wrong about in your mental model about videogames?
I think video games are very comparable to chess. I mean, chess:
1. Is dull
2. Promotes laziness
3. Is sedentary
4. Stunts societal progress and growth
5. Limits the mind
6. Promotes violence
The list goes on!
Yet can you believe that people can spend hours of their time on a game of chess? And parents allow their children to play it, which is most definitely irresponsible parenting. I would certainly never allow MY child to play chess when it's clear that it leads to such negative outcomes - nay, downfalls!
Everything in life is literally a game. Game design exists at every level of life and the most amazing ones are the games that bring people together and develop “real world skills.”
People do not need to be productive 100% of the time. Let people have fun without shaming them. Just think of how far science and technology has come in the last decade.
Heh, I wonder what Kaczynski would think of the game's message about the consequences of exponential industrial development, and resulting pollution (and specifically the dangers coming from cutting trees or polluting them so much that they die, at higher difficulty...)
> [...] yet multi-billon dollar studios claim supporting anything beyond x86_64 Windows is an impossible feat.
Who actually claims that? The issue is more that the ROI just isn't there. Remember that porting to a platform incurs additional support costs that need to be payed.
Behold, the actual reason AAA studios can't. Engineering excellence is not something you buy or otherwise invest in. It is your culture. It is who you hire. It is what kind of expectation you set internally for yourself and your peers. That is only marginally related to budget spent on salaries.
If you actually practice engineering excellence, the investment is small. If you don't, the investment is nonsensical.
Blizzard seems to disagree, as I mentioned in another comment. I’m just not sold on this argument tbh. It makes sense at face value but that’s not enough reason for me. Especially given how many games played on both PC and Mac from 2010-2020 and continue to do so. It’s not the empty desert for gamers it was in the 2000’s.
And Blizzard also stopped trying to port more advanced titles like Overwatch. Even Diablo 2 Resurrected didn't get an Apple Silicon release (despite the fact that they already made an ARM version for Switch).
Maybe you're right, and the technical hurdle is fairly small relative to the money they stand to make from it. Whatever the case is though, Apple's current offerings are not really attractive to developers. People would rather target the APIs they already know than do Apple's dirty work for them. Maybe Apple should take a page out of Sony's book and actually compensate or assist the studios building on top of their technology.
Specifically, those two titles don’t have Mac releases for any architecture. Apple Silicon support is somewhat orthogonal, since there weren’t any major API removals between x86_64 and arm64. (Unlike in the old days, when, say, Carbon was removed in the transition from x86 to x86_64.)
Actually it does. Everytime someone bothered to give actual numbers for games released on windows vs linux on equal footing, sales numbers represented barely a few percent while support requests represented 50+%
1. I was trying to focus on the development cost, but a few percent of 50 million dollars can get you a good support team.
2. Isn't the topic at hand mac?
3. Are you sure you want to cite that kind of number? The most prominent one says this: "Though only 5.8% of his game's buyers were playing on Linux, they generated over 38% of the bug reports. Not because the Linux platform was buggier, either. Only 3 of the roughly 400 bug reports submitted by Linux users were platform specific, that is, would only happen on Linux." "The bug reports themselves were also pretty high quality, he said, including software and OS versions, logs, and steps for replication." That's free QA, not a burden.
Unfortunately this is not how many managers see it. They just see it as more work because people are "discovering" more issues that they now have to fix. The squeaky wheel gets the grease, or in this case, the squeaky community is considered annoying and gets neglected.
Not to say it's all positives, I'm sure a big title has a lot of linux users in the annoying 'enthusiastic youngster' phase a lot of school-age PC gamers go through. But the kind of spreadsheet math that doesn't even classify support interactions is lazy.
I've worked with multiple companies on projects with large support infrastructure and teams. I've seem many foreseeably-bad business moves called "data driven decisions" based on support metrics. Metrics aren't insight, but they do let a business team justify the decisions well on paper. I'll avoid writing a whole rant about supporting a call center, but I will say their cost and ancillary nature makes iterating with the data coming out of them (or just iterating on the design on the support system itself) poorly prioritized and full of noisy signals.
I could be wrong, but maybe Linux users are more persistent (by necessity), and thus more likely to file bug reports, where your average Windows user is more likely to just go "F this" and stop playing your game.
I worked in small and large game companies for over 20 years and this does not surprise me at all. A modern AAA title involves 100s of people, whereas a small independent can be less than a dozen. At scale a company can spend millions on advertising reach, on graphics pipelines and content creators. What they are bad at is small things and risk. If you spend $1b developing and marketing a game you must ensure it appeals to the widest demographic possible.
Unless there is a clear market of hundreds of thousands then it's really hard to shift even the meagre amount of resources it would take do the actual port. And then, as a large company, you need to train people to do support, you need legal people working with the platform for all kinds of shit from placement and proper use of logos to compliance with app store rules.
Is it really that complicated from a legal and support perspective? For legal/platform compliance I feel that it shouldn't be that different between platform, most of the work should be the initial familiarization with the requirements or are they changing all the time like crazy?
Support should be proportional to the number of users.
If the game is multiplayer/ live service I see that things might get seriously daunting if you don't have either a good number of sales or the capacity to coordinate the whole team.
I’m curious whether people who hold this view are graphics developers themselves who’ve developed games?
The reason I ask is because there are so many different graphics APIs, of which Vulkan is the one that has the lowest rates of direct targeting.
You’re looking at custom platform APIs everywhere but Linux. Windows and Xbox only officially supports DirectX (including console variants here), PlayStation have their own, Switch supports their own plus Vulkan (and most people don’t target Vulkan on there).
Even Metal has significantly more games targeting it due to iOS support. Before people say those are only mobile games, thats a severe underestimation of what games are on mobile.
So every cross platform game needs to support multiple APIs anyway, and even if you count every API , Vulkan is by far in last place.
Edit: I’ll also add that both macOS and Linux had lack of games when their only API was a cross platform one: OpenGL. yes, macs GL is outdated today but it wasn’t always , and just like Vulkan, very few games directly targeted GL anyway.
Windows supports Vulkan just fine, but engine support is lacking. IMO Metal prevents it from being universal in engines such as Unity, so instead a lot of effort goes into DX11 as the lowest-common-denominator to emulate i.e. with shaders. As a result, DX12 AND Vulkan progress is held back IMO because shader performance is necessarily worse when you have multiple layers of inefficient transpiling instead of running SPIR-V everywhere.
To add to that, I don't think most mobile developers are going to Metal directly; a huge number of them make use of it through Unreal or Unity. With Unity, customizing the rendering backend is not really possible without an insane amount of internal knowledge about it (not to mention source access at the read-only level or above).
I think that Vulkan is not particularly behind as an API, it's closer to a usable "almost there" state in terms of developer access. If Unity threw their weight behind improving their shader compilation pipeline and building it around SPIR-V/Vulkan, I think it would rapidly become more universal.
spir-v can be transpiled to MSL just fine so I’m not sure how Metal is holding Vulkan back.
Even then, none of these game engines are using direct authored shaders anymore, instead using shader graphs which are much easier to target any backend on.
In the case of Unity they also support WebGL which is much more limited than any of the other APIs and somehow WebGL isn’t holding those back.
As a gamedev, what matters is ROI: can we recoup the costs of supporting this new platform/API/etc. by selling to a large enough consumer base?
The numbers have been run many times by game studios large and small: it only makes sense to support Windows/DirectX and the three major consoles. Maybe Metal if you're targeting iOS.
No open standards. Those only earn you Stallman good boy points and you can't pay your bills with those.
No one is using OpenGL or Vulkan to a significant extent any more.
Also, not every game has to target every platform, especially when we're talking about radically different inputs / outputs !
I'm honestly surprised about Factorio going to the Switch (especially since they supposedly started working on it before the Steam Deck, which I could have seen as having been a «tech demo stepping stone»), considering how much Factorio relies on precise mouse clicking !
Windows doesn’t officially support Vulkan. It’s exposed by third party drivers, but isn’t the official API (and was indeed limited in use for a while). One could argue that it’s moot though but there are ramifications.
PlayStation supports GNM not Vulkan.
Afaik the switch is the only recent console to support Vulkan but it’s not preferred, instead using the NVN api. Not counting the SteamDeck since it’s a general purpose computer.
> Windows doesn’t officially support Vulkan. It’s exposed by third party drivers, but isn’t the official API (and was indeed limited in use for a while). One could argue that it’s moot though but there are ramifications.
There aren't really any practical differences and if anything, Direct3D being official never stopped it from rotting away at a faster rate than OpenGL and Vulkan - older games relying on Direct3D, even D3D9 despite how widespread that one was, are way more to likely to be broken in modern Windows than games using OpenGL and gamers nowadays often end up using tools like DXVK (which implement Direct3D on top of Vulkan) to get their older games work properly even under Windows.
The main reason Direct3D is more widespread than OpenGL and Vulkan is that D3D was also available on Microsoft's consoles which were themselves very popular. It isn't a coincidence that games were as likely to use OpenGL as Direct3D (or even offering backends for both) at the past but somewhere around the mid-2000s when XBox and XBox360 consoles started taking off Direct3D also took off.
DX9 was a steadfast because of DX10 was both a highly breaking update and required a new , highly unpopular OS.
Unfortunately that meant DX9 had the most market share at the time and for many years after.
While I agree that consoles helped the adoption of DX10/11/12, I disagree that it was driven mainly by the popularity of the XBox.
The big reason IMHO, and it’s similar with Vulkan, is that GL was fractured with tons of vendor extensions, much higher variation of feature support etc… DX was a more stable and consistent target.
There’s also the renewed investment by Microsoft into DX at the time, with a lot of investment into abstracting the other parts of the OS. That was probably driven by the Xbox like you say, but I think it’s a subtle difference in that it became the better API surface to target vs GL+other audio and input libs, instead of the ubiquity of the consoles.
Vendor extensions aren't really an issue in practice though - if your program needs the functionality the extension provides, you'll use it and have it as a requirement. If it is good/essential functionality, it'll either become part of ARB or EXT at some point or at the very least will be implemented by multiple vendors.
It isn't like applications have to support all of extensions or anything like that. If some functionality was provided by, e.g. D3D10 then the same functionality would also be provided by OpenGL + some extensions.
If anything extensions are a good thing because it is thanks to them that OpenGL wasn't stuck in OpenGL 1.1 that Microsoft provided with Windows and how a lot of new hardware functionality was exposed to applications before even D3D had access to it - without even being locked to a specific OS or OS version (like the Vista you mentioned). And unlike Direct3D you didn't had to do a D3D9->D3D10/11->D3D12 complete rewrite of your code, you just used the new functionality where that makes sense.
Game engines need to support multiple APIs anyway. DirectX for PC/Xbox, a different API for Playstation, and yet another API for Nintendo Switch. From what I've heard from people in the industry is that you really want to use the platform specific APIs because they're developed by the same company who built the hardware, and as such there are a lot of exposed hardware-specific APIs and tricks that can increase performance by a significant amount. And those features are guaranteed to be there for every system that it will be played on. PC, and by extension Vulkan, doesn't get that kind of hardware access since it's made to be generic and usable across a large range of generic hardware that may or may not have those same features. So while you may be able to target Vulkan on Playstation and Nintendo...performance is likely to be terrible compared to using the native APIs.
Adding a Metal renderer isn't the big deal. As others have said, the deal is that they won't get enough revenue from Apple sales to justify the development and support of Apple hardware.
Speaking as a graphics engineer, Metal is THE most painful API to support bar none (including all the console vendors) because they just had to roll their own shader language as opposed to extending an existing one. The development cycle on Apple just plain sucks, and the tools are not as good as native tools provided by IHVs. I dread needing to support Apple platforms because they choose to be completely unhinged and make zero attempts to making porting easier.
No no, that would be Microsoft around '99. Apple is just trying to do the same as microsoft did over 20 years ago.
SGI and microsoft were in api-war between OpenGL and Direct3D (and the boids examples ha!). They started a collaboration to create 'Fahrenheit' (lowlevel) and Fahrenheit (XSG) Extensible Scene Graph. Microsoft was sandbagging, and simply working on Direct3D, bc of opportunity. Bc microsoft was still evil back then bc they had more power.
As far as I always understood, is that OpenGL and whatever was left of Fahrenheit and the coop became the Khronos group.
There has arguably been no market for Mac games. Until M1, Macs didn't have good enough GPUs for many games. There were no "Gamer macs" (Macs with good enough GPUs to run top end games)
And, AFAIK, there's very little market even for games that don't require a high end GPU on Mac. My guess is the majority people who want to play games on a desktop/laptop just know to get a Windows PC.
~2009 until ~2019 (when Apple dropped 32 bit support) was actually a pretty good time for gaming on the Mac. AAA games like Human Revolution and Witcher 2 were being released about a year after their PC counterparts, lots of Unity games were coming over, GOG was supporting old games on the Mac (bundled with DosBox), and Bootcamp was available for people who needed more.
If you were a casual gamer, you'd be pretty satisfied on a Mac for that decade.
Things got significantly worse since the 32 bit support was dropped. Most companies aren't interested in going back to update a decades old game to support Mac gamers, so a ton of games that were available suddenly became unavailable. And with the M1, Bootcamp no longer works.
We'll have to see what the future brings. I never would have guessed things would have been as good as they were from 2009-2019, or that things would suddenly reverse as quickly as they did after 2019. So who knows what's around the next corner for Mac gaming.
There are lots of Intel macs with good GPU's, but most of the base consumer models didn't. (Macbook pro has reasonable Radeon graphics, Mac pro even better but that's a lot of money to play games and beside that some games are broken on Xeon CPU's).
There are lots of simple games in the App store, some ported from iOS, some indie games built on generic engines (Unity etc). The problem was Mac titles have to be continuously supported. Apple killed compatibility with older 32-bit games with Catalina which decimated my Steam library. I presume that's one of the reasons developers are not exactly flocking to mac en masse...
I feel like at this point you can no longer afford to generalize about the market this way if you're a game developer or publisher. The gaming industry is so big and so incredibly over-saturated with games that it is better to evaluate the market for your game, individually, rather than gaming as a whole.
For example, games like Factorio are incredibly niche and loved by people with builder/engineer mindsets (not necessarily working as engineers). Now you've got to ask yourself whether potential Factorio players are more or less likely to own a Mac, compared to the 3% (cited by another commenter) of all Steam users on Macs.
Also I wonder how this 3% figure was calculated. Is it based on monthly active users? I am a Mac user and I have Steam installed but I don't leave it running all the time because application itself is a dumpster-fire battery hog. I only start it up when I want to play a specific game, then I shut it down. Some games don't even require Steam to be running so I launch them directly without bothering to start Steam. Am I excluded from the monthly active users because I might go months without launching Steam, despite playing games regularly?
Developers aren't interested when Apple keeps dropping APIs.
Develop your game for x86_32, suddenly Apple drops support and you have to rewrite your entire game again for x86_64.
Develop your game for OpenCL/GL, suddenly Apple drops support and you have to rewrite your entire game again for Metal.
Develop your game for x86_64, suddenly apple drops support and you have to rewrite your entire game again for ARM64.
That's without counting the fact that most games developed nowadays developed using Vulcan or DX12 which has to be rewritten to Metal.
People are installing Windows on their Macs to play games that used to run natively on MacOS. Imagine how bad the backlash if this happened on Windows.
Fucking no one uses Vulkan except maybe for Android skinner box vendors. Kronos should have just rubberstamped DX12 as the future.
Come to think of it, the fact that Fahrenheit -- next-generation "OpenGL" with a DX base and SGI-provided scene graph API -- failed probably set cross-platform graphics back years if not decades.
After all, Apple's Metal precedes Vulkan by two years, and the vast majority of gaming platforms do not support Vulkan, or Vulkan is a distant third-party API on them: https://news.ycombinator.com/item?id=33742823
Mantle was a proprietary AMD-only Windows-only API until AMD gave up and parts (not even the whole part) of Mantle became parts of the future Vulkun standard sometime in 2015.
So sure, sure, Apple "went out of their way to not support standardized cross-platform APIs" that didn't exist even as an idea by the time Metal was released.
Version 0.1.0 was released in 2014, and version 1.0.0 was released in 2020. They are now working on an expansion. So yeah, as far as I know it's their only game.
As you might hope, the game is exceptionally well-polished, at least once you get over the initial UI complexity.
> yet multi-billon dollar studios claim supporting anything beyond x86_64 Windows is an impossible feat.
it's because multi-billion dollar studios don't hire nimble, competent teams that can turn on a dime.
They higher large numbers of replaceable cogs, use massive engines and other commercial libraries from various vendors, in the hopes of shortening development time without sacrificing production value.
Unfortunately, this only works to some degree, and makes turning on a dime hard. Not to mention that their leadership tend not to be technically informed enough that making such a change seems risky for low benefit.
it's why i do not buy games from such multi-billion dollar studios.
Every wonder why Steam has so few Mac OSX users? Might it have something to do with how few major games are available at all on Mac OSX?
It doesn’t exactly take brilliant insight to see that this is a bit of a chicken-and-egg problem, and so them taking this justification to avoid porting (and I think you’re probably correct that this is their reasoning) creates quite a self-fulfilling prophecy.
One problem is there used to be more games. With windows you can buy 10 year old releases and they (usually) still work. With Mac, there is very little chance a 10 year old executable would run.
Yeah, the cost of porting to a new platform needs to be lower than the money you think you will make out of it.
I think Factorio's code quality allows the developers to make the ports relatively cheap.
Big studios have so many developers that come and go and just work on legacy code that seems to make assumptions about the platform on every level. The cost becomes too high and not worth it.
It's because one is greedy, but because it's bloated.
If I remember correctly the studio behind Mini Metro gave up Linux on their new game because it was too time consuming to run the game and test it. Obviously if you lack tests and need to have manual testers all the time, new platforms won't help you. (I'm making assumptions from the blog posts I read, I might be wrong about the real reasons.)
Every Mac game I own on Steam I could have (and honestly should have) bought outside Steam. The Mac App Store, GOG, Itch, direct from the publisher. It might have been more expensive, but Steam is such a piss-poor Mac citizen I am not surprised barely anyone uses it on Mac. It's not a clear-case cut of where obviously every gamer plays on Macs.
I mean, yes, and that goes to literally everyone who's all "web is great because cross-platform" too, but it is worth noting that Factorio doesn't exactly ask much of GPUs compared to the games multi-billion dollar studios are putting out.
There are a lot of different GPUs and drivers to account for on Windows, and adding in Linux to that compounds the problem[0] significantly and brings in a paltry number of users, so it is pretty understandable why they don't bother. And the Mac user base that cares about gaming is kind of a joke in terms of size.
[0] Steam Deck being relatively bespoke hardware might help there.
> Contemporary integrated GPUs are also significantly faster, and while it might not be as much of a challenge to render the game for them, they do share some resources with the CPU - be it the last level of cache, or CPU cooler, so the integrated GPU working hard may cause the CPU to slow down.
> However, the point I wanted to illustrate by this post is how broad a range of GPUs there is. People see a 2D game and expect to be able to play it on essentially anything. If we want to live up to that expectations, we have to impose a lot of limitations on ourselves, because 'anything' also includes a couple orders of magnitude slower GPU than is found in an average gaming computer of today. CPUs got a lot faster in the last decade too, but mostly due to increasing the number of cores and adding wider vector computation units. They didn't get that much faster when executing serial code, which is unfortunately most of Factorio's game code. So if you play the game on a laptop with a Core 2 Duo and GeForce 320M, you'll run into framerate issues due to the weak GPU much sooner than a UPS slowdown due to the old CPU.
But, yeah, Factorio is one of those games that don't impose a maximum limit of entities on themselves and are mostly bottlenecked by singlethreaded CPU than GPU performance, like any simulation-like game with lots of entities, each of which can have a gameplay (and not just display) effect on another :
This is the only thing they have done in the past ~8 years, right? It's definitely a different business model than big studios. I do appreciate the depth of care this game has gotten, but it doesn't come for free.
No VC money, just three people developing a game they‘d like to play themselves, and it grew organically from there, financed by the money made from selling the game.
They just ported their game to Switch (another ARM platform) so once they did the legwork to make code decently arch and GPU API independent, porting to macOS probably isn't a massive overhead.
> yet multi-billon dollar studios claim supporting anything beyond x86_64 Windows is an impossible feat.
It makes sense though, there are different circumstances in play.
One day back in ~2013 (or so) as i returned from work, a gamedev studio, i bought my first Android device (some cheap tablet that was sold in a basket) from a mall i often visited after work. Once i arrived at my place and checked it out for a bit, i decided to port my previous (current at the time) 3D game engine on it - downloaded the SDKs, read some docs and tutorials, etc and banged out code until some hours later i got the test game running[0] on the device.
All it took was some spontaneous decision and a few hours of my time at home. That's about it.
However if something similar (porting to a new system) was to happen for the game engine i worked on at work, it'd take much more effort - even after assuming the decision was already made. Different programmers worked on different aspects of the engine that would need to work on it (graphics, audio and low level / system support would be the least) and as the engine relied on middleware we'd need to also ensure the middleware supported whatever we wanted to target, the programmer responsible for that middleware had to take that into account and ensure the legal side was covered too (some middleware consider ports -or basically any executable you make- as separate licenses). We'd also need to have the buildmaster work into integrating the new platform for the automated builds and testing (and perhaps write some basic tests if needed). QA would need to allocate time to test on the new platform, not only for the platform specific functionality but also some tests would need to run on the new platform even for functionality that had nothing to do with it to ensure all new stuff worked as expected (this in turn could cause other ripple effects - for example some new functionality in the engine that worked in a powerful platform might prove too demanding/slow for the new platform - at that point someone would have to decide -meaning meetings, etc- if the new functionality will remain, if it will be altered to work on the new platform -this means some research time will be spent on this- or if it will become an optional feature that somehow is only available to the powerful platforms while not being available on the new one -which would require not only the programmer time to implement this switching but depending on the functionality, potentially also artist/designer time to specify where it will be used- but also depending on what that'd be about, it may require some fallback functionality too -again more programming time).
And that would be for a small-to-midsized AAA game developer at the time with a rather small engine team - in larger teams and companies there'd be way more people and friction involved. All that would translate to a lot of extra time and thus cost.
If you are an indie developer it is very easy to just add support for something, but much harder to do if you are a multi-billion dollar studio. A couple of years after the above i joked i could port my engine (same as the Android one) to Haiku if i felt like it - and then i did that[1]. Meanwhile that wouldn't even pass as a joke in the companies i worked at.
As a person who wants to play games on a Mac, sometimes I feel like Charlie Brown trying to kick the football. But Apple's custom silicon has me hoping again. I keep seeing more and more stories like this where ported games don't just work "acceptably", but actually work better on M1 and M2 chips.
Apple's hardware is unquestionably very good now, and their graphics APIs are actually seeing some uptake. The recent stories about Resident Evil Village especially sound positive.
Be careful; the only performance comparison they made was between the x86_64 builds and the arm64 builds. Both were done on the same hardware, with the x86_64 build necessarily running in emulation. This only proves that the ported game runs better than the non–ported game run in emulation, not that it runs better on an M1 Mac than on other computers.
The post actually mentions what map they used for benchmarking, and linked to a bunch of other benchmarks on the same map[0].
They quote around 200 UPS average. It's hard to compare to the linked benchmarks because those quote p75 numbers instead of average, but it seems like the results are in the same general ballpark as the Ryzen 9 5950x.
Wow thanks for this.
I noticed the 13700k results are surprisingly lackluster. The highest speed memory kit there is 5600. I just got a 13700k with Hynix A-die (now clocked to 7200). I have an AIO arriving in the mail in a few days to replace my NH-D14. The 13700k has quite a bit of headroom when not thermally constrained.
I think that the 13700k and 13900k with the same turbo ratio should perform almost the same in gaming workloads. The only difference should be in the 36 MB of LL cache vs. 30 MB. It's a modest difference, but factorio is memory subsystem performance sensitive.
I'll add a benchmark to that page in a few days with a 5.8 GHz clocked 13700k to test the theory.
Update for posterity: The ALF II 280 actually performed the same as my NH-D14 in thermal stress testing. I ran y-cruncher for an all-core workload that reliably thermal throttled 100 C @ 220 W. The clock frequency is dependent on the voltage given, which has been a pain to tune. I don't think this chip can go beyond 5.6 GHz stable without adding so much voltage that it is actually lower performance in most workloads. 5.7 GHz can be made borderline for most workloads, 5.8 GHz is unstable, and 5.9 GHz does not boot. I know the adaptive selection voltage mode should be able to address this, but something about the BIOS is incorrect. These results refute my theory that the 13700k could match the 13900k and the 13900k is in fact well-binned.
That doesn't mean the 13700k can't match (or exceed) the out-of-box Factorio performance of the 13900k when given better memory, hence the score of 304 UPS.
Bonus: E-cores have thermal headroom at stock and can be stable at 4.5 GHz if given +0.1 V, but this cuts into the thermal headroom of the P-cores in all-core workloads and lowers the overall performance. Bumping to 4.3 GHz from 4.2 GHz with no voltage increase is stable.
Not sure about the performance of NH-D14 and 13700K, but my NH-D15 (upgraded from NH-U12A) is running OK for 13700k with Intel's PL1/PL2 settings, hovering around 60~70c during full load at 25c ambient temperature. Unlimited PL2 is a different story and can go up to 100c during full load. My previous NH-U12A build ran at about 2-4c higher temperature under load.
I've been experimenting with different settings and found that unlimited PL2 and undervolting the CPU by -150mV give the best temperature to performance at ~80c during full load. It has been running stable for few days, and I'm pretty happy with the result so far.
For stability testing I like y-cruncher. It teases out edge cases that many XMP profiles are unstable with. I'd take a slightly less efficient system that I am confident will not error.
Also, those numbers aren't far off the out-of-box behavior I had, but I like to tinker. I throttle with PL set to 190 but not at 180.
I'd avoid Intel entirely if heat/power efficiency matter to you at all. AMD has had acceptable performance with far better heat/power use across the board for a few years now.
> with the x86_64 build necessarily running in emulation
Rosetta 2 is not emulation, at all, it's AOT, static binary translation, backed by hardware that implements Intel specific behaviour from the latest chips down to the oldest 8080 or something. It's eerily fast.
In fact, it happens that arm64-translated x86_64 running on Apple Silicon can often be faster than x86_64 running on the latest Macs with Intel processors.
So you really have to ask two questions here:
- does the x86_64 Factorio build run faster on Apple Silicon than on a comparable† Intel?
- on Apple Silicon, does the arm64 Factorio build run faster than the x86_64 Factorio?
>Rosetta 2 is not emulation, at all, it's AOT, static binary translation, backed by hardware that implements Intel specific behaviour from the latest chips down to the oldest 8080 or something.
Imagine you only speak English and you want to read a novel in French.
Emulation: you hire a translator to read the novel to you. They translate each word while reading.
Static translation: you hire a translator to transcribe the book from French to English. They give you a printed book purely in English. But simple French words like flâner and râler are expanded into lengthy passages because there is no simple English translation.
Rosetta 2: you hire the translator to transcribe the book to English, but they leave in unique French words and teach you what they mean so you can understand them in an English phrase without even noticing that the word isn’t “real” English.
Rosetta 2 isn’t emulation because no instruction is translated on the fly to a different ISA. It’s static translation plus ISA extensions. There is no lower level emulating anything.
As a slight correction, I believe Rosetta 2 also has a JIT mode, which is a bit more like conventional emulators. But it's used infrequently, eg when dealing with x86_64 apps that themselves use a JIT.
It does have JIT translation (not a JIT "mode" though, as it always use AOT translation, only relying on JIT translation at runtime for the parts that need it)
> which is a bit more like conventional emulators
Not at all†, Rosetta 2 does the same†† translation step on dynamic Intel code, whose arm64 output can be reused afterwards
> But it's used infrequently, eg when dealing with x86_64 apps that themselves use a JIT
Yes, although it's more like "exceedingly rarely" in practice since usually those interpreters are up to date enough to have a native arm64 release.
Thanks, that is definitely correct. Rosetta 2 can do JIT, and it gets exercised for native JIT / dynamic code.
I could probably extend the metaphor to an avant garde French novel that asks the reader to look up and include today’s headlines from Le Monde, but it was already stretched.
Factorio runs quite well on the M1. The graphics system (FPS) is partially decoupled from the factory simulation side (updates per second, or UPS), so there are two components to performance. UPS mostly depends on how big and complex your entire factory is, and on what mods you're running, and FPS mainly depends on how many sprites are on screen. FPS is limited to be <= UPS, since there's no point in redrawing until the game state changes, but UPS can be greater.
FPS: Despite being a 2D sprite game, sometimes it has trouble keeping FPS at 60, at least when running at max graphics and max zoom level with a graphically intensive mod. I would guess it's using OpenGL, and Apple's OpenGL stack isn't great. You can see the article mentioning the M1 Max only hitting 45 FPS in one of the tests, and this is without mods (but with a huge base and presumably a wide zoom level). In my experience, if you adjust the graphics settings appropriately (eg max sprite atlas size and max vram usage, since integrated graphics use unified memory), you can usually keep it at a smooth 60 FPS 99% of the time even in graphically-intensive setups with max or almost-max quality settings.
UPS: Scoring 199 UPS on the flame_sla 10k base puts the M1 Max above any other laptop processor for that benchmark. This matches my experience: the simulation part of the game almost never lags, except for unavoidably heavy operations (eg generating new worlds when playing with mods that do that). See a comparison at:
Yeah, Factorio is multithreaded, but in practice it usually only runs a few ways in parallel. Instead its performance is determined in large part by the memory subsystem, which is why the X3D processors do so well. It's probably also part of the M1's great performance: with a large cache and stacked DRAM, it has very competitive bandwidth and latency.
Some anecdata: I've played Factorio on both an Intel MBP (2018) and an M1 MBP (2021), the performance even under Rosetta blew away the Intel chipset. Being M1 native means even faster performance with a lower power impact.
The current Steam problem on macOS is more about 32 bit games that never received 64 bit builds, which means they are not playable anymore since Catalina independently of the Intel vs Apple Silicon hardware, notably the first-party GoldSrc and Source engine games, and all of their third party derivatives.
I would not really care if my game library was going through Rosetta 2, as I'd rather take a theoretical performance hit (vs a native arm64 build) than outright be unable to play.
At one point even intel macs broke compatibility with every game that was 3 years old or something stupid like that. I remember that my working game simply refused to execute after a macos version upgrade. Exact same machine, refusing to run my software overnight.
This kind of attitude just isn't conducive for gaming, where people like to build libraries in steam and expect everything to keep working for a long time.
On my PC, I can fire up games from 20 years ago and they work perfectly. Witcher 3, a 7 year old game, is getting an overhaul. I expect no problems in downloading it on steam from my library and playing it seamlessly on my relatively new PC.
Yep - they killed 32 bit compatibility and it’s annoying.
IIRC win64 finally killed win16 support but that was rarely used for games and those games you can dosbox (which amusingly enough works fine on Mac in many cases).
Meanwhile I’ve been happy playing games in Parallels (virtual Windows arm64) like Against the Storm. Crusader Kings 3 has better perf in Parallels than the macOS build.
Does protonDB work for Mac titles, or is it only for Windows? I've been able to play basically every game in my steam library on linux using protonDB. I have hundreds of games.
An alternative would be using CrossOver (which pulls from Wine and adds stuff like MoltenVK), which is what Proton does as well (pulling from Wine and adding stuff, but not MoltenVK) and vendors internally, Valve "just"† doesn't pull from the CrossOver changes nor expose Proton on macOS.
† scare quotes because it may not be as easy as it seems
Proton helps you run Windows games on Linux. A few years ago, the problem with Mac games on Steam was running 32-bit Mac games on 64-bit only macOS builds. Now, it's running x86 Mac games on arm64 Macs. Often, we're talking about those same 32-bit games that never got updated to 64-bits x86, let alone arm.
IIRC from the benchmark M1 have some truly great single-core performance. Then again Apple stuff is so fucking expensive you can get faster for cheaper easily...
I get the impression that the lack of releases on mac are generally down to non-technical reasons.
Building for ARM is probably not a big challenge if you’re already building for Apple’s x86 toolset.
Metal would be more of a challenge I imagine, but a bridge probably worth crossing all else being equal (or one that you don’t need to cross at all if you’re using something like unreal or unity).
The Mac isn’t a games platform as Apple hasn’t shown much interest in the mainstream gaming market, and I can’t imagine major publishers are eager to fork over a third of their revenue on the App Store for sales they’ll probably pick up elsewhere without more work and cost. Sure theres Epic and Steam on Mac, but they’re ghost towns, and publishers are likely waiting to see what way the EU Digital Markets Act shakes out globally anyway (as other governments are pressured to provide the same freedoms).
There was talk at one point of Apple working on a game console (a more powerful Apple TV) but who’s the market for that?
They’ll not be cost-competitive with Xbox or content-competitive with PlayStation and Nintendo.
At best they’d be likely to produce a similarly powered box with little content and a high price tag in a market already retailing hardware below cost price.
The most important "goalpost" is: Every major cross-platform AAA game gets released simultaneously on Windows, Xbox and PlayStation. When macOS gets added to the list in a consistant manner, people will consider it a "major" gaming platform.
Having access to a lot of mobile games doesn't matter for hardcore gamers.
Gaming on a Mac is like gaming on a PC just worse. There's less games, every game on Steam that supports Mac supports Windows. Performance is usually worse too. For gaming it's just a worse PC. And it's usually more expensive. And there hasn't been an exclusive worth playing since like 1997. With consoles there is at least stuff you can only play there and an extra ease of use layer plus they are cheaper. Mac is just all downside for gaming.
The goalposts haven't moved. Where are the games? Macs will game when macs can game. The gaming industry has chosen the APIs they are willing to support. When Apple meets them there then macs will be able to game.
> The gaming industry has chosen the APIs they are willing to support.
The gaming industry will go where the money is.
The majority of the industry revenue is now on mobile and the lions share of mobile revenue is on iOS and Metal.
I think Apple Silicon Macs will end up benefiting from studios having experience with Metal on iOS in the same way that XBox benefited from studios having experience with Direct3D on Windows.
Disagree on the last point, I'm a university student who spent 1k on a MacBook Air, and I don't have a console or gaming PC because I can't afford it.
I study Computer Engineering, so having a good laptop was important. Prior to this I owned refurbished ThinkPad running Linux which cost me around $500 but had multiple issues with performance and speed, to the point where buying an M1-class machine was almost necessary for me.
From what I understand, Apple decided to force Metal upon everyone instead of using Vulkan like the rest of the industry. This causes friction with game development.
If you want to claim that Direct3D is king, I'll agree with you. I'm merely pointing out that all of the DirectX libraries are fairly irrelevant when they run just fine without DirectX.
Ooof, I've seen some bad takes but this one is a really bad one.
DXVK is currently only used as a drop in replacement for DX9 games. DX10 has been forgotten (thank god), DXVK's D3D11 implementation is not good, and more and more games are going on D3D12 which affords a hell of a lot of control.
Additionally, DirectX is not just a graphics API, it's also sound (XACT and XAudio2), ray tracing (DXR), Input handling (XInput & DirectInput), CUDA-like calculations (DirectCompute), storage handling (DirectStorage), etc. Most of these have either an alright equivalent (DXR has an equivalent in Vulkan with extensions, and that's about it, and Valve's input implementation is _really good_, but it's not a usable API as far as I know.) or a wildly inferior alternative (at least for the PC space that is Windows/Linux/MacOS). D3D12 also is one of the drivers of new GPU programming features in the PC space (once again, the console side of things is a bit weirder, although MS does bring some lessons in from the Xbox side of things), while Vulkan is kind of stuck doing everything as extensions that may or may not be available, and Metal is still a piece of shit.
Remember, to start, Windows only officially supports DirectX. OpenGL and Vulkan comes from your GPU vendor and Microsoft waives all responsibility for them. Vulkan is, quite literally, a 3rd-party API that can run on Windows - not something Windows supports or endorses.
Xbox does not support Vulkan. DirectX or get rejected.
Only 60% of Android devices support Vulkan. Guess you’ll also need ANGLE or OpenGL for backwards compatibility.
PlayStation does not support Vulkan. Better learn gnm, gnmx, and PSSL.
Nintendo Switch has Vulkan but it is almost unusably slow, on a console that is already not known for speed. Better use NVN if you want anything decent.
iOS does not support Vulkan. Better use Metal.
So… what does Vulkan support, exactly? Windows, Linux, and not enough of Android. If your game only runs on desktop, it’s a good option - but why not target DirectX? Windows, Linux with Proton, and most of the Xbox support all in one. For this reason, I have yet to see a Vulkan game that does not have a DirectX mode.
Blaming macOS for being proprietary is disingenuous in an industry full of Proprietary APIs.
You're doing a bit of slight of hand on Windows support.
Windows doesn't "support" the DirectX version shipped by your GPU vendor either. The drivers shipped by your GPU vendor, and all the APIs provided by them, are supported by your GPU vendor.
So the real thing we're talking about is hardware vendors. Nvidia and AMD support Vulkan, OpenGL, and DirectX where applicable. Apple only supports Metal. The console vendors have always had weird variant APIs based on the open standards but not identical, except MS where the console is very close to desktop Direct X.
On mobile hardware vendors ubiquitously support OpenGL ES and there's widespread support for Vulkan.
So it's complicated. In the desktop space, as a percentage of market share, Vulkan is extremely widely supported. Same with mobile. Consoles have always been an odd man out.
So Apple, which doesn't sell a console, is absolutely breaking from the pack in the markets they target.
If you consider 60% of Android users, and 0% of iOS users, "widely supported," sure. That's less than half of mobile phones in use right now, making Vulkan the odd-one-out on mobile as well. You certainly can't build a mobile app right now that only uses Vulkan without cutting out huge parts of your audience.
> So Apple, which doesn't sell a console, is absolutely breaking from the pack in the markets they target.
Apple wants the same API on all of their devices, and I can't blame them. They are the odd-ones-out in Desktop only.
But does that really matter? If you are making a game only for Desktop, namely Windows, and weren't going to just use DirectX for some reason, it does (which I think, nowadays, is a rare situation). But if you are targeting any game consoles, or any mobile phones, you're adding multiple graphics APIs anyway and Metal is just another one.
> If you consider 60% of Android users, and 0% of iOS users, "widely supported," sure.
But that's what we're talking about, if it weren't for Apple Vulkan would be a near ubiquitous desktop and mobile API. Apple is the one standing in the way of that.
If Vulkan were a near ubiquitous mobile and desktop API, _maybe_ the console vendors would be more willing to tolerate it.
Microsoft can't be expected to just give up the gatekeeping that is DirectX on their own (well, maaaybe if antitrust was more effective ??).
Sony only cares about their console(s), so they prefer to focus on optimization rather than compatibility. (While Nintendo cares more about gameplay than graphics.)
Isn't that 40% of Android devices old / very cheap ?
Apple is one of the biggest (and especially, most profitable) companies in the world, and with great power comes...
So "Microsoft can't be expected", "Sony doesn't care" but we should only blame Apple. Gotcha.
Because only Apple has the power and hence the responsibility, not the tiny helpless companies Microsoft and Sony (~99% of consoles, 76% of desktop).
Edit: don't forget, Apple absolutely must support Vulkan (and others don't) because Metal is proprietary and non-cross-platform (just like any other graphics API on all major platforms) even though Vulkan appeared two years later than Metal (but neither Microsoft nor Sony are expected to drop their APIs which also appeared earlier than Vulkan).
If you're _really_ aiming for cross platform support, an RHI is only the beginning of your problems though. Replacing the rendering code may not be _trivial_ but if you're stuck at that point you're likely going to struggle with networking, filesystem, permissions, user accounts, store requirements. Modern rendering APIs are _similar enough_ in feature sets and abstraction layers that it's not an insurmountable task.
Total addressable market matters here. 100% of Android-based VR headsets support Vulkan. Granted, that's mostly the Quest 2, but it's not the only HMD in town anymore.
Also, a lot lot lot of Android devices are garbage-tier <$100 that you wouldn't want to target anyways because you won't get any sales on them. So the % of Android devices supporting Vulkan may be misleading in the sense that you might be aiming for a segment of devices with much, much higher, if not complete, support for Vulkan.
This seems disingenuous. The previous claim was that Apple ignored an existing emerging standard, so it is relevant. Your new claim is that Metal is a bad API, which I haven't heard anyone else say. What makes it bad?
My larger point is that Apple can obviously support both APIs. It doesn't matter if Metal is good, bad or even awful, Vulkan is what people are using and Vulkan can translate from DirectX. Apple is shutting themselves off from the rest of the industry with this move, which I would argue (judging by how many Mac users wish they could game) is a bad thing.
> It doesn't matter if Metal is good, bad or even awful
I'm confused. Your previous statement was saying the main point is that Apple have kept a bad API.
> Vulkan is what people are using and Vulkan can translate from DirectX. Apple is shutting themselves off from the rest of the industry with this move, which I would argue (judging by how many Mac users wish they could game) is a bad thing.
Possibly, FSVO translate, but I don't think anyone was commenting on this new point. More with the previous points.
It could cook my breakfast for me, and it would still be useless if it exclusively targets Apple products. People use Vulkan because it targets multiple platforms. There's nothing stopping Apple from providing both APIs, and if Apple's API is truly nicer to use than Vulkan, then Apple has nothing to worry about.
Vulkan is counter to Apple's vertical integration approach. It would be a foundational API, sitting between their graphics hardware and their OS, yet one they don't control.
Better for someone else to build a Vulkan API on top of Metal, which is what has happened. It's not perfect, but it's the only thing that can work. The pressure on Apple should be for Metal to better support Vulkan by providing APIs it needs to work optimally.
Beyond that, Apple might want to contribute to the Vulkan-on-Metal implementation... though they're only going to do that if it makes strategic sense, which I don't see. For cross-platform, what makes more sense is to encourage games to use a higher-level engine that supports metal among its platforms, like Unity and Unreal.
Well, thus-far it has failed. We have Factorio, Tomb Raider and one of the Resident Evil games running on MacOS - most of which had to implement Metal by hand. If you're right, Apple's strategy needs to change.
The thing is, you need a whole game to work on macos, not just the graphics. Vulkan is just a cross-platform graphics API, not a cross-platform framework.
Typically, if a game maker wants to make cross-platform a priority, they wouldn't target just Vulkan, they'd target a cross-platform framework. That would be true whether Vulkan was supported by Apple or not. And if they don't make cross-platform a priority, the chances of a mac port go down regardless.
So...
We're looking at the incremental gain of Apple providing first-party support for Vulkan vs the existing third-party support. Looks like a lot of work for Apple for little gain. Just doesn't seem worth it to me. Also, the Vulkan version would always be out-of-date since Apple would pin the supported version to an OS release, and would need to be conservative about it, since they aren't going to hold an OS release for Vulkan.
Really, Vulkan on macOS is much better done by the interested third parties, and the focus on Apple should be to get them to better support a Vulcan API on top of Metal.
And a ton of other games because things like Unity make that easy. If you aren’t focused on AAA, you should have no problems finding more games than most of us have time for.
That's part of the problem. MacOS actually had the same Proton compatibility layer as Linux a few years ago, but with Catalina Apple changed what you're allowed to run on your Mac and it broke completely. Neither Valve nor Codeweavers have gotten DXVK or Proton to run on MacOS since, IIRC.
That doesn't sound right. Codeweavers has been shipping fairly recent version of DXVK alongside with MoltenVK since CrossOver 20. On community side there's Gcenx/DXVK-macOS[1] that patched DXVK to work with some of Apple GPU quirks in some games and closely track upstream.
Apple depreciated 32-bit, a year before Apple Silicon came out, which allowed them to more precisely target 64-bit processor speed without legacy baggage. Apple Silicon on iOS had dropped 32-bit years before and they didn’t want to bring it back for macOS.
Android is following with 32-bit deprecation this year, actually. Pixel 7 doesn’t support 32-bit apps.
Except for brief blips here and there Apple seems to not put any priority on gaming. It's a bit disappointing because they could have a really strong showing if they cared.
FWIW, addition of MetalFX upscaling (comparable to DLSS/FSR 2) in Ventura has been a nice surprise and one of the biggest development on Mac gaming in recent memory.
Digital Foundry did some review of MetalFX in Resident Evil Village[1] and was pretty positive about it. (From DF's findings, MetalFX has some problems with transparent texture, but details preservation/restoration are pretty good.)
It's not so much technical as business focus. There's a big issue with their app store cut on in-app purchases and whether they want to set a precedent on lowering/changing the rules to be compatible with games like Fortnite and the third party stores (like Steam and whatever Epic's is called). If they found a solution to that then they need high quality and timely ports, that's a whole organization of marketing, bizdev, devrel, etc people.
At a technical level it seems like they could get more console-like levels of tuning for their platforms. Very few chips to support, only a handful of thermal targets, all chips have a common CPU/GPU architecture, all devices have very fast storage. Conceivably they could field a winning platform for competitive gaming.
My guess is they look at Sony & Microsoft and don't see much value in reshuffling priorities to likely just be #3.
Edit: hajile above convinces me that rather than concerns about spending money to be #3 they probably already are in the top few by gaming revenue and could have lots of reasons for not being more aggressive about taking more share.
With Apple being the largest game company by revenue, they do take it seriously, but for every dollar made from an AAA game, there's probably 20-50 to be made on less "hardcore" games. Thusfar, Apple is serious about money rather than capability.
That seems to be changing though as there are pretty strong supply chain rumors that they are working on AR/VR headsets (allegedly delayed due to terrible market conditions and some supply chain issues). They've also made pretty big investments into their subscription game service.
I think Apple TV is very overlooked by developers too.
There's a lot of processing power in those things. The weak 2021 model has 15% more GPU power than a docked switch (30% more than an undocked switch). The older 2017 model uses an X processor which should give it even more GPU power (almost double an undocked switch). The latest A15 model (with a major price drop vs the previous generation) has more GPU power than the Xbox One S and isn't so far off from the PS4.
Nintendo Switch 500 GFLOPS (docked) 390 GFLOPS (portable)
Apple TV (A10x) 770 GFLOPS (2017)
Apple TV (A12) 580 GFLOPS (2021)
Apple TV (A15) 1500 GFLOPS (2022)
Playstation 4 1850 GLOPSS
Xbox One S 1400 GFLOPS
Apple TV shipments since 2017 seem to be in the 50-80M units range. Compared to 25M PS5, 17M Series X/S, 111M Switch, and 117M PS4, that's a pretty significant number.
I looked for a quantitative comparison. If the numbers in [1] are accurate Apple App store gaming revenue was around $11B in 2021. Meanwhile Microsoft was around $15B [2]. That's way closer than I thought, big enough that they could reasonably be concerned about avoiding any appearance of strength that could trigger anti-trust action. Interesting.
Don't use flops as a comparison please. Meaningless numbers especially across platforms. Especially across vendors for those devices that are pushing out said flops. Same thing with texels per second or gigapixels per second or any comparison like that between 3D graphics accelerator devices. If those numbers meant anything then my Intel A750 would be one of the fastest graphics cards in my house right now.
> I think Apple TV is very overlooked by developers too.
Because no one nows if the platform will be there or not. Apple's commitment to it has been lackluster. And since there are no dedicated controllers, you'll need controllers from a system... that you probably already own, so why play on Apple TV?
apple is in an active lawsuit with one the biggest gaming companies in the world. fortnite hasn’t been available on apple devices in 2 years. if they really cared they wouldn’t have shut down one of the most popular games of today from their devices
This comment doesn't make sense to me. Epic broke one of the major tenets of their contract with Apple. Apple faced losing a significant revenue stream if Epic's efforts were allowed to continue. In particular, since they intend to use the same model for future devices in other verticals, they would have also lost out on a simply colossal amount of future revenue.
The business case on this was a slam dunk. Other companies running virtual product stores with similar terms would have done the same irrespective of how popular Epic's products were.
False. iOS doesn’t support Vulkan, 40% of Android doesn’t support Vulkan, Windows does not officially support Vulkan, Nintendo Switch has a Vulkan implementation that is basically unusable, PlayStation doesn’t support Vulkan, Xbox doesn’t support Vulkan, shall I go on?
Contrary to widespread opinion, Vulkan is not an industry standard. It’s a 3rd-party DirectX alternative for Windows, the best API for Linux, and a curiosity on Android. And that’s literally it, nothing else supports it (except Switch, but it is so slow, almost no games use it, opting for the proprietary NVN).
saying "Windows does not officially support Vulkan" is a completely blatant cope by maclovers, nvidia and amd, the people that make the cards support it, and windows "supports" their cards, so let's stop being dishonest.
"Vulkan is not an industry standard" I mean yeah, in the same way Microsoft word is not an standard.
"except Switch, but it is so slow" again that seems to be a lie, doom eternal orks way, way better on it than it would be on similar software on a different API.
> saying "Windows does not officially support Vulkan" is a completely blatant cope by maclovers, nvidia and amd, the people that make the cards support it, and windows "supports" their cards, so let's stop being dishonest.
It is completely honest. On a fresh install of Windows, if you don't have graphics drivers, you can't run Vulkan or OpenGL. Windows washes their hands of any responsibility. You can at least run DirectX with software rendering regardless of hardware support. It is also for this reason that the locked-down Xbox where Microsoft can assert more control has zero tolerance for OpenGL or Vulkan.
> "Vulkan is not an industry standard" I mean yeah, in the same way Microsoft word is not an standard.
Microsoft Word, and the DOCX format by extension, has >90% market share. Vulkan has almost no presence on consoles, presence on less than half of smartphones in use, and mixed presence on Desktop because MacOS doesn't have it. Word is more of a standard than Vulkan.
> "except Switch, but it is so slow" again that seems to be a lie, doom eternal orks way, way better on it than it would be on similar software on a different API.
DOOM Eternal is one of the few games that uses Vulkan. >90% of Switch games do not use Vulkan, and found it preferable to use the proprietary API. That developers would overwhelmingly opt not to use Vulkan on Switch tells you all you need to know about the state of it. If adding another graphics API (such as Metal) was such a big deal, why in the world would they do it if Vulkan was cross-platform and worked fine? It doesn't work as well as it needs to - and adding another graphics API isn't as much of a blocker as we like to think.
>It is completely honest. On a fresh install of Windows, if you don't have graphics drivers, you can't run Vulkan or OpenGL. Windows washes their hands of any responsibility. You can at least run DirectX with software rendering regardless of hardware support.
DirectX with software rendering doesn't actually result in games actually being playable, unless they are 2d games that barely touch the GPU to begin with. So the software rendering fallback is completely irrelevant here, and what matters is what APIs will work when you do have the GPU drivers correctly installed. And at that point, it doesn't matter what degree of support Microsoft provides for Vulkan, only the degree to which the GPU vendor provides that support. (And the software rendering fallback actually makes it less straightforward to diagnose why a game isn't running as expected, in the case of GPU drivers not being installed. Plus, what game developer cares about the software rendering fallback enough to even test their game against it?)
So no, it's not completely honest. It's a disingenuous red herring.
For me one of the best things about PCs for gaming is access to lots of old games and long tail stuff...in fact I rarely play new graphically-demanding titles. That said OpenEmu on the Mac is the slickest emulation manager / front-end I've used on any platform and DOSBox-X runs fine on my iMac. But I still keep a gaming PC around.
I've played about 60 hours of Factorio on the Nintendo Switch version, which came out 4 weeks ago. I'd never played it before.
This comment has very little to do with Apple Silicon, except to say that I imagine its a faster platform than the Switch, and that Factorio is addictive fun. Here's hoping the promise of the post turns into reality.
Factorio has been mac compatible for over 7 years (citation: my embarrassingly large play time), this blog post is just about performance improvements (that several of my factories will appreciate)
Rosetta must be really efficient! That's my takeaway from this post. The native port is running only ~25% faster than the emulated x86 version. That's amazing! And Factorio is more CPU intensive than most games; this isn't just because the GPU is the bottleneck.
I don't mean to diminish the work of porting here, clearly a native version is better. I'm also impressed that the port didn't seem that difficult, the only technical hard thing he mentions is getting a copy of the arm64 libraries to link against. But mostly I'm impressed that the port wasn't entirely necessary because Apple did such a good job on x86 emulation.
I had thought the same thing at first too, but they were comparing the x86 build running on an M1 against the native build running on an M1. See the machine list in the first screenshot.
I see, for the Mac Studio 2022. But the other two (the Macbook and Mini) are intel native.
What's confusing here is he clearly says "After we had a functioning universal binary, we ran benchmarks and conducted manual tests to see what improved. We ran benchmarks on 3 different Apple Silicon machines, using the same compiled release binary and run settings."
So it's a universal binary. Surely it'd run natively on the Mac Studio. Maybe this was a typo?
> But the other two (the Macbook and Mini) are intel native.
The 2020 Mac Mini is not intel [0]. The 2021 Macbook Pro is also not intel [1].
> So it's a universal binary. Surely it'd run natively in the Mac Studio. Maybe this was a typo?
Yes, the new universal binary can run natively. That's what they are announcing: the new native support. They are comparing the native performance against using rosetta [2] to run the x86 binary (which, in prior versions, was the only option). By using the same binary they are ruling out that it's due to any other performance improvements between versions.
100%, I find it seems to activate the exact same regions of my brain that programming does. It's especially good at capturing the flow state that makes programming fun (which is the reason many of us got into programming in the first place). But on the other hand, I find I can't play it much these days, because it ends up feeling too much like work.
Maybe someday I'll retire and be able to spend 100s of hours playing Factorio, without it feeling like I'm just putting in extra hours at work.
I recall another comment on here which explained the duality, along the lines of:
- "Factorio feels just like programming ... but without the constraints of having to be reliable, or having to do what other people tell me to" => fun.
- "Factorio feels just like programming .... but I don't get anything out of doing it" => not fun.
I particularly enjoyed playing towards the constrained-achievements, such as: "launch the rocket with (near) minimal hand-crafting", or "launch the rocket on non-peaceful, without using solar power".
A bit off topic but I tried factorio once and the learning curve seemed pretty steep. I have been playing Mindustry which as far as I understand is simpler but a similar idea, is open source, and works great on my M1. Only complaint is it's pretty addictive.
> But I wasn't satisfied. As a software engineer and Factorio engineer, I had to ask myself: can I play Factorio for even longer with native Apple Silicon support?
Hmm... He could have tried to optimize his Factorio build to consume less power on the laptop running it... Or, as he did, to optimize the whole game for every ARM Macbook in the world.
Thankfully he chose the latter. Or, wait... I have a Macbook... Damn... Goodbye productivity :)
I love Factorio, it's a great game, and it's nice that they've ported it to run on Apple's new proprietary hardware. I'm just surprised that people are excited about a platform where people get excited about software becoming available in such limited quantities. I suppose you take what you can get.
I love to see Factorio native on my Mac Studio, but the performance actually feels a little worse than it did in emulation. I’m getting little slowdowns here and there I never saw running the x86 build in Rosetta on the same machine.
I wonder what goes into porting an AMD64 C project into arm64. Do you just make clean && make on your new machine? Is C portable enough to do that granted there is no inline assembly involved?
I feel like games like Factorio don't really need that low-level optimizations to warrant for C++. It would be possible to implement it on Go/Java/C#, so would be easily portable to whatever platforms.
Factorio I think is particularly poorly suited to memory managed languages. My factory is enormous, hundreds of thousands of machines all doing their thing in real time. Thousands of robots flying around delivering goods. I think that really does in fact demand high level optimization.
Don't see any problem there, how does it relate to portability?
Anyway games are going to pre-allocate large arrays and work on them, for cache affinity. That doesn't depend on C++ vs. Java. No need to create a separate object for each item in your factory, but rather a row in the array(s).
And even if you do create a new object for each item, it's better to do that on heavily optimized heap, than manually allocating/deallocating memory in the same thread. You end up writing that heap in C++ anyway.
Factorio does really weird things with memory (the deterministic simulation effectively reads all memory every tic) that makes it not perform like many applications are expected to.
Why? It seems like the opposite is true - see the other comment under this post about buying a 13900k just to play Factorio.
It seems to me like Factorio is exactly the kind of game that would benefit from a low level language, as it's relatively simple graphically and all the complexity is in the simulation, making it CPU-bound.
We don't know exactly what techniques are under the hood, because it is closed source (for now...I really hope they open source it one day because I want to marvel at it), but from reading all their FFF blog posts and just knowing how smooth this game runs in spite of the mind-boggling number of discrete operations a megabase can execute, I'm very confident they are doing a ton of optimizations to reduce allocation, reuse data structures, etc. I doubt they would be able to achieve that performance with a memory managed runtime.
Besides, their game is already one of the most cross-platform capable games I've ever seen. There's not a lot of feasible platforms that Go/Java would open up.
Factorio is a great example of a game that benefits from having a custom engine. For comparison, Cities Skylines (Unity) becomes CPU bottlenecked with a large enough population. Making large cities practically unplayable even on the most powerful consumer CPUs.
Also, choice of programming language isn’t what makes porting games difficult. It’s the platform specific graphics, input, window management, sound, etc. All of which aren’t handled by switching to one of the languages you mentioned.
No, Factorio is a perfect example of a game that benefits from as much optimization as possible. It's an unlimited-scope game where expanding forever is the natural course of action so any improvement in performance directly translates to improved gameplay experience because it means you can expand for longer before it starts to degrade.
In this kind of game, no matter how well it runs, there will be someone who has built enough that they're wishing it would run even better.
I have so much respect for Wube. Absolutely amazing quality and care in everything they do.