Hacker News new | past | comments | ask | show | jobs | submit login
Apple Silicon Games - 400+ Game Performance Reports for Apple Silicon Macs (applesilicongames.com)
118 points by epaga on Dec 8, 2020 | hide | past | favorite | 119 comments

The games table seems to be very misleading. For example, for Alien Isolation you apparently get 7-12 FPS at 1440x900 resolution but somehow that's marked as playable.

In fact, there are tons of entries for first/third person 3D games where <30 FPS is apparently playable.

Looking at the games which has been marked as "playable: no" it seems that their definition of "playable" means that it runs without crashing/glitching. I think that's perfectly sensible definition. A game can have 60FPS, but be unplayable because it crashes when you switch scenes. And a game can have 5FPS, but be "playable".

They should have another column called "recommended" though, since filtering by "playable" isn't very useful…

Yes, but placing Solitaire and Alien Isolation into the same category that "playable" means it can be launched and interacted with at 7-12 FPS is absolutely disingenuous. It wholly matters what type of game we're referring to.

Yes, Solitaire at low FPS is playable. I mean, it doesn't matter. At all. Playable for Solitaire is almost Boolean.

A suspenseful, atmospheric horror shooter? How it plays is absolutely important and why has that context been disregarded?

I think that column should be "runs" rather than playable. Yes, a chess game could well be OK at 5fps, but no-one's going to find any game with action playable at 5fps. Of course playability is a more subjective scale, but having read some of the table, this looks much more disappointing that I was expecting - for instance considering being able to play "among us" as a win seems to be a very low bar!

It actually used to read "runs", I should have kept that wording for now.

Initially I wanted to derive the value from the FPS column and from the comments but like you already wrote this does not work well as it is highly dependant on the genre of even the specific game.

Thanks a lot for your thoughts on this. It is one of the most important aspects to improve apart of getting more reports. Appreciate it!

Nope,it has to run mostly without crashes and glitches and need a certain FPS to be called playable.

> need a certain FPS to be called playable

What FPS exactly? Isn’t that very subjective? Some people might be happy with 15fps, others would want 60fps. I like the approach in this dataset because the reader can decide for themselves.

You are absolutely right! And with gammers setting more and more enthusiastic levels (e.g. 120fps) is a problem. But the bare minimum should be fps like a movie, so 24fps or 30fps. This will not be perfect, but any animation may look a little bit smoother than 15fps

Here's an interesting link on this subject: https://www.filmindependent.org/blog/hacking-film-24-frames-...

Ideally it should depend on the type of game (turn based games can be fine at 15 fps, a fighting game could be legitimately difficult to play at anything less than 60), but if you need a common baseline, 30 fps seems reasonable. 10 fps probably isn't.

it is subjective but nobody is happy with a first person 3d game at 15fps

In gaming circles, playable has always meant "at an acceptable FPS". What you're talking about (the crashes/glitches) is what whether it fully works or not (or somewhere in between).

The definition of "an acceptable FPS" has changed over time though. I certainly remember playing a lot of 3D games in the 90s that ran at 20FPS or less and were considered playable.

I remember playing Tomb Raider on the Saturn which ran at 15-20 FPS depending on where you are in the game.

It was absolutely breathtaking.


Thanks a lot for the comment and the suggestion. As quick fix I will add context for the current definition of "playable" and also add this to the form to submit reports.

As next step I will look into how to best model and communicate something like a "recommended" context and most likely make that the default view (as this I think will be what most people will be interested in knowing).

Thank you!

There are three entries for Alien Isolation, one gives a 30FPS (on a Mac mini), one gives 7-12FPS(on a MacBook Air) and a note of "Very slow and blurry, only 7-12 FPS, Effectively unplayable". The third says nothing useful.

The game is rated as "playable". So you need to look at the notes and hardware carefully if this is important to you.

That's the point. Under absolutely no circumstance would any one (I mean this literally), describe a modern day shooter at 7-12 FPS to be playable.

Does playable mean, it starts and loads? Or does it mean playable in the sense that, you feel satisfied and enjoy the experience?

It's absolutely the latter when talking in the context of video games. And not the former.

Edit: it even says so in the right hand side, Notes column.

"Very slow and blurry, only 7-12 FPS, Effectively unplayable"

That isn't "effectively", it absolutely _is_ unplayable.

Playable just means it works at an acceptable performance on some hardware, which this game does. Yes there is some other hardware that it wont run acceptably on, but this is true of all games.

Going forward I think I will be able to provide personalized/contextualized views. Imagine:

this is the list of games I own/care about: (5 games) this is the list of Macs I own/care about: (2 Macs)

this would allow me to add a personalized view where you can compare performance for the games you care about across the Macs you own + the Macs or games you intend to purchase

edit: This is part of the initial motivation for creating the spreadsheet

people right now are wondering whether their games (and apps) will run at all on the new Macs and if so how that compares to what they see on their current Macs

Thanks for your comment, I think right now having the game table list all the reports might be a bit confusing.

As next step I want to group the reports by game and provide a detail view for each game. An added benefit for this is that every game gets a dedicated URL but also allows for a more powerful ui (e.g. sort/filter by distribution platform or Mac specs you are interested in and so on) while staying in the context of the specific game.

Thanks for your comment. Appreciate it.

thanks a lot for the feedback!

I've already switched some games that were reported as "playable" to "not playable" because they are only technically playable

I am currently thinking about how to best approach this and provide a definition of "playable" that makes sense as context for the reports

I'm also thinking about adding some kind of confidence indicator (e.g. 5 reports by different known reporters stating a game is playable vs 1 report from an anonymous user w/ incompletely filled out report)

I'm currently looking at best practices from other software compatibility databases out there.

Any pointers or thoughts very welcome. I want this to be as high quality as possible so you know that if you have a question about game compatibility or performance on Apple Silicon Macs that this is the website to check.

I strongly recommend getting in touch with the developer of ProtonDB [1]. That database encountered the same issues and went through a number of revisions for scoring/confidence and is at a great place right now. You can reach him on Discord [2].

[1] https://www.protondb.com/

[2] https://discord.gg/uuwK9EV

thanks a lot for the pointer!

I think what it needs is a filter you create yourself that defines "playable" and only those games get the green light.

I also think that would provide most power to users, I will still try to come up with a sensible default definition for everyone

In some cases there quality and resolution are higher than they need to be. Deus Ex: Mankind Divided is an exceptionally heavy (or poorly optimised) game.

24fps at 1080p/highest is indeed unplayable, but it's expect it to be much better at 720p/medium.

but do you want to play at 720p on a Retina monitor? Seriously?

People play at reduced resolution on 4k monitors all the time. Display technology significantly out ran game performance a long time ago.

Yes, I'd rather play a game in potato than a slideshow in retina.

Nobody is discounting that somebody might have the opportunity to play their games on a better system, these tests just show what's possible... Which is a lot more than I'd have said a month ago.

imho the key thing about gaming on the M1 Macs right now is that many games that were unplayable on the previous generation of the same Macs (Air, entry model Pro 13", Mini) have flipped from unplayable to playable, which is what many people who use these systems (often not primarily for gaming) care about.

That said: I do think I need to come up with better wording or at least better definitions of what "playable" means (perhaps even for every game * settings * spec combination). With enough reports per game this might actually work.

For a while a long-tail of games will only have 1 report though and I wonder how to deal with those.

edit: e.g. in some cases it might sense to provide augmented information like if you are interested in performance on a MacBook Air while there is no report for MacBook Air it might make sense to add the context that there are reports on using a MacBook Pro 13" with the added disclaimer that wile in the same ballpark is not the same system (that could give enough context for the reader to guesstimate an upper/lower bound on what to expect)

kids these days get mad if their fps drops below 100!

> In fact, there are tons of entries for first/third person 3D games where <30 FPS is apparently playable.

first/third person 3D games are playable with 20 FPS (with drops to 15), I played a lot of them with that FPS on my old laptop (e.g. GTA IV, V, Sleeping Dogs).

I don't game on Macos, but if you get above 10fps, depending on the specific game, it can be playable.

For example, I played several hours of Hitman on a device which cannot push more than 15 fps at 1080p. In that game you have to plan, walk around, wait for opportunities... And you might not even have to fire your gun

I really wish Apple would just sort out their graphics APIs. Proton & wine makes gaming on linux genuinely surprisingly performant: I have enjoyed several AAA games (and am continuing to do so). I really don't know why they do not like Vulkan or don't want to provide a compatibility layer for it. One of the games mentioned is Fallout New Vegas, which apparently gets 15 FPS outdoors, at the lowest graphics quality settings, in a game >10 years old. I've literally just finished that on my linux box, under proton, with the gog copy installing brilliantly under lutris and with a 3rd party higher-resolution texture pack. Apple are really shooting themselves in the foot.

Also, from the article on a call of duty online game:

> Everything felt smooth, works flawlessly in highest settings available, but only few minutes until your account get banned :( SoC power consumption is around 3.5W

I'm not surprised that perhaps many devs naïvely assumed that if you're not running x64, you're a "cheater" [or in a VM, which apparently is often also "cheating"...sigh]. I'd be interested to know how many of the god-awful DRM schemes that BigGames like to run (like Denuvo) take to being emulated in crossover-or-wine-running-under-rosetta. I bet they don't like it!

Well, Apple did sort out their graphics APIs ;)

If anybody (e.g. Valve or Apple) would see the Mac as a viable gaming platform, they would offer an API wrapper solution like Proton (which would emulate DirectX and Win32 APIs on top of macOS APIs). But it's already starting at the hardware, embedded Intel GPUs don't cut it for gaming.

But the 3D API is just one tiny piece when porting games to other platforms, and the importance of cross-platform 3D-APIs is overrated. Game consoles always had their own non-standard 3D-APIs, and this didn't hinder games being created or ported. And OpenGL only was a cross-platform API in theory, in practice every OpenGL implementation had its own peformance and compatibility quirks, and Vulkan apparently isn't much different.

But the root cause is not a technological problem, there simply are not enough people interested in gaming on macOS. If there would be a market, games would come to macOS, with or without Apple's involvement.

> Game consoles always had their own non-standard 3D-APIs, and this didn't hinder games being created or ported.

Game consoles offered developers benefits in return for that, though. Both technical, deeper hardware integration and finer control, and non-technical, a very large market & various financing offerings (for exclusives in particular)

Apple offers nothing here by contrast. Metal is a fine API on its own merits, but it's not offering the game developer anything but more work at the end of the day. And the market size is a joke for MacOS & games that would compete with consoles.

> But it's already starting at the hardware, embedded Intel GPUs don't cut it for gaming.

Macbooks have had above average GPUs for a long, long time. It hasn't helped with MacOS gaming so far, why would you expect it to now?

> Macbooks have had above average GPUs for a long, long time.

As far as I'm aware by far most portable Macs are equipped with pretty terrible integrated Intel GPUs. What's "above average" there?

The base option on the 16" macbook pro is a AMD Radeon Pro 5300M. Discrete GPUs on the 15" has been standard for I believe many years now. I don't know what the sales breakdown is between the 15" MBP and the 13" & Air, though.

I partially agree with you, the problem is not technical, although using the most famous and portable APIs definitely helps gaming companies porting and making their games available for MacOS or iOS.

But i guess the main 2 problems are:

1) Not enough Mac users which would be significant for a AAA gaming company to port their games on day 1 and get profit.

2) Apple has not taken yet gaming seriously, maybe because they have not seen a viable business model for gaming. I guess here this could change as Apple is slowly putting their feet on it with Apple Arcade (which is definitely not going to bring hardcode gamers), making iOS a good gaming platform (device stability, good GPUs and good support for 3rd party controllers) now its the first time most popular Macs are getting a good GPU. Perhaps also this is one of the reasons why they are blocking Cloud gaming.

> 1) Not enough Mac users which would be significant for a AAA gaming company to port their games on day 1 and get profit.

I am a huge apple portable fan Macs, iPhones, iPads etc. However like a lot of people who casually play video games I also have a Switch and a PC with a nice RTX 2070.

Apple is dead last on my chain of gaming machines. A lot of people I know with Macs also have a console or PC that serves as the gaming machine(s).

I'm probably wrong, but I also imagine there's a good amount of that just use Bootcamp with Windows. Which adds to the Windows players stats over macOS, yeah?

I wonder if M1 could possibly change this since (atm) you can't bootcamp Windows. Not only that, but the m1 iGPU seem way more capable on entry level Macs over Intel's iGPU.

> although using the most famous and portable APIs

That would be D3D11 (most famous/used at least) ;) Metal is actually quite similar to D3D11 (or rather a hypothetical D3D11 successor if Mantle wouldn't have happened, which in turn heavily influenced D3D12 and Vulkan). If you have a D3D11 backend in your engine, a Metal backend is fairly easy to derive from this.

> Game consoles always had their own non-standard 3D-APIs, and this didn't hinder games being created or ported.

It did, which is why both Sony and Nintendo made their hardware and API's less special in recent years. (Which doesn't mean they are not special, but less so.).

I mean as far as I now both Playstation 4+ as well as the Switch do support Vulcan. Which is one (of multiple) reasons why so many games where ported to the Switch. (The other, maybe bigger, reason was that Nintendos polices got much better.)

I seriously doubt that Vulkan plays any significant role in getting games onto the PS4 or Switch, simply because there are not many games running exclusively on top of Vulkan. On PC by far most games are rendering through D3D11 and D3D12, with the bulk still using D3D11 because Windows7 is still a thing. In general you use the API that's "most native" on a platform because that usually gets you the best performance and driver quality.

I'm happy to be proven wrong though.

It doesn't help either that the old adage for people looking to game on their mac was to use bootcamp to install windows and forget about running the mac version of the game unless it was 10 years old. The most hard core and devote of mac gamers aren't even using mac OS.

I really don't know why they do not like Vulkan or don't want to provide a compatibility layer for it.

Metal was released before Vulkan existed, and Khronos already has a compatibility layer available (https://github.com/KhronosGroup/MoltenVK) - I don’t think graphics APIs are the issue here.

Note that Fallout:NV is running via CrossOver, a (commercial) Wine version. This means it runs Intel-Windows binaries via Rosetta on the Mac.

I might be too optimistic but I'm hopeful that Steam can figure out a way to provide Proton-like compatibility to Macs going forward and/or that CrossOver will help here.

It certainly would be a lucrative market for Steam that just opened up. Most games w/ high performance requirements were not playable on entry-level Macs up until now.

They don't like it because allowing users to run in a VM would be amazing for cheaters. A VM provides direct memory access to the game frmo outside the VM and a trivial way to create a new computer if you are banned.

Does apple really care about cheaters in desktop games? That’s something for a games publisher or studio to care about. Especially in paid games cheaters would make apple money because they’d need to purchase new accounts to keep playing.

Apple doesn't care at all, game companies care so they block VMs.

I played through New Vegas on a Mac last year using Porting Kit, so that's an Apple Silicon issue not a Mac or graphics API issue.

This looks like a marketing piece. The M1 doesn't replace dedicated graphics power nor is the apple platform going to jump to 100W+ extra power usage to embed a proper nvidia chip. Unless a game performs at 60fps at the displays standard resolution without having to rip it's quality apart it is not an enjoyable experience.

No doubt basic games like terraria or vanilla minecraft will work, same as intel HD.

> 60fps at the displays standard resolution

This has been kind of problematic for MacBooks since the first retina model. They ship with HiDPI displays that are beautiful for text and video, but people don't seem to realize the resolution is just way too high for gaming on an ultrabook. I used to game in 900p comfortably on my Intel integrated 2013 MBP. In native 1600p the same titles would be a slideshow.

Blizzard games deal well with this — on my iMac 5k, the UI is rendered at full 5k resolution, while the game world is rendered at 1440p. 1440p on a 27” display is perfectly fine (it’s what I play on on my PC now) but the UI at 5k does make a significant difference in text legibility.

Not only Blizzard games, I've seen this in ArmA, Euro Truck Simulator 2, Simple Rockets 2 and others.

Consoles do this pretty often, with a FullHD/4K GUI and lower (or even dynamic) resolution for the 3D. Very comfortable.

I don't understand the logic behind this complaint. What is so different about playing a game rendered at 900p on a 900p display and the same game rendered at 900p on a 1600p display?

Imagine you have 16 bright circle lights in a straight line, but you want to fill the full string of lights edge to edge with 9 perfectly uniform and spaced circles the same size as the lights?

You can't. You can turn the lights on in a very non-uniform manner. Extrapolating that to all the pixels on a screen, and you can't get quite the image fidelity because some pixels need to be partially activated for ideal images. Best case scenario you do exactly half (one fourth the total pixels) for images that align with your pixels, but at dramatically decreased image quality.

What has any of that got to do with gaming in any practical sense? With the barest minimum of smooth scaling, fidelity wouldn’t be any worse on the monitor with higher pixel pitch.

Now you might have a point if talking about pixel art games, but none of those will tax the GPU to any extreme level—in which case you could render at a native 1600p or a perfect pixel doubled 800p.

Except a minor loss of sharpness, no difference.

I was pointing out the fact that expecting native-resolution gaming from a machine, whose display is totally overkill for its size (for gaming, that is), may not be the smart thing to do.

And it was even more absurd in 2012 when people built gaming rigs with 1080p screens and suddenly there was an ultraportable with a 2560x1600 screen (that was so-so usable for 720p or 900p light gaming).

High end display resolution has pretty much always outstripped gaming performance. Plenty of PC gamers play modern games on 4K monitors at reduced resolution even with high end graphics cards, so this is a completely bogus expectation.

Playing Batman Arkham City, which is a pretty 3rd person action RPG, at 1080p and 60fps on my base model Air.

To me, that's astonishing.

As someone with a 2020 intel macbook air (i5) that uses it occasionally for gaming here and there, I can tell you that the reported performance in this database, if accurate, is a substantial improvement.

it might not replace a dedicated GPU but people play games on integrated GPU and the M1 makes games playable that weren't playable on previous Macs w/ integrated GPU

e.g. take a look at Metro Exodus on a MacBook Air M1 (30 FPS+ @ 1080p):


That's exactly why I started the spreadsheet. All the early review benchmarks about the M1s were interesting but what is even more interesting to me is whether a specific game runs and how well and on which system under which conditions and settings.

e.g. if you are out and about, don't have access to a power plug (need to run on battery) it is interesting to know what the MacBook Air M1 can do (compared to previous Macs).

I’m sorry but anyone using 30fps as “playable” shouldn’t be making these judgment calls.

I’ve got a 2060 in a machine I built for under $2k, connected to a high-refresh rate 1440p screen. I expect 144–165Hz, which is easily achievable even at highway settings in many games.

For non-gaming, I’ve been a dedicated and exclusive Mac user going back to the early 90s. But gaming on them makes no sense.

Are you saying the bar for "playable" (for Metro Exodus in this case) is 144-165 FPS @ 1440p or what's the pre-condition you would define for the label?

Perhaps multiple labels other than just "playable" make sense (runs, playable, enjoyable, excellent, …)? Appreciate any input to make the info on the site relatable for people who care about gaming on Macs.

hi, I just woke up to this thread!

I am maintaining the website and will ship a few improvements this week to make it more usable & improving context information (like OS version, game version, CrossOver version etc)

Thanks a lot for the feedback and for checking it out.

Two things I'm currently thinking about are

a) how to further improve the site (what do you care about? which questions do you have?)

b) how to reach people who might not be aware of the website yet (e.g. it got featured by @gruber on daringfireball recently and on /r/macgaming) but I guess there must be way more people interested in gaming on the Mac out there. Any ideas appreciated!

I'm also on twitter at @__tosh (two underscores)

>15 FPS, playable

That's funny.

And most of these are at the lowest graphic settings.

It is playable.

Not great, but I managed to finish few 3D games with 15-20 FPS and it was fun.

Not everything needs to be 120 FPS to be fun.

Same here, I am a Mac user since pre-OSX

most of my Macs were entry-level models and/or low performance laptops with integrated GPU like the 12" MacBook and I know Macs were not the ideal platform for gaming but that did not hinder me from enjoying games w/ low FPS because the alternative (without a gaming PC) for me would have been not to play them at all.

That's where I am coming from and that's the main motivation for building this website. I am excited about what Apple Silicon Macs mean for gaming. This is a huge shift.

Sorry about the downvotes!

OK, so why would anyone think 15-20 FPS is not playable? Have you tried? I did and I assure you it is playable - I did it for few years, finished few games and wan enjoyable.

So why I am downvoted?

30fps is widely regarded as the minimum playable FPS. If you personally found it OK that's great, but you're definitely in a small minority on that.

OK, for FPS maybe (but nowadays FPS games are slower than in the DOOM/Quake days - thanks to console controller), I only played DOOM (from 2016) on my laptop. I don't remember FPS, but I doubt it was better than in GTA V.

For sure < 30fps is small for multiplayer games, but for single player is is quite enough (unless you get 1 FPS :)

I would straight-up get a migraine beneath 60fps at this point.

People play games on PS4 and Xbox with 30fps and they manage.

I guess Cyberpunk 2077 will have to wait for the M1X.

0-1 FPS, playable.

It works on the PS4 which is pretty old so M1 may be fine.

Question is whether the M1X will support ray tracing.

Or a version that's optimised and natively compiled for the M1.

as ultimate escape hatch it will be available via Stadia which also works well on the M1 Macs

Yeah I played a bunch of games on my mac mini Check out my convoluted thread from a lil bit ago


btw Hades should work by now, I just got a report about it

Wonder if any of these are suffering from the high pooling rate issue for gaming mice on Big Sur[1]? Not sure if it affects M1's.

Just ran into this on my 16" Intel MBP after upgrading to Big Sur and attempting to play a game while using a Razer mouse.

Basically many gaming mice in including my Razer have upwards of 1000hz poll rate by default and apparently macOS does not handle that well causing high cpu in WindowServer, however in Big Sur it really doesn't handle it well hitting 150% cpu in WindowServer and games seem to be almost unplayable do to massive stutter. Supposedly some improvement in 11.1 but still recommended to lower poll rate.

Had to plug my Razer into a PC to run their config software to lower the poll rate to 125hz. Now everything runs great, didn't realize how much it was effecting everything and now games are smooth and playable again.

1. https://www.reddit.com/r/mac/comments/juqcrd/calling_all_big...

great question @ input hardware.

I also have a Razer on Big Sur right now and it "works" but I have no idea which polling rate is used and the official app by Razer seems unsupported.

Going forward I want to add more context about peripherals as well. The section about supported controllers is a first start (e.g. I was surprised that some controllers are supported on Macs but aren't on iOS etc)

I was looking at this for StarCraft 2 and Diablo III. Both have conflicting reports. Until they have like 90% success, I couldn't risk spending the money on Apple Silicon and not being able to play.

I also love playing these (rather older, now) games on modern hardware with very high settings and very fluid framerates.

Definitely envy the battery life of the new Macbooks, but until they have proper gaming, I cannot "have my cake and eat it, too."

I'm in the middle of trying to find a Ryzen/Geforce/huge battery combo that I like, knowing full well that I'll be lucky to get much more than 4 hours doing non-gaming things off cord (with brightness where I prefer it.)

From what I can tell both run better on the M1 Macs than on previous Macs with integrated GPU (if that helps). Other Blizzard games run extremely well like World of Warcraft, others like StarCraft Remastered do not yet run at all though.

I am working on getting more game reports as well as getting the accuracy (& confidence) of the reports up.

Perhaps I should add a feature for getting notified once a playability status changes from not playable to playable?

> knowing full well that I'll be lucky to get much more than 4 hours doing non-gaming things off cord (with brightness where I prefer it.)

Finding it in stock is a challenge but the Zephyrus G14 is exactly this. RTX 2060 GPU pairs perfectly with a 120hz 1080p display, and it still gets >8 hours on battery doing normal non-gaming stuff

My main problems are:

* I don't like when laptops get (too) hot on your lap

* Due to a medical condition 144Hz is now the minimum

* And my eyes definitely prefer larger screens

So I have an HP Omen 15 but I may return it. (I can play older games without getting too warm, but the screen is really good, but not excellent. The barrel port power on the side bothers me about 5 times a day.) And I have a Legion 5 17" on order... which may never arrive. (In which case I hope I still have the Omen.) In either case, the battery life isn't going to be great doing non-gaming things though.

But a friend has the G14 and loves it!

> I was looking at this for StarCraft 2

SC2 works fine on non-M1 macs, so I assume on M1 it will run even better.

Just a UX suggestion for the site:

The header row of the table should be fixed when the user types in a game title at the top of the page. That way the column titles will be visible even as you scroll. When I typed in a game name and was immediately navigated to the table, I had to guess what the columns represented.

great feedback, thanks a lot!

The only real benchmark for me is Football Manager, where M1 seems to do really well: https://community.sigames.com/topic/515765-fm20-performance-...

So the only native games are iOS games? We'll have to wait awhile for Silicon gaming...

Rosetta 2 seems to work well for most games seeing performance way beyond what Intel Macs with integrated GPU were capable of.

There are a few games like World or Warcraft that already come with native support and similar to apps there are many games where native versions are expected to become available within the next days and weeks.

So yes, it is still early but out-of-the box the M1 Macs do really well compared to the previous models.

> performance way beyond what Intel Macs with integrated GPU were capable of.

That doesn't say much though.

World of Warcraft has a native version now, apparently. But that's not exactly a long list.

Is something wrong with Rosetta for games?

There is if your goal is to make judgements upon hardware performance.

Diablo 3, playable yes, runs well.

Diablo 3, playable no, unable to launch.

So basically your mileage may vary, alot, making this whole list a bit....questionable. =/

Yes, it is still early and "it depends" which is exactly why I started the spreadsheet in the first place. Some games like CS:GO don't start using Big Sur 11.0.1 but are playable using the Big Sur public beta 11.1

The idea of the spreadsheet is to provide this context and it gets frequently updated with more reports and more context (and as game developers ship new versions and as Apple ships new versions of the OS).

Then I would say THAT is the information you should highlight in the spreadsheet. THAT is actual, useful information. Two entries completely contradicting each other without any further information of why is not very useful. :)

This set of real-world performance tests demonstrates that the prior glowing reviews of the M1 were based on artificial benchmarks.

It turns out the M1 is not the revelation Apple claimed. Great battery life? Maybe, when not running games or other CPU-intensive tasks. But performance champion? Not even close.

>This set of real-world performance tests demonstrates that the prior glowing reviews of the M1 were based on artificial benchmarks.

Yes, like running regular workloads for coding (compiling, etc), video editing, audio editing, photoshop, full Windows virtualization getting better scores than a Surface X, and other such "artificial" stuff.

No, it just means the games are GPU heavy, and haven't been written natively for Apple Silicon, nor they take advantage of the graphics APIs, plus the GPU is lightweight (nobody said it's not in any review).

Not to mention the performance is a huge improvement over the exact same Macbook Air's Intel model (with Intel GPU).

You miss that most games are listed as running under Rosetta 2, not natively. This is very impressive and you now can play games you couldn't before on the Air or mini!

Yes: The M1 is not beating a dedicated GPU, but it does beat other integrated graphics.

The next challenge for Apple is to provide similar performance to dedicated GPUs in the MBP and iMacs. This will be interesting as NVidia and AMD have tons of experience and IP. (IF Apple decides to make their own GPU, maybe they won't.)

Keep in mind these are benchmarks done on low-priced entry-level models of Apple hardware (about one third of them on a fanless MacBook Air), while running translated x86-to-arm binaries and probably translated direct3d-opengl (correct me if I'm wrong).

Aside from that, the M1 would have been great even if Apple hadn't done the x86 support. The performance at native CPU-bound tasks is still incredible.

Keep in mind that many of these values are for a laptop (MacBook Air) with no fan at all (the MacBook Pro and Mac Mini are only marginally better even with a fan).

I doubt there is any other fan-less laptop in the world which can run Baldur's Gate 3 smoothly at 30 fps without a hitch.

I’d interpret this as M1 Macs is still limited by iGPU and lacks proper dedicated GPU.

I'd interpret this as the inherent limitation of emulation and compatibility layers. It's pretty clear that the A1 GPU is pretty good as long as software natively targets it. Obviously it's not going to come close to dedicated GPU hardware, but I dare say it's going to be more impressive than people realise once we start seeing a few native ports.

One thing I'm especially fascinated about is that many gamers were sceptical about the 8GB and 16GB RAM and whether that will be enough for apps and specifically for gaming.

The early reports show that when it comes to RAM there is no noticeable difference for most games regarding whether you have a M1 Mac with 8GB or 16GB.

That in itself to me is remarkable and part of why I started the spreadsheet. To get to real-world performance for specific games/settings/specs and away to have something specific in addition to the (also interesting) theoretical discussions on what is enough RAM and whether you can game on an integrated GPU and so on.

I hope this sheet helps answer some of it and will help more of it going forward.

We’re not going to see the real capabilities of these devices until the apps and games have been optimized for them, and I’d guess that most developers won’t have the resources to do so when porting from somewhere else. The big game engines like Unity et al probably will though, so it’ll be interesting to see what they manage to do in the coming year or so.

Or maybe graphics and gaming APIs haven’t been optimised in Rosetta2.

Or maybe they optimized for specific workloads that crossplatform, not-built-for-mac games do not hit very hard

Something I read stated they spent a lot of time optimizing the CPU for specific bottleneck scenarios, like NSObject creation. I don't know how that would map out to apps not written in Swift or Obj-C but my guess is "not very favorably"

It’s astonishing that this reply is being voted down. It is surely improper and unreasonable to judge a CPU architecture based upon code running predominantly through emulation layers on an architecture for which few if any serious optimisations have been attempted.

It’s worth remembering the experience of first generation Ryzen which had good general compute performance hampered in games by micro-optimisations for Intel chips.

You really think so?

There are some very decent FPS recordings on Macbook Airs. Remember, that's a passively cooled 10W TDP chip. That's pretty incredible imo.

I guess there were more reviews of the CPU rather than the GPU.

Reading through the performance reviews, it's clear that performance on M1 laptops is CPU bound, not GPU bound.

How is that clear?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact