Apple is fundamentally incompatible with "serious" gaming. Games are largely not regular software platforms which receive endless updates and maintenance. Every few years Apple makes a breaking change and expects every app to update or break, which is fine for Photoshop and electron apps, but most games just end up unplayable. This happened when Apple killed 32 bit support and tons of games that used to work on Mac never worked again.
It doesn't seem like a market they have any interest in. The real money is in mobile slop games with micro transactions.
Depends a lot on your hardware. I've got a ~2020 gaming pc and I just installed bazzite on it, moved my desktop to the TV and only use it with an xbox controller. Never opened the terminal or configured anything, all my games just work.
I'd guess that the difference only matters if you have the latest most expensive gear pushed to the limit. I have a 2019 RX5700 XT and one of the DDR4 ryzen 5 cpus and all of my games run flawlessly on Linux with great performance.
I've long since decided that buying the latest top end hardware is just spending a lot of money to be upset by buggy drivers or not being able to get 5000 fps in a benchmark but has no real gains in how fun games are.
So you have very old hardware, can barely play modern AAA games (if ever), and are still happy. Good for you.
But your opinion is relevant to average gamer who enjoys playing games released in current year in the same way that someone drinking instant coffee can advise on coffee beens that it's all just caffeine in the end.
Not a power issue but a feature issue. No ray tracing stops Indiana Jones and Doom Dark Ages (though you can do it in software on Linux): https://youtu.be/aU2qwlCLWm8 . Doom Dark Ages also added a check for Vulkan Variable Rate Shading, requiring a workaround to spoof it. Mesh shader requirement prevents Final Fantasy 7 Rebirth from running.
This. I understand that getting your desktop fps to ridiculous heights is a hobby in and of itself, an obsession that I don't share at all, and good luck to them that do. But I'm colourblind and have the reaction speed of a slug. Anything over 25fps is wasted on me.
After building a few PCs over the years something I've noticed is every time I've bought the highest end new part I feel bad about the money spent, and then I feel bad every time there's a delayed frame or feature missing, and then I feel bad when the next model comes out.
Every time I get something mid range or second hand I feel good about what a good deal I got, and how I'm getting 98% of the features for 40% of the price, and how realistically as soon as you stop pixel peeping screenshots, you won't even notice your settings are on High instead of Ultra. You just take in the story, the sound design, and the actual game.
> all of my games run flawlessly on Linux with great performance.
Your definition of great performance is not mine, but it’s fantastic to watch Linux users continue to hand wave away real issues whilst continually claiming the same or better performance across the board, which is provably false.
> but has no real gains in how fun games are.
It absolutely does for me. Modern displays are absolutely dogshit. I won’t play at anything less than 144hz, as much as I can I aim for 200hz and I want that with consistent frame times.
This is exactly the mentality I'm talking about. People have entertained themselves for all of human history without anything nearly as sophisticated as modern displays. At some point this unchecked desire will suck all of the fun out of a hobby and leave you constantly buying the latest thing and dissatisfied at anything that isn't the highests specs possible to acquire.
The game story, gameplay elements, and such have become secondary to the real hobby of consumerism. If people could have fun gaming 20 years ago, there is no reason it isn't possible to have just as much fun gaming on low to mid range hardware today.
I think this is similar to how buying books is a related but different hobby to reading books, or buying board games is a related but different hobby to playing board games. I know people who have hundreds of board games, thousands of dollars worth, but rarely get to actually play them (for various reasons but mostly involving children).
The hobby of optimising your gaming desktop is a related but different hobby to actually playing games.
Completely agreed, I think most hobbies have this perverse side aspect that is just themed consumerism. And it's so easy to get sucked in to watching youtube videos about the latest board games that you just need to buy, while the reality is you aren't even playing the ones you already have.
It's much harder to step back and realise you don't need the new thing most of the time. Sure if you have a 15+ year old desktop and you can't run the new games at all then an upgrade could be good, but I'd guess most hardware purchases come from people who already have great hardware.
It’s a bizzare assumption to make that because people happen to have different preferences or needs than you do it must be “consumerism.”
I have very specific requirements for motion clarity in games on modern displays. Older display technologies like CRTs and plasmas achieved this naturally through the way they operated. Most modern sample-and-hold displays do not.
You may not notice or be affected by that difference, which is fine. Couldn’t be more thrilled for you, however I am affected. Anything below 120Hz on a sample-and-hold display causes noticeable discomfort for me, and for a long time I stopped gaming entirely because I couldn’t work out why playing anything had seemingly overnight become so bad to play from a comfort perspective. Eventually I realised the issue started when I moved away from CRTs and plasma TVs to modern sample and hold displays.
I was only able to comfortably return to gaming by using very fast displays at 120Hz minimum, preferably 240Hz, because that gets closer to the motion quality I was used to from years of using PC CRTs. For games locked to 60Hz or below, I still prefer playing them on a CRT for exactly that reason and I own a number of CRTs for this reason.
> At some point this unchecked desire will suck all of the fun out of a hobby
You’re projecting. I think I’ve got what I enjoy from my hobby figured out after 35+ years, but thanks anyway.
> The game story, gameplay elements, and such have become secondary to the real hobby of consumerism.
You’re projecting.
> If people could have fun gaming 20 years ago
I didn’t have to endure sample and hold slop 20 years ago, now I do. You may accept or tolerate it, I am under no requirement to do so, nor live in a world where I must accept a significant performance loss is “ok” in any circumstance.
If I wanted less performance, I’d buy something with less performance to begin with.
They forked the slicer, and then put the networking part in a plugin that gets downloaded after you open it purely so they don't have to share the source for it as the rest of the slicer is GPL.
The last two jobs I've had ended up with teams spread across multiple offices and time zones. I don't hate the idea of coming in to the office, but every time I do I end up only talking with people from other cities on calls anyway.
That said, I completely agree. I learned most of what I know from being in the same room with senior developers and asking questions. Something that just isn't happening these days.
I'm fairly sure Android used to have an internet permission back in the early days. But then basically every single app requested it so the utility was diluted. Then they switched away from a static list of permissions and more to a ask for permission at the time of use model.
The old permissions model was always a bit of an illusion of choice. The app presented a massive list of permissions and you could take it or leave it. But when every app asks for every permission you don't really get a choice and just had to accept it. The new model where you can install an app and then reject it's permissions is much better.
Stock Android has always classified internet as a "normal" permission that can't be toggled by the user. I think it still might have to be requested by the app, and you could see it in the app details, but it has always been auto-granted with no way to turn off.
My recollection is that I stopped seeing the “Internet” permission a couple of years before Android acquired the ability to toggle permissions, or do anything at all with them aside from displaying the list from the manifest during app installation.
Almost a decade ago, I wrote and published a small companion app for a game and set a hard rule for myself that it didn't need the internet permission (and thus stuff like a privacy policy). It still managed to be useful despite that, which made me pretty proud at the time.
The processing power is there but the actual game support is not, which is the more important part. There are some games that support it but at least 3/4th of my Steam library won't run on a macbook.
Even games which used to run on mac mostly stopped after 32bit support was killed.
The Googlebook name won't stick around for that long. It'll be like the Nexus and Pixel C, around for 1-3 revisions, canned, then brought back a few years later.
It seems pretty inconsistent. I tried GeForce Now on my gigabit internet and it was super laggy with a lot of audio glitching. Maybe I just didn't have a datacenter near by.
reply