The industry, and at large the gaming community is just long past being interested in graphics advancement. AAA games are too complicated and expensive, the whole notion of ever more complex and grandiose experiences doesn't scale. Gamers are fractured along thousands of small niches, even in sense of timeline in terms of 80s, 90s, PS1 era each having a small circle of businesses serving them.
The times of console giants, their fiefdoms and the big game studios is coming to an end.
I'll take the other side of this argument and state that people are interested in higher graphics, BUT they expect to see an equally higher simulation to go along with it. People aren't excited for GTA6 just because of the graphics, but because they know the simulation is going to be better then anything they've seen before. They need to go hand in hand.
That's totally where all this is going. More horsepower on a GPU doesn't necessarily mean it's all going towards pixels on the screen. People will get creative with it.
I'm almost certain that we'll see comments that GTA6 feels like a downgrade to big GTA5 fans, as there was a decade of content created for the online version of GTA5.
I disagree - current gen console aren't enough to deliver smooth immersive graphics - I played BG3 on PS first and then on PC and there's just no comparing the graphics. Cyberpunk same deal. I'll pay to upgrade to consistent 120/4k and better graphics, and I'll buy the games.
And there are AAA that make and will make good money with graphics being front and center.
>aren't enough to deliver smooth immersive graphics
I'm just not sold.
Do I really think that BG3 being slightly prettier than, say, Dragon Age / Skyrim / etc made it a more enticing game? Not to me certainly. Was cyberpunk prettier than Witcher 3? Did it need to be for me to play it?
My query isn't about whether you can get people to upgrade to play new stuff (always true). But whether they'd still upgrade if they could play on the old console with worse graphics.
I also don't think anyone is going to suddenly start playing video games because the graphics improve further.
> Do I really think that BG3 being slightly prettier than, say, Dragon Age / Skyrim / etc made it a more enticing game?
Absolutely - graphical improvements make the game more immersive for me and I don't want to go back and replay the games I spent hundreds of hours in mid two thousands, like say NVN or Icewind Dale (never played BG 2). It's just not the same feeling now that I've played games with incomparable graphics, polished mechanics and movie level voice acting/mocap cutscenes. I even picked up Mass Effect recently out of nostalgia but gave up fast because it just isn't as captivating as it was back when it was peak graphics.
> it’s odd how quickly people handwave away graphics in a visual medium.
There is a difference between graphics as in rendering (i.e. the technical side, how something gets rendered) and graphics as in aesthetics (i.e. visual styles, presentation, etc).
The latter is important for games because it can be used to evoke some feel to the player (e.g. cartoony Mario games or dreadful Silent Hill games). The former however is not important by itself, its importance only comes as means to achieve the latter. When people handwave away graphics in games they handwave the misplaced focus on graphics-as-in-tech, not on graphics-as-in-aesthetics.
I don't know what these words mean to you vs. what they mean to me. But whatever you call the visual quality that Baldur's Gate 3, CyberPunk 2077, and most flagship AAA titles, etc. are chasing after that makes them have "better graphics" and be "more immersive", whatever that is, is not the only way to paint the medium.
Very successful games are still being made that use sprites, low-res polygons, cel shading, etc. While these techniques still can run into hardware limits, they generally don't benefit from the sort of improvements (and that word is becoming ever more debatable with things like AI frame generation) that make for better looking [whatever that quality is called] games.
And not caring as much about those things doesn't mean I don't understand that video games are a visual medium.
This is just one type of graphics. And focusing too heavily on it is not going to be enough to keep the big players in the industry afloat for much longer. Some gamers care--apparently some care a lot--but that isn't translating into enough sales to overcome the bloated costs.
For me, the better graphics, mocap etc., the stroger the uncanny valley feeling - i.e. I stop perceiving it as a video game, but instead see it as an incredibly bad movie.
> I don't want to go back and replay the games I spent hundreds of hours in mid two thousands, like say NVN or Icewind Dale (never played BG 2). It's just not the same feeling now that I've played games with incomparable graphics, polished mechanics and movie level voice acting/mocap cutscenes. I even picked up Mass Effect recently out of nostalgia but gave up fast because it just isn't as captivating as it was back when it was peak graphics.
And yet many more have no such issue doing exactly this. Despite having a machine capable of the best graphics at the best resolution, I have exactly zero issues going back and playing older games.
Just in the past month alone with some time off for surgery I played and completed Quake, Heretic and Blood. All easily as good, fun and as compelling as modern titles, if not in some ways better.
-How difficult it must be for the art/technical teams at game studios to figure out for all the detail they are capable of putting on screen how much of it will be appreciated by gamers. Essentially making sure that anything they're going to be budgeting significant amount of worker time to creating, gamers aren't going to run right past it and ignore or doesn't contribute meaningfully to 'more than the sum of its parts'.
-As much as technology is an enabler for art, alongside the install base issue how well does pursuing new methods fit how their studio is used to working, and is the payoff there if they spend time adapting. A lot of gaming business is about shipping product, and the studios concern is primarily about getting content to gamers than chasing tech as that is what lets their business continue, selling GPUs/consoles is another company's business.
Being an old dog that still cares about gaming, I would assert many games are also not taking advantage of current gen hardware, coded in Unreal and Unity, a kind of Electron for games, in what concerns taking advantage of existing hardware.
There is a reason there are so many complaints in social media about being obvious to gamers in what game engine a game was written on.
It used to be that game development quality was taken more seriously, when they were sold via storage media, and there was a deadline to burn those discs/cartridges.
Now they just ship whatever is done by the deadline, and updates will come later via a DLC, if at all.
It is pretty simple to bootstrap an engine. What isn’t simple is supporting asset production pipelines on which dozen/hundreds of people can work on simultaneously, and on which new hires/contractors can start contributing right away, which is what modern game businesses require and what unity/unreal provide.
Unreal and Unity would be less problematic if these engines were engineered to match the underlying reality of graphics APIs/drivers, but they're not. Neither of these can systematically fix the shader stuttering they are causing architecturally, and so essentially all games built on these platforms are sentenced to always stutter, regardless of hardware.
Both of these seem to suffer from incentive issues similar to enterprise software: They're not marketing and selling to either end users or professionals, but studio executives. So it's important to have - preferably a steady stream of - flashy headline features (e.g. nanite, lumen) instead of a product that actually works on the most basic level (consistently render frames). It doesn't really matter to Epic Games that UE4/5 RT is largely unplayable; even for game publishers, if you can pull nice-looking screenshots out of the engine or do good-looking 24p offline renders (and slap "in-game graphics" on them), that's good enough.
The shader stutter issues are non-existent on console, which is where most of their sales are. PC, as it has been for almost two decades, is an afterthought rather than a primary focus.
The shader stutter issues are non-existent on console because consoles have one architecture and you can ship shaders as compiled machine code.
For PC you don't know what architecture you will be targeting, so you ship some form of bytecode that needs to be compiled on the target machine.
Agreed. I didn't mean to say consoles' popularity is why they don't have shader stutter, but rather it's why implementing a fix on PC (e.g. precompilation at startup) isn't something most titles bother with.
It's not just popularity, Epic has been trying really hard to solve it in Unreal Engine.
The issue is that, because of monolithic pipelines, you have to provide the exact state the shaders will be used in. There's a lot of that, and a large part of it depends on user authored content, which makes it really hard to figure out in advance.
It's a fundamental design mistake in D3D12/Vulkan that is slowly being corrected, but it will take some time (and even more for game engines to catch up).
That's why I said "precompilation at startup". That has users compile for their precise hardware/driver combination prior to the game trying to use them for display.
Even this is just guesswork for the way these engines work, because they literally don't know what set of shaders to compile ahead of time. Arbitrary scripting can change that on a frame-by-frame basis, shader precompilation in these engines mostly relies on recording shader invocations during gameplay and shipping that list. [1]
Like, on the one hand, you have engines/games which always stutter, have more-or-less long "shader precompilation" splashscreens on every patch and still stutter anyway. The frametime graph of any UE title looks like a topographic cross-section of Verdun. On the other hand there are titles not using those engines where you wouldn't even notice there were any shaders to precompile which... just run.
> In a highly programmable real-time rendering environment such as Unreal Engine (UE), any application with a large amount of content has too many GPU state parameters that can change to make it practical to manually configure PSOs in advance. To work around this complication, UE can collect data about the GPU state from an application build at runtime, then use this cached data to generate new PSOs far in advance of when they are used. This narrows down the possible GPU states to only the ones used in the application. The PSO descriptions gathered from running the application are called PSO caches.
> The steps to collect PSOs in Unreal are:
> 1. Play the game.
> 2. Log what is actually drawn.
> 3. Include this information in the build.
> After that, on subsequent playthroughs the game can create the necessary GPU states earlier than they are needed by the rendering code.
Of course, if the playthrough used for generating the list of shadersdoesn't hit X codepath ("oh this particular spell was not cast while holding down shift"), a player hitting it will then get a 0.1s game pause when they invariably do.
If anything I think PC has been a prototyping or proving grounds for technologies on the roadmap for consoles to adopt. It allows software and hardware iterations before it's relied upon in a platform that is required to be stable and mostly unchanging for around a decade from designing the platform through developers using it and recently major refreshes. For example from around 2009 there were a few cross platform games with the baseline being 32bit/DX9 capabilities, but optional 64bit/DX11 capabilities, and given the costs and teams involved in making the kind of games which stretch those capabilities I find it hard to believe it'd be one or a small group of engineers putting significant time into an optional modes that aren't critical to the game functioning and supporting them publicly. Then a few years later that's the basis of the next generation of consoles.
Long first runs seem like an unambiguous improvement over stutter to me. Unfortunately, you still get new big games like Borderlands 4 that don't fully precompile shaders.
Depending on the game and the circumstances, I'm getting some cases of 20-40 minutes to compile shaders. That's just obscene to me. I don't think stutter is better but neither situation is really acceptable. Even if it was on first install only it would be bad, but it happens on most updates to the game or the graphics drivers, both of which are getting updated more frequently than ever.
Imagine living in a reality where the studio exec picks the engine based on getting screenshots 3 years later when there's something interesting to show.
I mean, are you actually talking from experience at all here?
It's really more that engines are an insane expense in money and time and buying one gets your full team in engine far sooner. That's why they're popular.
PC costs a lot and depreciates fast, by the end of a console lifecycle I can still count on developers targeting it - PC performance for 6+ year hardware is guaranteed to suck. And I'm not a heavy gamer - I'll spend ~100h on games per year, but so will my wife and my son - PC sucks for multiple people using it - PS is amazing. I know I could concoct some remote play setup via lan on TV to let my wife and kids play but I just want something I spend a few hundred eur and I plug into the TV and then it works.
Honestly the only reason I caved with the GPU purchase (which cost the equivalent of a PS pro) was the local AI - but in retrospect that was useless as well.
> by the end of a console lifecycle I can still count on developers targeting it
And I can count on those games still being playable on my six year old hardware because they are in fact developed for 6 year old hardware.
> PC performance for 6+ year hardware is guaranteed to suck
For new titles at maximum graphics level sure. For new titles at the kind of fidelity six year old consoles are putting out? Nah. You just drop your settings from "ULTIMATE MAXIMUM HYPER FOR NEWEST GPUS ONLY" to "the same low to medium at best settings the consoles are running" and off you go.
Advancements in lighting can help all games, not just AAA ones.
For example, Tiny Glade and Teardown have ray traced global illumination, which makes them look great with their own art style, rather than expensive hyper-realism.
But currently this is technically hard to pull off, and works only within certain constrained environments.
Devs are also constrained by the need to support multiple generations of GPUs. That's great from perspective of preventing e-waste and making games more accessible. But technically it means that assets/levels still have to be built with workarounds for rasterized lights and inaccurate shadows. Simply plugging in better lighting makes things look worse by exposing the workarounds, while also lacking polish for the new lighting system. This is why optional ray tracing effects are underwhelming.
Nintendo dominated last generation with switch. The games were only HD and many at 30fps. Some AAA didn't even get ported to them. But they sold a ton of units and a ton of games and few complained because they were having fun which is what gaming is all about anyways.
> That is a different audience than people playing on pc/xbox/ps5.
Many PC users also own a switch. It is in fact one of the most common pairings. There is very little I want get on PC from PS/Xbox so very little point in owning one, I won't get any of the Nintendo titles so keeping one around makes significantly more sense if I want to cover my bases for exclusives.
I agree, but bases for exclusives it's one way to differentiate an audience. I literally don't like any game nintendo makes except maybe for Zelda,I wouldn't buy a switch just for that though.
I do have a Switch because I have kids though.
Exploding production cost is pretty much the only reason (eg we hit diminishing returns in overall game asset quality vs production cost at least a decade ago) plus on the tech side a brain drain from rendering tech to AI tech (or whatever the current best-paid mega-hype is). Also, working in gamedev simply isn't "sexy" anymore since it has been industrialized to essentially assembly line jobs.
Have you played it? I haven't so I'm just basing my opinion on some YouTube footage I've seen.
BF1 is genuinely gorgeous, I can't lie. I think it's the photogrammetry. Do you think the lighting is better in BF1? I'm gonna go out on a limb and say that BF6's lighting is more dynamic.
Yes I played it on a 4090. The game is good but graphics are underwhelming.
To my eyes everything looked better in BF1.
Maybe it's trickery but it doesn't matter to me. BF6, new COD, and other games all look pretty bad. At least compared to what I would expect from games in 2025.
I don't see any real differences from similar games released 10 years ago.
The times of console giants, their fiefdoms and the big game studios is coming to an end.