Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The PS5 is already emulated on computers today to some degree. Kyty doesn’t run any commercial PS5 games but it can run some homebrew PS5 apps.


The PS5 is basically a regular old desktop computer. The PS3 has a quite unique multi-"cell" architecture.


I can really recommend this tech talk regarding PS4 and PS5 hardware by their lead hardware architect https://www.youtube.com/watch?v=ph8LyNIT9sg

some really interesting topics related to what kind of performance they wanted as a baseline and how to optimize to make the PS5 do what it currently does... like no other platform.

a $700 console that:

- outputs 4k / HDR (upscaled from lower native res. ofc)

- renders games steadily at 30 / 60fps

- no hitches, framedrops

- always records your gameplay

- live video sharing of stream with PS friends

- live streaming to youtube

- updates of games being installed

- downloads of games/data

all simultaneous, instant game switches, quick loading times. the experience cannot be replicated with a 1k gaming PC, not even a $2000 machine. They really delivered a device which is imho, more than just "an AMD gaming PC with a custom GUI and some DRM"

would have loved to have widescale keyboard/mouse support, as playing Far Cry with a controller is frustrating at best. And keyboard/mouse support is in hands of the game dev if they want to support it or not.


Performance is vastly helped just by not having a fragmented ecosystem or a moving target. If I know all my customers are running my software on the exact same hardware, I can optimise the hell out of my software, and also I can see exactly how it will perform for the end user and polish the worst bits until either it's shiny enough for me or I run out of budget.

If every single user has a different setup with components chosen from a vast array of possibilities that can all do different sets of things at different speeds... well, I can get it to work on my machine, and I can try and guess what to degrade when things get bad, but ultimately I just have to throw it over the wall and hope it's not too terrible in the wild. It's impressive, really, that PC desktop games work as well as they do.


that's the case for every console ever made after the 90s; here the fact that both gaming desktop PC and consoles share similar hardware pieces makes it imho that more interesting to be able to find use cases the deliver a better out of the box experience, and have it cost a fraction of the game PC budget.


- no hitches, framedrops

This statement alone is pure bullshit. So the rest should be taken with a pound of salt.


While that one line is incorrect, it’s a fallacy to use that to try and sow the seeds of doubt in the other points. People make mistakes in wording all the time and it’s a low debate tactic to do what you did.

The fact is some frame hitches are gone but obviously not all.

The ones that will primarily be gone or largely mitigated are

- shader compilation hitches, because console versions of games can ship with pre compiled shaders.

- data transfer hitches because the consoles have shared memory and dedicated compression blocks to optimize transfer

- system resource scheduling contention because the OS and other processes won’t start interfering with the game process since they use dedicated resource allocation.


thanks, on an internet forum, you'd expect some leeway, but alas, it seems it's worse than a technical paper ;)

the OS and hardware working hand in hand to help overcome some of the causes for hitches and frame drops is what sets the consoles apart from the DIY PC builds where you simply don't have access to the custom design;

and it's also imho where in this generation Sony has pulled ahead of Microsoft even if both are using similar hardware


> the DIY PC builds where you simply don't have access to the custom design;

On Linux (or Steam Deck) you can precompile shaders for your specific hardware like Switch or PS5. There is nothing about "DIY PC builds" that prevent you from building an experience like this.


The steamdeck is a single known entity of hardware. For all intents and purposes, it can be treated like a console in that regards.

But DIY PC builds, that’s a wide range of hardware to support. And it’s not just hardware, it’s driver versions, OS versions and firmware versions.

So it’s possible to do what Valve does where the first playthrough caches the shader compilations and then stores them by a configuration hash, so subsequent users get it. But the sheer number of hardware and software permutations makes it significantly harder.

It has nothing to do with Linux either.

The shaders are therefore not precompiled in the same way they are for console. It just means that the second playthrough of a section is a shared experience taking advantage of the first users resources.

If a game hasn’t been played first, or you encounter an area of the game that hasn’t been encountered before you, or you’re on a slightly different hardware/software combination than the previous shader cache, you’ll hit the stuttering again.


That's not how Valve caches shaders on Steam. They accommodate those DIY builds by compiling them on-machine with Fossilize, converting them to system-optimized files. For DirectX titles like Elden Ring, this effectively eliminates all shader compilation stutter in-game. It also doesn't rely on fancy "first playthrough" setups, since it's translating and optimizing the original shaders wholesale.

> It has nothing to do with Linux either.

It's an out-of-box feature with Steam on Linux. You can run all of this stuff on Windows too, but you'd have to build it from source and configure DXVK environments for each game by-hand. On Linux it all happens automatically.


It’s a factor of Steam not Linux. They could in theory do it for other platforms too.

Fossilize does require at least one playthrough because shader permutations can be generated at runtime. There’s no static shader setup that’s common to all games. It just means that the first playthrough doesn’t have to be the same person playing it right now


I believe fossilize snapshots the entire pipeline configurations. It can then replay that and generate final hardware-specific binaries, not just SPIR-V, for the cache completely ahead of time.

That's much better because it doesn't matter what hardware the first person used, the data can be used everywhere.


True, the replay aspect does help as long as nothing invalidates the pipeline, which is still a higher possibility on PCs than consoles


[flagged]


pointing out a problem that many bad PC games have does not magically give the ps5 extra performance and better frame rates.


It does when you are misinformed believing that the only cause for frame-rate drop is JIT shader compilation.


never said that shader compilation is the only cause, but not wasting effort replying somebody calling BS without bothering to RTFM, so ya. nothing of value lost


I'm not saying that the PS5 isn't a good performer for its cost. It is also clearly cost-optimized to do exactly those features without any resources wasted on extra hardware. But at the end of the day even if an equivalent desktop computer cost 5x as much the hardware and hardware architecture look identical.

Sure, this will make emulation hard right now because you don't have the huge compute advantage that you do when emulating a PS3 on modern hardware, but you shouldn't have much difficulty matching the architecture, because it already matches. Basically a PS5 emulator can look a lot more like Wine as opposed to hardware emulation like you see for NES, N64 and similar consoles which were completely custom hardware.


it's definitely done on purpose by Sony, to have PC grade hardware so they can port their games easily to the PC platform and have a larger install base for the first party titles, which previously only existed on the Playstation. All in all, I see it as a win for end users that they are converging as the software titles are available across multiple platforms. So price competition is very relevant. In a world experience worst inflation in decades, this is a very thin silver lining for sure.


> always records your gameplay > video share > live stream

it's called hardware encoding video. it's great that those features are present, but HW encoding (even in consoles) isn't new


never said it was "new" have nvidia shadow play forever running and OBS for the better open source solution on a game PC; but that doesn't take away that the all-in-one polished end user experience is very nice and required some better planning.

PS3 multitasking performance was horrid


Consoles are a loss-leader so saying that the price is $700 doesn't tell you what it costs without including how much of a negative margin Sony was willing to take.


PS5 and Xbox SX are AMD Ryzen 4000 series x86 and a GPU slapped on a die with some memory so emulation isnt so far off


The original Xbox proved that hardware similarity doesn’t necessarily make emulation easy. It took a long while for OG Xbox emulation to be decent , and that was with it using fairly commodity hardware and a very DirectX api.

The PS5 has custom APIs that would need to be reversed out, especially graphics APIs. It also has a large-ish pool of shared memory that makes it difficult to map to most PCs which don’t have that setup.

There’s several custom hardware blocks for bespoke decompression that are routinely used and the equivalent to direct storage to speed up resource access.

It’s not impossible to port those games over as has been seen, but it’s also not easy to emulate that if the specific game builds make use of those features (and many many games do)


I think what we'll likely see is that it's (relatively) easy to get PS5 games running in a PC emulator, but running them _well_ will take ages. Primarily because PCs will have to be able to out-horsepower the PS5 by a wide margin to make up for things like the shared memory setup, texture streaming stuff, etc.


Yeah, PS5 games expect to have up to 16GB of VRAM available, and thanks to GPU vendors being stingy with VRAM to upsell to get that you’d need to buy expensive high end cards.

But that doesn’t do anything to help with the PS5’s shared memory architecture, where because VRAM and RAM are one in the same, textures that need to be in memory aren’t duplicated between RAM and VRAM like on bog standard PCs, which has performance implications.

Windows has support for streaming assets directly from SSDs like the PS5 does now (at least if you have a fast enough NVMe SSD installed, SATA SSDs or older NVMe drives won’t cut it), but PCs still lack the hardware texture decompression of the PS5 which once again impacts performance.

The mass market computers closest in architecture to PS5s are actually M-series Macs, with how they also have a large pool of memory serving as both RAM and VRAM. Once the integrated GPUs on M-series SoCs achieves parity with the PS5’s onboard Radeon, they might actually be the most straightforward to emulate a PS5 on despite needing x86-to-ARM translation.


> PCs still lack the hardware texture decompression of the PS5 which once again impacts performance.

They might not implement it the same, but hardware-accelerated texture decompression has been around on PC for as long as SIMD has existed. With tech like ATSC floating around I'm not sure if it's appropriate to say PCs really "lack" the technology.

> they might actually be the most straightforward to emulate a PS5 on despite needing x86-to-ARM translation.

The problem with Apple Silicon is that nobody wants to use Metal. The big Switch emulator Yuzu should have also been a perfect fit for Apple Silicon too, but it took years to get "ported" and the end result used MoltenVK for the GPU API. Now that it's here, systems like the Steam Deck are hitting 60fps where M2 struggles to hold 50:

https://youtu.be/pubEj1yLknI?t=414

https://youtu.be/5BeYYuLnS3I

It would be cool to see, but nothing I've witnessed surrounding these sorts of emulators suggests that will be the case.


At the end of the day it all depends on the will of the individuals involved with the projects. Dolphin got a native Metal port for instance.


You’re conflating texture compression like ATSC with generic resource compression.

https://gamingbolt.com/former-frostbite-software-engineer-ex...

Kraken is a generic resource compressor while Oodle is closer to ASTC


> The problem with Apple Silicon is that nobody wants to use Metal.

iOS games market begs to differ.


The iOS games market speaks for itself. It's littered with freemium games and low-effort asset flips, the number of shitty 2D lottery/lootbox games outnumber Monument Valleys 100:1.

The vast majority of substantial game experiences are not getting ported to iOS. The reason for this is mostly Metal-related. Apple has acknowledged this themselves on many occasions, like the last WWDC with their Game Porting Toolkit.


Why weren’t they getting ported before Metal when OpenGL 3.1 was still at parity with the rest of the industry?

The graphics API is not the significant portion of the porting issue. It’s market share and the fact that until recently, very few Macs by market share had great GPUs.

The game porting toolkit works alongside wine and Rosetta to make time to first pixel easier for developers to consider the platform.

Regardless of metal or not, time to first pixel and consistency of hardware has always been the biggest hurdle. Most big engines support metal just fine already, so it’s not the primary hurdle people claim otherwise we’d see more unreal and Unity games running natively on Mac’s.

Now every Mac has a decent GPU (for some definition of decent) with very similar hardware.


> Why weren’t they getting ported before Metal when OpenGL 3.1 was still at parity with the rest of the industry?

They were. The number of OpenGL games was minuscule though, and Apple's underlying APIs have broken now, rendering most of these games unplayable. Apple doesn't really provide a stable gaming runtime, outside of the DirectX-like promise that if you use their proprietary APIs they won't depreciate them.

> The game porting toolkit works alongside wine and Rosetta to make time to first pixel easier for developers to consider the platform.

See, that's the thing. "time to first pixel" was an issue because of Apple's APIs. If you translate non-native program calls into native ones, then obviously you circumvent the problem.

Furthermore, the reason why Game Porting Toolkit didn't exist before now was because Apple had to write a Metal translator for DirectX. The community never wrote one like they did for Vulkan, likely because nobody wants to translate DirectX to a second proprietary API. Kinda defeats the purpose, at least for non-commercial contributors.

> Most big engines support metal just fine already, so it’s not the primary hurdle people claim otherwise we’d see more unreal and Unity games running natively on Mac’s.

Most big engines also support PS5 and Nintendo Switch as development targets. The reason why they are relatively unpopular for porting is the exact same as Apple's - the APIs are nonstandard and closed, with limited distribution and long-term support options. Why would anyone put in the majority of their effort to support a minority of the market?


The number of Mac metal games is about the same as the number of Mac OpenGL games. Which is to say minuscule like you said, but all that shows is that it’s not about the APIs or we’d see Unity/Unreal games a plenty.

It’s just down to market share. Time to first pixel still matters for off the shelf engine based games because devs need to get over the hump of building it etc let alone consider all the possible hypotheticals of how it works, even before they get to APIs.

Game porting toolkit solves that. It’s not meant as a general purpose translator , just to get people over that hump

And again it’s just down to market share. There are plenty of AAA games on iOS that use the same engines as PC games without having Mac versions. Take the Call of Duty games for iOS. Why wasn’t there prevalent CoD on macOS?

All that proves to me is that APIs aren’t the primary reason.


PS5 and Nintendo Switch unpopular?!?

The first and second champions of game sales of this decade!

What a joke, thanks for making my day.

By the way, game studios don't have any issue translanting DirectX to LibGNM/X and NVN.


> PS5 and Nintendo Switch unpopular?!?

>> for porting

I don't think my statement is wrong. People don't like porting to Switch or Playstation 5, there's a significant amount of development and testing overhead required to support either platform. The Switch has a decently popular SDK backed with Nvidia drivers, but requires deliberate ARM ports and very carefully written Vulkan code (if any). The PS5 is a little friendlier to PC-first devs, but still has a unique runtime and zero options for DirectX code. Both platforms require fairly bespoke versions of your game, compared to the "press play" porting experience of the Deck or API parity of modern DirectX on Xbox.

I wish the situation was better for these platforms, but they reap what they sow when they make highly exclusive SDKs and resist open specification.

> By the way, game studios don't have any issue translanting DirectX to LibGNM/X and NVN.

Are there DirectX translators a-la DXVK for GNM and NVN? As far as I'm aware, porting from DirectX has to be done by-hand unless you're using an IR language like SPIR-V (at which point you may as well use native Vulkan).


I would advise to spend some time in developer conferences like GDC, GDCE, PAX.

The only people that don't like porting APIs are usually indie devs in some FOSS forums, proper game studios have hardly any issues dealing with multiple backeds.

Doing game engines with pluggabble backends has several decades of industry experience since Atari and ColecoVision.

Games IP, game design and getting good publishing deals is what matters, not the 3D APU du jour.

As for shaders, usually there is either an internal language, shader graphs, or chosing a specific one, with a translation layer in the middle.

There is no native Vulkan on Playstation, and in what concerns Switch, Vulkan and OpenGL aren't as used as FOSS folks think.


I’m sorry but this post reads like someone dyed a little too in the wool on Linux and hasn’t worked in the video game industry.

Except for indies, PS5 and switch get a ton of high profile games. Very few companies have issues porting over and most will have their engines able to target multiple platforms.

Very few people use Vulkan on the switch. It, like the PS5, has its own graphics api.

Very few games ,outside of the few indies that make their own engines, target DirectX or a specific APi. They use an intermediary HGI that abstracts over various backends so that they can target the wide range of console behaviour that exists from APIs to console specific features.

Thinking about PS5 development from the perspective of DXVK or SPIR-V is the wrong way to think about it. Higher level abstractions coupled with low level backends make it easy for any well architected engine.

Like the sibling comment says, please spend some time perusing the GDC vault or among professional game devs. Your world view on the matter is not representative of the those communities. It is more representative of the external view common within the Linux gaming community that holds Vulkan on a pedestal


Exactly. They’ll do what they need to do for any market they deem to have an adequate ROI.

I always point to Linux when people mention APIs being the issue. Linux gaming was depressing before Proton, despite having both Vulkan and up to date OpenGL. Devs could have supported them but didn’t. So the API isn’t the big reason people make it out to be


Linux actually seems like the antithesis of your point. It has the lowest ROI of any of the platforms we've mentioned, yet the highest degree of compatibility with PC and console games outside Windows. If openness and API support isn't the issue, then why didn't DXVK get written for Apple platforms first?


Not the antithesis unless you’re purposely ignoring that nobody ports games to Linux.

Almost the entire Linux gaming scene is dependent on the fact that Valve wanted to make consoles, failed with the steam machine and then figured the formula out with the steam deck. That’s why DXVK exists, between funding and direct development. It was a high RoI for valve to have their own platform. Nobody else cares.

Linux is not a target gaming platform. Even though it has native Vulkan and OpenGl, nobody targets it and nobody targeted it before proton either.


Hardly any game studio that targets Android bothers with Linux, despite the similarities of being a Linux kernel, with the NDK having ISO C, ISO C++, OpenGL ES/Vulkan, OpenSL, OpenMAX.

Not even Valve managed to change their mind in this regard.


Yep. I think if anything valve actually made the state of Linux targeted gaming enshrined as forever translated since proton is so good. There’s no impetus to even bother making Linux ports and dealing with support when people are willing to translate and blame the lack of native support if it doesn’t work well for some reason.

I imagine that’s the reason Apple doesn’t allow studios to ship with game porting toolkit. They likely want to prevent the eternal translation solution, especially since their GPUs are so different than the original targeted ones.


Their Game Porting Toolkit is mostly about macOS.

There are plenty of AAA studios with iOS games, regardless if you like their business model or not.


Excellent points on memory architecture. Though I’d posit that the non-base M series GPUs are actually already equal or higher performing than the fairly aging Radeons on a PS4/5, depending on the respective SKUs

The wild cards will be translation overhead , differences in TBDR access and thermal headroom.


It would be interesting to me to see an emulator that specifically required APUs to run to not have to design around the unshared memory pool of a discrete GPU.

Either an AMD APU or, if Rosetta sticks around, the Apple Silicon chips.


OG Xbox took so long, because most of its exclusives were also available on PC. Halo 1 & 2, for example. So there wasn't much interest. Developers focused on more interesting systems like the PS2 and the GameCube at the time.


> most of its exclusives were also available on PC. Halo 1 & 2, for example

While I agree with your point about most Xbox "exclusives" being available on PC as well, Halo 2 didn't arrive on PC (Windows Vista only too iirc) until 3 years after the original Xbox release, which was after Xbox 360 was already released and in full-swing. So I think there was a bit more to it than just lack of interest. Especially considering how massively popular Halo 2 was.


Consoles usually get working emulators well after their generation ends. Recent Nintendo consoles are an exception to that rule due to their low power compared to their competitors.

When Halo 2 was released - it was too early for a working emulator to be developed. Back then PS2 and GC emulators also weren't working properly yet. Though they were better than xbox emulators.


Also OG Xboxes were cheap and trivial to mod including converting in to debug consoles, so the homebrew community that often drives emulation didn't have as much reason to care. They could just install a debug BIOS, find a copy of the XDK floating around the ol' interwebs, and have basically the same toolset commercial game developers had.

Now that unmodified OG Xboxes are being irreparably damaged by failing clock capacitors and the used market is drying up as a result the people who still care about the platform have more reason to want a good emulator.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: