Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns.

>We now know that, contrary to most games, the majority of the loading time in HELLDIVERS 2 is due to level-generation rather than asset loading. This level generation happens in parallel with loading assets from the disk and so is the main determining factor of the loading time. We now know that this is true even for users with mechanical HDDs.

they did absolutely zero benchmarking beforehand, just went with industry haresay, and decided to double it just in case.





Nowhere in that does it say “we did zero benchmarking and just went with hearsay”. Basing things on industry data is solid - looking at the steam hardware surveys if a good way to figure out the variety of hardware used without commissioning your own reports. Tech choices are no different.

Do you benchmark every single decision you make on every system on every project you work on? Do you check that redis operation is actually O(1) or do you rely on hearsay. Do you benchmark every single SQL query, every DTO, the overhead of the DI Framework, connection pooler, json serializer, log formatter? Do you ever rely on your own knowledge without verifying the assumptions? Of course you do - you’re human and we have to make some baseline assumptions, and sometimes they’re wrong.


They made a decision based on existing data. This isn't unreasonable as you are pretending, especially as PC hardware can be quite diverse.

You will be surprised what some people are playing games on. e.g. I know people that still use Windows 7 on a AMD BullDozer rig. Atypical for sure, but not unheard of.


i believe it. hell i'm in F500 companies and virtually all of them had some legacy XP / Server 2000 / ancient Solaris box in there.

old stuff is common, and doubly so for a lot of the world, which ain't rich and ain't rockin new hardware


My PC now is 6 years old and I have no intention of upgrading it soon. My laptop is like 8 years old and it is fine for what I use it for. My monitors are like 10-12 years old (they are early 4k monitors) and they are still good enough. I am primarily using Linux now and the machine will probably last me to 2030 if not longer.

Pretending that this is an outrageous decision when the data and the commonly assumed wisdom was that there were still a lot of people using HDDs.

They've since rectified this particular issue and there seems to be more criticism of the company after fixing an issue.


>they did absolutely zero benchmarking beforehand, just went with industry haresay, a

https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence

It was a real issue in the past with hard drives and small media assets. It's still a real issue even with SSDs. HDD/SSD IOPS are still way slower than contiguous reads when you're dealing with a massive amount of files.

At the end of the day it requires testing which requires time at a time you don't have a lot of time.


This is not a good invokation of Chesterton's Fence.

The Fence is a parable about understanding something that already exists before asking to remove it. If you cannot explain why it exists, you shouldn't ask to remove it.

In this case, it wasn't something that already existed in their game. It was something that they read, then followed (without truly understanding whether it applied to their game), and upon re-testing some time later, realized it wasn't needed and caused detrimental side-effects. So it's not Chesterton's Fence.

You could argue they followed a videogame industry practice to make a new product, which is reasonable. They just didn't question or test their assumptions that they were within the parameters of said industry practice.

I don't think it's a terrible sin, mind you. We all take shortcuts sometimes.


It's not an issue with asynchronous filesystem IO. Again, async file IO should be the default for game engines. It doesn't take a genius to gather a list of assets to load and then wait for the whole list to finish rather than blocking on every tiny file.

There are two different things when talking about application behavior versus disk behavior.

>wait for the whole list to finish rather than blocking on every tiny file.

And this is the point. I can make a test that shows exactly what's going on here. Make a random file generator that generates 100,000 4k files. Now, write them on hard drive with other data and things going on at the same time. Now in another run of the program have it generate 100,000 4k files and put them in a zip.

Now, read the set of 100k files from disk and at the same time read the 100k files in a zip....

One finishes in less than a second and one takes anywhere from a few seconds to a few minutes depending on your disk speeds.


"Industry hearsay" in this case was probably Sony telling game devs how awesome the PS5's custom SSD was gonna be, and nobody bothered to check their claims.

the industry hearsay is about concern of HDD load times tho

HDD load times compared to......?

What are you talking about?

This has nothing to do with consoles, and only affects PC builds of the game


HD2 started as playstation exclusive, and was retargeted mid-development for simultaneous release.

So the PS5's SSD architecture was what developers were familiar with when they tried to figure out what changes would be needed to make the game work on PC.


If what they were familiar with was a good SSD, then they didn't need to do anything. I don't see how anything Sony said about their SSD would have affected things.

Maybe you're saying the hearsay was Sony exaggerating how bad hard drives are? But they didn't really do that, and the devs would already have experience with hard drives.


What Sony said about their SSD was that it enabled game developers to not duplicate assets like they did for rotating storage. One specific example I recall in Sony's presentation was the assets for a mailbox used in a Spider Man game, with hundreds of copies of that mailbox duplicated on disk because the game divided Manhattan into chunks and tried to have all the assets for each chunk stored more or less contiguously.

If the Helldivers devs were influenced by what Sony said, they must have misinterpreted it and taken away an extremely exaggerated impression of how much on-disk duplication was being used for pre-SSD game development. But Sony did actually say quite a bit of directly relevant stuff on this particular matter when introducing the PS5.


Weird, since that's a benefit of any kind of SSD at all. The stuff their fancy implementation made possible was per-frame loading, not just convenient asset streaming.

But uh if the devs didn't realize that, I blame them. It's their job to know basics like that.


By far the most important thing about the PS5 SSD was the fact that it wasn't optional, and developers would no longer have to care about being able to run off mechanical drives. That has repercussions throughout the broader gaming industry because the game consoles are the lowest common denominator for game developers to target, and getting both Xbox and PlayStation to use SSDs was critical. From the perspective of PlayStation customers and developers, the introduction of the PS5 was the right time to talk about the benefits of SSDs generally.

Everything else about the PS5 SSD and storage subsystem was mere icing on the cake and/or snake oil.


Yeah, that's what I was trying to get at. Sony was extremely deceptive in how they marketed the PS5 to devs, and the Helldivers dev don't want to admit how completely they fell for it.

It's incompetence if they "fell for" such basic examples being presented in the wrong context. 5% of the blame can go to Sony, I guess, if that's what happened.

And on top of any potential confusion between normal SSD and fancy SSD, a mailbox is a super tiny asset and the issue in the spiderman game is very rapidly cycling city blocks in and out of memory. That's so different from helldivers level loading.


I don't really understand your point. You're making a very definitive statement about how the PS5's SSD architecture is responsible for this issue - when the isssue is on a totally different platform, where they have _already_ attempted (poorly, granted) to handle the different architectures.

No. Please try reading more carefully.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: