Hacker News new | past | comments | ask | show | jobs | submit login
Fallout 76 Day One Patch Is Larger Than the Game Itself (hothardware.com)
275 points by xhrpost on Nov 14, 2018 | hide | past | favorite | 281 comments

The original version is apparently 45GB. I have no basis for saying this (I have not ever worked in the game industry), but my guess would be that this "patch" is really just the complete version that happens to be 11GB larger (which is still significant, but much less than implying there's 54GB of additional content).

Put another way, let's say the actual delta between original and "patch" version is 20GB (11GB new + some changed stuff). There's two ways to ship this:

(1) Create a delta patch file that can be applied on top of the 45GB base version (20GB)

(2) Repackage the entire thing into a complete install (54GB)

If you do only (1), anyone installing for the first time after the "patch" version is released is downloading an extra 9GB and going through a longer installation process.

If you are expecting that only a small percentage of your total install base has downloaded the base game (45GB), it also really doesn't make sense to produce both (1) and (2): You're incurring extra dev and testing time to produce a package that saves you a tiny fraction of your overall bandwidth usage. You also end up with an install base that is partially installed fresh (2) and partially patched (1) which means subsequent updates also have to test both scenarios -- and if you had a bug in applying the patch, you have an even harder time later to try reconcile it for the same reason.

If you do only (2), which is what I suspect this is, then anyone downloading after its release just downloads the 54GB version. The small percentage of people with the initial version have to download an extra 30-ish GB, but it saves you (as the developer) a bunch of testing time and risk.

More like the complete version happens to be 50 GB larger than the shipped on-disc data, judging from the 96 GB install size. The maximum size of a Blu-Ray is about 50 GB, so we're hitting the point where games are too big to even fit on a single disc and it's almost just acting as a dongle for the download version.

On XBox and PC, the free space requirement is 60gB; on PC the game folder winds up at 49gB.

The PS4 needs 96gB free because its digital installers work by downloading everything, then copying things to where they need to go, and then deleting the download. At 99.99% installed, there are two copies of the game on disk, and if there's not enough space for that, the install fails.

red dead redemption 2 recently released and required two discs. one was data only, whcih was copied off. the other disc was data plus the playable disc. this is the first ps4 game i have seen do that, but i don't know if there are others.

Oh man this brings me back to 5 CD jrpgs.

We had to know it was inevitable.

Don't get us old guys started on the number of floppies...

Lemon lists 11 games for the Amiga that had more than 10 floppy disks. I owned 4 of them (Indy 4, Monkey Island 2, Flight of the Amazon Queen, and Beneath a Steel Sky).

Luckily I had a hard disk. I don't even remember that BaSS had 15 disks! That must have taken a whole evening to install that thing. But it took only 12 MB on the hard disk although that was 10% of the whole disk capacity. Times sure have changed.


Beneath A Steel Sky was Amazing.

Time to dust off the Amiga emulator :-)

The PC/Mac/Linux version with full audio is available for free: https://www.gog.com/game/beneath_a_steel_sky

Might as well mention that it can be downloaded for free here, including a bunch of other games that run on SCUMMVM:


Oh god yes. Back before hard disks (Amiga A500+ in my case), trying to play an N-floppy game on a 1-floppy system was... character-building.

I remember Microsoft Space Simulator had something like 11 discs to install.

Yeah, but (presumably) at least you only had to install it once.

Still remember Windows 95 with 21 floppies.

In 2002 I decided to install Windows 95 on a black & white Win3.1-based laptop, just for the heck of it. It had a one or two hundred megabyte hard drive, but only a floppy drive for I/O, so it felt like metaphorically building a ship in a bottle. Unfortunately by the 2000s the quality of floppy disks was atrocious. I had to go through three boxes (about 75 disks) to come up with a set of 21 Windows 95 install disks that had no write or read errors.

Later I realized I could have done the job, and faster, using just two disks: one in the computer writing the disk images and one in laptop reading the disk (and then swap disks & overwrite with the next image). D'oh!

oh man, the dreaded "click click, click click,...".

You think that's bad? Try Office 97 on 46 floppies:


(Yes, this was a real, official thing that was available from Microsoft)

The difference between Win 3.1 and Win 95 was really life changing. The usability of Win 95 certainly sped up adoption of the Web by non-university/govt users.

Early Linux distros downloaded via 14.4kbps modem for me. I... don’t even remember how many floppies that took. Lots as I recall.

My first Linux distro was SLS, sporting Linux kernel version 0.99pl12, and it came on 15 5.25in floppies. I only had Usenet at the time, so I wasn't able to download it directly, and had to pay some random person to mail me the floppies.

I was then setting up my own little TCP/IP network with a NFS server on my two machines, connected via coax Ethernet using NE2000 network cards.

Thank goodness for CD-ROMs. Only a few years after that, I was able to just go into a computer store and just buy a single CD with Yggdrasil (or RH or Slackware) which made the whole process much easier. As long as your system could boot off of CD-ROM... sometimes you still needed to make a boot disk.

I had a friend that only used two floppies... one installing on a new computer, the other downloading the next disk... (isdn line) It actually worked out pretty well.

My first modem was 2.4kbps. Thankfully google didn't exist back then

I installed this a few times! :D

I still remember the 4 floppies of monkey island AND the inside joke when you find a stomp and the game starts to ask for the disk #22!

I remember autocad 12 + extensions on floppies... loading for a lab. Man that was the series of installs that just wouldn't end. IIRC win9x on floppy was almost as bad.

Win95 + Office, at least 50 floppies, and no FEC (forward error correction), so any one got damaged, and that's it.

I think Wing Commander IV was the most discs of any game I owned (6 IIRC).

Either Wing Commander II or Gabriel Knight was the most floppies (both around 12).

One of my first games was Baldur's Gate 2. That was 6 cdroms iirc.

Oh well look on the bright side: its better than no patches. Bethesda has a history of ignoring bugs. They let Skyrim on PS3 wither on the vine.

If I remember correctly original Phantasmagoria release that I still have had like 7 discs.

5 CDs, with live switching as you progressed in game. Still like that one although I'd not revisit it.

Item on Ebay that look identical to the box I have tell it's 7 CD, but it's possible that game itself only took 5 and other 2 are some bonus content.

Probably will check it myself in a week once I visit my father.

Or I'm remembering poorly after ten years, most likely

Takes me back to Riven

I used to have an original edition The Sims 2. 5 discs to install as well. What a long way we've come.

Or the Gamecube version of Resident Evil 4.

Yeah iirc Call of Duty 2 had 5 discs on PC

Unreal Tournament 2004 was like 7 CDs. I remember having to buy a second hard drive for that game. Worth it.

Redbox sent me an email that had a picture of RDR2 "Disc 1" in the list of new games. I wasn't sure what that meant, and figured that data disc might have had a gimped version of the game with only one or two chapters.

Turns out, you have to rent both discs to play! One to install (which took like two hours for me, if not more), and one to play. I guess you only need to rent one disc after that, but they're still getting at least $6 out of anybody that wants to play.

I think GTA5 had something like this.

The GTA5 box holds 7 dual-layer DVDs: https://www.reddit.com/r/GrandTheftAutoV_PC/comments/31vdpb/...

(edit: for the PC version, that is; and even that might depend on the year you got the thing. Also found a Reddit thread with the box on display)

the ps4 version, which is a blu-ray, only has one disc.

> More like the complete version happens to be 50 GB larger than the shipped on-disc data, judging from the 96 GB install size

Compressed distributions that are expanded for install have been used at least since floppies were the main distribution media for isn't all to fixed disks.

Distribution media (and the fixed disks they install to) may be on the order of 5 orders of magnitude larger now, but that hasn't changed.

I don't think that's as true as it once was - the majority of that 96GB is almost certainly media of some kind - images, textures, music, etc. - they don't losslessly compress well at all.

I thought disk I/O was a major blocker for loading in games, which would imply you'd want to keep things compressed on disk.

Additionally, as another commenter pointed out, the majority of assets are in formats that support native compression (textures and audio) and thus won't compress too well a 2nd time.

BD-XL will hit 128GB, but its unclear to me if those can be pressed, or are only writable disks.


The Wikipedia article seems to imply that BDXL (if the hardware even supports it) uses additional layers to reach 100GB(base10) for a quad layer disc.

There is archival optical disc, which is reader-compatible with BluRay and stores up to 1TB. Written by a 220-405nm laser like original BD-R.

I can't find info about it being "reader-compatible". According to a brief note in wikipedia, it has a different metadata format from blu-ray, so would it at least need different firmware in the drive?

You can ignore ATIP if you just read it.

It's always amazed me how downright inefficient patching is, even though better algorithms have existing for literally decades. The typical implementation is that you have to re-download every single resource that's changed, even if the resource is gigabytes large and only a bit in it has changed.

Someone should calculate how many petabytes of Internet transfer capacity are wasted annually solely due to bad algorithms used in performing software updates. It'd be a surprisingly large figure.

One important issue is that if the patch program goes wrong, you can easily end up with customers having broken installations that you can't fix. The patching software is how you distribute updates, so it has to work. So there's strong incentive to deploy simple techniques with fewer failure modes as opposed to optimizing away every last byte of network traffic.

Firefox has historically shied away from complicated patch algorithms for this reason among others. (To be fair, of course, we're also dealing with several orders of magnitude less data than AAA games are...)

If the patch doesn't apply cleanly, don't apply it. The command-line `patch` utility works just fine for this. Transactional updates with rollback on failure are a solved problem.

Is patch generally applied to resources I the multiple gigabyte range, very possibly with the original resource size exceeding the total amount of installed RAM in the system?

Patching small to moderate sized resources transactional ly might be a solved problem, but I imagine doing so on large resources in an acceptable time frame may not be for this use case.

> Is patch generally applied to resources I the multiple gigabyte range, very possibly with the original resource size exceeding the total amount of installed RAM in the system?

That's where bundling all your data files into a single large archive bites you in the butt. Individual files the game uses ain't gonna be that large.

The reason why games have such large files is that the fastest way of loading a game is to do a serial read of the data, ergo the large files.

Sure, but now we're talking about game developers making choices based on distribution and patching instead of what they would likely view as more important considerations, such as performance and ease of development use.

Perhaps they could break stuff apart prior to shipping, but I imagine that could cause a lot of QA headaches as sure it should all work perfectly fine if a solution is used that transparently supports both combined and split formats, but who wants to bet the ship date of a multi-hundred million dollar project on that?

Welcome to the launch of the Xbox One.

you can easily end up with customers having broken installations that you can't fix

I sometimes wonder if games were distributed on cartridge again if they would be more thoroughly tested before being released.

Plus, they'd load a brazillion times faster.

TBH, I think that cartridges in this case could be replaced by USB thumb drive keys... effectively a 128GB USB drive with a hardware key, and maybe even an embedded OS. Updates download and apply into the Key. Take it with you, run on any common x86 system, it could be great.

Consumer expectations of scope, game length, release time tables, DLC, etc. have made this an impossibility.

SteamPipe breaks all bits of content up into ~1MiB chunks, and then fetches those chunks. As long as you align your packfiles to 1MB boundaries, and keep your content sorted, only new chunks will be downloaded. Unfortunately, a lot of developers don't follow these guidelines for a number of reasons, and basically generate a whole new set of chunks with every update.

This includes a game I work on, because it's built in a version of Unreal where we do not want to drastically change the packfile format and modify the build process to handle this case.

Do you know where I could find details about how to optimise for SteamPipe?

Only companies that are bad at tech implement it this way.

Bethesda is really bad at tech. They use a ancient engine that binds physics calculation to frames and can't go over 60 FPS. They warned Fallout 76 fans during the Beta that they should not click the "Update" button (big download, game screwed). They trust the game client 100%. You can hack/cheat in this multiplayer game by changing INI files with a text editor.

Basic idea:

If the file is smaller than 10x the base patch unit size: don't bother just add the whole file as a single unit / send it all for each update.

Else: Create a manifest of the file: Size on Disk, Checksum of file, Name of file mapping to (one or more) list of segments within file by segment offset, segment size, segment checksum. A custom list might be provided by a resource packer that is aware of assets / code segments within the file.

A website will provide an interface that has a list of file names to target size/hashes.

There will also be a list by size/hash that enumerates the segment lists within the file.

To update, the file size/checksum would be computed and the manifests obtained from the server; these would be compared to the files on disk and invalid segments/files discarded.

The target version would then be selected and the target manifests collected. First limited by filename and then optionally project wide, segments would be matched first checking size then checksum. There are a couple different methods that might then be optimal depending on the desired outcome, which depend on disk space, operating system support for hard-linking/de-duplicating subsegments of a file, if the media is known to be an SSD or spinning rust, etc.

However, in any event, that infrastructure allows for easily checking and upgrading / repairing to any version from any version at the expense of storing a small quantity of metadata and having full copies via the actual delivery service for segments on file. The backend might also have de-duplication of the stored data as a bonus.

This is probably a very naive implementation of compression via de-duplication and should be obvious to anyone skilled in this form of art.

> It'd be a surprisingly large figure.

It would be a rather large figure, but I'd wager that superfluous video streaming (eg, video advertising, or youtube/netflix autoplaying to no audience) overwhelmingly dwarfs any bandwidth wasted on suboptimal update delivery.

Among others there's the issue that patching may be a seek-heavy operation. Downloading a 40GB file over a fiber connection is much faster than merging 8GB of new textures into a 32G file on a console that has a HDD.

When downloading from a fiber connection to a PS4 HDD the bottleneck is likely the HDD's speed. To merge a file you would need to at least read it and write it again, which is at least twice as slow as simply downloading it. Alternatively you can pay the cost by seeking constantly instead (like a texture per file), which may be even slower and probably sucks at runtime.

Only a quarter of the US even has the option of fiber, much less actually have it. It's silly to use ideal fiber transfer rates as the argument for downloading whole-hog vs more targeted patching, and I doubt it's the reasoning they use.

And don't forget that Sony and Microsoft need the infrastructure in place for handling millions of people patching the game at the same time...

If it's one 32G file, there's already something wrong...

I've been pretty surprised at how good a job steam does at this. When I release an update for my game, the download size is typically quite tiny relative to all the files I've touched.

Steam itself however downloads full installation, as it seems. Which is interesting.

If a game update goes wrong, they can just send down another update. If a Steam update goes wrong...

(This has sort of happened. Valve managed to roll out a version of the launcher for Linux a while back which used instructions not available on older CPUs, and crashed before it reached the point where it could check for updates. Was a pain. Normally, if Steam exits uncleanly they force an update check immediately on next restart, but if it never gets that far you're screwed.)

Self-updating on Windows is a horrible thing to get right. I wouldn't be surprised if they did this to keep the complexity of the process as low as possible.

The pirates seems to have mastered the art of compressing assets to create distribution packages of manageable size. Maybe legit publishers needs to learn a few lessons from them.

That's because pirates care more about user experience and know more about digital packaging than the publishers do. People getting content from pirates can just switch to another source at any time if one pirate does a bad job. People getting content from official publishers are stuck with whatever they put out.

There was one game (I think one of the really old Unreal Tournament games) that had all the sound files as .wav files, which made the game huge. Many people were still on dialup at the time, so pirating it would have taken over a week to download it.

Well, one pirate group compressed all the .wav files into MP3s and wrote a quick batch script to decompress them all back into the .wav's. Turned a 700 megabyte download into a little under 100 megabytes.

Jeez that takes me back. CORE, CLS... Some games were compressed with UHARC or ACE which performed better than other compressors at the time. mp3unpack or WAVE injector would also run to decompress the mp3s or recombine them back in to large solid files. All that while having some banging chip tune and a snazzy animation play. CLASS!

At that time it might also have been illegal to distribute an mp3 decoder without paying royalties.

Apple has had delta app updating since iOS 6 and I know that Steam also does this. The real issue becomes a combination of, 1. How fast is your internet connection. 2. How fast is your HDD to read the newly downloaded data and patch the existing file. (When I used to have my steam games/windows on an HDD this took forever.)

Is there a way for public users to rsync(or whatever flavor of "better algoritm" you like) to traditional CDNs like Akamai?

zchunk, zsync, and casync spring to mind. zchunk and zsync require support for range queries but casync splits the chunks into multiple files. https://github.com/zchunk/zchunk http://zsync.moria.org.uk/ https://github.com/systemd/casync

The linux RPM world has "delta RPMs" which lets you ship much smaller packages when updating. I don't see any blocker to delivering these through a CDN. Not sure if this is quite what you were asking.

I believe the Deb world also has Jigsaw

The closest you'd come to such a setup is a squid proxy with some prefetching (besides bittorrent, which CDNs don't support until IPFS is a first class citizen).

Internet transfer capacity is not wasted unless it goes unused (or data transferred caused other data that was needed urgently to be queued). For example, you could say any links not currently near full utilization are going to waste.

If a hard drive doesn’t need to spin up because the patch you’re serving a zillion copies of fits in cache, you’re saving the power to run it and the heat it would have generated.

Similarly with all the other resources that would be used on its journey between the data center and the end user. You’re wasting time if you could send an efficiently-generated 2 gig patch instead of a 60 gig file, especially for users in remote places. You’re taking up more space and cycles on your CDN. CPUs switch from idle power consumption to active. Every stage you can think of takes some amount of power and other resources, and if you’re delivering an unnecessarily huge file, you’re wasting everything it took to run all the steps along the way.

Human time is most valuable compared to the various waste you mention (cycles on CDNs, power, heat, etc).

And a lot of human time is being wasted waiting for update operations that take much longer than necessary.

Yes, the oft repeated mantra in this industry. It's a cute thing to say when it's about saving your (developer) time by externalizing energy use on other people.

The world is awash in energy. Time is fleeting. Sort of a cheap shot to call it "cute" when it's factually accurate. I'm sure your time is worth more than overly optimizing every technical process to save a few KwH of power.

A few kWh of power times a million users. I'm not complaining about individuals buying time by paying for energy. I'm talking about saving little time for yourself by burning a disproportionate amount of power, distributed among all your users.

11gb isn't some bug fix, this is asset overhauls. Be it models, texture, audio etc. This is a result of Bethesda overworking employees and crunching like there's no tomorrow. The reason for the download being this size is because they aren't technically capable of making it smaller, and just wrongly assume everyone has the network capability to download it instead. It would take 3 days on my non-fibre internet to download though, so this is a purchase I won't be making.

It's an online only game. If this would really take you 3 days to download, you wouldn't be able to play it anyways...

> It's an online only game. If this would really take you 3 days to download, you wouldn't be able to play it anyways...

(54 GB) / (3 days) = 1.66666667 megabits per second

To really drive this point home, even the FCC agrees with the above poster:


Rule of thumb with ANY Bethesda game is just wait 6-12 months after release, and MAYBE it won't delete your saved game. No one ever learns with these guys. Cool games, horribly buggy for most of its early life.

They patch some bugs[0], but they don't patch the shallow characters and mediocre writing.

I played Fallout 4 and Witcher 3 side by side. By the end of Fallout 4, I realized that I had been mostly playing the same game for 10 years (all of Todd Howard's games since TES4[1]). I almost stopped playing near the end. Witcher 3 is one of the best games I've ever played. Todd Howard has lots of work to do if I'm going to get another one of his games.

I'm watching Cyberpunk 2077 intently.

[0] They leave the "entertaining" ones in. It's usually up to modders to fix what they can. https://www.escapistmagazine.com/news/view/112719-Bethesda-S...

[1] https://en.wikipedia.org/wiki/Todd_Howard#Works

Though not a Bethesda developed game New Vegas, It is one of the best written games, of course coming from Chris Avellone. I have rarely felt that immersed in any other RPG. Its open world felt full of possibilities, excitement and danger, unlike most other open worlds which are mostly empty, sparsely populated with self-similar bland features.

In writing in RPGs, I will dock some points from Larian. Recently tried Divinity:Original Sin. The setting and the writing was so generic.

>They patch some bugs[0], but they don't patch the shallow characters and mediocre writing.

They solved this in Fallout 76. Can't be accused of having shallow characters if you put no characters in the game!

Many people called them robots. They solved that one in the last game: http://fallout.wikia.com/wiki/Synth

What did you think of Fallout 3 and New Vegas in terms of writting? It's been a long time, but I found them very entertaining. Different league than Fallout 4. I'm surprised it's the same creator.

Then again, there was a large gap between trying Fallout 3/NV and Fallout 4, so maybe my perception changed.

> Different league than Fallout 4. I'm surprised it's the same creator.

Not quite. Fallout 3 was all Bethsoft, and Obsidian mostly did Vegas.

Fallout 3 has no big story twists, and clear good guys and bad guys. The main quest is entirely linear, with a large spectacle near the end, ending with a pathetic boss fight. Most everyone (except Dad and maybe 3 Dog) comes off as one dimensional with only one purpose in life. There are some cool side stories and places (a cave filled with orphans, and a mad scientist's escapist simulation), but I recall them being static for the most part.

New Vegas is complicated and more gray in comparison. Who's good or bad is murky. I don't recall any big story twists, but the characters had depth. Take for example, Caesar: I doubt Bethesda would have thought to give him a tumor. There's lots of powerful factions, which not only have stories, you significantly participate in them. The main story branches off into about 4 very different endings, something that Bethesda didn't do until F4.

Back in the day, I watched the Zero Punctuation review of New Vegas[0]. I remember literally everything he talks about! I drank out of that toilet!

[0] https://www.youtube.com/watch?v=7XsACXSVhWY

> but they don't patch the shallow characters and mediocre writing.

one of my reactions to fallout 3 was that unlike the original fallouts, it wasn't funny.

F3 was made by Bethesda. New Vegas - by Obsidian. Not the same studio.

I cried a little when that Cyberpunk demo video was released. Beyond hyped for that game. CD Projekt is a top echelon developer.

Almost all of Bethesda's software problems originate from their insistence on sticking with their old creation engine. The engine has been falling to pieces since Skyrim. I gave up trying to get Fallout 4 to run acceptably on high-end modern hardware and refunded it. It looks like Fallout 76 literally just gives you a stretched picture if you try to run it on a 21:9 monitor (which a very substantial proportion of high-end PC gamers have in late 2018).

For me, Fallout 4, Fallout 76, and any subsequent Bethesda games based on this engine are automatic no-buys for me at this point. I sincerely hope Fallout 76 sells poorly (relative to other Bethesda titles) so that they can get the message where it hurts.

There is a recent post in the Elder Scrolls subreddit https://www.reddit.com/r/ElderScrolls/comments/9wmm98/bethes... discussing the Bethesda game engine and the following quote from Todd Howard was posted:

”I think most people that aren't making games use the word 'engine', you know, they think of 'engine' as one thing, and it's, we view it as technology, right? so there are lots of pieces, and every game, parts of that change. Whether it's the renderer, the animation system, the scripting language, the AI, the controls... so, some people talk about Gamebryo but that's, like, we haven't used that in a decade.”

”And a lot of it is, some of it is middleware, whether that's Havok animation here, and, so 76, we changed a lot of it. You know, it's an all new renderer, new lighting model, new landscape system, and then, when you go to Starfield, even more of it changes. And then Elder Scrolls 6 which is really out in the horizon, even more of that will change there.”

The quote is translated from a video interview in German here:


Here is an English translation of the video interview:


On the other hand there are players that find bugs from Skyrim in Fallout 4. The 3D engine still binds physics to FPS in Fallout 76 and can't go over 60.

I mean this is a company that currently releases a multiplayer game without any anti-cheat, anti-tamper technology. Cheating in Fallout 76 will be as hard as locating the config files and changing them with your text editor.

Todd is definitely right about how much has changed in their "engine" and how the layperson has little understanding of how it works under the hood. That said, there are some classic "Bethesda bugs" that have managed to hang around for a long time. For example, every one of their games going back to at least Oblivion has shipped with vsync on, and would have weird issues (typically with physics) when it was disable and the framerate went about 60 fps.

This is still present in Skyrim and Fallout. If you use a 144Hz monitor you have to change the config file. I heard in Fallout 76 it is still present.

It would be a start if they automatically adjusted the value based on the refresh rate of your monitor.

Yes 'Engine' is an all-encompassing concept, taken from the car where of course the engine is made up of many components. It drives the vehicle forward, but the drivers don't see it or know how it works. Until something goes wrong.

I hope they're working on a new director component for their content creators because Cyberpunk 2077 level of story scripting + animation + staging + etc seems to be generating results that Bethesda games are _well_ off.

People point at the right speech moments, look disappointed appropriately, a timer then fires for a story option that's only open for five seconds. It looks so much more _alive_ than a Bethesda game.

There's hints of it back in Witcher III if you know where to look but I'm super excite for 2077.

> Almost all of Bethesda's software problems originate from their insistence on sticking with their old creation engine.

I'd expect to read that on /r/games but not on HN :-(. Engines are source code, they can be fixed and many are fixed (some even descend from codebases older than the NetImmerse/Gamebryo/Creation engine). The issue is that Bethesda doesn't fix their engine, not that their codebase's history goes back more years than whatever time delta the marketing teams of mainstream companies have convinced the gaming public is supposed to be "good" by arbitrarily changing the name or version of their engines.

Unless we're talking about a brand new studio or tiny developer, no engine is being made from scratch anymore - especially in the AAA space. Even id Tech 6 traces its lineage to Quake 1 in 1996 (Carmack said at some point in the mid-90s that he starts new games from scratch, but that was up to Quake 1 and since then every new engine is an improvement to the existing codebase). Same with Unreal, LithTech (latest version being LithTech Firebird and used in Shadow of War), RAGE (which is based on earlier AGE from before Rockstar bought Angel Studios), Serious Engine and of course NetImmerse/Gamebryo as well as a bunch of others (and note here that i'm only mentioning engines that started in the 90s - if we go forward a little bit, a ton of "modern" engines started in early 2000s).

The framerate bug everyone mentions? By keeping the fixed timestep that they already have with their framerate capping, uncapping the framerate, decoupling the game updates and ensuring they are called at ~60Hz (or whatever they want) and interpolating the visual state between update states they'll have the currently best approach for 60+ Hz displays with barely touching most of their systems (mainly the rendering code), let alone rewriting everything from scratch.

(as a sidenote i always find it funny when people think that Bethesda will make a brand new polished and bugfree engine when they do not give enough time to their programmers to fix issues in their existing engine that are often minor)

> which a very substantial proportion of high-end PC gamers have in late 2018

According to Steam's hardware survey only a combined 1.28% uses an ultra-wide monitor as of October 2018. In perspective, that is almost the half of a combined 2.47% that uses non-widescreen resolutions and more than twice the users using 1280x1024 than users using 2560x1080 (the most common ultra-wide monitor resolution).

I mean, considering 160 million active users (or so, last time i checked) this is still an estimated around 2 million users, but that is only a tiny bit higher than the percentage of Linux users (0.72%) and less than half of the percentage of Mac users (2.84%) that Bethesda is completing ignoring - so i don't exactly expect them to put any effort towards a tiny percentage of ultra-wide monitors either (which, if we consider porting studios like Feral than handle everything themselves and only need a code dump from the original developer, might actually need more work from Bethesda's side than making the game available on Linux and Mac).

Not that i excuse their lack of support for ultra-widescreen, mind you (especially considering that it is trivial to have if you also don't screw up FOV tweaking - which btw, Bethesda also does). But the proportion of PC gamers using ultra-widescreen isn't really anything special to affect their decisions.

I don't think the points about the engine are very compelling. Just because something has a number of somewhat independent pieces doesn't mean it's also not a monolithic whole at the same time. That's why it's still called an engine, and that's why even Bethesda refers to it as such. That's also why they won't move away from it -- because there's an entire toolset built on top of it that every component has to interoperate with. It's the same idea with a car -- just because it has a separate transmission doesn't mean it's not a 1990 Honda Accord at the end of the day.

That said, I will admit that my language could have been more precise. It's less that they won't "move away" from their old engine, and it's more that they won't invest properly in updating it for modern times. Anyone who has played Oblivion can attest to the fact that, even if they've swapped out the terrain system or the lighting effects system, their new games are just reskinned copies of Oblivion, complete with all the same bugs, quirks, idiosyncracies, and lack of decent PC support that's always been there.

In the past Bethesda has relied on modders dealing with things like supporting modern monitors. Now that's not possible with Fallout 76, so the poor PC support becomes much more pronounced.

For what it's worth, I think it's hard to draw too many conclusions from the Steam hardware survey, because that's including everyone with Steam. Instead of asking how many Steam accounts have a 21:9 resolution monitor, it would be better to ask how many potential PC buyers of Fallout 76 have a 21:9 resolution monitor, and I suspect the number is higher than 2 million. You have to keep in mind many of those 1080p 16:9 folks have Steam on an old desktop or a laptop that couldn't even run Fallout 76 decently -- there are plenty of 2D indie games that would still justify having Steam installed.

I didn't really argue about the engine being monolithic or not though, i argued that, in your own words:

> they won't invest properly in updating it for modern times

This is very different from "moving away" from their old engine and isn't just a matter of language precision: one requires rewriting everything from scratch in a multiyear project that is a big undertaking on its own (with results that are very likely to not be as stable and bug free as people would imagine) that practically no company with an existing codebase does anymore (especially at the stage and size of Bethesda), whereas the other requires fixing a few subsystems that are already there and a tiny fraction of their existing codebase that can be done as part of their updates between games. It is certainly a much smaller change than the renderer updates they've done so far.

About the Steam hardware survey, every PC gamer has Steam installed (there might be a few who do not, but they either only play one or two games, like Battlefield or somesuch, so they do not really matter as far as PC audience comes, or they are so few that they are statistically insignificant - for all intents and purposes Steam has a monopoly on the market and both developers and publishers use every single statistic it provides as pretty much The Truth). With that in mind, PC buyers for Fallout 76 - or any other AAA game - are inside this audience, either as a whole or as a subset, so this 2 million is actually the best case. In practice it is most likely lower because not every single one of those 2m gamers would be interested in Fallout 76 (it isn't like having an ultrawide monitor is some sort of requirement to like the game :-P).

Bethesda is possibly the only AAA developer with this problem. What do you think all the other game developers in the world are doing? How exactly did RDR2 ever get made? How did BF V ever get made? How did AC: Odyssey ever get made? How did Nintendo ever make Breath of the Wild or Mario Odyssey? How did CD Projekt ever manage to make TW3? It's easy to defend Bethesda here until you play pretty much any other full-priced AAA $60 modern game. None of these games are one-off projects -- every single one is from a long-running franchise going back many years if not decades.

Not every PC gamer is in the potential target audience of Fallout 76. That doesn't increase the numerator -- it dramatically decreases the denominator, making those 2 million a much bigger percentage.

Either way, I feel vindicated with Fallout 76 now being the lowest rated Fallout game in history, selling >80% fewer copies than Fallout 4, etc. Hopefully Bethesda learns a lesson from this and starts putting in the same effort as their peers in the industry.

I'm not sure why you think i defend Bethesda... i wrote in my original message above that "The issue is that Bethesda doesn't fix their engine", which is also why i wrote that "in your own words > they won't invest properly in updating it for modern times" in my second message. My argument was really the part i originally quoted, specifically this:

> their insistence on sticking with their old creation engine

...because it is a common thing to say about Bethesda. But the reality is that they can stick with their engine just fine - they just need to fix their bugs. The problem isn't that they keep using their engine, everyone does (as you already wrote in the first paragraph here), the problem is that they do not give their programmers the necessary time to fix the issues with their engine.

The reason they're hellbent on improving their existing engine is they have the biggest modder community out there. Their day 1 releases are always buggy as hell and crash to desktop bugs a year or three after release are still plenty, but I actually love their overall technology.

They do need to fix their physics and modernize frame rates and aspect ratios but other then that I hope they never change to some completely different engine. Loading in Bethesda games is always much faster than most other games too.

But FO76? I'm going to pretend it doesn't exist, they might as well have just released it as a DLC for FO4 because it seems completely pointless as is.

Now we're having to do this with OS updates too:


I finally got to PC Skyrim about 2 years ago and still ran into a game breaking bug in the main quest. It took me at least 5 hours of research, 2 mods, and a pile of console commands to bypass an event which wouldn't trigger.

From what I found it was a reasonably well known bug too.

I got hit by something pretty much just like that, was it the one with the council on high hrothgar and the jagged crown mission overlapping and blocking each other?

It was Esbern not coming out of his room in the Ratway Sewers when he was supposed to. I'm really fuzzy on the details now, but from what I remember there are multiple bugs related to him. Some have been patched, either officially or community, but the one I hit hadn't been.

OH! I hit that one too, quit the game over it. Very obnoxious finding endless forum posts of people having the same bug as you, and not a solution to be had, and your not even being paid for it.

That's basically a good rule of thumb with any software. It's just impossible to do full QA of everything that crazy users will do, on a wildly diverse spectrum of hardware, in projects that immense. Even if you had unlimited budget and manpower, you're never going to see all the things that are possible; there's just too many unknown unknowns.

While I think what you're saying is not wrong, it is definitely different these days.

Back in the day games used to release on carts and we could not patch them. Now that we have Day1 patches and people expect bugs, production companies shoot for earlier release dates because they can.

I had some rough experiences with Fallout 3 but New Vegas and Fallout 4 didn't really give me any trouble. Maybe they are improving?

New Vegas wasn't made by Bethesda. It was made by Obsidian.

Obsidian was founded by a lot of the people who worked on Fallout 1 and 2, and I thought New Vegas was about 10x better than Fallout 3.

> Obsidian was founded by a lot of the people who worked on Fallout 1 and 2

To be fair, Fallout 2 (my entry to the series, and a game I still love) is famous for being utterly bug-ridden and full of incomplete/broken quests. Disappearing car, anyone? :P

While Obsidian "made" it, it used the same engine/code/technology as Fallout 3.

It did, and Bethesda had time to iron out the kinks in the code that Obsidian used.

But Fallout 3 used the 3rd party Gamebryo engine anyway, so the engine wasn't really developed by Bethesda either.

New Vegas used the same engine as Fallout 3, and it was developed by Obsidian Entertainment. They fixed all the bugs in the engine, and a better development studio handled creating the game. I believe that Fallout 4 used the engine developed for Skyrim, so they had more than 5 years to work out all the bugs.

To this day I get a kick out of the fact that the best Fallout game was made by Obsidian.

Why? Remember, Bethesda was the outlier. They purchased the Fallout IP from Interplay after Black Isle Studios went under. And many of the former Black Isle Employees went to go work at Obsidian, so New Vegas was much closer to being under the stewardship of the original team than Fallout 3 was.

Breath of the Wild has a huge open world and was, in comparison widely bug-free when it came out. Same goes for most Nintendo games. Also when it comes to pure gameplay they are much more polished than all the GTAs and RDR2s. What are they doing differently?

A little insight here, not sure whether this is the full story: https://www.forbes.com/sites/olliebarder/2017/03/13/zelda-br...

Did the other recent big open world games (Witcher 3, Horizon) have similar issues when they were released?

I believe this has to do largely with Japanese game development culture. Japanese developers in general are known for rarely, if ever, updating their games. For example, the game Nier: Automata released with game-breaking bugs, and never got a single patch. Nintendo also doesn't patch their games often, but they have a stronger commitment to releasing things in a good state in the first place. Shigeru Miyamoto famously said, "A delayed game is eventually good, but a rushed game is forever bad."

I looked at the SNES development manual once and it said something along the lines of “if an area isn’t required for completing the game, go there anyways to ensure there’s no bugs”

Nintendo seems to treat all of their online services as an addition to their product. This appears in a few ways:

- No integrated voice chat for Switch, an existing mobile device is used instead

- Significant delay between product launch and paid online service availability

- System upgrades included on-cart for the system so you're not in a situation where you need a system update but don't have online access so you can't play the game you just put in your system

- DLC is actually bonus content, not an integral part of the game split off about 50% through development and stuffed behind a paywall to drive pre-order upgrades

- "Day one DLC"/"On disc/cart DLC" not the primary upsell mode on the Switch for Nintendo developed games - instead, free content packs and paid content packs are slowly released when they're done

Sony and Microsoft seem to treat online services as an integral part of their product, make money when a developer has to publish a patch (after the first submitted patch), and seem to encourage a ridiculous amount of microtransaction DLC (example: There are a ton of DLC packs for LittleBigPlanet 3, including packs of three Marvel, DC, and cartoon characters, all priced at about what a child would get for allowance in a week, Sony taking a 20%-30% of each sold, naturally).

As a result, it seems that Nintendo really does their level best to release games without showstopping bugs because they aren't as reliant on the online component to patch their games. I remember there being a bug in The Legend of Zelda: Skyward Sword (on the Wii) that would cause you to lose your progress if you triggered it[1]. Nintendo apologized, issued updated discs, and put up a Wii channel that would fix your save data if it was corrupted by this bug.

[1] More info: https://zelda.gamepedia.com/The_Legend_of_Zelda:_Skyward_Swo...

I've heard on the grapevine that Nintendo executive management is super against online; they're apparently terrified of the idea of some pedophile picking up kids via their network and the Nintendo vrand being associated with that in the public's eyes. That's why for instance Mario Kart doesn't let you say anything other than the pre canned statements.

As a parent this is one of the reasons I like Nintendo's approach to online. Not for such extreme examples, but because there is a definite attention to safety online.

They've had issues in the past with Miiverse and explicit material, and more recently with SMO and explicit material in the player icons: https://www.reddit.com/r/NintendoSwitch/comments/8syjzw/psa_...

Bethesda has never released games that were complete. This is the worst of the lot, but Skyrim, Oblivion, Fallout 3, Fallout New Vegas, Fallout 4 were all basically unplayable at launch. Some were unplayable after being fully patched years later and require fan-made patches. At this point, I don't get it.

I'm not sure why people say this. With the exception of Oblivion, I played all of those games at launch day and did not have any issues in any of them. In fact, I rarely experience bugs at all in my games (Bethesda or otherwise). I'm trying to figure out why. I am either extremely lucky or people are making it seem like these bugs occur a lot more often than they really do. I don't know.

I've played skyrim on both an up to date patched pc version and a release version on my dad's unnetworked PS3. The difference was ridiculous. My dad never ended up finishing because of game breaking bugs. I couldn't handle playing it for long. It would lag or freeze randomly sometimes you'd have to just hold the power on the ps3 until it rebooted. A bunch of missions had bugs that stopped them from being completed. Tons of clipping errors that would leave you trapped or outside the gamemap.

My updated one had a few of these but I could play for more than 20 minutes at a time.

Also, I never played the release version of it but apparently my dad gave up on fallout 3 on his ps3 when a hacking glitch gave him unlimited stats and made the game pointless.

In what ways do you feel BotW was more polished than RDR2? Genuinely asking. My only real gripe with RDR2 is the sluggish controls (by design in a Rockstar game) and the lack of fast travel late game.

RDR2 definitely has more going on than BotW. BotW is large, but mostly empty.

I'm genuinely astonished by how bad the entire introduction/tutorial to RDR2 is. It's several hours long, completely uninteresting, bogs you down in every control and feature that the game has to offer, and is so ungodly slow that I can't imagine anyone completing it in one sitting. I've played a lot of games; that was among the top 10 worst introductions I've ever seen.

I believe we are on the same page and think you have just missed my "when it comes to gameplay" here. The overall gameplay (of which controls and control flow are a huge element) of Rockstar Games titles always had a lot to wish for. Been wondering for some time how a AAA company got away with it for so long, it rarely ever gets mentioned in reviews.

On that note, if Bethesda releases yet another TES with that God awful first person battle system I am going to... well I suppose I just won't buy it.

> the lack of fast travel late game.

You can fast travel via stage coach or train (for a fee) or one-way out of camp with the fast travel map (if you've purchased it).

Trains are not what I'd consider fast travel and I've had the camp option since the opening of act 2, but it's one way. I want fast travel from wherever. Makes me not want to go far from camp because ugh that ride back.

I understand wanting players to experience the world, but later on I just want to get things done

Bethesda just makes buggy software, but as for comparing RDR2 with BOTW, I think the bugs in RDR2 would stem from it actually having stuff in it.

I'm certain that people more talented than me worked on BOTW, and it is definitely polished, but I think it's probably wrangling less complexity to sparsely populate miles and miles of grass with one of a half-dozen "attack player when get close" enemy types.

Wonder if the Nintendo corporate culture got built around cartridge releases, so once released they were pretty much frozen forever. Less of a need for that now but corporate cultures are very persistent.

I'm not sure if or when it was fixed but Witcher 3 had a lot of minor bugs that annoyed me and reminded me of mid-90s PC RPG gaming. Unclickable chests, uncompletable quests (because an item did not drop, or an NPC died instantly), and so on. All very minor inconveniences for my playthrough, it mostly just detracted from the immersion before I moved on.

> bug-free when it came out

Might want to double check with the speed running community on that.


There is a difference between convoluted glitches left in the published game and publishing a game near-unplayable without massive day one patches.

Most glitches exploited by speedrunning are not things that 99.9% of typical players are going to encounter while completing a normal playthrough of the game. Even something as simple as whistle sprinting is almost impossible to discover by accident, much less things like using shield swaps to reset animations or manipulating the physics engine.

Both Witcher 3 and Horizon required 45+ GB patches for me. Horizon was still an amazing game so I don't mind, but I don't know how TW3 became so popular, I couldn't get into it and had to return it, not to mention it ran awfully (on a PS4 Pro too).

Well, Fallout 76 is multiplayer, so that complicates things heavily in terms of bugs and polish compared to those other titles you listed.

As well, this is Bethesda's first multiplayer game, so expect it to have a lot more bugs than your typical Bethesda launch.

Could be, but then just give the devs more time. As with RDR2 and their 100h week statement these massive day one patches just scream bad management.

Oh, I agree it sounds poorly managed. I expect the first month or two to be a complete disaster in terms of bugs.

Not their first multiplayer game, Elder Scrolls Online?

'Bethesda Softworks' is the publisher for both, but ESO was made by a different studio -- 'ZeniMax Online Studios'. 'Bethesda Game Studios' is responsible for the mainline Fallout and Elder Scrolls games, including now Fallout 76. I think this is their first multiplayer title.

Just Cause 3 for the PC was in a terrible state when it first shipped.


And its still horrible in terms of optimization, never having been fixed after launch.

> Also when it comes to gameplay they are much more polished than all the GTAs and RDR2s


I get that this is singling out Fallout 76, but this is not that uncommon of a practice in the modern era of console gaming. Heck, I think even the new Black Ops 4 PS4 game only came with a small portion of the game on the physical disk, and you had to download the rest.

For what it's worth, Playstation does have stuff in their policy that says something along the lines that games that don't have online have to be on the blue ray console physical disk, but since this is an online only game, it's okay for the disk not to have everything.

I don't love this policy, but as someone who rarely even buys physical versions of the game anymore, I don't really think it's that big of a deal. The game is online only, it's not that crazy that you have to download stuff on day 1. /shrug

EDIT: Plus, a lot of this comes from the publishers breathing down the game devs necks to get the game out ASAP, esp before the holidays. So they put super tight deadlines on the games, and then that requires the devs to work out all the bugs and final code pretty much all the way up until the launch date. It's not like the past where you had to get it right on the first try for the physical copy because there was no way for a user to download a patch.

That reminds me of a recent Visual Studio 2GB patch just to change all branding for VSO to Azure DevOps.

They should build a patch to update their documentation online, because they change the names of services so often and rarely go through and update docs, so you'll find confusingly outdated instructions and references without links when troubleshooting.

I love VS Code, but I'm really happy I'm mostly off of MS for development.

Almost 20 years old, and still relevant: https://www.penny-arcade.com/comic/1998/12/21

A reference to Half-Life 2 perhaps? That 120MB mandatory launch day patch was painful on dial-up!

Half-Life 1, you mean? 2 didn't come out until 2004, I think, and this comic's from '98.

A comic shockingly ahead of its time. While there were an increasing number of games that were borderline unplayable without their Day 1 patch, I think that comic was just extrapolating the trend. It wasn't until the mid-2000s that game boxes came with nothing more than a CD key in them.

Back around when that comic was written, Computer Gaming World had a policy of only reviewing the unpatched versions of games. That became increasingly untenable and they eventually had to scrap the policy.

The parent probably figured it was Half-Life 2 because the HL2 launch was a disaster. It was one of the first games requiring online activation and the servers were completely overloaded.

I bought HL2 (disk) on launch day, had to wait 3 or 4 days to play it. 56k dial up was not fun at the time.

Given the way the URL is formatted it would appear to be from Dec 1998, so not HL2.

I beta tested ultima online around then and it took ~8 hours to download the patches when I installed the day my CD arrived.

wow, straight from the dawn of the cable-modem era

ah the bloated fix. As an engine mechanic, it reminds me of the nearly criminal level of disposability in japanese cars.

I once had to service an acura with a recall for failed half shaft seals on the drivetrain. no big deal, about an hour of labor and a trip through the carwash but the computer in the service tech office kept telling me the part was on order 2-3 weeks.

2 weeks go by and there is an enormous fragile package on my workbench. It is an entire carbon fiber driveshaft assembly for the vehicle im working on, with a replacement axle shaft. Both are required to be installed, with old parts sent back to the guys at the factory. now my labor is 4-6 hours with an alignment, balance and a test drive because some factory worker decided picking up a final product was simpler than picking out a part from that assembly.

I bet the Fallout 76 change is the same. Massive CI, its already being tested at the final assembly level, just reinstall everything and dont worry about it.

>now my labor is 4-6 hours with an alignment, balance and a test drive because some factory worker decided picking up a final product was simpler than picking out a part from that assembly.

Sounds like money in your pocket.

My friend used to work at a Honda service center and he said some warranty repairs were great for him and others would screw him over. Like, there were some jobs that paid maybe 6 hours but in practice would end up taking up most of your day even if you were really good and knew what you were doing. Othertimes it would be the other way, easy job that paid more than it was worth, and people in the shop would fight over those because you could bank a lot if you got a bunch of 'em stacked up.

He works for Tesla now and he's happier with the fixed hourly pay. It fosters actual team work and cooperation instead of people fighting over who gets the best jobs.

There is more to life than money in your pocket.

Sure... but while you're at work and being paid by the hour what does that matter?


Personal attacks are a bannable offense here. Please read https://news.ycombinator.com/newsguidelines.html and don't post like this to HN again.

I did not read it as a complaint, just a commenting with a similar story of inefficiency in a different industry. I found his story to be interesting. Really no need for you to come across so angry/aggressive.

Wonder if they fixed that the gamespeed is tied to the FPS https://streamable.com/xd87p You look down = less to render = more FPS = you run faster.

It's not fixed. You can find videos about this on youtube.

Guess that's the price for using Gamebryo in 2018...

They're using Gamebryo for TESVI...

That gamespeed issue in Fallout 4 was super annoying on my newer computer. Dialog animations went too fast and made talking quite jarring.

I remember downloading Xilinx FPGA tools some years ago. You HAD to download 14.1.0 (or whatever) first, at 6GB, and then get the patch from that to 14.1.1 (another 6GB), and then to 14.1.2 (guess how many gigabytes!). They also provided a combined download to go directly from 14.1.0 to 14.1.2... which was 12GB. Solid.

This sounds like the experience my kids had when they got their XBox One last Christmas. It took literally 2 hours before we could play anything on it.

Software is now "ship it quick and fix it later" and that's wrong.

Exactly, the project managers and management had to hit a date, the game wasn't ready.

So we live in the world of ship it broken/incomplete, just so a PM can hit their date, quality of the game be damned.

I prefer Valve Time to make sure things are as quality, complete and correct as possible [1].

The worst part about these 50GB games and 'patches' is that it takes a few days to get it down sometimes with the data capped/throttled ISP 'networks' we have settled for milking us out of transfer, and big games like this really let them hinder your 'network' for a few days, knocking you offline, data/packet loss and more as they think you are doing something malicious with their 'smart' 'network' throttling.

[1] https://developer.valvesoftware.com/wiki/Valve_Time

Ha! You expect that from a console, but I just bought a tv, and it had to fucking update before I could use it. And on top of that it wanted me to create an account. For a TV!

One of the several reasons I never even connect a smart-TV to the network. nothanks.

burn it with fire

Totally agree. I don't actually have a Smart TV now. I've got a dumbass TV and it's actually fast and reliable for once!

that was my exact experience when i bought one to play RDR2. I took forever to update, then when RDR2 came out that took forever to install and update. Not to mention all the updates chewed up 15% of my comcast data cap that month. Its sad that I have to think about my purchases in terms of how much of my data cap i'm going to go through.

This has been my reality since Comcast announced that awful data cap. I have to juggle streaming TV/movies, video game patches, and various cloud backups just to make sure we don't go over. This is at a house with just two adults that are at work 10+ hours a day. I throttle my TV's connection speed, think twice about sharing files from home->work computer, defer security patches on software, etc.

Even with auto-updates turned off, the PS4 will download patches when in rest mode. I used to love renting games at Redbox, but the rise of huge patches makes that impossible. I rented the Modern Warfare remaster thinking it was an old game and would mostly work out of the box. I was shocked to see it try to download a 40 GB patch. I canceled the download and played the game without the patch and returned the game the next day (it's not a remaster; they re-made everything and it stinks). The next day, I turned on my PS4 to see it had downloaded and installed the patch without me. 5% of my data cap, gone.

An aside, I stumbled across https://physicalgames.wordpress.com/ps4-update-sizes/ a few months ago and like to reference it before making any purchases. Fallout 74 is a hard pass for me.

i know right?! i have have two kids and they are constantly streaming stuff off disney, nick jr, and pbs apps. i wish there was a way to cache locally the stuff they just watch over and over so i'm not getting hit over and over. Also, all the instant on streaming ads! i know its probably a small amount but what if that is what pushes me over. I'm probably going to switch to centurylink soon.

Had this same experience with an Xbox One someone gifted me. Immediately soured me on it but I stuck with it. However I also don't use it all that frequently so every time I powered it up to watch a Blu-ray or play a game I had to do a 10-30 minute download and update. After a few times of that I just never turned it on again. Now I'll never use it again because the day I do will be a day of downloading and updating. Screw that.

I know you can set it to a mode where it can stay minimally powered on and update itself, but I use it so infrequently why would I waste all that power?

What I always do is unbox the product, do all the updates, charge the batteries (if portable), and then package it back up and wrap it for Christmas.

Not a bad idea. Kids are getting old now and I hadn't worked this out yet :)

Back in 1985 I got a BBC Micro for Christmas. I turned it on. Beep Boop and I had BASIC in front of me. Anything that doesn't do that magic will get my fist shaking while the onion on my belt wibbles around damn it :)

Yeah fortunately or not, best practice for a gifted game system is to update it before it's given.

Same here. I got the RDR2 PS4 Pro bundle and had to install the game for 2 hours first.

And I'm still working for a patch for Fallout 4 as it just closes when I enter any key in the main menu only if I have internet connection. When I unplug my computer or disable the network card, all works. I can enable it again when a save is loaded. Internet is full of people complaining about that for over a year or two... and still nothing happens, there is even no official workaround information.

How does Blizzard/Activision do their patching? In WoW, the game will go from red (patching) to yellow (playable) usually before half of the patch is downloaded and they are just modifying existing files that may be 10+GB. The patches are usually quite small by comparison. (50MB to a frew hundred MB)

1. Blizzard uses torrents where people downloading the patch also act like P2P seeders (you can control the download/upload speeds).

2. Assets are bundled via the zip like mechanism, and individual "zone maps" are separated by hard loading screens. If you travel from 1 continent to another in WoW you get a had loading screen as assets are unloaded, and reloaded.

3. Which continent you log into is the only one you _need_ data for, so as you log in the client can prioritize certain assets higher then others.

4. Blizzard/Activision have a lot of experience with this. In 2008 blizzard had 10mil + players downloading 4GiB+ patches within 24hours so they've had a while to tune and make adjustments as this is part of their core customer experience.

5. Large publishers don't general optimize for this, because the people who whine about it largely already gave you their money and aren't likely to refund/return (in some avenues it is impossible).

>1. Blizzard uses torrents where people downloading the patch also act like P2P seeders (you can control the download/upload speeds).

They removed that functionality years ago.


> the people who whine about it largely already gave you their money and aren't likely to refund/return (in some avenues it is impossible).

Not only that, but they will buy the next game and the next one too. This is the reason so much game software is buggy, bloated, and user-hostile: people keep buying it, release after release. You’re rewarding the company for crap! The whole “buy, complain, buy the next one” cycle keeps products of all kinds awful.

At one point th high res assets were nearly half of the download, so they put them at the end. If you logged in early you couldn’t run straight to the end game content and some things were not as pretty as they could be. I think some other games have started copying this pattern.

That makes sense. Thankyou! I can see why non subscription games would not bother with doing this. Maybe they just minimize their cost using CDN's.

They don't do it because creating such a system that works well is a very, very hard problem.

They have a data format which essentially acts like a zip file. So they can swap in and out data on the fly I believe.

The 1TB SSD I bought for gaming in my new computer suddenly feels very small.

Destiny 2 was released for free a while ago.

I gave it a pass because it requires 100+ GB for a full installation.

For one game.

It's still free, and you can redeem it to your Blizzard account just by opening the Blizzard app.


It's also still 100GB.

I have a 500GB drive for my games. I can't justify burning 20% of it for a single game.

I have a 1tb main drive (SSD hybrid, IIRC) with all my games installed and a 2tb external drive with things that don't need to be fast, like video and audio. They're both full. I got down to 8gb on my main drive the other day and had to uninstall a bunch of crap.

I've had this drive a few years now. If I did it today, I'd definitely buy a larger one.

54GB is going to crowd your 1TB? Plus how much of that 54 is going to overwrite or replace the existing code?

The full install is 96GB. 10% of a disk for one game. That doesn't mean it's not worth it, but it's notable.

It's not, it is half that size. 96GB is only a thing when you have both the install and the game files on the disk (but then the install is deleted).

Even so, 5% is a lot for one game.

Even if you use the hard drive for only games, if all of the games you wanted to install were that size, you'd be limited to 20 games. On a 1TB hard drive.

This is ridiculous, companies and developers have to start taking some pride in their software. Games and software packages do not need to be 1/10th the size of a high end SSD.

Isn’t most of the size in a modern game artistic assests like textures, dialog, etc.?

That said, I don’t understand how bug fixes can be so big, because presumably the aforementioned art files would change substantially in the case of software fixes.

Assuming that I’m mot wrong about the proportion of a modern game being art, that lends credence to the idea mentioned elsewhere in this thread that there was straight-up missing content in the original release.

The way that Gamebryo content files work means that as you change more bits, there is an increasing chance that you will need to download the whole content file.

Didn't Bethesda stop using Gamebryo a long time ago?

Nope, just a fork and rename

It also takes 10% of a normal SSD.

But only 4% of a HDD.

Your problem is what again?

Because there is an abundance of a resource doesn't mean its a good time to go hog wild.

In 21th century, 4k, big screens and high expectations on good looking polished games, I think 54gig is okay.

And this is after the first day of the Beta on PC they required players to download the whole game again. I cancelled my pre-order after I figured out I bought the Physical Disk version from Amazon, but I'm not entirely convinced to buy the digital download I meant to get anymore.

I really didn't get enough time to play during Beta (often were during times I had appointments or work), and while I would see players like Shroud having tons of fun with friends , only a small number of my friends play on PCs and few of those will get Fallout76. There were a lot of bugs which I hadn't seen, but also I wasn't convinced this game is worth it.

On the other hand some of the gameplay I saw from Shroud showed that complaints of quests being boring and enemies being too weak, while they might have been true in the beginner areas, were not problems if you simply went to the right areas of the map.

Bad news: the physical disc is supposedly just a download code, so you'll be pulling 50+GB regardless.

Day 1 patch or no, I don't like the practice of the physical media just being a Steam/Uplay/Bethesda installer with a download code for the game.

This seems like some sneaky way to push out a couple more sprints of dev time while shipping ‘physical copies’

When I got Portal 2 on disc in 2011, putting it in my computer prompted me to download Steam so I could get it there. It's not like this is anything new.

Today we announced the size of our new game as being around 8GB. Someone responded with ‘that sounds a bit small’.

Turn tape over to side 2 and press play

If you’re a big publisher, what’s the cost for something like this?

And as a corollary - what does the cost look like considering it's being served by Bethesda directly and not by Steam, as would be typical for a AAA game?

just a reminder: if you read the small print your ISP probably has a monthly bandwidth cap after which they start charging you more. 54GB is probably a significant fraction.

If it's Comcast Xfinity, you'll get a text, email, phone call, and JavaScript injected into non-http sites at 70%, 80%, 90%, and beyond.

If you've got an ISP that considers 54gb a lot of data it's time to get a new ISP, a new country or protesting at how absurd the idea of bandwidth caps are.

My mind is blown by the fact that caps still exists for a stationary connection. I mean, how bad are the networks to require that?

Some isps slow you down instead, but same caveat.

You mean they took the "BETA" event, found bugs and patched said bugs? On the first day of OFFICIAL release! NO! That's impossible! Why would they do something like that!?!? Everyone knows BETA means 5 years of an incomplete game that you have to pay early on...

Seriously, they were pretty honest and upfront about their Beta and did it in the truest sense. "There are bugs in the game! It's total trash!". Yea... that's why you run betas... to find the bugs. I feel like I'm taking crazy pills.

More than likely there was an issue with their patch tool too, thus it's a whole new download. Haven't bought the game yet, but I see nothing they're doing as "ridiculous".

I think the "Ridiculous" part is the trend of shipping a physical copy of the game with a giant patch of all the stuff that wasn't tested/ready in time for the disc to to go print.

If you're going to just effectively re-release the whole game as a day one patch, why not skip the discs and mail people a nice decorative card with a redemption code for the digital download version?

If you have multiple people in a household, each with their own Live/PSN/whatever account, you can buy one physical disc and it unlocks the game for everyone in the household. If you buy the digital version, it's locked to only whichever account bought it.

That's why I still prefer physical versions of games. Both me and my wife have our own separate accounts on the services.

PSN/Xbox Live unlocks it for the _console_ (as well as the user). I have 4 accounts on my PS4, and they can all access content purchased by the other accounts.

Hmm, something is not right with our setup then, because I have absolutely been blocked from playing games my wife has bought on her account on our PS4.

You have to set your particular console as your PSN account's "primary" console. When a console is set as your primary console, any other accounts on that console can use software that you have bought using your PSN account. However, I think only one account can have a particular console set as their "primary" console at a time, so if your PS4 is your account's primary console, then it cannot be your wife's account's primary console, meaning if she buys software, your account cannot access it.


PS4 requires the physical disk in the drive to play. While yes you can simply share that copy with multiple people, they cannot play together simultaneously.

With PSN you can have two people playing the one purchase of a game simultaneously.

I bought FO76 from Best Buy, and that's exactly what I did - I just bought the "Digital Download" edition, which gave me a code I could enter into the Bethesda launcher.

I honestly don't remember the last time I bought software on physical media.

Could physical copies still be traded, whereas digital downloads are typically tied to a single account?

It's funny how much the Xbox One team anticipated this, and tried to move to that model, but got so much flak they gave up. People seem to like their ownership Totems (discs) that they can continue to sell to Pawn Shops (Gamestop) for pennies on the dollar of the original purchase price.

It's okay to me that the nice decorative card rack keeps growing, right next to the Totem shelves in most stores. If I'm going to download the whole thing anyway, I might as well not waste shelf space in my home on the Totem.

What a silly way to trivialize stripping consumer rights. When you buy a game on disc, the disc is the product. You now OWN the game. You're free to lend it to a friend, sell it, give it away, trade it, or whatever the hell you want because its your property. Microsoft was attempting to go the Steam route and forcing people to LEASE their game and tie it to an anti-consumer terms-or-service agreement (redundant, I know). With that you're not allowed to do anything with the game that they don't approve of. And if you behave in a way they don't like, they can take away your entire library in the middle of the night. Naturally people don't like shit like that. And unlike Steam, the Xbox actually has competition, which is why people opted for other consoles instead.

On the contrary, I'm not trivializing the stripping of consumer rights, I'm suggesting that property rights in this matter have always been more complicated than "discs are property" and "codes are not", because in the digital realm discs are codes in a different form factor. The terms of service are the same on the Xbox for digital store downloads and discs. I refer to the disc as a Totem, because that physical artifact holds more spiritual value than it does physical value. Some property rights of that disc are implied, sure, by virtue of being a physical token, but they haven't been tested in court, and they aren't guaranteed. EULAs have been the name of the software game for ages now; they predate the digital stores (even Steam) and have always stated that Software is Leased, never Purchased. Discs were always a physical token tied a EULA. Now discs don't even have the actual software on them anymore, and its getting increasingly silly to pretend that they do.

The original Xbox One plan included lending, giving away, and trading games, it just intended to cut out the Gamestop middle man and let you do it directly Gamertag to Gamertag. That's what Microsoft was really punished for, cutting out the Pawn Shops.

We should be demanding our rights in the digital stores (and securing them in copyright reforms and regulations, if need be), not fighting for bits of plastic that pretend to be rights.

While I think you're right that the Xbox One's Always Online features were mostly just mistimed, (and marketed) in addition to a Totem aspect physical disks are also a way to keep a game longer than however long the service's servers exist, assuming there's no required day one patches and the like.

The Wii shop for example will be going down for good soon, making purchases not already on the console inaccessible. A copy of a Gamecube game on the other hand will work until the disk physically breaks.

Right, but there are also Wii games that won't work when the shop goes down because what's on disc isn't playable and needs a Day One patch it can no longer download.

We absolutely should be fighting for digital store rights, but focusing on physical discs isn't the right way to solve that.

I don't understand this line of thinking. If the game was as broken as you say this close to release, and they knew full well about it, why not delay it by a few weeks and incorporate this patch into the master!?

I'm lucky enough to have very high speed broadband with no monthly limits, and I gather from the tone of your comment that you are too. But it's not universal and patches like this cause a lot of problems for people who aren't as privileged as us.

Dude, it's a stress test. The only way you can fix it is by... STRESSING it. It's really straight forward of a concept. That means lots of people doing lots of little things. You can't find these bugs without lots of people doing lots of things WRONG.

It's basic dev 101. Your user is stupid and will do everyone wrong. It's flat out amazing when you stand behind someone that uses a bit of software you built, took the time to make everything as straight forward as possible. But they find how to break it in crazy ways.

dude is right haha. no matter how hard to try to write nice documentation and make my code nice and user friendly, when i pass it off to someone else for usage they ALWAYS find a way to pass in the wrong arguments and misuse my methods -_-

I agree that this is the standard for modern online games... I had friends really into destiny 2 and they tried to get me to play. As soon as I realized the game was 'online only' i was like fug this man. I had been playing soulsborne games offline on an old xbox 360 for years with no complaints. Nowadays shit is fucked with most games, they expect you to be online and paying extra. No thanks!

As someone else said, i think the trend is 'game as a service' rather then a "game" in traditional sense. Sony has put out some good single player offline capable games, i just have the think the re-occuring revenue of these GaaS games (fortnite, fallout, etc) is SO damn high that the business value ends up guiding decisions.

People want the game now and don't mind downloading a patch. Those who do mind can wait a few weeks for the master to become the patched version.

Honestly, I don't see the big deal. I never buy a game when it first releases. From anyone. AAA or indie. I always wait a month or so. That's when the bugs the devs never saw coming typically get ticketed and patched. I don't understand the big deal of "OMG I have to have this game now!".

Personally, I like the fact that they are patching and continuously patch. This strikes me as the devs want to make sure there are as few problems as possible and will do everything they feasibly can. Instead of "Oh well, it's shipped, it is what it is."

It's a little unorthodox to have your week long beta after the game is already pressed to disc, and a week before the game launches.

Also, I agree with you - Their patch tool is pretty terrible..

Perhaps due to the way the game files are packaged up, but unless they edited every model and sound file, you shouldn't need to redownload the whole thing.

Maybe it's due to their DRM? But even rsync can handle better updates :)

People want “games as a service”-level updates without having having to download big patches. Or they get upset that a patch comes out every week or so.

There’s really no winning.

I think that there are different groups. One group wants constant updates. The other group is irritated by constant updates. Both are vocal on social media. I fall into the latter camp-I don't get a ton of time to devote to gaming so when I do I'd like my games to be ready to play. Biweekly multi GB updates make this difficult, and it's extra insulting when they're just adding cosmetics.

If you’re a casual player, I get it. To contrast, Destiny 2 originally launched with casual players in mind, resulting in casual players becoming less of a vocal audience after the first month or two (like you’d expect for most games) while the more hardcore players never stopped shouting their dissatisfaction.

At the end of the day, the hardcore side are the ones who usually determine if a follow up will be successful or not; balancing both audiences is likely an exercise in frustration and futility.

I don't own the game, so I don't have a dog in this race.. But it does seem a bit silly that you have to download the ENTIRE game again..

At that size, they should have some sort of delta update tool.

It was a beta. For all we know, they decided to switch DRM protection formats or make other sweeping changes in the months since the beta build was originally forked.

I don’t think it’s unreasonable for them to say “sorry, this beta must be deleted and the full game must be retrieved.” That’s the life of a beta tester.

If that's the case, then they probably ought to have had the beta before the stamped discs, right? :)

As I understand, the discs that need the huge update aren't beta discs, they're what you can go buy in the store.

> People want “games as a service”-level updates without having having to download big patches.

I usually hear "games as a service" derided as a combination of lazy bugfixing and wanting a constant stream of money. But anyway, "service"-level patches don't need to be very big. A code fix or a new item aren't going to cause any major churn.

That’s not necessarily true. If a new item introduces a new perk/mechanic/effect, adding the triggers for said effect may have wide-reaching effects. Texture files are often bundled as large bitmaps with dozens or hundreds of textures laid out in one binary, resulting in the entire texture file needing to be re-downloaded.

Right? They've been pretty up front and honest about the online only aspect due to the nature of the game. If they hid that fact, yes, lots to be pissed off about.

But it's like "I bought this package that said it's salt on it. It's not sweet at all! Why isn't this like sugar? This should be like sugar!"

Later, they complain a pack of sugar is too sweet.

"I'm so pissed over what EA did with Battlefront" What did they all do? Buy Battlefront 2. "I hate EA!" They still buy EA games. "I hate online only games!" They buy online only games. Gamers are such a whining bunch of babies and you can downvote me all you want.

There are alternatives and options out there.

I suspect the people who complain about various things are not the same people buying those same things. It's not a willpower or hypocrisy problem, it's a coordination problem.

I think you’d be surprised, honestly. Lots of people put hundreds of hours into games like Battlefront 2, Destiny 2, World of Warcraft, etc all while screaming at the top of their lungs about the good old days and how awful it is now. Meanwhile, they’re effectively paying $0.50/hour.

Updating the patch tool should only take megabytes. What do you mean?

As in it's possible the build can't interact with the patch tool. Thus, they needed to make some fundamental fixes that just requires a flat out new build. I don't know, I'm speculating. But I highly doubt there going "Hey, let's throw in 50bg+ of stuff just to piss everyone off! That'd be funny because everyone is already complaining about online only." Shit happens. Especially when people also decide to try hacking the crap out of the builds and post the results online.

I mean, for a community of devs, I'm really trying to figure out all the "None of this makes sense." You'd figure on HN, most people would start speculating why instead of throwing shit in the dev's face. I work on only small to medium sized projects. Nothing to the scale of what this game is. But I've been contracted to fix, have seen and heard about far worse problems than just a new build download patch.

I just assume it's ultra-low priority, and I bet if they bumped it up to medium-low priority they wouldn't have this problem.

I've seen criticism online related to how soon the beta happened before release.

But gamers are such a childish (usually literally) and entitled bunch that I've noticed it's hard to take most of their criticism seriously about anything. The only criticism I've agreed with lately is the rise of lootboxes. Everything else just reminds me of a non-technical manager trying to tell developers how easy something should be because it was easy to write it down as a Trello card.

For example, there's a common meme on /r/gaming about how lazy/selfish Bethesda is by not include a single-player mode. As if it's some trivial undertaking. Same with Bethesda reusing its engine instead of "just" building a new one.

What gamers never seem to realize is that their demands are no different from the ideals of the developer. There probably isn't a single person at Bethesda who wouldn't want to work <40hrs/week incrementally improving the game to perfection for a few more years, or working on a cool single-player campaign with fresh ideas.

Meanwhile, criticism is easily squished when the financials come out. If enough gamers bought and enjoyed the game to make the venture profitable, then criticism is exposed as nothing but a wishlist, not as the damning evidence gamers like to think they're circle-jerking over.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact