Hacker News new | past | comments | ask | show | jobs | submit login

Very interesting article. Some of these are shocking:

> Use Source Control: Do you not use source control?

How do professional companies not use source control in the year 2021? The mind boggles.

> Smaller Commits, Better Commit Logging: Submitting a 2GB 2000-file change to Perforce with the description "did some work" is terrible

Also quite surprising. If you are going to log your progress, you might aswell properly document what actually changed.

> Don't Ignore Your Lead Engineer if You Have One: If your lead engineer says that you need to spend $2,000 on SSDs or buy X license for Y, and your answer is immediately no because you believe your engineer is just trying to waste your budget, you either need to learn how to trust your engineer or find a new engineer. I have seen a company waste 70 man-hours a week simply because they had their workers on slow hard drives. Giving everyone faster SSDs would have cost the company ~60 man-hours in budget.

This one really hits home and something that I personally had to deal with. I had to complain and fight for a long time to get the company to actually buy a SSD (or another job critical piece of time-saving hardware). Usually its not management saying no to such requests, its more of a strong general attitude of saving money and keeping expenses low. I have quickly learned that this level of penny pinching is a red flag and you should bail out as fast as you can because it rarely gets better and that attitude is rarely limited to one department.

Also related: Sometimes its about money but in a very indirect way. The amount of time I had to waste waiting for slow builds to complete because the corporate Anti-virus scans every new generated file is staggering. What should be a 10 minute build was over 1 hour, all due to the mandated new corporate anti-virus. Corporate IT could not adjust the security policy because they are understaffed because management laid off 80% of the IT team causing a huge backlog of work which then cascades to every other part of the business. Then they wonder why they get negative feedback during workplace reviews.

---

A thing I quite liked: the hierarchy ranking of engineers and distinction between senior and lead engineer.

--- Also, does anyone know of places where I can read similar war stories? These articles are always super interesting to read and learn from.




Regarding penny-pinching: at Amazon in 2013, the rule was that a sw dev could have one 24-inch monitor, but any second monitor had to be 19 inches or smaller. My manager gave me a second 24-inch monitor, and so I went to the hardware support guy to get an appropriate DVI cable. The support guy refused to give me a cable because I had an unapproved excessively large monitor. He could only give me the cable if he came to my desk to confirm that my second monitor did not exceed 19 inches.

All this haggling and wasted time over a monitor that cost about $100 more. I just bought my own cable (from Amazon!)

I guess they had a pile of ancient 19-inch monitors from years ago and were determined to "used them up" before spending more money buying new monitors. If that means that highly paid SW devs are less productive because they're squinting at late 1990s monitors, then so be it.

In my experience this was typical of the penny-pinching Amazon mindset. As a result, I did not work there for very long.


I was working as a contractor for AT&T years ago. The issued me the same laptop that everyone got. Of course, it doesn't have much RAM and I'm somehow lassoed into supporting the current implementation of Business Objects. It literally took long enough starting up BO that just 2 starts (let alone any actions) would have paid for upgrades in RAM. Over the course of two years, I was paid literally a couple thousand watching my computer load Business Objects.

They could approve overtime on a contractor, but couldn't approve a couple hundred for a RAM upgrade, what with it being non standard and all.


It’s possible they do other things as well, but is it _really_ true that a Dev is more productive with a second monitor that’s 24” as opposed to 19”? I used to be a huge fan of multi monitor setups but have gone through years of having 2 4K 27”, as single 4K, or just plain old MacBook screen only. I can definitely attest that a laptop screen alone is not good, but am honestly not sure about the added benefit of a second large monitor, leave alone due to its size. If anything I probably was less productive when I had two large screens! Furthermore the most prolific engineers I know of rarely has more than one large monitor if even that.

However, does having two large monitors make me feel good? Heck yes! So in terms of engineer morale it might help but I doubt you were measurably less productive due to this handicap on actual non emotional basis.


> So in terms of engineer morale it might help but I doubt you were measurably less productive due to this handicap on actual non emotional basis.

Except they probably were measurably less productive (if anyone bothered to measure), because software engineering runs on emotion and engineering morale. If you're doing anything more complicated than retyping code from printouts, how you feel will determine your speed and ability to focus. So even if the only thing a 24” screen did for GP was to not constantly annoy them, that's a real productivity improvement.


The point is that the discussion costs more than the purchase, so the only reasonable answer to a request to buy anything useful is "Yes, immediately, I'll buy one for every colleague who wants one". A manager who wants to save money on hardware is unprofessional (because they don't understand what is useful or not), an idiot (because they don't understand arithmetic and psychology), or both.

Being unexpectedly denied, through a three-person meeting, a $30 software upgrade was a significant factor in my last job change.


I bet ebay has some way of getting rid of old monitors. I'm surprised Amazon doesn't.


I think they were determined to squeeze as much "value" as possible out of those obsolete tiny monitors, by having someone use them for a few more years. Getting rid of them was not the problem.


Engineers used to talk war stories on the internet all the time. I think the switch from pseudo anon to real name social media has put a dent in that, but there are people still doing it on Twitter e.g @foone or @swiftonsecurity .. among a lot of other output.

The trick now is limiting yourself to those war stories you'd be happy to have all your future job interviewers read :/


>> Smaller Commits, Better Commit Logging

In my current company (one of top us financial institutions) widely spread habit is to use jira title as commit message, be it small jira for a bugfix or something big for twenty commits and two months of work. Just because.

>> If your lead engineer says that you need to spend $2,000 on SSDs

Our physical more or less ok workstations (no ssd though) got replaced by thin clients connecting to virtual machines. Slow in terms of raw performance and video/mouse/keyboard i/o. At the same time big suits like to blah-blah about how important technology is, how "we are not bank we are tech company with bank department" is and how much they invest in tech.

edits: grammar


Source control for UE games is a different beast. Should still be done of course but a lot more tricky, so I understand why people don't.


Not sure where this is coming from. Most AAA studios have been using Perforce for decades. There is a smaller contingent using Git, mostly mobile and indie games. UE is not different in that respect. We have to deal with large, versioned binary assets in game development, something that Git struggles with (without things like LFS that are hacking around the architectural problem).

I don't know what to say about a studio not using source control in 2021. I see student projects regularly with full version control and CI.


That was my point. You need Perforce or Git-LFS so it's very different than what regular developers are used to. When I see "how can you not use source control in 2021?" it's mostly from people that don't understand that you can't just do a "git init" and be done with it.

I'm involved in the indie gaming scene and not using source control is still pretty common as far as I can tell.

UE and Unity both have further issues that make source control complicated, as illustrated by the sibling comments.


What makes it complicated? I don't do games development so I've no idea how the project structure looks.


In addition to large binary asset files, in Unity you have tons of yaml files that you don’t edit by hand. If you make a small change to a scene with the editor, it can change dozens of lines in the corresponding yaml file.

Now imagine 2 people working on the same scene. When you attempt to merge, you have to read and understand an auto generated file that was never intended for human consumption.

To a lesser degree, it’s analogous to using line based source control on a jpeg to handle merging edits made in photoshop.


PHP's Composer package manager has this same problem. Basic Drupal updates used to be a total pain in the ass - any attempt at a Git merge would most likely corrupt Composer state as the default merge drivers don't have a clue how to generate valid JSON.

There's a custom Git merge driver for composer.json and composer.lock files. It can't generically merge all JSON but it's good enough to make merging upstream updates on a composer-managed project at least sort of tenable.

This is entirely a tooling and time problem: it would be possible for Unity to ship with a Git merge plugin for their file formats, and add merge-conflict resolution to their editor, which would enable using Git with Unity. They're an ECS, which means all their objects and components of those objects should (and I stress, SHOULD) have unique identifiers that they can use for resolving merges. However, I imagine that isn't terribly easy to write and they probably have better uses of their time since everyone likes using Perforce anyway.


So.. how is source control done then? Centralized/lock-based (subversion)? Everyone works off the same shared NFS drive? Manually copy "changed" files from multiple people over top of each other and hope for the best?


Centralized/lock-optional, and very large repositories.


Merging Unity files is somewhat tractable. You can try to resolve merge conflicts by hand (it sometimes works!) and you can split things apart into prefabs.


Sounds like the yaml shouldn't be in source control. I've always followed the guideline: If it's generated by the computer from some other thing, then put the other thing in source control.


It's not generated from another thing. It's the editor's save format, so it's the direct output of human work. It's just not in a 'genuinely' mergeable format.

The fact is, that we take VCS for granted as programmers, but the vast majority of people don't have access to workflows that allow for true multi-user collaboration or branching/merging. Google Docs was a revolution because even though real-time joint editing is not that great and no substitute for the "work independent, review, merge" workflow developers use, it's still better than emailing Word docs to each other. Which is BTW still extremely common even in firms where they have Office 365 - most people never learned about the sharing and collab editing features.

On a game most people aren't devs. So VCS is less relevant to them, and git especially so, for the reasons someone else discusses below. Note the mention of Perforce in the article. Why are they using expensive proprietary VCS? Well, Perforce is better at handling fully centralised workflows with large binary assets.


I'd be interested to hear the answer as well!

I am neither an Unreal nor games developer, but in case nobody provides a better answer...

From what I've heard, what makes source control tricky for game development is all of the non-text files.

Git's distributed "everybody has a local copy of the entire repo" approach is not well suited for binary assets, especially large and frequently changing ones. Nearly any "modern" game development project will have gigabytes if not terabytes of these. Imagine you're a game coder and every `get fetch` grabs another 400GB of textures from the level designers.

Last I heard many game dev shops still used Subversion instead of git for precisely this reason.

There are also workarounds/extensions for git; not sure of their maturity/adoption.


I've been in games for about 9 years. In my experience, virtually everyone uses Perforce. Studios that use Subversion seem to be the outlier.

You're dead on with the rest though. I'll add that UE4 is a large, hulking behemoth that generates a metric crap ton of intermediate files during builds and general development, which makes managing what to check in a bit of bear. Additionally, a lot of teams try to avoid having artists need to re-compile their editor when they pull changes from P4, so some amount of compiled binaries get checked in as well (usually from an automated build system like Jenkins or Team City that runs after every source file commit).

There are also some parts of the engine that need to exist in order for other parts to run properly, but that don't get compiled when you're making changes to the engine itself (like helper programs, or platform specific DLLs, which you need to specifically compile when you want them). Sometimes, these binaries are checked into p4 for one reason or another, which leads to fun things like needing to remember that if you're checking in the DotNet binaries folder, to explicitly NOT check in a couple iOS related DLLs, since the engine's build tool will try to recompile them every time you build a game for any platform, and will fail if those files aren't writeable. The engine is so massive at this point that there several other things like this that need to be kept in mind when setting up version control on a project.

[edit: I originally said there were "at least 20" things like this and realized I couldn't think of that many, so I've revised]

The engine is very capable (there's a reason that AAA studios without an in-house engine default to it), but it's also got 20 years of legacy systems in it and a lot of pain points to deal with for projects of any real size.


Is there any room for a new company to make a clean break and offer something completely new, or would that take decades of engineering work?


More likely Epic themselves would offer source control. I run my indie studio off UE4 on git lfs, which I can do since we are small.

Perforce and Perfoce centric workflows tend to be broken not because of UE4 but because of bad workflow design. Take our parent poster's example: using source control for binary build distribution. It does not matter what engine you use if you abuse the source control to distribute full builds your workflow will have pain points.

Same deal with the example for iOS binaries. This is related to Perforce locking those files: "Doctor, it hurts when I stab myself, but my workflow requires it". Don't checkin build files is the "no duh" answer. Such "no duh" answers from indies usually result in "but you do not understand our unique insane required workflow" responses from big team engineers.

In the future when you hear a weird non-sense workflow from AAA teams keep in mind part of the pain is self-inflicted. Big teams try optimizing for different goals, like not requiring artists to compile code. Yet do so in weird half-measures.


Are there "best practices" for dealing with the unique requirements of source control for game development shops?

Like for example, for non-game shops....

Back in the 2000s there were a lot of ways to use/abuse git as it was fairly new. Gradually folks tended to settle on the "git flow" workflow, and then later the "github workflow".

It's maybe not the right workflow for everybody, but it's kind of a sane default for most projects/teams. And it generally pairs well enough with standard Jenkins/etc build pipelines.

Is there anything like that w.r.t. source control in the game dev world, or is it truly the Wild West with every shop having their own bespoke source control and build pipeline?


Thanks for the great answer!


Why not just split the assets and the code, have the code reference a version of the assets, which are on a separate server and only fetched when needed?


That's basically what git-annex is, but I guarantee that companies with this problem will be doing something disastrously incoherent with file shares and document_final_2 names instead.


I thought Git LFS had largely replaced git-annex these days? (I haven't used either, just going off [1].) Apparently git-annex never really supported Windows, which is a bit of a dealbreaker for gamedev.

[1] https://stackoverflow.com/questions/39337586/how-do-git-lfs-...


Such a separation makes sense to me (a non game dev) but the assets need version control too, though, so that doesn't really solve the problem.


Raw assets do, built versions of assets and binaries don't. Most workflows I've seen cache built data outside of source control. It needs to be in sync with the source but not versioned. Downloading a 10GB+ built map data file over SMB is a sensible option - much faster than via Perforce. I've even seen bittorrent solutions for this before.


That's the same thing as not using version control, given that the point of using UE4 is that you don't have to write much code. Note the repeated references to "blueprint spaghetti". Blueprint is a visual DSL for game designers. The assets end up encoding much of the game logic.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: