Hacker News new | past | comments | ask | show | jobs | submit login
Confessions of an Unreal Engine 4 Engineering Firefighter (2018) (allarsblog.com)
194 points by mooreds on April 11, 2021 | hide | past | favorite | 119 comments



> By the Way, All Your Subordinates Are Lying to You (Because You Don't Listen)

> One very interesting thing I had found, when I came into an office as an outside objective evaluator and you had left me to do my job, was that approximately 2 nanoseconds later your employees were already telling me everything that was wrong about you, your company, and your development methodologies. It was not a matter of having a trusting face, or even promising them to make things better, it was about your employees who cared about the growth of the company, and more importantly about their own growth within a company despite feeling unheard and ignored. I just happened to be a new face that wouldn’t judge them for what they would say, and even if I did, often they are so tired of the problem being present that they would even just stop caring about any potential consequences of speaking out against their employer.

Whenever I read something like this I wonder if I've just been extremely lucky with my employers, because I see it so often it sometimes feels like it's the norm (despite obviously not benefiting anyone). It should be noted that I don't work in the games industry though


I've encountered it a little bit, and in both cases I left as soon as I realized I wasn't going to make any more headway.

But the key is here:

> They immediately asked who was giving up this information and wanted to crack down on this perceived insubordination, rather than trying to address the issue

One of the key elements of the "boss" social role is to enforce subordination: people doing what you tell them to do. This is kind of intrinsic to the interpersonal dynamic, and it's quite hard to avoid doing at least some of the time even if you're aware of how it can be a problem.

The problem with people doing what you tell them to do is that what you tell them to do may be wrong or incomplete. You then have two choices:

- admit error. This is uncomfortable.

- increase the effort to subordinate the person so the boss does not lose face.

There are all sorts of examples and case studies of this kind of thing in command situations. Here's it causing a plane crash: https://www.cnbc.com/2014/02/10/asiana-airlines-to-pursue-co...


To very large degree Boss/Management role is to support (indirect control) allocation of human capital resources to tasks in relation to objectives with minimal inefficiencies such as; conflicts, overlap in tasks, miscoordination.

I would argue there is virtually no justification to manage people by means direct control, virtually for any team/group sizes.


That's the organizational role. The interpersonal, social one is often a lot more primitive and has nothing to do with capital or resources.


Sorry, for being so direct, but are we still talking about small business/enterprise organization or family and friends ?

> has nothing to do with capital or resources.

Is most confusing point for me.

In general, if you meant enforcing compliance, I would have different response, but you clearly did not mean that in your original post.


In any organisation, although it's more likely to be bad in small ones. I'm talking about the difference between the formal, process-related aspects of management and the social/anthropological/dominance aspects. All the stuff that matters once voices get raised and feelings get high.

To tie this back to the original post: why is there information that subordinates have that would potentially improve the business, that they want to give to their seniors, but they don't? It's because that information is uncomfortable.

Also, all the stuff in the original article below "Incredible amounts of harassment of any kind". Why do business deals get done at strip clubs in the first place? It's not in the company's interest, it happens because those involved have enough power to use company resources for their own purposes and enjoy all the little displays of dominance involved in such an outing.


> social/anthropological/dominance aspects

Are highly subjective when compared to "formal" management functions.

> All the stuff that matters once voices get raised and feelings get high.

This is clear aspect of conflict resolution, if management process is optimal such situation are rare cases. Undoubtedly, there are cases where you have to change tone to get point across, this only can be rationally justified if all other means fails.

> why is there information that subordinates have that would potentially improve the business, that they want to give to their seniors, but they don't? It's because that information is uncomfortable.

This is only one of the so many reason why this can happen, in general, learning, negations, feedback, are all uncomfortable processes to a degree. The major factor why there is failures to improve organization process is shared understanding that is bases of communication, in addition to risks, responsibility individual mechanisms.

> Why do business deals get done at strip clubs in the first place?

There some aspect important aspect of social interaction before business deal similarly as there is dating before marriage.

> It's not in the company's interest, it happens because those involved have enough power to use company resources for their own purposes and enjoy all the little displays of dominance involved in such an outing.

Such narrow view is simply false.


That seems extreme. Managers have to lead and make decisions. If they don't then you rapidly end up with chaos. That's how some firms end up with 10 different languages being used, where no engineer can work on anyone else's codebase because there's no consistency about anything, where progress is minimal because decisions don't get made due to some people disagreeing with each other and having no way to break the deadlock, etc.

I think it's become sort of fashionable to claim bosses are always clueless or shouldn't actually try to manage because their employees are always smarter than they are. The "servant manager" idea. That is dead wrong in my experience. Or rather, if the manager has no idea how to do the work their employees are doing and just sits out that part of their role, the team has much bigger problems.


>Managers have to lead and make decisions.

No, managers have to create the processes by which decisions are made and then make sure the processes are followed. One way of doing that is to make the process be to do whatever they say but it's not the only way.

A manager being the single point of final decision making doesn't scale since they will eventually not be knowledgeable enough to make a good decision and be bottleneck to any decisions being made. The only manager who is smarter than their team in regards to every decision is one who hires only idiots.


If that were true choice of CEO would be irrelevant or pointless as long as they had some sort of generic manager training. But in reality CEOs can make enormous differences. The textbook example of a non servant manager being Steve Jobs of course but most successful tech CEOs are famous for micro-managing some aspects of the business.


You seem to be really underestimating the difficulty of what I wrote. First of all, workable processes differ based on industry and the strategic goals of the company. If you apply the Google way of doing things to Pepsi you won't have a good time. Second of all, ensuring processes are followed when you have very intelligent managers under you trying to bypass them for their own benefit is amazingly difficult. Third of all, managers are also in charge of hiring and firing which ties into all of this.


> The only manager who is smarter than their team in regards to every decision is one who hires only idiots.

Reading this reminded me of quote from Skunkworks PM (if I recall correctly); "I only want to hire people who are smarter then me."


Concept of empowerment within a context of mission command and science of control, arguably, is by far the most complex matter in humans.

> Managers have to lead and make decisions. If they don't then you rapidly end up with chaos. That's how ...

I thought I made this point clear in terms of inefficiencies example.

While it is oversimplification I attempt to exemplify this using spatial navigation analogy.

Owners select area and resources.

Managers of Managers who make decisions on approach to area and to a degree by what resources.

Operational Manager make decision on routes way points a given resources.

Workers will make decisions in between way points.

With such example: one can argue this should be top down one way hierarchical approach, the old school inefficient approach. (eg. Direct Control)

In my perspective, modern approach is to utilize indirect control by establishing conditional triggers (eg. only specifying limitations on areas, approaches, routes, and way points) based on horizontal feedback (top/down or down/up hierarchy) and vertical feedback (side way communication).


Have you formally managed people before?


Throughout life in small business environments 2-50 people and multiple times in emergency management exercises up to ~100 people.


In any innovative field if a manager is a full time manager (not a tlm) then after rather short amount of time they really have no idea beyond some universal process/common sense things. This is not the same thing as “clueless” provided he/shes realizes this and goes to sr engineers for council.


I have explicitly instructed my team to let me know if I ask them to do something stupid so we can avoid the mistake. I was so excited the first time someone spoke up to alert me that I was overlooking a better option & made sure to celebrate the alert in order to keep the transparency going.

There may be teams where this direct approach wouldn’t work out, but so far it’s seemed to help.


The boss role does not have to enforce subordination, there's multiple styles of management which encourage self-leadership in terms of what people work on and how they do it. I'm personally a fan of David Marquet's "Leader-Leader" style.

But, because so many people expect to be subjugated in their roles it does require some hurdles to get past normal societal expectations and not everyone is a fit for it, some people just want a task list and to be told what to do. I think the world could be a better place if we could effectively help people sort into the companies with the leadership style that best suits their working style.


> so many people expect to be subjugated

Was thinking something along the same lines. I work in an environment where a tiny bit of self initiative does very well for one's work and career, and still some people can't get it over their heads. Unless something is very clearly stated somewhere, they won't do it no matter how obvious it is, even people who are trying to get a promotion. I swear sometimes I think people would let the office burn because "yelling fire!" is not of their stated position responsibilities.


This primarily can be attributed to how people perceive risk and risk aversion mechanisms, don't think this would be false to assume it also relates to responsibility.

Self initiative people willing to take more risk and thus responsibility if things don't work out, where not self initiative people if things don't work out, take substantially less responsibility by default.


The old joke that "Consultants tell you what you already knew but didn't want to face." has been around forever for a reason. It's even an accepted tactic by business units to hire a consultant to document all these known problems as a means to force themselves to clean up their own act.


I have a close friend who's a director. I've always treated him like an equal but one day I told him that he was never getting the whole story. He felt devastated because he thought it was him but it's not him, it's just the title.

Part of getting the fancy title is understanding that you will never get the truth from subordinates, you will get a sugar-coated version of the truth that is close enough to the truth, but not so close that you'd rage and fire someone on the spot. Even if you've never even do that in a million years, somewhere, somehow, someone who is giving you good and bad news will always sugarcoat the truth on the bad news yet hype up the good news.


It’s only natural though, one way to play the salary game is to maximize your perceived involvement and truck factor, while reducing actual efforts needed, all while avoiding getting upsetting colleagues.

Workloads at full capacity, with stakes distributed precisely according to power structures, with zero meaningful/impactful output thus zero uncertainty, is Pareto optimal, if you view a corporate as a salary distribution game that you’re a player of.


More often than not technical issues are in fact management issues. Plenty of evidence for that.


Sometimes it's an easy conversation starter. People like to bitch about work and speaking about all the things they dislike at the workplace. Often times it is pointing to an underlying problem though and some communication breakdown at the least.


>It should be noted that I don't work in the games industry though

Well there's yer problem!

The games industry is basically terrible - the management is full of people who don't understand the industry they're in and see their employees as cheaper-than-slave labor and their customers as open pocketbooks to take. In management's defense, nobody understands the game industry; it's ridiculously hit-driven and producing a "good" game involves taking on as much unmitigated risk as possible. The only risk mitigation strategy you can adopt is diversification - funding as many projects as possible. But that went away along with the PS2 when game development budgets skyrocketed and audience numbers plateaued.

Each one of the factors I mentioned above can be gone into with great detail:

1. "Cheaper-than-slave labor": Game studio employees are frequently replaced with fresh hires to keep costs down; expertise is a liability in an industry that remakes how to build a videogame every 5-10 years. You can survive this but it's at the expense of your own health and sanity. You will be abused, mentally and physically, by your studio for the privilege of working at a games company until you quit.

2. "Ridiculously hit-driven": Game studios can spend literal billions of dollars on a project nobody likes. Management wants to do this because games that don't screenshot well don't market well, so you need literal armies of artists to texture every tiny detail in these games and you need them to work 996 so the project ships anywhere close to on time. But that also increases risk which means we need the games to cost more, both in terms of up-front price and after-purchase DLC and microtransactions.

3. "No diversification": Diversification works well when your audience is large relative to your production costs. This hasn't been true since the PS2 era; most companies struggle to afford to make even one game that's up to modern quality standards - now you want to make hundreds of them so we can just "see what works"!? In fact, if you look at the libraries of, say, the PS3 to the PS4; you can almost spot the point at which publishers decided to just focus on one or two online games (which monetize very well) rather than tens or hundreds of projects (which are one-and-done things).

This has resulted in an extremely abusive working culture and I highly advise everyone I know getting into software development to not touch videogames with a ten-foot pole. Hell, I wouldn't even recommend it as a hobby: the "more, more, more, better, faster, and yesterday" mentality has even sunk into the people who play videogames. If your boss isn't berating you for not fixing up his fuckups fast enough, your customers will be literally threatening to kill you for having the balls to delay a broken game.

This has all been known since the early 2000s (look up easpouse on Wikipedia sometime if you want to know how much HASN'T changed); the entire games industry needs to crash and it needs to crash yesterday before any of this can be fixed.


Very interesting article. Some of these are shocking:

> Use Source Control: Do you not use source control?

How do professional companies not use source control in the year 2021? The mind boggles.

> Smaller Commits, Better Commit Logging: Submitting a 2GB 2000-file change to Perforce with the description "did some work" is terrible

Also quite surprising. If you are going to log your progress, you might aswell properly document what actually changed.

> Don't Ignore Your Lead Engineer if You Have One: If your lead engineer says that you need to spend $2,000 on SSDs or buy X license for Y, and your answer is immediately no because you believe your engineer is just trying to waste your budget, you either need to learn how to trust your engineer or find a new engineer. I have seen a company waste 70 man-hours a week simply because they had their workers on slow hard drives. Giving everyone faster SSDs would have cost the company ~60 man-hours in budget.

This one really hits home and something that I personally had to deal with. I had to complain and fight for a long time to get the company to actually buy a SSD (or another job critical piece of time-saving hardware). Usually its not management saying no to such requests, its more of a strong general attitude of saving money and keeping expenses low. I have quickly learned that this level of penny pinching is a red flag and you should bail out as fast as you can because it rarely gets better and that attitude is rarely limited to one department.

Also related: Sometimes its about money but in a very indirect way. The amount of time I had to waste waiting for slow builds to complete because the corporate Anti-virus scans every new generated file is staggering. What should be a 10 minute build was over 1 hour, all due to the mandated new corporate anti-virus. Corporate IT could not adjust the security policy because they are understaffed because management laid off 80% of the IT team causing a huge backlog of work which then cascades to every other part of the business. Then they wonder why they get negative feedback during workplace reviews.

---

A thing I quite liked: the hierarchy ranking of engineers and distinction between senior and lead engineer.

--- Also, does anyone know of places where I can read similar war stories? These articles are always super interesting to read and learn from.


Regarding penny-pinching: at Amazon in 2013, the rule was that a sw dev could have one 24-inch monitor, but any second monitor had to be 19 inches or smaller. My manager gave me a second 24-inch monitor, and so I went to the hardware support guy to get an appropriate DVI cable. The support guy refused to give me a cable because I had an unapproved excessively large monitor. He could only give me the cable if he came to my desk to confirm that my second monitor did not exceed 19 inches.

All this haggling and wasted time over a monitor that cost about $100 more. I just bought my own cable (from Amazon!)

I guess they had a pile of ancient 19-inch monitors from years ago and were determined to "used them up" before spending more money buying new monitors. If that means that highly paid SW devs are less productive because they're squinting at late 1990s monitors, then so be it.

In my experience this was typical of the penny-pinching Amazon mindset. As a result, I did not work there for very long.


I was working as a contractor for AT&T years ago. The issued me the same laptop that everyone got. Of course, it doesn't have much RAM and I'm somehow lassoed into supporting the current implementation of Business Objects. It literally took long enough starting up BO that just 2 starts (let alone any actions) would have paid for upgrades in RAM. Over the course of two years, I was paid literally a couple thousand watching my computer load Business Objects.

They could approve overtime on a contractor, but couldn't approve a couple hundred for a RAM upgrade, what with it being non standard and all.


It’s possible they do other things as well, but is it _really_ true that a Dev is more productive with a second monitor that’s 24” as opposed to 19”? I used to be a huge fan of multi monitor setups but have gone through years of having 2 4K 27”, as single 4K, or just plain old MacBook screen only. I can definitely attest that a laptop screen alone is not good, but am honestly not sure about the added benefit of a second large monitor, leave alone due to its size. If anything I probably was less productive when I had two large screens! Furthermore the most prolific engineers I know of rarely has more than one large monitor if even that.

However, does having two large monitors make me feel good? Heck yes! So in terms of engineer morale it might help but I doubt you were measurably less productive due to this handicap on actual non emotional basis.


> So in terms of engineer morale it might help but I doubt you were measurably less productive due to this handicap on actual non emotional basis.

Except they probably were measurably less productive (if anyone bothered to measure), because software engineering runs on emotion and engineering morale. If you're doing anything more complicated than retyping code from printouts, how you feel will determine your speed and ability to focus. So even if the only thing a 24” screen did for GP was to not constantly annoy them, that's a real productivity improvement.


The point is that the discussion costs more than the purchase, so the only reasonable answer to a request to buy anything useful is "Yes, immediately, I'll buy one for every colleague who wants one". A manager who wants to save money on hardware is unprofessional (because they don't understand what is useful or not), an idiot (because they don't understand arithmetic and psychology), or both.

Being unexpectedly denied, through a three-person meeting, a $30 software upgrade was a significant factor in my last job change.


I bet ebay has some way of getting rid of old monitors. I'm surprised Amazon doesn't.


I think they were determined to squeeze as much "value" as possible out of those obsolete tiny monitors, by having someone use them for a few more years. Getting rid of them was not the problem.


Engineers used to talk war stories on the internet all the time. I think the switch from pseudo anon to real name social media has put a dent in that, but there are people still doing it on Twitter e.g @foone or @swiftonsecurity .. among a lot of other output.

The trick now is limiting yourself to those war stories you'd be happy to have all your future job interviewers read :/


>> Smaller Commits, Better Commit Logging

In my current company (one of top us financial institutions) widely spread habit is to use jira title as commit message, be it small jira for a bugfix or something big for twenty commits and two months of work. Just because.

>> If your lead engineer says that you need to spend $2,000 on SSDs

Our physical more or less ok workstations (no ssd though) got replaced by thin clients connecting to virtual machines. Slow in terms of raw performance and video/mouse/keyboard i/o. At the same time big suits like to blah-blah about how important technology is, how "we are not bank we are tech company with bank department" is and how much they invest in tech.

edits: grammar


Source control for UE games is a different beast. Should still be done of course but a lot more tricky, so I understand why people don't.


Not sure where this is coming from. Most AAA studios have been using Perforce for decades. There is a smaller contingent using Git, mostly mobile and indie games. UE is not different in that respect. We have to deal with large, versioned binary assets in game development, something that Git struggles with (without things like LFS that are hacking around the architectural problem).

I don't know what to say about a studio not using source control in 2021. I see student projects regularly with full version control and CI.


That was my point. You need Perforce or Git-LFS so it's very different than what regular developers are used to. When I see "how can you not use source control in 2021?" it's mostly from people that don't understand that you can't just do a "git init" and be done with it.

I'm involved in the indie gaming scene and not using source control is still pretty common as far as I can tell.

UE and Unity both have further issues that make source control complicated, as illustrated by the sibling comments.


What makes it complicated? I don't do games development so I've no idea how the project structure looks.


In addition to large binary asset files, in Unity you have tons of yaml files that you don’t edit by hand. If you make a small change to a scene with the editor, it can change dozens of lines in the corresponding yaml file.

Now imagine 2 people working on the same scene. When you attempt to merge, you have to read and understand an auto generated file that was never intended for human consumption.

To a lesser degree, it’s analogous to using line based source control on a jpeg to handle merging edits made in photoshop.


PHP's Composer package manager has this same problem. Basic Drupal updates used to be a total pain in the ass - any attempt at a Git merge would most likely corrupt Composer state as the default merge drivers don't have a clue how to generate valid JSON.

There's a custom Git merge driver for composer.json and composer.lock files. It can't generically merge all JSON but it's good enough to make merging upstream updates on a composer-managed project at least sort of tenable.

This is entirely a tooling and time problem: it would be possible for Unity to ship with a Git merge plugin for their file formats, and add merge-conflict resolution to their editor, which would enable using Git with Unity. They're an ECS, which means all their objects and components of those objects should (and I stress, SHOULD) have unique identifiers that they can use for resolving merges. However, I imagine that isn't terribly easy to write and they probably have better uses of their time since everyone likes using Perforce anyway.


So.. how is source control done then? Centralized/lock-based (subversion)? Everyone works off the same shared NFS drive? Manually copy "changed" files from multiple people over top of each other and hope for the best?


Centralized/lock-optional, and very large repositories.


Merging Unity files is somewhat tractable. You can try to resolve merge conflicts by hand (it sometimes works!) and you can split things apart into prefabs.


Sounds like the yaml shouldn't be in source control. I've always followed the guideline: If it's generated by the computer from some other thing, then put the other thing in source control.


It's not generated from another thing. It's the editor's save format, so it's the direct output of human work. It's just not in a 'genuinely' mergeable format.

The fact is, that we take VCS for granted as programmers, but the vast majority of people don't have access to workflows that allow for true multi-user collaboration or branching/merging. Google Docs was a revolution because even though real-time joint editing is not that great and no substitute for the "work independent, review, merge" workflow developers use, it's still better than emailing Word docs to each other. Which is BTW still extremely common even in firms where they have Office 365 - most people never learned about the sharing and collab editing features.

On a game most people aren't devs. So VCS is less relevant to them, and git especially so, for the reasons someone else discusses below. Note the mention of Perforce in the article. Why are they using expensive proprietary VCS? Well, Perforce is better at handling fully centralised workflows with large binary assets.


I'd be interested to hear the answer as well!

I am neither an Unreal nor games developer, but in case nobody provides a better answer...

From what I've heard, what makes source control tricky for game development is all of the non-text files.

Git's distributed "everybody has a local copy of the entire repo" approach is not well suited for binary assets, especially large and frequently changing ones. Nearly any "modern" game development project will have gigabytes if not terabytes of these. Imagine you're a game coder and every `get fetch` grabs another 400GB of textures from the level designers.

Last I heard many game dev shops still used Subversion instead of git for precisely this reason.

There are also workarounds/extensions for git; not sure of their maturity/adoption.


I've been in games for about 9 years. In my experience, virtually everyone uses Perforce. Studios that use Subversion seem to be the outlier.

You're dead on with the rest though. I'll add that UE4 is a large, hulking behemoth that generates a metric crap ton of intermediate files during builds and general development, which makes managing what to check in a bit of bear. Additionally, a lot of teams try to avoid having artists need to re-compile their editor when they pull changes from P4, so some amount of compiled binaries get checked in as well (usually from an automated build system like Jenkins or Team City that runs after every source file commit).

There are also some parts of the engine that need to exist in order for other parts to run properly, but that don't get compiled when you're making changes to the engine itself (like helper programs, or platform specific DLLs, which you need to specifically compile when you want them). Sometimes, these binaries are checked into p4 for one reason or another, which leads to fun things like needing to remember that if you're checking in the DotNet binaries folder, to explicitly NOT check in a couple iOS related DLLs, since the engine's build tool will try to recompile them every time you build a game for any platform, and will fail if those files aren't writeable. The engine is so massive at this point that there several other things like this that need to be kept in mind when setting up version control on a project.

[edit: I originally said there were "at least 20" things like this and realized I couldn't think of that many, so I've revised]

The engine is very capable (there's a reason that AAA studios without an in-house engine default to it), but it's also got 20 years of legacy systems in it and a lot of pain points to deal with for projects of any real size.


Is there any room for a new company to make a clean break and offer something completely new, or would that take decades of engineering work?


More likely Epic themselves would offer source control. I run my indie studio off UE4 on git lfs, which I can do since we are small.

Perforce and Perfoce centric workflows tend to be broken not because of UE4 but because of bad workflow design. Take our parent poster's example: using source control for binary build distribution. It does not matter what engine you use if you abuse the source control to distribute full builds your workflow will have pain points.

Same deal with the example for iOS binaries. This is related to Perforce locking those files: "Doctor, it hurts when I stab myself, but my workflow requires it". Don't checkin build files is the "no duh" answer. Such "no duh" answers from indies usually result in "but you do not understand our unique insane required workflow" responses from big team engineers.

In the future when you hear a weird non-sense workflow from AAA teams keep in mind part of the pain is self-inflicted. Big teams try optimizing for different goals, like not requiring artists to compile code. Yet do so in weird half-measures.


Are there "best practices" for dealing with the unique requirements of source control for game development shops?

Like for example, for non-game shops....

Back in the 2000s there were a lot of ways to use/abuse git as it was fairly new. Gradually folks tended to settle on the "git flow" workflow, and then later the "github workflow".

It's maybe not the right workflow for everybody, but it's kind of a sane default for most projects/teams. And it generally pairs well enough with standard Jenkins/etc build pipelines.

Is there anything like that w.r.t. source control in the game dev world, or is it truly the Wild West with every shop having their own bespoke source control and build pipeline?


Thanks for the great answer!


Why not just split the assets and the code, have the code reference a version of the assets, which are on a separate server and only fetched when needed?


That's basically what git-annex is, but I guarantee that companies with this problem will be doing something disastrously incoherent with file shares and document_final_2 names instead.


I thought Git LFS had largely replaced git-annex these days? (I haven't used either, just going off [1].) Apparently git-annex never really supported Windows, which is a bit of a dealbreaker for gamedev.

[1] https://stackoverflow.com/questions/39337586/how-do-git-lfs-...


Such a separation makes sense to me (a non game dev) but the assets need version control too, though, so that doesn't really solve the problem.


Raw assets do, built versions of assets and binaries don't. Most workflows I've seen cache built data outside of source control. It needs to be in sync with the source but not versioned. Downloading a 10GB+ built map data file over SMB is a sensible option - much faster than via Perforce. I've even seen bittorrent solutions for this before.


That's the same thing as not using version control, given that the point of using UE4 is that you don't have to write much code. Note the repeated references to "blueprint spaghetti". Blueprint is a visual DSL for game designers. The assets end up encoding much of the game logic.


Since I don't often see UE4 related posts on the HN front page but love the technical community here, gonna go ahead and take the opportunity to ask what people in the industry think about current job opportunities for entry level folks transitioning from web development work.

I know the old story about conditions being worse and avoiding turning your passion into your job/hell, but anyone able to chime in?

I'm working on a VR game as a learning experience and using UE4 with the hope that as AR/VR grows more popular over the next decade I'll have set myself up for moving to the field, but I don't know how early I might be able to make that move or how quickly opportunities are growing. A few years ago my hope was that by now VR would have grown to the point that some of the digital agencies here in NY would have work doing branded VR experiences or some type of VR work for non-gaming related VR uses / for non-technology companies. But I haven't necessarily seen VR roles popping up at agencies / marketing firms.

Anyone with any insight on VR or UE4 job market general trends?


(I am not a game dev but work at a games startup with a bunch of big industry people) To be successful at finding employment in games with UE4, you have to be able to use create game systems without using their blueprint feature. It's insanely powerful, and people really like it for rapid prototyping, but relatively frequently blueprint devs can't transition to C++/"raw" game code. The same is true for pretty much any game engine - the ability to make games beats the ability to use editor features.

If you're looking for entry level work, I would say having a wealth of items in your portfolio is helpful (across several genres/styles, meaning get some work out there that isn't VR/AR related). Also remember that paying gamedev work is usually nothing like making games for yourself. Think hard about what and why you enjoy making games. Are you comfortable working on something like a desktop version of Candy Crush? Or a poker game? In my experience (on the internet) you hear most about A.) AAA Crunch, and B.) Massively successful or massively failed games. The middle is where most employment happens. Mediocre games making modest sums of money.

If I was interested in working with UE4 for games I would kinda just tailor my resume to working at Riot. Valorant is written in UE and they're a huge name/very established business, so you'll be shielded from some/most of the garbage of the games industry.

The job market is good. Lots of AAA studios are moving away from custom engines into UE/Unity because of the advancements in the engines making them more hireable/popular.

Just my 2 cents, and I'm certain others have wildly different experiences.


>> To be successful at finding employment in games with UE4, you have to be able to use create game systems without using their blueprint feature. It's insanely powerful, and people really like it for rapid prototyping, but relatively frequently blueprint devs can't transition to C++/"raw" game code.

You make it sound as if you would inevitably hit a brick wall when using blueprint. This is not true. Game design is different to programming and some coding skills are almost a requirement for blueprints. Lots of game logic is better implemented as blueprint than c++.


If your only, or primary, experience building games was scripting systems in blueprints (or analogous visual scripting language), you wouldn't get hired as an engineer at most large studios. There is a role for that on those teams: technical designer. The (gameplay) engineer exists to build large systems that might have hooks that allow designers to script/customize the functionality through a visual scripting system. Engineers provide their value lower in the stack.


> You make it sound as if you would inevitably hit a brick wall when using blueprint. This is not true.

In my experience, devs that spent all their time with blueprints did not transition well away from blueprints. YMMV.


Maybe they should not transition away from blueprint.

I get that beeing aware of pitfalls and manage teams to do things „the right way“ can be very tricky.


I'm not really interested in arguing with you. There are times when Blueprints are insufficient, and if a team is comprised exclusively of Blueprint devs, that team will be unsuccessful. This post was about UE employability, to which I added my input that being versatile is important, which from an employability standpoint is pretty objectively true.

Blue prints are great! People should use them where appropriate, and use something else when not.


Visual coding is still in its infancy. We have decades of experience for how to design systems with textual code, and a ton of tools (IDEs, static analyzers, refactoring tools) that work with text.

The fact that some systems are better implemented as text is not an indictment of visual coding as much as it is a testament to the maturity of text as a language for coding.

Game design is only on part of the problem anyway—often a feature which is simple from a design standpoint is complicated to implement in code. This is why you don’t just hire game designers who dabble in coding, but also programmers.


> Visual coding is still in its infancy. We have decades of experience for how to design systems with textual code, and a ton of tools (IDEs, static analyzers, refactoring tools) that work with text.

Visual coding has been around since at least the 1970s; we have decades of experience with it, too.


And yet, in that time period, it has remained in its infancy.


Or, extensive experience has shown that it is best used as an adjunct to textual programming.


Exactly- you hire both and for good reason. The point is that there is absolutly no reason for somebody not to be successful in game design working with blueprint.


Ideally C++ programmers should know how to make blueprint components wrapping their C++ code, and understand blueprint programming enough to know how to make usable blueprint apis.


> but relatively frequently blueprint devs can't transition to C++/"raw" game code.

I agree that if you are looking to become a gameplay or engine programmer, demonstrable C++ (or C# for Unity) is essential. Bear in mind that js or python, and sometimes even C# are often considered as scripting languages for non-programmers.

Other roles might be closed to your skillset though. Some backend/server /tools stuff might use python. Some engines use a streamlined form of javascript to drive the UI. Or there's UI/UX design roles.


Been in the industry for a decade and am a hiring manager for engineers. Overall, the industry has never been better and there are plenty of job opportunities available to work on a wide variety of games.

In terms of general advice for junior engineers, I wouldn't get too caught up on technology (ie engine). The sorts of things I am looking for are: do you have strong computer science fundamentals? have you taken game related projects from conception to completion? how well do you work within inter-disciplinary teams? and do you evaluate your work from a user's (ie player's) perspective?

Unreal dominates AAA for third party studios and I don't think that will change in the near future. Whether Unreal expands its dominance to mobile is an open question, but with more HD games going to mobile (or starting development with HD and mobile as platforms) I would probably bet on it. Unity probably isn't going anywhere in the near future but I don't expect it to shake Unreal's hold on large game production; they're simply too far behind.

I am fairly pessimistic on VR as a whole, but there will likely still be a small market there for the foreseeable future. With Sony recommitting to VR for PS5 and Oculus continuing to pump some (albeit much smaller) money into the space, I expect VR to remain a stable niche. AR is a lot harder to predict as Apple is the big unknown. There are a lot of companies ready to jump on that opportunity (ie Niantic) if someone can figure out the right hardware.


Dont shift your professional life towards vr or ar. You will be disappointed. I see lots of opportunities popping up with UE4 at the moment. If you enjoy that and have web design or coding skills go ahead. You probably dont want to dive straight into character animation but there are lots of other interesting fields around UE4.


“I know the old story about conditions being worse and avoiding turning your passion into your job/hell, but anyone able to chime in?”

I think you already know the general culture of the work you want to get involved in. VR projects are usually side projects and considered nice to have rather than need to have. There haven’t been “killer apps” recently which has driven massive adoption from industries with deep pockets such as infrastructure or transportation. As much as I respect media and entertainment, the pool of dollars are hotly contested. It better to work for other industries.


I'm not convinced VR is set to grow a whole lot. I think that the VR that we were looking forward to back in the The Lawn Mower Man / Matrix days is a long way off. All we have today is just TVs that you glue to your eyes which aren't really close to what we've been fantasizing about in sci-fi for decades.

* Most headsets probably aren't good enough to not cause headaches after about 2 hours. I also think they could lead to eye problems.

* Right now we only have a very limited subset of senses available. (Touch, Taste and Smell are still a long ways away from being available.)

* The way to interface with VR is still very crude. Still uses a traditional controller or at best some Kinect like interface. Which would have the problem of needing a expensive treadmill like device to prevent you from bumping into walls.

* The headsets are still expensive. The hardware it takes to run games capable of taking advantage of the headsets is expensive. The cost of developing those games is also crazy expensive.


VR has fundamental usability problems. If you and the world move separately, people get disoriented. Most of the successes involve a virtual world locked to the physical world (Beat Saber) or where the player sits in a vehicle. Still, there are dancers using VRchat with full body tracking, the closest thing yet to full dive.

AR, in some ways, has more potential. In AR, people can still see where they are. It's mostly a hardware cost problem. If you could get AR goggles down to swim goggle size and selling for $39.95, you could do Pokemon AR.

I'm interested in big virtual worlds. Lately, that field has been invaded by the Make Money Fast / NFT / ICO crowd. They've been building crappy virtual worlds with overpriced virtual assets. I'm not sure how this ends.

Second Life has found a new big success. They added an area of "premium homes" two years ago. If you sign up for a paid membership, which is about $100/year, you get a nice upper middle class American house in a nice neighborhood. The neighborhoods are well laid out, nicely landscaped, and have several distinct styles. You can decorate your house, invite people over, have parties, visit other people, walk or drive around, or just hang out in your house. They've built over 60,000 houses so far and are struggling to keep up with demand. The neighborhoods are all hand-built, not just stamped out from some template.

These are planned unit developments in a virtual world. It's something people stuck in some crappy apartment want - the American Dream. [1]

This isn't the only option - you can buy raw land outside the planned unit developments and build your own thing. But many people just want to buy a nice virtual lifestyle. Something nobody seems to have anticipated is that there's a market for rather boring virtual worlds.

[1] https://youtu.be/2peH7aeuwPI


>Something nobody seems to have anticipated is that there's a market for rather boring virtual worlds.

Philip K Dick predicted it: Accessorize your Perky Pat Layout!

https://en.wikipedia.org/wiki/The_Three_Stigmata_of_Palmer_E...

>The story begins in a future world where global temperatures have risen so high that in most of the world it is unsafe to be outside without special cooling gear during daylight hours. In a desperate bid to preserve humanity and ease population burdens on Earth, the UN has initiated a "draft" for colonizing the nearby planets, where conditions are so horrific and primitive that the unwilling colonists have fallen prey to a form of escapism involving the use of an illegal drug (Can-D) in concert with "layouts." Layouts are physical props intended to simulate a sort of alternative reality where life is easier than either the grim existence of the colonists in their marginal off-world colonies, or even Earth, where global warming has progressed to the point that Antarctica is prime vacation resort territory. The illegal drug Can-D allows people to "share" their experience of the "Perky Pat" (the name of the main female character in the simulated world) layouts. This "sharing" has caused a pseudo-religious cult or series of cults to grow up around the layouts and the use of the drug.

https://en.wikipedia.org/wiki/The_Days_of_Perky_Pat

>In this novel, survivors of a global thermonuclear war live in isolated enclaves in California, surviving off what they can scrounge from the wastes and supplies delivered from Mars. The older generation spend their leisure time playing with the eponymous doll in an escapist role-playing game that recalls life before the apocalypse — a way of life that is being quickly forgotten. At the story's climax, a couple from one isolated outpost of humanity plays a game against the dwellers of another outpost (who play the game with a doll similar to Perky Pat dubbed "Connie Companion") in deadly earnest. The survivors' shared enthusiasm for the Perky Pat doll and the creation of her accessories from vital supplies is a sort of mass delusion that prevents meaningful re-building of the shattered society. In stark contrast, the children of the survivors show absolutely no interest in the delusion and have begun adapting to their new life.


I'm going to go out on a limb and say you don't own a modern VR headset ?

Although, you still could and have this opinion I'm just curious.

Honestly even if you don't own one I'm not saying we should dismiss your opinion. Honestly it arguably means more or I should say could mean more.

The thing that sticks out to me is, do you think these are good arguments for "games won't grow" or "Vr usage won't grow" ?

I'm thinking about when video games first were gaining popularity since it's a really good comparison imo.

~ * tv/game eye problems ~ * limited controls ~ * very crude interfaces ~ * expensive hardware (thinking of original pc games to even the N64 $100+)

So if we're not close to the ideal then it's not worth it ? "Matrix days is a long way off.", yeah it probably is... so ?

If you want VR of today or even in 5 years to be the "Matrix" or the holy grail of VR then yes you will be dissapointed.

All that to say this hypothesis you're proposing : "Vr won't grow a whole lot because we're far away from the Matrix vr experience" is not totally unreasonable but I personally think it's really weak and historically proven wrong in so many industries.


> Still uses a traditional controller or at best some Kinect like interface. Which would have the problem of needing a expensive treadmill like device to prevent you from bumping into walls.

> The headsets are still expensive. The hardware it takes to run games capable of taking advantage of the headsets is expensive.

The most popular VR device right now is the Oculus Quest 2, which is $299, needs no additional hardware, and uses fully motion tracked controllers (along with hand tracking). There's a boundary tracking system to prevent running into walls.

Also, it is growing a lot on a year-over-year basis, although of course the base is still tiny compared to cell phones etc.

So some of your info is out of date. I think that some of the challenges being worked on right now are content, device size/weight, and also integrating eye and face tracking into headsets. [1]

[1] https://www.theinformation.com/articles/mark-zuckerberg-on-m...


$300 is not cheap for a lot of people. It is more than I've ever spent on a computer monitor, and I use those every day!


I don't think it's exactly expensive either, relative to other devices that have seen broad adoption. Computers, smart phones, TVs, (LCD/tube) monitors, tablets etc didn't reach that price point until they were in the market for quite a while.

Sure it might need to be cheaper to be affordable to 100% of the world, but it's not even close to saturating the market in high income countries yet (or the market share of much more expensive devices).

BTW the $300 device is not just a display, it includes a fairly high end smartphone chip in addition to a high res-screen, battery, lenses, controllers, 4 cameras etc. I wouldn't expect major cost reductions anytime soon.


People drop $1000 on smartphones all the time. It might be expensive, but it's not unobtanium anymore.


> People drop $1000 on smartphones all the time.

you live in a different universe


An iPhone 12 with bumped storage and sales tax is about $1000. Apple probably sells roughly 200 million phones a year that cost at least $800? I'm not sure how many premium smartphones Samsung sells these days.

I'm not sure how you can consider that another universe? It's a pretty big market right here on planet Earth?


??? but almost no one buys the iphone at their full price. e.g. here if I want an iphone 12, it'd cost me 249€ with a mobile plan (https://m.boutique.orange.fr/mobile/details/apple-iphone-12-...)


FWIW I think it is ludicrous too. I'm just saying how it is.


> Still uses a traditional controller or at best some Kinect like interface. Which would have the problem of needing a expensive treadmill like device to prevent you from bumping into walls.

It is a shame how many games developers don't want to let people use the analogue stick to move in VR. It's uncomfortable for some people for a bit, but for the majority it's fine and lets you do all the normal in-game things. HL: Alyx is the most recent example where Valve originally didn't offer the option for simply walking and turning smoothly in VR. I get that you want your game to be accessible to the greatest number of people, but you're limiting your design space this way and leaving aside a part of the market.


FB claims[1] Quest 2 sales drove Facebook’s Q4 non-advertising revenue to $885 million, a 156% increase year-over-year

More than 60 Oculus developers have reached $1M+ revenue

Quest 2 has sold more units that all other consumer headsets combined for the history of VR.

It’s possible we are early in seeing VR cross the chasm. Or perhaps people are adopting VR due to shelter in place and will abandon it after the pandemic.

[1] https://www.roadtovr.com/2021-vr-forecast-key-indicators/


GPU shortages have hit VR at a very unfortunate time. It was just starting to take off a little bit, but now you have to trade your first-born child to get a 3080 card to really drive VR.


Hasn't Facebook effectively killed PC VR anyway? (discontinuing the Rift S, forced FB logins, and focusing on closed-platform mobile VR)


Surprising no not really. Facebook has greatly improved the oculus link functionality for the Quest. You can now play PC VR wirelessly with an application called Virtual Desktop.

I beat half life Alyx this way and it was awesome. No link cable is my preferred way to play pc vr. They killed the rift S mostly because the Quest 2 is the better unit overall.

I do suspect though they will eventually stop supporting pcvr, but today it works pretty well.


The only companies making serious money in gaming are:

Epic, Apple, Google, Steam, Nintendo, Sony, Microsoft.

National Governments who charge 10-20% sales taxes on customer spending.

Game/app development would be ok if not for the huge taxes that Platform Owners and Governments charge.

You're typically left with about 50% of what customers actually spent on your app.


I don't understand what Epic is doing on that list. Your thesis seems to be that the platforms keep all the profit. Epic's platform is a giant pit that they're shoveling money into as fast as they gian.

If you're referring to Fortnite, I don't think it's that big of an outlier.


Was the store's cut really lower back in the boxes-on-shelves days?


The margin on digital sales is a lot better than physical sales, and every large publisher has been pushing the transition to digital as fast as possible. A publisher is getting ~70% on every digital sale compared to ~50% for physical.


I think a major complaint is that for 30%, people expect some sort of marketing, but really it's nothing more than payment processing and download.

A store literally provided a box on a shelf for browsing and brought in customers to browse. The digital "stores" don't offer much in that respect.


Steam has the promoted carousel, "players like you" suggestions, curator suggestions, the discovery queue, reviews, new release announcements, the startup splash, seasonal sales, wishlist alerts, screenshots and promo videos...

It doesn't perform the same hard editorial winnowing as brick and mortar, no, and in a digital world it's hard to see why it should. The long tail has fans too and they were terribly served by the old system. But it's hardly fair to call Steam marketing "nothing".

Edit to add: the digital stores also provide an update channel, something which had no real equivalent back in the brick and mortar world. That has real business value.


VR is at its peak due to Oculus quest. VRChat has to turn off server features to handle the surge of users on the weekends now. There's competing social games like chilloutVR, xbox and playstation are coming out with new VR this fall (in conjunction with valve so these are probably real headsets).

Consoles can finally run VR, so I expect there to be a big boom there. Lenovo has some sort of prototype office vr glasses that allow you to have a bunch of virtual screens you work in.

One use for VR I would have is such an infinite screen feature. I'd like to be able to have multiple windows open when debugging so I can see all data at once. I don't believe this exists yet.


I'm an indie developer, so limited perspective for sure. Focusing on games and the consumer market.

The main issue I can see with VR is the same pattern that played out in mainstream PC gaming -- where 20% of the market takes the vast majority of the market's dollars. That means that the VR market can sustain far far less companies, even though games cost more and are primarily developed by indies.

I'll bet Half-Life: Alyx and Beat Saber took in more than a third of all money spent on VR last year, whereas everyone else is left competing for scraps.

If you make or find a game studio that is breaking into that upper category then you may be in a very comfortable position for the riding future growth.


vr is going to be an inherently smaller market not just for the cost of entry but because it's inherently hard to make it casual


tbh, The Quest 2 selling like hotcakes as a portable Beat Saber machine makes me think otherwise. The same issues apply to regular consoles too.


I think that for VR:

buyFactor = price / howGoodTheTechIs

If either the denominator increases or the numerator decreases it could more popular in the future.


2019 was the Year of VR.


Thanks for asking this question, I'm in a similar situation and have had almost the same conundrum on my mind.


> Allowing reported issues to result in zero action is the quickest way to convert your employees, especially your engineers, from enthusiastic company-first workers into "clock in, clock out, it’s not my problem" workers. Most employees, again especially engineers, want to improve the company and themselves… until you prove that their thoughts do not matter.

The flip side of this, which I have also experienced, is that the entire company simply accepts the issue as "normal" and stops trying to deal with it. Generally this happens when people have tried many times to solve it and the team has lost a lot of time. People get annoyed when you bring it up because everyone is dealing with it all the time and if you really brought it up every time you saw it, every sentence would reference it. In a healthy company this is the behavior you expect around "the problem" the company is working on: everyone efficiently and implicitly references the problem in their work. There's a thin line between nibbling around the edges of a problem and biting the same spot without effect.


Very interesting read. Some things about the difference between middle/senior/lead-developers is a bit too generalized / controversial. But this quote stuck with me:

"What you do not want are employees who are blindly following you down a potentially self-destructive path because they believe anything the company does is what it should do."

What I find fascinating about the Unreal engine is, how fast they are able to iterate or implement new features, without that whole thing falling apart. Reading their release notes [1] is crazy. No idea, how the manage to keep going at that pace.

[1] https://www.unrealengine.com/en-US/release-notes/


Much like humans and apes, my experience as a games-adjacent engineering leader is that game dev and non-game dev practices diverged somewhere in the early 90s, and are now two completely different animals. The ideas that feel so elementary to non-game devs are foreign to game devs, such as having a good git flow, continuous integration, and so on. This is, in part, why the games industry was caught off guard by the mobile revolution. Game devs weren't paying attention to what was happening in the non-game dev space.


> Game devs weren't paying attention to what was happening in the non-game dev space.

Well of course not, they were too busy crunching 70-hour weeks trying to hit that arbitrary release date so they could be fired and spend months unemployed before the next game studio ramped up for release. /s-only-not-really


As someone whose entire career was spent doing dev and dev management in the SaaS app world, but who's tempted to experience what it's like to work in a game studio, it seems like the jump is pretty hard to make. Not only is it a different type of software writing, but the processes and the expectations seem pretty different as well.

Seems like the two worlds are not that simple to switch between, which you're saying as well.


As a (primarily) web developer I think a huge difference is that games are immediate mode rendered as opposed to retained.

I think a lot of complexity that web devs deal with is due to it not being immediate mode.


Discussed at the time:

Confessions of an Unreal Engine 4 engineering firefighter - https://news.ycombinator.com/item?id=16775166 - April 2018 (115 comments)


Check out this amazing link from within the article: https://blueprintsfromhell.tumblr.com/


I feel like without having worked on UE4 blueprints I'm not getting a whole lot of comedy here. Yes, sure, some of these are messy or elaborate, but from hell? For all I know, half of these could be best practice blueprints and I wouldn't be able to tell the difference.


hiring sepme with systems sense is something I wish a lot more companies could & would perioidcally try, emergency or no. software devs are awash in their own experiences & there should be such a huge market for outside visitors to come re-assess, review, suggest or dabble in some re-orientation.

this article is such an amqzing unspoken revelatory tale, that applies to so many kinds of software engineering environments.


Not in the game industry but this really hit home. Some great insights, and I have many horror stories that I remember in software dev


Brings back nightmares of my own working in the Los Angeles area games biz.


care to share any warstories?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: