Hacker News new | past | comments | ask | show | jobs | submit login
The Strange, Unfinished Saga of Cyberpunk 2077 (newyorker.com)
170 points by mitchbob on Dec 11, 2021 | hide | past | favorite | 255 comments




Source: "My friend who works w/ games knows someone who's at CDPR"

Rumour has it, management got extremely ambitious with AI. Every actor in the game world was to have a personality, daily routines etc.

The devs tried to explain how absolutely unfeasible that is, but management was convinced it's possible - and would indeed be ground-breaking.

After a large number of years of toiling with the problem, the writing on the wall was finally seen. The NPC AI was scrapped and replaced with crude "we'll just spawn the cops around you so they don't have to have a chase mode"-level hacks, to allow them to ship in 2020.

Now they're stuck in limbo, because the players are still waiting for those marvellous AI miracles that were promised, but they remain absolutely impossible to deliver. CDPR can patch things like the car physics, but people's expectations were thoroughly mismanaged and they'll forever be disappointed


That's amusing because a game like Deadly Premonition managed to pull that off in a decently convincing way. We're not talking about hundreds or thousands of NPCs, of course, but everyone had a daily routine that they went about and the various places in town had opening hours. Most of the story would only progress if you were in the right place at the right time.

Not the most polished game but easily the best murder mystery and Twin Peaks style life sim I've ever played.

So if CDPR couldn't pull it off, they must have been asked to do it at an unreasonable scale or the requirements themselves were unreasonable.


A lot of other games (the first one that comes to my mind is Zelda:BotW) also have NPC routines, opening hours etc. but if you look closer it's just a simple O(1) mapping between time ranges and expected state. It gets much more complicated when you actually try to pull off NPCs that may decide to do different things based on their modeled traits, react to in-world events or player actions etc. which I assume is what grandparent was talking about.


Most players just want a decent looking, mostly bug free, interesting Witcher-3-but-it's-Cyberpunk game.


Exactly. I don't expect to be able to follow an NPC and see a life-like routine, just something convincing enough.


LOL, this reminds of an episode of Mythic Quest where the lead engineer (Poppy) imagines "real-time, persistent, globally-reflected environment changes" for the MMO they built, gets told that it's unfeasible by the entire dev team, mows ahead anyway and ends up with a disastrous demo at the end when the system basically collapses on itself.

Funny how reel-life and real-life collide into each other! :D

---

As an aside, after Ted Lasso (which gets all the well-deserved hype), Mythic Quest is a great trope-filled show esp. for anyone who's worked in tech in general (and game dev specifically).


Sounds vaguely inspired by Ultima Online


I keep forgetting that Mythic Quest exists. I need to watch that.


I’m a developer, though not in game dev. This sounds feasible, what am I missing?


A game today which is struggling to accomplish this is Star Citizen, and they did a video on what they plan to do here: https://youtu.be/TSzUWl4r2rU

Anyways, I think the big challenge compared to real time apps like Google docs is that game simulation is significantly more complex than synchronizing bits of text. One problem is that at a minimum, you have the round trip to the server as latency between “submit command” and “receive simulation result”. And this has to happen many times per second so that it looks realistic. And then on the server, it has to be processing the changes from N connected players M times per second, and then simulate what happens in the universe. And if simulation (M) drops too low, you get desync problems where players are rubber-banding and the AI gets very dumb. (An example is that star citizen appears to have great AI in demos, but in the live server, the servers can’t process changes fast enough for the AI to work well.)

And then, if you want a lot of connected players or a big universe, this just gets more and more complex, and you’ll eventually out-grow the ability of a single server to process changes. At which point you also need to handled data replication among multiple servers. For example, star citizen game servers don’t have capacity for more planets at the moment, either because of memory or cpu constraints. So sharding the universe into multiple servers is a pre-req for the next stage.

So to more directly answer your question, it’s definitely possible to implement. Many games have basic versions working. Minecraft is an example, but it can’t scale to hundreds/thousands of players. New World is another recent example, but even its servers have player caps. No Man’s Sky is sort of an example, but it keeps players spread out so that they rarely encounter each other. Eve Online is also an example, but I think it reduces the amount of nitty-gritty simulation details. (So it might not have a very in-depth physics simulation, or first-person-level details.)

So it’s basically a scaling problem, and even then still has limitations based on the speed of light (for internet latency) which limit how good it can get.


That reminds me of the early work on the internet arguing that the amount of global connectivity wouldn't scale due to power laws. In a way, they were assuming a system like these global world sims. The equivalent of static routing (players that want to play together enter same server) exists, but the "dynamic routing" field is mostly a game of selecting which illusion to present the player (sometimes, it's okay to just stick people on a different instance but make sure their friend group is preferentially on the same one, etc).


I'm not sure you understand. Cyberpunk is a single player game.


I was responding in the context of a grandparent comment, which was talking about features for a fictional MMO which turned out to be infeasible, and then someone asked why they were infeasible


Star Citizen and the fictional MMO in Mythic Quest are multiplayer.


I'm guessing you'd need Discord level tech to propagate changes in a multiplayer environment which is a bit of a distance from "core competencies".


> Rumour has it, management got extremely ambitious with AI. Every actor in the game world was to have a personality, daily routines etc.

They received some AI-related grant[1] from the local government and did whatever they could to avoid penalties.

[1] https://www.cdprojekt.com/pl/media/aktualnosci/30-mln-zl-dla... - original source but in Polish


What's really weird about it to me is that games like Oblivion and Skyrim were doing sandboxy open world AIs with daily routines over a decade ago. That franchise also had more than its share of bugs, but the AI did more or less work.

That CP2077 couldn't even reach that level remains kind of baffling.


In Oblivion and Skyrim IIRC, the characters were mostly individually configured by the level designers. That works fine for a world where you have what boils down to a few small towns but I don't think the same approach could be applied to a place the size of Night City.


In Skyrim you could place poisoned apples in your enemy's chests so that they're going to eat them and die some days later... And it was 2011. Bethesda will need a miracle, because the expectations for TES6 are too high.


Bethesda's Elder Scrolls and Fallout series have much less NPCs in their worlds and far fewer densely populated areas compared to a game like Cyberpunk, where there's crowds of people around every corner. If they really did intend for every single NPC actor to be on a TES level schedule I can see where the difficulties come from.


The conceit being that I want to watch some McDonalds employee brush his teeth before heading out to work, or get the chance to talk to some guy about his mortgage at a bar.

I don't know why you'd build a game simulating more depth of interest in background characters than people actually possess, and it's not like occlusion culling is a foreign idea.


And the largest city in Skyrim has 74 inhabitants. This kind of stuff has price.


Bethesda oversold their AI all the same though. Google for Oblivion and "Radiant AI", pretty big drama back in the day.


> Bethesda oversold their AI all the same though

The big difference was that Radiant AI was fun. You could join the Warriors' guild, jump across the road to the Mages' Guild, use an enrage spell on them, the run out into the street. Instant chaos ensued.


was probably that they didn't have time after scrapping everything else and trying to hit deadlines


Gothic (the game) introduced that concept even 20 years ago for me.


Can go much further back to e.g. the Ultima games, especially Ultima 7. The world was incredibly big and you had a decent approximation to a living world.


Another one from same time frame (1989-1992) was https://en.wikipedia.org/wiki/Virtual_Theatre Lure of the Temptress


Found the English version of that source[1]. Seems to have had five criteria[2] with AI being one such. Would be interested if you had a link about the penalties you mention regarding GameINN.

[1] https://www.cdprojekt.com/en/media/news/cd-projekt-group-sec...

[2] https://www.pwc.pl/pl/pdf/alerty-innowacje/13-PwC-Alert-GAME...


I have found[1] the government research grant program "Intelligent Development Operational Program for the years 2014-2020" ("Programu Operacyjnego Inteligentny Rozwój 2014- 2020 działanie 1.2") and funding number/code "POIR.01.02.00-00-0105/16" but am Ill-equipped to search surrounding[2] PDFs for penalties.

It was, and still is, my assumption that there are penalties.

Also, this seems to be another grant - "City Creation" is documented separately[1] and with different amounts from "Animation Excellence", "Cinematic Feel" and "Seamless Multiplayer"[3]

[1] https://archiwum.ncbr.gov.pl/fileadmin/gfx/ncbir/userfiles/_...

[2] https://archiwum.ncbr.gov.pl/programy/fundusze-europejskie/p...

[3] https://archiwum.ncbr.gov.pl/fileadmin/gfx/ncbir/userfiles/_...


I am not sure what you laid out is unfeasible, but perhaps the way they approached it was unfeasible and left them with no time to iterate or pivot.


CDPR doesn't iterate and test their code in the optimal way. Most of the time they just go at it full on and patch things up at the end. This was (un)fortunately for them a project where the complexity rose so high that their approach couldn't handle it (also covid didn't help), and they were left with intertwined systems that were causing severe performance bottlenecks everywhere (think polynomial or even exponential). AI the way they planned it meant spawning hundreds of NPCs, all interacting with one another and the player, with badly designed systems for that interaction (because why spend upfront time on this if you can floor it), and almost no consideration on how that would impact the graphics and physics sides of the game. The whole development of this game is a case study of how to fail at keeping your externalities in check.


That doesn’t make a lot of sense though. In my country first year CS students build game of life versions with thousands of not millions of individual entities all going about their business and interacting with each other in some shitty math based programming language, using math they typically get from Google because it comes from biology.

How can that be a thing of what you say is true?


I would say first, that's the power of exponential complexity. Second, AI in games tends to consist of somewhat complicated goal-based & fuzzy logic behaviors. Third, game AI agents have interactions that take place within a 3d world that has to be modeled accurately. That means lots of expensive computations for things like line of sight, pathfinding, and so on. Finally, given that games are soft real-time and include remarkably expensive 3D rendering, the compute budget available for AI functions I would guess would be somewhere around the order of 2ms per frame (on a 16.7 ms frame, for 60fps).

Though, two through four can usually be handled without issue, provided the devs know what they're doing & architect the engine correctly. (edit: And keep the scope in check, which ultimately is the responsibility of the leadership.) Unfortunately, this appears to be where #1 messed things up for them.


You don't really need to rerun your AI logic every frame though. It's more than enough if you do it every ~second, and can compute it on a separate thread.


That depends on the type of game. The game I am working on would not work with a AI update every second.


are we talking about the same game of life? in conway's, the update logic for a cell depends only on the immediate neighbors of that cell. so the time to update an entire frame is O(m*n), which is not hard for any matrix size that would reasonably fit on a laptop screen.


I am pretty sure that given the constraints (must operate reasonably well on last generation consoles and minimum specs with decent graphics) the problem space is pretty much impossible.


why? SimCity-like games do this already to a degree.

definitely not trivially easy, but impossible!? nah.


SimCity games do not simulate individuals. Even when they do, the indivduals are not very complex.


Cities Skylines seems to, but just like for the vast majority of systemic gameplay, it's going to be an approximation of some kind and that's fine. CDPR needed to find that compromise between full simulation and believable approximation, but it seems like they ran out of time to get it working.

Star Citizen's "Subsumption" and Quanta system is very similar in goal, and if it goes to plan will be one of the most engaging "AI population" simulations to date. It does't individually simulate every person in real time, but instead simulates a set of tasks and demands to be done, all the resource nodes, and an approximate population of simple entities carrying out those tasks, and when you spawn near said tasks, it spins up a "meta NPC" to represent in full 3D space, an agent of one of those simplified population entities.

So for example, you fly to a mine site, the system knows there are currently 40 miners here mining aluminium to take to the factory, it knows at what point they are at with their tasks, and so it generates 40 NPCs to represent those miners. At that point you can interact with them and influence the simulation, say steal their ship and minerals. That updates the simulation but the simulation need not care about the specifics of the NPC. Just that he has no minerals and he is no longer capable of mining.

You fly away with their ship, and all the other NPCs return back to simplified entities in the graph.


Star Citizen is notoriously widely considered to be vaporware; I wouldn't really give it as an example of "achievable with time, budget, and limited resources."


Yeah, probably not the greatest example of a lean operation. But the argument was that it's impossible on current hardware according to GP, which I don't think is true.

In regards to Star Citizen being vapourware, my recommendation would be to take a critical look at their timeline and what state the game is currently in and ask if you could really see it going any quicker. It's a controversial project, but I think calling it vapourware is just the industry being dismissive.

They had to build the team from nothing, now 700 people, and they are building two games not one, both are pretty groundbreaking in features and scope. That considered it is actually moving pretty quickly. They have already delivered a bunch of those groundbreaking technical hurdles to players. Of course the whole thing could implode yet, but it's looking good.

They deliver consistent progress reports and updates to the playable alpha, which has enough content for me and friends to play regularly for fun now. Updates every three months is the schedule, plus any in game content events that get run.


>Every actor in the game world was to have a personality, daily routines etc There's a similar system on the stalker saga and maybe they could have built upon it? Of course making every NPC do it for itself instead of a sort of hive mind (I don't have a better term for it) sounds very difficult but sounds rather weird that they can't do some sort of similar implementation, right?


They can make it work in a barren hellscape with less than a hundred characters, but not in a bustling urban setting as densely populated as Manhattan, I wouldn't call that weird at all.


> Every actor in the game world was to have a personality, daily routines etc.

I tried this on a limited scale in a text-based MUD in the 90s, and even then it quickly became an ungodly nightmare. What sounds good in your head isn't always feasible in reality.


Risking hubris:

With current GPT-3 technology (or even GPT-2, accessible by then), it should be possible to create each agent unique. But if the research started earlier, I am not surprised by the dissonance between marketing and devs.


The release date is not when work on the game starts, it's when work on the game ends.


The solution: paid overtime. Time and a half after 8 hours, time and a half after 5 days, and those multiply.

This is why film scheduling is a serious discipline and software scheduling is a joke.


This is a brilliant idea and I'm curious if anyone has any opinions on why it wouldn't work. Paid overtime shifts financial responsibility to the PM who ordered overtime, because arguably a more competent PM would either push back against unrealistic deadlines or manage the team more efficiently to avoid overtime crunch. It sets the stage for competent management to prevail while incompetent managers are forced out of the org. I think this is great incentive structuring.


Devil's advocate:

As paid overtime becomes more common, people start factoring it in production costs, and standard hour salaries are adjusted accordingly. When people negociate they get told "don't worry, with X hours of overtime you'll get a pretty good pay" and we're back to square one.

For a real world examole of that, Japanese companies have had these kind of deals for a few decades.


If that does happen then there was never going to be a way to fix this but no one is worse off. If that doesn't happen then the problem is at least somewhat fixed. Sounds like a good argument to give it a shot.

I'm generally all for Devil's Advocate, but when it comes to giving workers reasons to _not_ demand more, I need to see some really compelling evidence of serious harm before I'm willing to tell people to reconsider.


I'm generally all for Devil's Advocate, but when it comes to giving workers reasons to _not_ demand more, I need to see some really compelling evidence of serious harm before I'm willing to tell people to reconsider.

makeitdouble isn't saying workers shouldn't demand more; they're saying that paid overtime doesn't solve the problem of crunch time. It makes crunch time acceptable, and (potentially) gets factored into renumeration packages in the long term.

If you want to solve the problem of crunch you need a solution that removes it, not one that pays people to continue doing it.


If crunch time continues but workers are making money commensurate with the fact that they're working outside reasonable hours, then I view that as progress. If base pay is depressed to make up for this change and workers make functionally the same amount, that's a bummer.

But either way: I don't necessarily view crunch time as a thing that can or will be solved. I just want the game to be more fair than it is now.


Crunch doesn't really exist in large areas of the software industry. There's no reason why it can't be solved for gaming, especially at large, well-funded studios like CDPR.


To add to onion2k’s point, a better “let’s give it a shot” alternative is to raise the base remuneration to account for the potential overtime.

That way if you’re overflown with overtime it’s in your compensation, if you manage to not do any overtime you’ll benefit from it.

It’s arguably harder to negotiate, but I think it would be the best way forward. Now, if the choice is between nothing and paid overtime, yes it’s worth it getting paid overtime.


Part of the problem though is that this needs to be phrased as a regulation or rule, it’s not something companies are just going to do, as ununionized workers have little ability to extract the real value of their labor in negotiations.

Creating a regulation that says “workers in this sector get overtime pay” is probably easier than creating an industry-specific minimum wage.


People are worse off because those who actually can protect their time make less if overtime is needed to reach competitive pay.


There are many countries where not paying for overtime >=2x is illegal, and nothing like that is happening


Most of these countries also have special statuses for skilled workers where the pay isn’t related to worked hours.

In France it often was a separated contract you had to sign, but most people above “junior” level were pushed into it. It works pretty well in some companies, and heavily abused in others.


It can really depend. For example in the UK there is the Working Time Directive which limits employers to a 48 hour maximum working week. Except the employee can opt out of it voluntarily and the power relationship means this is almost never actually voluntary. No doubt some places have strong laws but many have so much wiggle room the laws on the books look nothing like reality.


Insightful comment.

And I will add that we constantly complain about how there are just a few hours daily of truly useful work. Adding more money will not change anything. Paid overtime also can give people the wrong incentives.

Edit: what I mean is that we should not plan for overtime. Of course it should always be paid.


Just claiming an opposing stance isn't enough to validate it. This is just a silly opinion for the sake of argument.

I live in a country with a 40-45 hour max work week. After that, you pay 1.5x rates. Saturdays are 1.5x rates for salaried workers and Sundays and public holidays are 2x. There are very serious legal consequences for trying to skirt the system.

As paid overtime becomes the norm, it becomes another metric to track. Project managers that routinely run into overtime are punished. The same as any other expense caused by analysis failures.

You seem to think paid overtime suddenly incentivises a company to just up the price of their product. Like it was equivalent equivalent the rising price of materials. It just doesn't work that way.

Not everyone is a piss poor planner who can't react to changes in environments. Plenty of projects I've worked for have been pretty close to budget. When overtime is required, we talk to the customer. Their money, their decision ultimately.

If your costs unexpectedly skyrocket, you're not keeping customers.

I'm getting a little bored of these "qualification by differnce of opinion" reasoning. You've gotta actually qualify your opinion


Your comment is overly dismissive and provides no better evidence than the comment you are responding to.

Not everyone is a poor planner, but deadlines become firm when large marketing spends are deployed, as they are in the gaming industry.


How do other kinds of software companies with large marketing spend deal with it? Hint: Often not with death march crunch time.


Some very badly run (although not unsuccessful) games companies scheduled in crunch as part of their project plan. Disturbingly some still might.


At least those devs get something back for their crunch time then..


The comment you replied to explained that the rates would be the same as now after some adjustment.

(Of course, you can disagree with that.)


We had paid overtime working on APB at Realtime Worlds across the whole salaried team. I do think it curbed the crunch a bit but did nothing for the project management side of things. There were so many things wrong with that project though it’s probably not a great case study. The other side of it is that people would work extra hours even if they didn’t need to which you might think is great but actually leads to lots of downstream effects because there is more to test that’s been written by people past their best.

My personal take is that the overall discipline of managing software projects is so woolly and ill-understood to begin with that the incentives aren’t really there beyond the obvious fairness of actually paying people for the work they do. You don’t need a cost to competently push back against unrealistic processes.


It’s not that it wouldn’t work, it’s that in order for management to put this kind of restriction on their own power in place, they’d have to first admit they need it. People usually don’t like to tie their own hands.

Now, if the employees unionised like in the film industry, it might happen.


Well, CDPR had paid overtime (just as any other company is required by law in Poland and nearby European countries - the USA notion of programmers as exempt from overtime rules is, well, mostly USA-specific) and this apparently didn't work - the argument that "there should be paid overtime" does not seem to be relevant for this article at all.


This is exactly how it is in Poland, by law.

150% salary for weekdays+saturdays overtime (200% if at night) and 200% sundays and bank holidays. ~400 overtime hours a year hard limit.


In general, in Poland most senior people work as (pretend) "contractors", because the taxes are drastically lower for them than for full-time employment (it's essentially a massive country-wide tax evasion scheme which is, at least for now, tolerated by the tax authorities). None of the labor protection laws (high overtime pay etc.) apply to them. I don't know if CDPR also hires a ton of "contractors", but I would be very surprised if they didn't - they'd be saying no to a ton of experienced people who just won't work as full time employees.


It depends on companies, some corps do it, some don't.

I worked in both types, usually contractors are treated as second class citizens, e.g. some perks don't apply to them.

I have yet to see a company where contractors are majority.


That's because most companies don't consist solely of very senior people (they can't afford them). Among very senior and valuable people, contractors are a majority I think.


I think this is more like cost cutting by some (cheaper?) companies. Companies that can afford normal work contract don't play with the other contracts.


Small correction: Saturdays are also 200% (for the first 8hrs, the remaining ones are 150% - yeah, strange but I was bitten by it once).


Given that CDPR are in Poland, I guess we can say it didn't work?


In the sense that it didn't pay off for CDPR. The developers were paid regardless.


I haven't seen the pay schedule detailed but i have read that CDPR paid for the overtime. At least for the Polish employees (who are guaranteed that pay by law)


And yet the movie industry - which has over half a century's head start over gaming - still has its fair share of studio excess, cost overruns and expensive flops.


I quit the video game biz over crunch.

When I was made to work three nights without sleep for some imaginary deadline, and staff came in on the fourth morning and they had to pry the gold CD out of my hand as I was asleep on the floor under my desk - that was the end for me.

https://www.amazon.com/Abomination-PC/dp/B00001ZT3Z

I left and became the World's #1 seller of Beanie Babies. (True story)


Isn't that common in Europe, even required by law in most of it's countries? I'm not sure about Poland though


> Isn't that common in Europe, even required by law in most of it's countries? I'm not sure about Poland though

Yes. I live in the Netherlands and we have this. I am a software engineer and working in a free sector (meaning I don't have a union or common branch contract) but it's normal that overtime pays extra. It's also the reason I rarely work overtime. Plus it's fairly common to be compensated in time. So if I work on a Saturday I get 1,5 off on my paid vacation sheet. Sunday pays out 2.


So it's a bit better than in Poland then, we have 1.5x / 2x pay for overtime but it doesn't translate to number of vacation days i.e if you work on sunday you may either receive 2x pay or one extra vacation day.


> The solution: paid overtime

The first step to getting there is a union.


What makes you think so?

Mainstream economics would suggest that high productivity and plenty of competition between employers gets you benefits. That's what worked for programmers rather well. Thanks also to low barriers to entry for new companies.

(Especially if you compare programmers to people in other industries, also in other countries and in other historical eras. That would make the impact of unions vs the factors I outlined above more apparent.)

In any case, feel free to form or join a union, if you want to try it.


The state of game development jobs, with near universal "crunch", lower pay than much of the industry, sexual harassment and discrimination everywhere, suggests that the market is not correcting for these things, isn't it?

And, of course, all this while gaming is a booming industry attracting (and requiring) workers in droves.


Don't extrapolate to the whole sector from a few high profile failures. There are places like e.g. Hello Games, who don't do crunch, and my friends at Ubisoft tell me they do only a little overtime right before shipping. If you watch SIGGRAPH, you can see a lot of good real-time graphics research come out of DICE, Epic, Unity. I doubt those people are doing much overtime either. Not that I would go work there, being a game developer is below my pay grade, but not every place is as bad as ActiBlizz.


Sure, but it is much more the norm than in other industries, at the very least. And Ubisoft had their own sexual misconduct and discrimination scandal quite recently[0].

[0] https://kotaku.com/ubisoft-ceo-and-others-blamed-for-institu...


> The state of game development jobs, with near universal "crunch", lower pay than much of the industry, sexual harassment and discrimination everywhere, suggests that the market is not correcting for these things, isn't it?

It totally is!

So first, I suspect crunch time sort-of works. In the sense that it allows you to bring out a product a bit sooner, but burn out your people in the process.

The game industry is as bad an employer as it can get away with, because so many people are eager to join. That's why the game industry can mostly get away with low pay and crunch time.

Working your standard programming job at Boring Inc is less cool, so people demand more in return. More money, more perks, better treatment.

It's all about equilibria.

So if you want to 'fix' the game industry, the way to go about this is to make the alternatives more appealing.

There are plenty of example where regulation (or similar) tried to fix unsavoury industries by fiat. Eg look at Orange in France. Orange is the former French telecoms monopolist. As such, they employ way more people than current management wants.

But firing people is extremely hard in France. As unintended consequence, management has no incentive to make the life of marginal employees bearable, and every incentive to avert their eyes from bullying, etc.

Cue the French complaining how 'neoliberalism' causes suicides at Orange. https://www.dw.com/en/france-orange-top-bosses-caused-employ...


> The game industry is as bad an employer as it can get away with, because so many people are eager to join. That's why the game industry can mostly get away with low pay and crunch time.

This is the whole thing, you've laid it bare here. The games industry is exploiting it's workers' passion.

If there were game workers' unions, we'd see a lot less of this type of behavior, as has been shown time and time again.


There's already minimal barriers to entry for starting your own game company. So employees already get plenty of choice.

I have no clue how a union is supposed to help here. Perhaps they can restrict who can work in the games industry? Just like they keep non-union people from appearing in Hollywood movies?


Unions work by helping employees get leverage so they can actually negotiate working conditions with the company. Any individual developer is easily replaced, but the whole workforce is not.

The current state of employee "choice" between companies all colluding to offer the same conditions is no choice at all.


How can they all collude, when entry into the industry is so easy? It's hard to keep big conspiracies alive. You can literally form a new game company in your bed room right now, and any people do.

Someone could break rank, hire programmers for a dollar more or an hour less crunch time, and still make a killing, wouldn't they?


No, they couldn't. There are a handful of companies making almost all traditional games, a handful more of indie successes, and then a terrible morass of companies making gambling traps called "mobile games".

If you're not working for EA, Activision-Blizzard, Ubisoft, Sony, Microsoft, Bethesda, Valve, Bungie, Epic, Rockstar, or Riot Games, you are almost certainly not working on a major video game project, and certainly not on a game that will make hundreds of millions of dollars in profit.

And conversely, if all of the developers on, say, World of Warcraft were to strike tomorrow, Blizzard would not be able to afford to replace them. That's the power of a union.

I have no idea where you got this idea that it's easy to start a gaming company. Unless you plan on making a mobile game, they're a very hard industry to enter, like all other art industries.


I worked for 3 years as a unionized game developer at public sector company in Canada and can confirm it is absolutely the best situation - for developers. All other non-unionised companies I’ve worked for - except for the small boutiques - were deeply unhappy burnout factories. Yes, you can vote with your feet - but the journey to the door is over searing hot coals.

Unionise.


You can vote with your feet before you ever set your feet in the door in the first place.

Please unionise all you want, but don't force me to be part of your union or adhere by what you negotiate.


Paid OT is not something an individual can negotiate for when it is unprecedented in the industry.

It also wouldn't surprise me if major game studios have a handshake agreement to never offer paid OT. It would probably be easier to avoid a class action lawsuit for that than salary fixing a la Google and Apple et al.


You don't have to negotiate explicitly.

As a customer in retail you seldom negotiate explicitly with Burger or Walmart or Amazon directly, you just vote with your feet (and wallet).

Investigate your potential employers before you take a job. If you like none of them, at least pick the lesser evil, or move into an adjacent industry. (Eg lots of game programmers are more than welcome in Internet or in finance etc.)


If you want to compel an industry to fundamentally change its compensation structure, history indicates that you do need to organize and negotiate collectively and explicitly. See the 8 hour workday and the 5 day workweek as examples.


There is paid overtime for some employees. Including engineers. Instead of a project completion bonus for other employees. It’s a bit of a wash.


This solution is the law in probably 99% of the countries?


All overtime at CD Project was paid in full accordance with EU policies. This isnt Japan/Korea where you cant leave your desk before supervisor even if it means sitting up to 23:00 doing nothing and not being paid for it.


> This is why film scheduling is a serious discipline and software scheduling is a joke.

How would you test / falsify this?


Look into how film completion bonds work. I wrote that up a few months back.[1] Film Finances has been doing completion bonds for video games since 2007, but I'm not finding much info about that. I wonder how the details work.

[1] https://news.ycombinator.com/item?id=27687265


Thanks.

Tangentially related: https://socialgoals.com/


Because films aren't delayed or go over budget?


I think it's less common. Especially for the shooting part.

But I suspect it has more to do with most big films being made in relatively similar ways.

It would be interesting to compare different film industries in different countries and different times.

Eg classic Hollywood studio system, modern Hollywood, Bollywood, etc.


Cyberpunk 2077 is one of my favorite games of all time.

It was unplayable on console, and a buggy mess on release (even on PC), but some of the acting and story telling in the game is really something else. I've been playing games for 30 years and nothing has really come close in terms of voice acting. Judy, V (both male & female VO's are great, but I prefer the female one), Keanu Reeves does his best work, and is totally at home in a melodramatic setting, all the smaller characters have proper voice actors that are more than just an after thought.

The story is interesting, there are limited, but real choices along the way, along with a plethora of non-impactful decisions, that merely change how your character interacts with the world. The world itself feels great, the architecture of the city feels real, and it's the first game that I've played that explored the vertical aspects of a city, though they definitely could've done more.

The action aspects are a little limited, surface level stuff. It's such a shame that the game wasn't given the same scope it has right now, but another year or two for polish (pun unintended...) and I think the systems in the game could be fleshed out and be much more fun.

For anyone that has moderately decent hardware, I highly recommend a play through.

EDIT: forgot about the music. Might be the best soundtrack of anything ever. Movie/TV/Game/Other.


I agree. When I first heard the story was about rock star turned terrorist I thought what the hell, that makes no sense, but the story really does work and the characters are completely believable.

From a psychological perspective I found Johnny silverhand fascinating, there is something about an insane self confident rock star that you can imagine leading a cult terrorist organisation in another life. I found myself thinking someone like Steve Jobs might have been a terrorist in another life, that unshakeable self belief coupled with charisma, narcissism and actual skill is fascinating to me. And they made it work. I’ve not played a game with that depth before.

Possibly second favorite game to the original deus ex.


I feel similarly. The art is amazing and really sucks you. But that perhaps is it's downfall, overfocus on aesthetics and creativity rather than gameplay. David Jaffe the creator of God of War mentioned that the lead developers of the game were creative people rather than people that have the qualities of finishing things up. Forget his exact words, but that was the gist of it.


Makes sense, the game is 10/10 on all counts w.r.t art (architecture, music, character design, story, modelling), but relatively poor in terms of the 'business logic' of the game (the systems for gunplay/stealth/police/levelling etc are all average/poor).

Still fun though.


While it's not one of my fav game of all time, I have to say it was a great opportunity for me to try Stadia. I preordered the game and it came with a 1 month pro subscription, a controller and a chromecast all included for free, and I enjoyed smooth bug-free play from launch day (I have a gigabit connection so felt native). Highly recommend! Full disclosure, I haven't played in a while because the game selection is limited. I just want to play "quadruple A" titles like Last of Us 2 and RDD2.


A drive-by pronouncement, one that is probably wrong because of how reductive it is, but it is visceral to me because I've been dealing with a similar situation at work: you can't trick computers. You can't intimidate them into working with groupthink, coercive meetings, aggressive talks, or vague threats of losing face, position, or job. They are binary. You either make something that produces the output you want, or you get a buggy, unworkable mess (to various levels of unworkable).

All the crunch and pain and buggy games are a result of humans trying to force the other humans who are programming the games and content, and indirectly, the computers themselves into doing a rush job.

You can't rush art. Programming is an art since whatever is being programmed has never been made before, else you could go Copy + Paste and be done.

Yes, there are millions of paintings of meadows, just as there are millions of copies of very similar but ever-so-slightly different boring line of business applications. But if you want THIS painting, this one that you have commissioned to show the meadow in YOUR back yard, you can't rush the artist, or else the painting quality will be reduced. Depending on how much rush you add, the quality can be subtly bad but close enough, or horrifically bad as if, say, this pretentious commenter were painting it.

Let the artists (programmers) take the time they need to produce the artifact. Yes, the artists can sometimes get stuck in cycles of tweaks (goldplating) and need to be told this artifact is done, ship it, but it always seems like we lean too far to the other side: feature factories, churn churn churn, crunch crunch crunch.

Let the artists paint / program in peace!


I'll go against the grain and say that you can rush a game, you can overwork people, and that in many cases, for many companies, doing so is the only path to success. We're talking about one of the most successful companies in Poland that you only know about because they managed to produce some of the most popular and profitable games ever created, and they didn't get to this point by letting developers and managers take their sweet time.

Are you going to try to tell me that The Witcher 3 was developed by having everyone work a standard 8 hour day and that the team was given as much time as they felt was needed to produce it? Of course not... it was developed with just as much of a crunch, pressure and "inhumane working conditions" [1] as Cyberpunk 2077, but because it was a resounding success on all platforms, no one batted an eye.

The issues with Cyberpunk 2077 have next to nothing to do with crunch time, they simply tried to make a game for a platform that was never going to be able to support it, regardless of how much time they took. The game on PC is a masterpiece of technology, it looks and feels amazing and it's a resounding success. The failure on PS4 and XBox is because those platforms were still very large markets and they wanted to take advantage of them and well push come to shove, there's just no way of making the game play well on them.

That's all there is to this story, making a game for a platform that can't support it is going to be a failure. Now that Cyberpunk has failed, people will use that failure as a way to argue about things that likely have nothing to do with the cause of the failure, and you know maybe that's a good thing and better working conditions come of it... I don't know... what I do know is even if CD Projekt Red decides to change its working culture for the next game they make, there's dozens of other new game studios rising up and it may be the case that the only way for them to compete with the existing giants is to work day and night to the bone on making the next masterpiece.

[1] https://www.gamebyte.com/cd-projekt-red-admits-crunch-period...


> I'll go against the grain and say that you can rush a game, you can overwork people, and that in many cases, for many companies, doing so is the only path to success

Take a look at the development for the game Hades. They took their time and polished everything, even having a pre release game in steam and incorporating community feedback. Rushing is not the only option


Hades is a good game. You might enjoy it more than witcher 3. But it's multiple orders of magnitude smaller no matter how you cut it. If you want to build enormous games, you tend to need enormous effort.


That may be true, but it doesn't justify worker exploitation and abuse. Unless you'd like to claim that is the only way any large project gets done in any industry. Which would be a bold claim indeed.


What you call worker exploitation and abuse is reasonably well known about the game industry. Every one who gets into that industry either does so willingly, or otherwise with gross neglect.

What happens between consenting adults ain't anybody else's business.


Employers aren't adults, they are massively powerful corporations. Worker exploitation happens between one passionate programmer, artist, QA and a massive corporate organization that holds all the power and has no qualms about deceiving, badgering, shaming it's way I to what it wants.


Eh, companies have reputations. And would-be employees can read up on companies before they sign up.

Eg I love Amazon as a customer, but I wouldn't really accept a programming job with them.


That doesn’t make it at all acceptable for them to treat the people that work there the way they do.

Doing so makes you at least complicit in that, and at most an active perpetuator of it.


> Doing so makes you at least complicit in that, and at most an active perpetuator of it.

Doing what? And who is 'you' in that sentence?


Worker exploitation and abuse happens in many industries and was once the norm. It is in society's interests to prevent systematized exploitation of society's members.

"Consenting adults" is a fine heuristic for what happens in a private bedroom between individuals with equal power. It does not apply to for-profit exploitation done by organizations with special legal privileges. Privileges they have only because we see commerce as generally beneficial and to be encouraged.


> Worker exploitation and abuse happens in many industries and was once the norm.

What do you mean by exploitation? When was it the norm and where?


Sorry, there's no way I'm spoon-feeding you the whole history of labor. Maybe read Loomis's "A History of America in Ten Strikes" as a starting point. And then maybe Beckert's "Empire of Cotton" for the broader, systemic view.


Supergiant Games is a wonderful studio and they do not have the same kind of crunch that is endemic to other studios. Having said that, I don't know that you can compare the development of a 2D game with a very constrained and linear progression to a massive open 3D world like Cyberpunk 2077 or The Witcher 3. I love Transistor and Bastion, haven't tried Hades... they are great games and I look forward to whatever is next from them, but we're comparing a game that cost about 6 million dollars to make with a game that cost 400 million dollars.

While I appreciate your point and even agree to a large degree with your point, the stakes are very different between these two products.


> I don't know that you can compare the development of a 2D game with a very constrained and linear progression to a massive open 3D world like Cyberpunk 2077 or The Witcher 3

This is part of the issue I think. AAA games are so high flying in their technical and artistic scope, and they have all the business and power bullshit that comes from their now massive, 100mil+ budgets. Their failures are spectacular because when there is so much complexity inherent in the system and teams, lots can go unnoticed, and it falls apart in many different ways.

You can't even really compare the clean delivery of a AAA war shooter from 2005 with one from 2021 for example, the technologies and disciplines involved have grown so much.


From a far removed distance, it seems to me that you can create AAA game with good labor conditions, with enough time. But the underlying technology moves so fast and the expectations along with it that you end up requiring to do so much in parallel in lockstep to get it to ship early enough, and with this parallelism comes great cost when things inevitably crop up.


Hades was a real surprise - I hadn't heard anything about it and was expecting something pretty average but it might actually be my game of the year.


Cyberpunk 2077 is definitely a dramatically bigger game than Hades, but it also had 400+ people working on it for twice as long as the couple dozen Supergiant employees worked on Hades


A team of 400 people working on a project do not work at the same speed as 33 teams of 12 working on 33 different projects.

https://en.m.wikipedia.org/wiki/The_Mythical_Man-Month


Yes, I am well aware. However they do occasionally manage to get more done than 20 people working on the same project would. Not always, admittedly


Which makes scheduling exponentially harder and mistakes costlier.


Hades is a much smaller game than either The Witcher 3 or Cyberpunk. Scale makes things harder.


Tons of high quality games were created with a lot of crunching and squeezing (e.g. pretty much all ID games). People think too much about 9-5 or 4-day work but what is much more important is give developers the FREEDOM to create contents. Of course flexible work hours is part of the deal but more importantly you need a management that understands the business and refrain from imposing unreasonable deadlines or requirements.

BTW I believe the best place for a game developer is something similar to ID before Activision. It's small and flexible, but not too small to develop big titles. Nowadays you might need some 50-100 developers for a big title but if you go over the number and/or go public the productivity and more importantly the focus is going to be lost.


Or even ZA/UM and Disco Elysium for that matter


> even having a pre release game in steam and incorporating community feedback.

Ah, so they leveraged unpaid quality assurance and design labour in order to cut financial corners.


>>Are you going to try to tell me that The Witcher 3 was developed by having everyone work a standard 8 hour day and that the team was given as much time as they felt was needed to produce it?

Are you going to tell me that The Witcher 3 was a success because it was released May 19th 2015 instead of October 24th 2015 or even May 19th 2016?


I'm not saying there should be crunch but there is limited money (no idea for The Witcher). If you get funding for X amount of money you calculate your salaries plus overhead and you have a deadline. Miss the deadline then you renegotiate for more funding but at some point the funders will just pull the plug and your team are out of jobs.

It's unrealistic to believe there's no reason for deadlines. For a FAANG company, sure, they're making so much money they can -- mostly -- afford to do things without deadlines but few video game studios are at a level of funding that they can spend another 3, 6, 12, months.

The internet claims Cyberpunk 2077 had 500 to 5000 people working on it. I'm sure that 5000 number is not full time but it's not hard to believe the 500 number. Just paying 500 people $15 an hour, 40 hours a week $1.2 million a month, not counting rent/power/taxes etc... And of course some large percent of those people are probably getting closer or $30 per hour $60k yearly salary) to $60 per hour ($120k a year salary)

Witcher3 claims 150-250 employees so your delay, shipping October instead of May is easily $4 million needed and I suspect that number is low. $12 million if it took an extra year as you suggested and that's a low-ball estimate.


In retrospect no, The Witcher 3 would have been a success even if it were delayed by a year. But consider that CD Projekt almost went bankrupt over the development of Witcher 2 due to costs and delays and that video games are an incredibly risky business. There's a saying that businesses don't fail because they're unprofitable, they fail due to cash flow issues. Development costs only go up, so the longer a game takes the more expensive every single day of development ends up costing. Even if the game would ultimately prove to be successful, the huge risk that comes from having to fund its continued development for an extended period of time can put a company in jeopardy.

That The Witcher 3 ended up being a massive success is great but it was never a guarantee.


While I agree on the sentiment, what would have happened if it had gone out on Jan 2017 or 2018?

Development cost at least doubles, the graphics would start to look dated relative to other titles. Competition with other open world action rpgs increases.

The people responsible for making it the game that it is would have moved on, and other ideas would have come into the dev cycle.


I think you are missing what the comments point was.

Cyberpunk could have been delayed again, but it still would have been bad. The Witcher 3 could have had less crunch and released a year later and been as much of a success, but the studio would lose way more money.

Labor is extremely expensive and a year more of labor isn't worth it, so they rush. It just worked that time.


Final Fantasy XIV was a massive failure. The original version 1.0 was buggy, tedious and not very fun. People still had fun with it, it wasn't New World levels of complete failure, but it did not live up to the brand. So they replaced the two directors, who had previously worked on FFXI, with one Naoki Yoshida. Naoki-san had perhaps the worst task he could ever have. He had to figure out a way to turn a burning trash heap of an MMO into something that could proudly carry the name Final Fantasy (and make real profit).

He spent three months just on planning the design for a completely new MMO to replace the current one. Then went to upper management and told them to choose - they could have him just manage the current game, fix the bugs and issues, and make it a really polished, but still fundamentally flawed game, that would never have the kind of audience like WoW, one that would forever tarnish the brand; or, he could make a completely new MMO from scratch in one year, while also supporting the existing one. They chose the latter. Today FFXIV is the world's most popular MMO.

The process of getting there has been documented in interviews and it is what Naoki-san says in this one [1], that I want to bring attention to. Paraphrasing, "we could only get there by aggressively cutting down features and micromanaging daily tasks for each employee, to the point that the tasks fit into a standard 8 hour work day, which generally has about 6 hours of actual work done, when you factor out meetings and breaks".

This is a man tasked with fitting 5 years of work into 1, while also supporting the existing product, and he didn't immediately reach for overtime. He absolutely worked himself to death, but he didn't push the team beyond their limits. It helps that he was an MMO fan and had a full design document for the game he wanted to build, but it still shows that with good management you can deliver a product better than everyone else while not doing overtime.

To be fair, 6 hours of actual work is still a bit more than I hit most days. Having to work at full capacity every day would still be straining, but nothing on the order of doing 12 hour days 6 days per week for months.

[1] https://youtu.be/aoOI5R-6u8k


I'm sorry, the game on PC was a crappy bugfest. And as for being "impossible" on consoles... Isn't it effectively GTA in a different setting? GTA managed to run fine, and Cyberpunk could, but they weren't able to reach it.


GTA is an 8 year old game, it was released for PS4 and XBox One 7 years ago. I’m not sure how that compares?


It’s still super popular, more so than Cyberpunk and did all the stuff Cyberpunk shipped with, better, a long time before it. It does help that Rockstar have been basically working on making that kind of game for decades whereas CDPR haven’t.


Believe it or not it was released on Xbox 360 and PS3 as well.


GTA is a crappy bugfest as well, the graphics are very outdated, and the NPCs are quite limited in what they do; the world isn't realistic at all and frame rates routinely drop for no real reason. I haven't played Cyberpunk 2077 (yet), but I highly doubt it's as simple a game and as limited as GTA, which I have played a lot. I think it failed in part because they tried to port the game to those consoles, and it was just too complex for them, after seeing gameplay footage from it a while ago.


GTA V is rock solid, better written, better tuned, and more immersive than Cyberpunk. Of any mechanic I'm aware of that both games have, Cyberpunk's implementation is not as good.

Compare being chased by the police in GTA vs Cyberpunk for example.


really? it took a player upset with its very slow load times to figure out how the code works and made the loading times way better.... a very simple thing if you had the source code to do and devs never bothered, online is a horrible mess everyone cheats, first person game play they added is horribly with its massive input lag, let alone trying to get online was a joke.


GTA Online is a mess but offline is completely fine, and has a quite a bit of art to it whereas Cyberpunk's world just feels like it's barely keeping hold of the reigns.

The game is simply better made than Cyberpunk. It's not really a knock against CDPR, it's their first attempt up against rockstar, but there is simply not enough polish, not enough game design etc. For me to say anything else.

I wanted to love cyberpunk, but the moment I picked my faction and then was immediately funneled into the same story as everyone else it kind of set the tone for the rest of the game.

I sincerely hope they keep developing it but at the moment I can't bring myself to play at all vs. GTA which is very satisfying just to drive about a bit before I go to bed.


The NPCs in Cyberpunk are extremely limited in what they do, the world isn't realistic at all, and the frame rates drop for no real reason.

It's a simple game, and as limited as GTA is.


You can try, but so many games have been released half-finished, never subsequently completed via patches, and as a result sunk their studio and the whole franchise they were a part of. Games that should have become classics instead remain half-finished, mostly forgotten messes, and all the work those people put into the game are wasted in the long run. Why? To meet a holiday ship date?


This is so patently absurd that it goes against around a century's worth of knowledge and scientific research. "Spending more time on a task without rest does not get a job done properly" is not only commonly accepted, but it has been put to the test repeatedly in controlled environments and shown to fail, every single time.

The first link is filled with early-mid 20th century resources on this subject. The rest are all from the 80s - 00s because people... stopped studying this. Most of the studies covered the same thing, and it very quickly became an uninteresting problem in the field of Clinical Psychology because the answer was almost unanimously, unequivocally, "yes, this makes people output lower quality work product". It's one of the most settled problems in Psychology that exists in the modern day.

The only people still researching it are non-profits, and their research serves simply as advocacy, because otherwise businesses won't listen that this extremely, absurdly well-known and well-supported fact is true.

https://igda.org/resources-archive/why-crunch-mode-doesnt-wo...

https://www.tandfonline.com/doi/abs/10.1080/0267837890825693...

https://synapse.koreamed.org/articles/1125602

https://www.jstor.org/stable/40967587

Nevertheless, here is something specific to Game Development from 2016: https://www.takethis.org/wp-content/uploads/2016/08/CrunchHu...

Select quotations from papers quoted in that article:

"In a national Australian survey, employee cognition increased for hours worked up to 25 per week. Beyond that, cognitive abilities decreased. By 60-hours of work per week, cognition was lower than people who weren’t working at all (Kajitani, McKenzie, & Sakata, 2016)"

"In a large study of 4 US-based companies, productivity losses related to fatigue were estimated to cost the employers $1,967/employee each year (Rosekind, Gregory, Mallis, Brandt, Seal, & Lerner, 2010)."

"Lost productivity due to presenteeism (being at work while ill) is almost 7.5 times greater than that lost to absenteeism (Employers Health Coalition, 2000, p. 3)."

"In a study of a large, multi-employer, multi-site employee population, health care expenditures for employees with high levels of stress were 46% higher than those for employees who did not have high levels of stress (Goetzel et al., 1998)."

"People identified as workaholics were found to have higher rates of ADHD, OCD, anxiety and depression compared with those who weren’t ranked as being addicted to work (Andreassen, Griffiths, Sinha, Hetland, Pallesen, 2016)."


What I'm curious about is, if you can only be max productive at X for maybe 25-35 hours a week, can you also be max productive at Y and Z for 25 hours each in the same week, and how different do Y and Z have to be from X and from each other for cognitive fatigue not to carry over from one to the others?

Cos if you can be max productive at 3 challenging but sufficiently different cognitive/creative tasks for 15-25 hours per week each, that sounds like a super fun, productive, efficient, sustainable, healthy, and balanced life


Someone please answer this ASAP, ideally with multiple links to high-quality studies on the topic


Assuming what you say is true, and the studies apply etc: what do you think makes game companies ignore this research?

If what you are saying is true, you'd expect game companies that follow the advice to have a much easier time recruiting people (for less money even, probably) and deliver better games quicker?

What's going wrong? Is everyone involved an idiot or evil?


Games companies aren’t ignoring this, many companies have no crunch policies and try to make work-life balance a priority. I’ve done zero crunch in the last decade working in games.

There are still people that don’t approach things rationally though. And there are ignorant people who get funding that run their teams extremely badly. There are also “evil” people who think crunch is necessary (forged through adversity etc) to make a good product.

I’ve heard more stories of long working weeks from tech startups than I have games recently. Loads of people posting here seem to regularly clock in long working weeks when the topic comes up.


OK. But why haven't the 'good' companies out-competed the 'bad' ones yet, if as fao_ says the benefits are so enormous?


I've not seen any study that shows what's happening so the answer to that is that I don't even know that they aren't outcompeting the bad ones. There is simply no information about that I've seen.

I'd think though that the factors affecting the success of a company is very complex and to boil it down to one point is insufficient when talking about competition within the multiple markets that comprise 'games'.


Existing games companies were founded early in the industry and have the money and existing capital to leverage on PR, tooling, staff, etc. They also have a constantly influx of non-burned-out people, mostly interns, who want a Big Company on their CV. Whereas smaller companies are able to create good games, but can't throw people at problems in the same way. Worth noting that it's mostly only specific teams of developers that get treated as disposable, and not the marketing staff, HR, or other executive branches.


A strong incentive to maintain appearances. Even if less hours meant delivering a better game quicker, it would probably be difficult to get buy in and you take a large reputational risk.

If you don't deliver a better game quicker the change of process (I.e. less hours) and by extension you, will probably be blamed even if there are other legitimate reasons.


> Even if less hours meant delivering a better game quicker, it would probably be difficult to get buy in and you take a large reputational risk.

I see that this would apply to _most_ existing game studios. But if what fao_ says is true, studios that are explicitly set up to take advantage of the research should easily out-compete those that don't?


I went over your articles and they don't make a particularly strong case against crunch time. The first one tries to use construction work and other physically demanding jobs as a proxy for productivity. I think it's fair that working construction or manufacturing jobs for 60 hours a week for an extended period of time will result in lower productivity. It would be nice if the article had something more specific to software development or some other form of creative or technical endeavor. That said, it also says that crunch time for short periods of time on the order of two to three months does result in increased productivity.

The article relating loss of sleep to loss of productivity is fair and robust, it's worth taking seriously.

The article relating suicidal ideation with work hours is devastating, but it's hard to see how it relates to your point since the highest percentage of suicidal ideation is among those working less than 40 hours (17.2% suicidal ideation). The study itself is in Korean so I am not able to dig too deep into it but there is a table on p342 which I think is pretty revealing. Basically what it shows is that suicidal ideation is highly correlated to one's level of education; people who have little to no education showing the greatest level of suicidal ideation whereas those with a college education have very little suicidal ideation. My suspicion is that those with little education either work very long hours, or work very few hours which is why suicidal ideation is bimodal. Those with a college education are most likely to work within the 40-60 hours a week which had the lowest suicidal ideation.

The research linked to jstor.org is behind a pay wall, but I managed to find another source for it at [1] and it contradicts your point. It specifically states that no difference was found among those working an 84 hour work schedule compared to those working a 40 hour work schedule, and I quote right from the study which I link to below:

"In conclusion, working of one's own free will on an 84-hour workweek schedule with physically and mentally demanding job assignments does not necessarily induce greater physiological strain than working with similar tasks on an ordinary 40-hour workweek schedule."

It appears the problem with working long hours is not the duration of time, but rather the manner in which those hours are scheduled. The article points out that when people are able to schedule their own extended hours, they perform as well as those who work a given 40 hour schedule, at least as far as their physiology is concerned.

All told, the arguments you presented should not be quickly dismissed and need to be respected, but they are not as strong you made them seem and certainly do not establish my point as being patently absurd. On the contrary one of your sources says that crunch time over short periods of time from 2-3 months can produce an increase in productivity, and another one of your sources says that working long hours is fine so long as one is in control of their schedule. It is somewhat strange that you'd present said sources while also claiming that I'm being absurd.

[1] https://pubmed.ncbi.nlm.nih.gov/17091202/


> The research linked to jstor.org is behind a pay wall, but I managed to find another source for it at [1] and it contradicts your point. It specifically states that no difference was found among those working an 84 hour work schedule compared to those working a 40 hour work schedule, and I quote right from the study which I link to below:

Yes, follow up on the citations (It was written in the 80s) and you'll find it quickly contradicted.

> All told, the arguments you presented should not be quickly dismissed and need to be respected, but they are not as strong you made them seem

As I already said, we stopped bothering to do psychological research on this en masse, because it's not an "interesting" problem -- it's mostly been settled that it's unequivocally a bad idea. The idea that you think that you can look at 5 links and suddenly emerge with a complete understanding is very interesting though! Those were left as breadcrumbs for you to follow up on that I could gather in the shorted time possible, not as the complete picture.


You had every opportunity to present whatever justification you wanted to establish the categorical nature of your argument, and you chose to post some fairly mixed evidence and then turn around and make it seem like I'm the one who's supposed to come out of this an expert with a complete understanding of the topic.

I think posting a few links that you likely didn't bother to read through yourself (given that one is behind a paywall and another is Korean) and then proclaiming absolute confidence and authority on a subject shows not only poor critical thinking skills on your part but also a propensity to discuss challenging topics in bad faith.


Lower efficiency/productivity doesn't mean it's not worth working more, it only means that there are diminishing returns.


> Are you going to try to tell me that The Witcher 3 was developed by having everyone work a standard 8 hour day and that the team was given as much time as they felt was needed to produce it?

No, and it showed. Witcher 3 was a success because of previous games. 3 was a dissapointment for a lot of people, they took the Ubisoft formula of repetitive content and diluted the experience until I got bored after about 100 hours.

W3 was a huge financial success but the writing was on the wall and only people who did not pay attention where surprised at Cyperpunk disaster. They tried to force a good game out with W3 and even more so with Cyberbunk. The results where clearly different than W2 which I personally think was their peak - a human curated story pace that left you dreaming for more.

Compare this with GTA games. Rockstar knows you can't rush things nor can you force industry revolutions in AI over night. Slow and steady wins the race. To which company do you think will the investor money go next?


This was my personal experience as well. I discovered W2 by accident, and enjoyed it so much I pre-ordered W3. While I did have a great time, I personally didn't enjoy it as much. The various quests of varying quality clashed with each other as well as the rather weak, main plot. Cyberpunk was a real let down for me and made me return my pre-order purchase : (

While it's great to see games/companies become widely recognized, I feel that it always slowly gets poisoned by investors. Anyways, cheers all - stay safe.


Witcher 3 was the first of the series I played, and I enjoyed the writing and play-through so much I 100%'d the game and playing it again from beginning. There were lots of people like me who really enjoyed it, so saying it was a disappointment feels a bit weird.


If you want to, try witcher 2. There is a good chance you will like it a lot.

I know there are a lot of people who enjoyed it. For me though it was a sign of things not going in the right direction.


I was hoping they’d do something more novel with the gameplay like break the fourth wall gameplay where I can hack the engine to hack the world; replace doors with walls, and characters would react to the state change appropriately. Like a god game in an FPS.

I’ve not played but I hear AI is still dumb af, it’s COD with a cliche Hollywood cyberpunk style.

Pretty sad our art is still tethered to 80s-90s fanboy stylings. I feel like the obligation for things to fit such a literal expectation constrains creators. Then again the stories coming out of the the dudes running these places. Egh


Like artists, some programmers are able to make the tradeoffs to get the product out the door and others aren't. Compare Brandon Sanderson, who announces annually what to expect from him in the coming year and generally lives up to it, to Patrick Rothfuss, who lashes out to attack any fan who asks about the novel he claimed to be working on for the last 10 years. Similar to game titles such as Minecraft vs Star Citizen. Some people will never finish or keep starting over or keep expanding scope, and others have a gift of the ability to balance conflicting needs just right.

It's more about getting the right personalities on your team than giving blanket license for artists to take as long as they please.


Everyone, every writer anyway, knows that Sanderson is a freak of nature. Much to be admired, but not a standard by which to measure yourself.


Certainly true. Both examples are on the extreme side of the spectrum.


I thought the magic of Halo died a decade ago but I see the 180 that Infinite pulled in the past year and recently learned that Staten returned around that time. As Feynmann said, there are no miracle men, but Staten knows the formula for making a good game. He was there when Bungie had creative freedom and was making games that they wanted to play. Having him come on as a leader with political power was the only way Infinite could have ever been anything other than a disaster.

The recent IGN interview was great. It was an hour of hot chocolate for my niche demographic. It also confirmed that the magic of Halo was what we thought it was all along. You want to make a great game? Watch the Halo Vid Docs and take notes.

https://youtu.be/EqfPclVRioI


could you please tl;dr the magic for those who are completely out of the Haloverse?

(I played about 3 levels of the first one, it was ok for that time, I know people really love it, but so do Doom, or Diablo, etc)


Bungie in its day was very grounded and in-tune with "the player experience". The team would play the game with each other at the end of every day and have fun doing it. Only things that made the game more fun were put in.

As for storytelling in the campaigns: it was a space opera complete with mysticism and an epic journey. It's really hard to stretch that into a never-ending story, but the books have done well. The games have not. The first problem is in making the self-insert chief a human. It was obvious from the beginning. When I played Halo 4 I had a mental list of all the ways the creative direction dropped the ball. There's something that 343's culture has never understood about the Halo games. They think it's a Michael Bay film because the audience is young men but it's more like Star Wars V.

This is a highlight reel of what I'm talking about.

https://youtu.be/6qfx9eoB-88


> You can't rush art. Programming is an art since whatever is being programmed has never been made before, else you could go Copy + Paste and be done.

Eh, if you want to make a comparison, I would put most corporate programming into the bin of 'craft', much more than 'art'.

It's of course all subjective, and there's no one true category. They are just different ways of thinking about the subject.


> I would put most corporate programming into the bin of 'craft', much more than 'art'.

A former corporate programmer who was a highly paid corporate programmer at the time when he told me this is that all they do is very fast simple arithmetic. Add and subtract.

Great hackers on the PG spectrum may get struck by inspiration and may outperform the average corporate programmer by 10x or 100x or more and work crazy hours while the creative electricity buzzes.


> A former corporate programmer who was a highly paid corporate programmer at the time when he told me this is that all they do is very fast simple arithmetic. Add and subtract.

It totally depends. Most corporate programming is CRUD (https://en.wikipedia.org/wiki/Create,_read,_update_and_delet...), so not much arithmetic involved. But a lot of 'book keeping', and organising your information. Not rocket science, but more listening to your stakeholders.

> Great hackers on the PG spectrum may get struck by inspiration and may outperform the average corporate programmer by 10x or 100x or more and work crazy hours while the creative electricity buzzes.

See also Carmack etc. But I was talking about the typical programming tasks.


Artists don’t get paid until their work is complete, do you think programmers would accept that model?


If the people who made that offer had a good track record, and the money was good enough, I would.

(In some sense, this reminds me of getting startup equity, which is basically worthless until the company goes public.)


2077 is a fantastic game, and the only thing I wish is that I had never let my opinion of the game be colored by reddit and ycombinator.


It is a great game, but marketed incorrectly. It isn't an open world game. It is a narrative driven looter shooter with amazing visuals. It isn't the game I wanted. I have zero desire the replay it (and you can't really play past the 'end'). It is worth the price of admission IF you have a computer to run it, which is a significant hurdle.

Interestingly I played Psychonauts 2 and Cyberpunk 2077 back to back, both at times felt like playing a movie rendered into video game form.


Any game with voiced dialogue is either going to have one single story, a bunch of little threads that don't interact much, or both. Some games might give you a bit of wiggle room to get most or all of the content in a single playthrough; but usually that's going to be a fairly rigid solution since any slack time translates to a slowed narrative pace for casual players. It will likely look like a movie, but with some choose your own adventure cutting-room-floor edits and maybe player controlled pacing.


The latest Elder scrolls games, Grand Theft Auto, Saints Row, all had voiced dialogue and didn't feel as "on rails" as Cyberpunk did. None of those games are recent.

The way the core narrative is structured is one of the core problems with Cyberpunk, the games mcguffin implying an artificial time constraint to drive the narrative to resolution. Except driving the narrative to resolution ends the game.


Those games all spent their budgets better and had better planning. They were still on rails, just the core story was better planned and there were many more fulfilling side quests (extra threads) that stood on their own for players to organically experience. TES is infamous for players wandering off and almost entirely ignoring the primary story to instead enjoy all of the dungeons and side quests.


I never read any hype about games and was perfectly happy with CP2077. Granted, I played it on an Xbox One X, which had enough power to actually run it.

Ran into a good dozen crashes during my 40hour+ playthrough and one bug that affected my playing in any way (some quest-related audio didn't play and missed some stuff).

From what I've gathered there was a massive group of people, bolstered by what the Witcher 3 had become after years of development and multiple DLCs, who believed and took to heart every single word from the marketing department about three separate plotlines and AI NPCs with full lives etc.

While in actuality the "separate plots" were three short intros and a few background-specific discussion options.

This might be the worst case of mismanaged expectations in the history of gaming. CDPR shot for the moon, said they were gonna get to the moon and the result was a sparkler thrown off a roof. It was the prettiest sparkler in a long time, but that's not what the crowd was expecting.


The best way to foster and learn about a hobby is reading its subreddit. Also, the quickest and surest way to kill your love for a hobby.. is reading its subreddit.


Amen. There's so much bashing going around but when i played it, it blew my mind, the setting and athmosphere is phenomenal, the soundtrack is insane and i just want more of it, thats my only issue. The main story should be longer.


I have seen some gameplay footage and it's almost always worse than what reddit complained about.


Same. I stopped reading articles and posts about it because they were all just a pile of whining, while I was truly enjoying the game.

I now had the same with MTGA. Turns out discussion about games seems to always turn into complaining.


I waited, for a good while and jumped into the game on a high end PC around the time of the 1.3 patch. And I loved it. I don't usually clock over 90 hours in a game, but I easily done it here.


I agree, it was really dope. I still like Witcher 3 better but I don’t get the hate.


Crunch is what happens when you announce a release date and realize you can't make it.

What happens when your release date is "when it's ready"? Star Citizen.

Surely there's some happy medium.


Valve seems to have an interesting solution of their own to this problem, which is just to not tell anyone that they’re working on a game until it’s basically done. This removes the pressure on them to meet a certain deadline, but it doesn’t look very good from a public perspective. But it might be better in some cases than a Star Citizen scenario.


Valve barely ever releases anything. This means that players don't get shitty rushed games, but it also means that players get no games and the company gets no money. That's only realistic for a company like Valve, which earns in a minute what Star Citizen earns in a year.

A company/studio that actually needs to make games to survive can't afford to be perfectionists.


Before Half-Life: Alyx last year, Valve hadn't released a major game since Dota 2 in 2013. I wouldn't call that a solved problem.


Valve also basically have the entire PC gaming industry pay them rent. They could just work on Steam for the next 100 years and not want for money. That creates some pretty bad incentives, which coupled with their very informal management style can lead to the current situation. If rumours are to be believed, something is being done internally to address the problem and turn them back into a videogame company. I would personally love to see the Steam team split off into a completely separate entity, but I can see why that won't happen. In short, Valve's premiere product is a software delivery platform, and they sometimes spend some money to make a videogame. Sometimes that videogame even gets released.


I believe this is the way to go. There's a cycle that sets in once you set expectations and open a project up for public feedback. In the SC case, it means you're never finished; in the CD Projekt case it means no one is satisfied when you ship. It's not that a game can't live up to expectations, it's that the goalposts move further the longer it is left to the public imagination. tl;dr I think fan service is what kills projects. In the end, it isn't what makes or breaks a great game. But the temptation to hype what you're working on - and raise money from pre-sales, which requires constant fan pleasing - is hard to resist.


> Surely there's some happy medium.

Hollow Knight's release schedule was (at least partially) determined by the founders running out of money. Seems to have worked ok for this one.

The sequel is still in the making.

You might like https://www.gwern.net/reviews/Anime#on-development-hell


Well, to be fair, Star Citizen has a massive financial incentive to trickle out paid content without releasing. The day they release is the day they cut the goose that lays golden eggs for them. Sure, they'll sell the meat, but those golden eggs are never coming back.


Yeah, it's evolved into an absurd and mercifully very unusual/difficult to imitate business model. But in my (early) experience on the project, the model of selling content to a nonexistent game was something Chris Roberts very much stumbled into by accident. When I started working with him I'm sure it never occurred to him that such a thing could even be lucrative. CR was extremely deadline-driven until the crowd funding and the fan forums got into it. Prior to that, it was specific goals, limits, milestones, and a pathway to launching ASAP. In a sense, the crowd preferred being strung along, or at least sending resources to things that would delay development in exchange for some holy grail down the road. The shift was profound when it happened. SC is a bad example for most things, because it really was started with the best of intentions but became such a shitshow that most people assume it was intended as a con from day one.


Oh, I believe that they didn't start with this in mind, absolutely. To some extent, the whole thing seems to be a victim of its own massive crowdfunding success.

Basically nothing they release now could ever live up to the hopes, so even without the profit motive, the incentives would lead to endless moving goalposts...


> You could be you in a tabletop game and bring all the stuff that you wanted to bring into it,” he said. “A tabletop game is limitless. A video game, by its very nature of how it’s designed, has some limits.”

Thats it? Such a uber long article to reach such a feeble insight? Not impressed, New Yorker


What a weird thing to say. Board games and video games are two very different media with different capabilities and limitations.


I'd imagine one would have a tendency to consider what you're personally working on as superior in some way to something you're not.


A tabletop game is limited to (essentially) what you can make with cardboard and flavour text.

You can add flavour text to a video game as well. FTL is a great example of that making the game feel much bigger.


>A tabletop game is limited to (essentially) what you can make with cardboard and flavour text.

Maybe for board games. But for tabletop rpgs, I think it's different.

Games like Dungeons & Dragons are limited more by what you can describe with language and reasonably bookkeep without slowing down the pace of play.


Tabletop rpgs doesn't require any props. People do use rules, character sheets, dice - and anything from sketched maps to grids and miniatures - but essentially tabletop rpgs are less limited than say an author writing fiction.


RPGs are not regular tabletop games. Nobody was talking about Monopoly in that article. The whole point was about pen and paper RPG vs video games.


Did you read the letter that came with the Witcher 3? It was very touching, you could really see the care that was putted into the game. Hard to believe the same company released such a mess of a game right after. I bought it for 10 dollars in a sale and even so I felt robbed of my money. If I was looking at a npc and did a 360º with the camera, when looking that same npc the model would change.


Do you know what happened between the release of Witcher 3 and the release of Cyberpunk 2077? CDPR went public.


CDPR went public before the release of Witcher 2 back in 2009 when it merged with Optimus S.A. (Witcher 2 was released in 2011) [1].

[1] https://www.cdprojekt.com/en/capital-group/history/


An interesting take from Mike Pondsmith given his heavy involvement in the venture at the end of (at least in my opinion) well-written article: "[C]omparing the tabletop experience with its video-game incarnation, he noted that the latter doesn’t really compare to the former when it comes to self-expression. “You could be you in a tabletop game and bring all the stuff that you wanted to bring into it,” he said. “A tabletop game is limitless. A video game, by its very nature of how it’s designed, has some limits.” "


True, but to be honest, CyberPunk itself was a problematic RPG for a number of reasons. It was deadly beyond most RPGs (well, except for Call of Cthulhu), and one of the archetypes was called a Solo which didn't lend itself to group play. Maybe it is actually a game better suited to a video game.

On a side note, if you can get the CyberPunk 2020, it will give you a chuckle on the tech comparison with the real world. I think it fits into the Cassette Punk genre.


Deadly combat was popular in most RPGs in the late 80s and early 90s. Even experienced characters could easily die from bad luck or poor judgment, and if wounds didn't kill them, infections might. The focus was more on role-playing (and a kind of realism), and combat was not supposed to be something you enter casually.

As far I remember, the problematic role was always the Netrunner, not the Solo. While the Solo was just your standard warrior, the Netrunner played their own subgame while everyone else was waiting.


The only major problem for group play in cyberpunk 2020 was the net running rules, due to the time dilletation (an hour long netrunning game session would zip by in a handful of combat rounds - meaning you couldn't really use the rules to do a break-in supported by a netrunner (like the tessier-asphol run from Neuromancer)).

You could of course work around it like with all flaky rule systems - make stuff up, and try for reasonable balance/spotlight among the group. But the solo ("fighter") might be one easiest roles for group dynamics (eg the nomad is likely to be more occupied with pack/tribe/clan matters than group - assuming not all players are nomads/part of the pack..).

In general I'd say cp2020 is quite typical for "mature" systems - it works well for stories/groups where everyone is content getting their fair share of the spotlight, and enjoy a bit of intrigue, conflict and backstabbing...


Solo lends itself to group play just fine.

Just because your D&D character's class is "Fighter", it doesn't mean they need to fight everyone they meet :D

Solo is just the asskicker class of CP2020

Source: Played CP2020 in the previous millennium a bunch.


Tabletop also has limits, otherwise it wouldn't have any rules. By this logic pretend play is the ultimate game but good luck finding a willing player over 12.


> pretend play is the ultimate game but good luck finding a willing player over 12.

What do you think improv is? And pen and paper RPGs are a form of improv constrained by some rules. But those rules are what people agree for them to be and you can make them as restrictive or as loose as you can agree to.


My point is that taking it to the extreme then stopping at tabletop as the natural best is nonsense. Tabletop has rules. If rules are the enemy then tabletop is not the ultimate. People enjoy frameworks. On one end of the spectrum is pretend play and on the other is a movie theatre. What spectrum of interactive entertainment you want to participate in is purely subjective and one is not "better" than another. Having fewer limits is not a virtue. It is worth nothing.


Most TTRPG groups ignore rules. The DM almost always fudges things behind the scenes and rule-of-cools stuff. The only rules in TTRPGs are what the DM decides. The books are just guidelines.


I think tabletop came up here because it's the origin of Cyberpunk 2077, not because it is somehow the ultimate form of interactive entertainment. That last line is just claiming that Cyberpunk 2077 ultimately worked better as a tabletop game than a video game.

Mind you, I don't necessarily agree, but I also don't think it's such a grandiose claim.


> Tabletop also has limits, otherwise it wouldn't have any rules. By this logic pretend play is the ultimate game but good luck finding a willing player over 12.

TTRPGs are group pretend play with (a wide variety of differe r styles of) dispute resolution systems and supporting guidance.


Many play-by-post RPGs are more focused on developing a shared narrative than in game mechanics, so they become precisely that: pretend play with few restrictions. These were popular as internet forums about 10 years ago (I think people now play these on Discord) and I met a few players over 30 back then.


Not all of us have held onto our childlike wonder and imagination, so it's nice to have graphics I must admit.


There’s a sequel going on with BF2042… the solution is simple, don’t preorder games. In reality, I know it’s unlikely. I’d be in favor of legislation forcing a return window that’s automated for pre orders.


Unpopular opinion incoming:

The solution is consumers paying way higher prices for games. The reason games are rushed is because labor is a huge cost, and it isn't cost effective to continually bleed money waiting for revenue.

Easier to just release unfinished.


Why do consumers need to pay way higher prices for games?

In 2000 you could buy Deus Ex for $40. It was a fantastic and fantastically successful game, developed with a budget of ~$7 million dollars. It needed 175000 sales to break even, which it did.

These days the standard price for a game is ~$60, so we are paying more. The market has also grown enormously. Video games used to be a thing played exclusively by "nerds" and now nearly everyone plays them in some form.

And that's not even touching the whole micro purchases thing.

So revenue has already increased, why do we need to increase it even further?


$40 in 2000 is about $65 in 2021 dollars. so if you pick that point of comparison, not much has changed. but IIRC, AAA games were already $60 by 2005, so compared against that, we are paying substantially less.

at the same time, I really hate to pay $60 for pretty much any AAA title that comes out these days. AAA games are in a weird state. so much time and money has been spent patching and optimizing a fundamentally broken lighting model. bigger, prettier open worlds are created every year, and they really are impressive. but 95% of the map feels like no human has ever actually thought about whether it would be fun to play in. I kinda wonder who is playing all these 100+ hour games with mind-numbingly basic mechanics.


A movie entertains for ~2 hours and costs ~$10, ~$5/h. In this perspective, games are very cheap.


movies still have enormous budgets. and to break even they target a much larger audience. plus except the A list stars the pay is not amazing. and the hours are probably even worse. though shooting is usually more varied than sitting at the same computer and IDE for years.


what's stopping them from pocketing the cash and still releasing it unfinished? after all, releases like cyberpunk 2077 shows that people would still buy it.


I believe Steam lets you return preordered games at any point before release. (At least, I was able to preorder New World, play the beta, and then, right before release, decide I didn’t want to keep it and got a refund.)


That proposed legislation sounds like bureaucracy for bureaucracies sake.

All of a sudden, a company would need a compliance team to make sure they don't get sued if anything goes wrong with their pre-orders.


A preorder is a sort of contract. A supplier promises a product with features x, y and z, on date nnnn. They fail to deliver -> breach of contract. Already supported in most legislations, I assume.


Exactly. And that contract can specify whatever the parties want. Eg that you would accept a peppercorn in lieu of the promised game.

https://en.wikipedia.org/wiki/Peppercorn_(legal)


I used to have the same thought but though it was expensive up front, GDPR seems to be going smoothly. If they can do that - stopping what is in my mind the predatory practice of pre-release sales can also be done.


How is this predatory? Nobody forces you to pre-order anything.


I've always wondered why articles in NewYorker are so lengthier. Do journalists there get paid by words or what.


This game was marketed not as a coherent game but as a mix of the best tropes from rpg's and open world genre. That's why they only showed one faked vertical slice and development for that have been scrapped and restarted from scratch. Scope creep and false too good to be true marketing destroyed this game.


What a weirdly apologetic article. Could it have something to do with the remarks near the bottom?

CD Projekt has spent the year since the game’s launch trying to win back the faith of gamers—and investors.


Very odd to read an article in 2021 written for an (American!) audience that the editor judged needs an explanation of what D&D is. Can The New Yorker's audience really be THAT out of touch?


Not everyone is familiar with all aspects of (pop) culture.

Giving a brief explanation helps the parts of the audience that need it, and everyone else can just quickly gloss over it.

I am sure there's plenty of stuff that I found obvious that you don't know, and vice versa.


I am not mad about it, I am just surprised. It's been a long time since I've met an American of any class or generation who hasn't heard of D&D. D&D isn't Firefly, it's Seinfeld


Can't please everyone, the next commenter is going to complain how some article assumes everyone knows the basic rules of baseball.


As a non-American, I have no clue how Baseball works.

It's just chonky dudes chewing tobacco and hitting a ball really hard. Then sauntering over to a white pillow for a rest, right? =)


From what I've heard, Baseball is like a simplified, Americanified version of Cricket. Something like Baseball used to be called Rounders and played by pupils.

(Of course, I have at most the vaguest idea how Cricket works.)


Baseball is just golf for the working class


Golf also used to be the golf of the common people, too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: