This article heavily uses sources such as Jason Schreier, journalist for Kotaku.com, and Jim Sterling, a YouTuber/writer in charge of The Jimquisition, a series regularly looking into the unsavory practices of the games industry. Some links contain profanity.
Video game crunch. It almost sounds like a name for a brand of tasty breakfast cereal, but it’s far more sinister than that silly joke I tried to make.
Now more than ever, we have witnessed incredible leaps in the design, graphics and scope of video games. Compared to the PlayStation 2 era, when I first started playing games religiously, the PlayStation 4 scoffs at the graphical achievements of titles such as Metal Gear Solid 3: Snake Eater, Shadow of the Colossus or Resident Evil 4.
Though amazing progress has been made in the presentation of games, the practices of companies within them have grown increasingly repugnant. CEOs of major video game publishers like Activision and Electronic Arts (EA) continue to earn multi-million dollar salaries while paying developers a pittance in comparison and laying several team members off.
Monetization in games has become commonplace through the use of loot boxes, purchasable items to make the game easier or give players an advantage in competitive multiplayer environments and paid downloadable content (DLC). That’s not even to mention the abusive individuals working at the top of several companies (examples include Randy Pitchford of Gearbox Studios and several higher-ups at Ubisoft).
I love video games. Playing them is getting me through the COVID-19 pandemic. However, as these issues in gaming become worse and more noticeable, I can’t ignore the damage being done to the people sacrificing their health and well-being to ensure a game releases on schedule for no extra compensation.
Let’s dig deeper into the cost developers are having to pay so that horses in Red Dead Redemption 2 have realistic shrinking testicles and can poop in real-time.
What is Crunch?
Crunch has a very fluid definition, as it doesn’t just apply to the video game industry. It happens in Hollywood, TV, journalism, factories and just about any other business where deadlines and perfection are the most important milestones to meet.
“Crunch time” is usually the period right before the release of a product where staff has to work 60-100 hours a week regularly to ensure specifications are met. They are irregular, overtime hours that are unpaid the majority of the time, as the developers are typically paid on a salary basis.
However, in some places, crunch lasts through most of production in efforts to meet milestones,such as a showcase at a convention, playable demos at tech events, presenting the game to the publisher for review, or in some cases, when a certain part of the game tests poorly with audiences and requires a complete retooling.
It is difficult to nail down a definitive definition of crunch because of the variables involved. The amount of crunch expected depends on the type of game being made, the budget available, the publisher in charge, the age and size of the staff and the tightness of the release date. Crunch can be a period of a couple weeks or several months depending on these variables.
If a studio head asks to delay the release date, cut features and other content from the title or outright refuse crunch, they risk being shut down and putting their staff out of work. Yet, it is all-encompassing in the industry, from huge productions like The Last of Us Part II and Red Dead Redemption 2 to indie titles.
Some publishers even factor crunch into the development schedule, using it as a cost-cutting measure to make games bigger and more expansive while sticking to a short timeframe.
In an interview with Schreier, Tanya X Short, a co-founder at the indie studio Kitfox Games who has also worked in AAA development, bemoans the “short-sided and disgusting” practice.
“If your milestone is more than two weeks away and you can tell you’re not going to make it, you have to cut features or extend the milestone,” Short said. “Those are your options. It hurts to cut what feels like limbs off your baby, but sometimes it’s necessary. Certainly more necessary than pointless, burn-out crunch—which, if you’re lucky—will only leave you sick (physically or creatively)… and if you’re unlucky will make you miss your milestone, get sick, and start you down a path towards bad production practices.”
At least in the video game industry, crunch is a normalized part of the culture according to Schreier, as it has been going on for decades and first reached the public light in 2004. Erin Hoffman, a designer at EA, published a blog documenting the personal strain crunch put on her significant other working there.
“No one works in the game industry unless they love what they do,” Hoffman said in the blog under the username “EA Spouse.” “The love of my life comes home late at night complaining of a headache that will not go away and a chronically upset stomach, and my happy, supportive smile is running out.”
Company executives such as Dan Houser of Rockstar Games brags about how long his staff works in interviews.
However, there are options to push back against the culture of crunch, such as studios working a consistent year-round schedule instead of crunching for a long period of time before taking a break, like the team behind the Hitman games.
Developers can also unionize with the help of organizations such as Game Workers Unite, a grassroots organization that provides support and education to workers in the industry.
Customers can also speak out, both on social media and with their wallets, by condemning publishers who use crunch and boycotting games that were developed with it.
Video Games are a multi-billion dollar industry, but as the budget and scope for AAA titles continue to rise, so do the practices that leave countless passionate designers damaged mentally and physically.