Game Marketing Tips, Analysis, and News


Sunday, May 29, 2016

The Origin of Crunch Time for Game Development -- and Why It Should Die

Recently, the issue of 'crunch time' in the games industry has come to the forefront of discussion – crunch time being the still-common practice of forcing employees to work 60, 70, 80 hour weeks or more, sometimes for months on end, in order to complete games. Crunch time is in the news because of an IGDA report on the issue (and a promise to work with publishers about the issue, because according to an IGDA survey 37% of employees report they are not compensated for crunch time).

What really brought the issue to a boil was the derisory response to the IGDA report by industry veteran Alex St. John, who essentially said that crunch time is just part of the business – he concluded his screed by saying "Don’t be in the game industry if you can’t love all 80 hours/week of it — you’re taking a job from somebody who would really value it." Now, a number of people like Rami Ismail have fisked St. John's article (among others who wrote in opposition, his own daughter), so I don't feel the need to directly address the issues raised.

Still, I want to explore why crunch time arose in the games industry and became a standard part of the production process for many years – and why most of those reasons are no longer valid for most games. Let me make one thing clear up front: I think when crunch time occurs, everyone should be well compensated for it regardless of the reasons behind the crunch. And that while crunch time is often due to avoidable problems like feature creep or bad management, there are times when it's the least bad alternative – but those cases should be rare.

The Origin of Crunch Time in the Games Industry
Crunch time arose in the games industry because of the need to hit specific ship dates for several reasons. These reasons were dictated by the technology and the business model of the time, which was in the 1980's as the initial videogame console era (led by Atari) came to a close. Whether games were being created for consoles (after 1985, this was the Nintendo Entertainment System or NES, and later the Sega Genesis and other systems) or for personal computers (the Apple II and Commodore 64, initially), they all had some common features. The games were put onto some form of media (a cartridge or a floppy disc) and put into a box along with instructions, and then sold in retail stores.

This meant that manufacturing had to be prepared in advance for the final game code (the "gold master") to be ready for duplication, along with all of the packaging. Retailers had to be told of the game's ship date months in advance, so they could place orders. All marketing efforts, including advertising and PR, had to be created months in advance, with large amounts of money spent, in anticipation of a certain ship date that had to be chosen many months before the software was finished.

Add to all of this, once the software was duplicated onto its medium there was no way to change it or patch it – there was no online connection. Theoretically, you could perhaps recall all the packages you had shipped and replace the software, but the cost would be staggering – and I don't know of any cases off-hand where that was done.

Because of all these factors, the games had to be finished, tested, and debugged by a certain time or else the publisher would lose a lot of money. A missed ship date could mean advertising appeared months before release, rendering it useless (or worse than useless, if it got people mad).

Add to this the problems associated with early console games, which were all on cartridges (memory chips in a proprietary plastic case) that were only manufactured by the console maker. If you wanted to put out a game for Nintendo's Super NES, for instance, you not only had to have your game approved by Nintendo (which meant the game had to be submitted in final form), but Nintendo would manufacture the game as well. You had to tell Nintendo months in advance not only when you would deliver the game, but also how many cartridges you wanted made (and pay a large amount of the bill up front, too). If you missed your delivery date, your manufacturing window might be gone for weeks or months – so you might well miss the all-important holiday season. Thus, a huge incentive to go into crunch mode and get the game done.

Don't forget, too, that manufacturing was only part of the issue – huge amounts of money were spent on time sensitive marketing and PR. Advertising in magazines had to be created and scheduled months in advance. If you pushed out your game release by a month or several months, you'd miss the issues where your game was advertised – or worse, where your game was featured on the cover or previewed in a feature article.

Many video game companies in the '80's and '90's were positioning themselves to go public through an IPO – that was the primary way companies paid back initial investors at that time, and employees could realize huge amounts through their stock options. Thus companies like Electronic Arts were focusing on meeting quarterly financial goals for years in order to look good prior to an IPO – which meant meeting projected sales in a quarter, which meant doing whatever you could to ship software on the date you had promised many months ago. Crunch time again, with the carrot that your crunching could make you rich someday through a more successful IPO.

Finally, there was typically an extremely limited time to generate revenue from your game. Retailers didn't keep games around for more than a few weeks unless it was a very strong seller. Over 90% of a game's lifetime revenue might occur in the month after its release – so if that month was not perfectly timed with all marketing efforts and the peak game-buying season, you could be losing millions. Hence, crunch time.

Now, none of those reasons to go into crunch mode meant that you shouldn't compensate employees for crunching. Some companies did directly through added pay or bonuses or vacation time. Sometimes it was more indirect (the stock options might have an increased value). Often crunch time just wasn't compensated at all, but expected.

Why Crunch Time Should Now Be Rare
If you think about most of the reasons I listed for the existence of crunch time, most of them just don't apply in today's game market. Many of the most lucrative games earn their revenue over years, and are essentially services – like League of Legends or Clash of Clans. New content is constantly being delivered every few weeks, but it really doesn't matter what day it ships on. There's no manufacturing times, or long marketing lead times. New content is created and presented to players. Crunch time would be burning out your employees to no financial purpose.

As for getting the game perfect before it ships, that does matter to some degree – but it's nowhere near as important as it used to be when you couldn't patch a game. In fact, many games now make a point of having early open betas to let people test out the game and help refine the design, as well as add to the marketing excitement. Yes, games that are sold in retail stores still have some time constraints that might encourage crunch – but not to the same degree that it used to.


Really, crunch time should be rare if you are at all good at scheduling and don't succumb to feature creep in your game design. Yes, there are still good reasons for crunch – for instance, you're a small developer and you're going to run out of money unless you get your new game out the door generating revenue. But managers at all levels should strive to avoid crunch whenever possible. When it's necessary, make sure you are compensating the employees as generously as you can for the immense sacrifice you're asking them to make.

No comments:

Post a Comment