All Your History: The Video Game Crash of 1983: Continue?

embedded by Embedded Video

The video game industry is one of the most powerful in all of entertainment globally, a multibillion dollar juggernaut that is now larger than Hollywood.  From powerhouse consoles like the Xbox 360 and the PlayStation 3 to smartphones and Facebook, gaming has become truly ubiquitous and a major driver of innovation.  But the rise of the industry was never a sure thing, and in fact, at one point, the entire market cratered.  Major players were put out of business, the home console market vanished, and the entire art form nearly disintegrated.  The Video Game Crash of 1983 came close to wiping video games off the map for good.  But instead of being the death of a fad, the lessons learned from the astoundingly bad business practices of early gaming allowed for a renaissance of the medium.  In many ways, the gaming world we know and love today was shaped directly from the wreckage of 1983, when the entire industry ran out of lives.

-Game Over-

In the early ‘80s, these newfangled “video games” were changing the face of American business.  The biggest name in the industry was Atari, and in only three years, their annual revenue had gone from $75 million to $2 billion.  That made Atari the fast-growing company in the history of the United States.  Games were making money hand over fist at a rate that defied belief.  Nothing had ever caught on that quickly with audiences before, or that profitably.

In fact, it had grown too quickly for its own good.  The actual business of gaming was still very young.  Nobody was really sure what it was that audiences wanted, how to properly monetize their products, or how to reward their own employees.  All they really knew was that, clearly, kids liked setting the high score.  So by the ‘80s, when the sheer money in the market was obvious, everybody jumped on the bandwagon.  Atari’s Video Computer System, also known as the Atari 2600, was dominant.  But it was competing with nearly a dozen other home consoles, including the Magnavox Odyssey 2, the ColecoVision, the Mattel Intellivision, and a host of others.  Some of these companies were electronics companies, some were toy companies, and some were actually switching markets completely just to get on board.  The market was completely oversaturated with consoles — and yet, more kept coming.  The industry was too young to realize that it was shooting itself in the foot.  And still, more kept coming.

Meanwhile, all of these consoles suddenly found themselves competing with another brand-new technology: the personal computer.  Unlike the specialized consoles, PCs were generalized devices that could be used as both business machines and gaming platforms, which made them a better overall investment for many households.  PCs also had long-term memory, which allowed players to save their progress instead of having to start over every time.  So not only were these numerous consoles competing with each other, but also the exploding PC market.

So clearly, the industry didn’t yet have the best business practices for their hardware.  Perhaps it wasn’t a surprise, then, that they didn’t have the best practices for their software, either.  At the time, Atari and the others did not want to recognize or reward their top programmers.  These were the guys who actually made the games, and in that era, each game was made by a single programmer.  A programmer would typically only make $20-30,000 a year, even if his game went on to sell millions.  And he wouldn’t even be appreciated for his work, because credits weren’t allowed.  Atari wanted Atari to be the only “creator” associated with any work.  Individuals were merely cogs in the machine.  By 1979, four programmers had had enough.  That year, David Crane, Larry Kaplan, Bob Whitehead, and Alan Miller all left Atari to start their own company, which they called: Activision.

It would become a watershed moment for the industry, and one of the key causes of the coming Crash at the same time.  Before Activision, every game was first-party, that is, all games for Atari were made by Atari, all games for the Intellivision were made by Mattel, and so on.  Because of that, Atari and the others sold their consoles for cheap, but expected to make their money back on software sales.  Activision was the world’s first third-party developer, and as such, if a consumer bought an Activision game for the Atari 2600, Atari wouldn’t see a dime.  The Atari console had no locks on it; anyone could make a cartridge to work on it.  Third-party development completely shattered the business model the entire industry was based on.  Terrified, Atari did what all big companies do: it sued.  Unfortunately for them, Atari lost.  In 1982, the US judicial system publicly affirmed third-party developers’ right to exist.  Which meant that, once again, everybody jumped on the bandwagon.

So not only was the market saturated with different consoles, now, it was absolutely flooded with games.  Companies that had no entertainment experience whatsoever — like Quaker Oats and Purina — started making video games, because they seemed like an endless gold mine.  With too many games came the inevitable: a dramatic drop in prices.  Entire companies sprang up with the intention of making bad games that they could sell cheap.  And none of these sales went back to the companies that actually made the consoles.

Still, it all might have worked out if Atari had maintained its standard of quality.  There will always be bad games, but if Atari could prove that their products were superior, audiences would remain loyal.  But after the success of Activision, Atari’s best programmers left in droves, fed up with low pay and no recognition.  And then, on top of the talent vacuum, the worst happened: business executives started losing touch with the real world.

For example, in 1981 Atari acquired the license to the arcade sensation of the year, Pac-Man.  So, the executives wanted a version of Pac-Man on the Atari 2600.  Seems like a good idea, right?  It is, unless you do it the way Atari did it.  First of all, Atari only got the license late in the year, but wanted the game out in time for Christmas, which gave the programmer almost no time to actually make the game.  Second, Atari calculated that the game would sell 12 million copies… but they had only sold 10 million consoles!  They actually believed that people would want the game so badly, that two million people who didn’t have a console would buy one just for Pac-Man.

On top of all that, they didn’t ask one of their own programmers to make the game, possibly because all their best talent had already left.  Instead, they contracted an outside freelancer.  The problem was, they agreed to pay him a royalty for every unit manufactured.  Not sold: manufactured.  And they’d already promised to make 12 million copies.  In other words, the programmer was already guaranteed a giant payday before he’d even made the game, which didn’t give him a big incentive to make the game any good, especially since he didn’t have enough time to make a good game anyway.

The result was a catastrophe.  It was slow.  It was ugly.  It barely resembled the game people had fallen in love with.  The game had been a success in arcades for its vibrant colors, smooth animations, and endearing sound effects.  This game… had none of those.  Atari printed its 12 million copies, and only sold 7 million.  And while that was still good enough to make it the top selling game in their console’s history, it left them with 5 million unsold cartridges sitting in their warehouse.  5 million.  The console’s second best-selling game — Pitfall by Activision — only sold 4 million.  Pac-Man was a disaster.

And apparently, they didn’t learn from it.  The next year, the business executives signed another deal, this one with Steven Spielberg to make an ET game.  ET promised to be one of the biggest movie hits of the year, and it was thought that a game based on it would also sell really well.  Again, not a bad idea.  But once again, they gave their programmer almost no time to do it.  Specifically, six weeks to design, program, quality test, manufacture, and ship.  Even by the standards of the time, that was insane.

On top of this, to get the rights to ET, Atari paid Spielberg $25 million up front.  Then on top of that, Atari ordered 5 million copies to be produced.  Only Pac-Man had sold more than 5 million copies in Atari’s history.  Once again, the executives believed that the strong ET name alone would send people in droves to buy the game.

It didn’t.  After getting burned by Pac-Man, audiences didn’t buy the game just based on the name.  They waited to see if it was any good.  And it shouldn’t be a surprise that a game made in only six weeks resulted in one of the worst pieces of interactive entertainment ever produced.  It was so terrible… that almost nobody bought it.  Once again, Atari had made a huge investment that resulted in 5 million copies sitting in a warehouse.

So, Atari was making horrible games, and selling them at full price.  Meanwhile, competitors were also making horrible games, but selling them at bargain bin prices.  Therefore, no one bought Atari’s own games anymore.

It all led to Atari’s December 1982 earnings statement.  They told shareholders that, due to all these incredibly negative factors and horrible business decisions, the company… would still grow by 10-15%.  Which is actually, great!  The problem was, before the announcement, they had told shareholders to expect 50% growth.  It was a huge shock to Wall Street.

And it sunk the whole boat.  Stock in Atari’s owner, Warner, crashed.  Sales of consoles and games, both oversaturated, tanked across the board.  Rumor has it that, unable to sell their tens of millions of surplus cartridges, Atari sent them all to a landfill in New Mexico, bulldozed them, and then buried them in cement to cover up the embarassment.  Atari had been the banner of the gaming revolution, once upon a time the fastest-growing company in American history.  Now, it was $500 million dollars in debt.  This was the Video Game Crash of 1983.

And so, the gaming fad should have ended.  Analysts predicted that it was game over, that audiences had moved on, that investors and retailers alike should look elsewhere.  But the little medium struggled on.  PCs continued their meteoric rise to prominence, and computer games never suffered any recession; it was this platform that allowed Activision to survive.  Arcades, too, continued on strongly.  Eventually, a few ambitious businessmen in Japan dared to think the impossible.  Maybe home consoles weren’t just a boom-and-bust fashion.  Maybe audiences really did want dedicated, powerful entertainment devices.  Maybe Atari’s horrendous business decisions were more to blame than the medium itself.  Maybe it was time for their company to take the plunge.

The company was called… Nintendo.  They had already released a home console in Japan, the Famicom, which had sold well.  But to bring it to the jaded American population, they’d have to reinstill confidence in the very idea of home systems.  They did it by locking down their console.  Unlike the open Atari 2600, which anybody could make a game for, the Nintendo Entertainment System would only work with cartridges that had a specific chip in them.  Only Nintendo itself would hand out these chips.  In other words, Nintendo would have full approval over any games coming to its system, and could reject inferior products.  Aside from making sure that Nintendo games were fun across the board, it also helped them out from the business side.  Atari had never seen a dime from third-party developers.  But since only Nintendo had the magic chips, a third-party developer would have to agree to pay Nintendo royalties if they wanted to publish games on the NES.  Atari had treated third-party developers like the enemy.  Nintendo saw them as an untapped gold mine that could add to the diversity and quality of their console.

And the rest is history.  The NES destroyed Wall Street’s expectations of failure by becoming a towering commercial success and an icon of the medium.  By proving that the home console market was never dead, just misunderstood, Nintendo began a legacy leading straight to modern powerhouse systems.

Today, industry-wide acceptance of third-party development is the norm, all of whom are monetized by the console-makers by their locked systems.  The industry has enough experience under its belt to avoid the glaring mistakes of the past.  But remember, this is not because publishers are smarter now than they were before; merely, that they have the benefit of hindsight.  The decisions that led to the crash of an entire industry have been studied and studied again, until they have become unthinkable in the today’s boardrooms.  There will always be games that don’t sell as well as expected, but you will never see warehouses full of unsold product again.  The multibillion dollar juggernaut is as strong and vibrant as it is today directly because of the Video Game Crash of 1983, the year the art form we all know and love seemed to die for good… and then, against all odds, respawned to fight another day.

  1. I saw this two-three weeks earlier on the YouTube channel.Please keep your blog up-to-date.

  2. Pingback: Pitfall: storia ed analisi di una pietra miliare

  3. Pingback: 6 Reasons Handheld Gaming Isn’t Dying « Fire & Ice

  4. The irony in this story is that none of the companies involved actually ceaced to exist, the clone-makers kept making clones, but shifted to the PC market, most of the video-game companies that maked clones, anc clones of clones, and clones of the clones’ clones, which were cloned and… (*_*) well, you get my point, were easily accepted into the PC market since IBM and Microsoft (the 2 major companies of the time) made little PC’s and a lot of software, the bussiness-system/-model that led to the crash of video-games didn’t ”die” it simply moved somewhere else, and this became the end of IBM and Microsoft, but not for Apple which makes its own soft- and hardware.

    Microsoft entered the gaming market in the next century with the X-Box which became a HUGE success. the irony in this is that when Microsoft creates software for the PC it leaves the parts (hardware) to be created by other companies (many of the companies that expertise in clones formerly mentioned, + ASUS, ACER, Medion, Packerd Bell, H.P. and Medion + MANY others), but Microsoft does make the hardware in the gaming industry, but not the software.

    • Actually, this isn’t even true, Nguyen Van Minh. Microsoft produces plenty of software. Fable, Dungeon Siege and Halo, just to name a few. They are all developed by in-house dev teams, wholly owned by Microsoft. Actually Bungie is on their own now, but they were owned by Microsoft. So, yeah…not true.

  5. Pingback: It's All About Gaming | The Video Game Crash of 1983

  6. the crash will happen again! the signs are all over the place!
    1. new “open” game consoles such as Ouya and the recently announced SteamBox with Steam OS. and more and more companies are expressing to release their own game console.
    2. Open-ness of consoles will produced a large libraries of games similar to the 80s.
    3. Android openness to third party development produces many games and apps, also don’t forget iOS SDK too,

    oh wait, the couple of things I listed above is enough to crash the market. The signs are all there! the market will crash again!

Tell Us How Wrong We Are

Your email address will not be published. Required fields are marked *