Porn and the Pinball Wizards

Video and computer gamers had sent waves of concern rippling through a nervous culture before Columbine. Entertainment activities—pool halls, pinball parlors, rock ’n’ roll, and even Dungeons & Dragons—had long been the focal point for underground youth subcultures, and like these predecessors, computer games had been periodically suspect in a wider culture that saw them as unfamiliar. In the medium’s early years, computer and online gaming avoided public scrutiny in large part thanks to its relative obscurity. The communities that had formed had done so on computer networks that were still years away from breaking into the mainstream popular culture. Arcade and home video games, which caught the public eye much earlier, were easier targets for criticism. Simpler and less community-driven than their online counterparts, video games triggered early concerns about possible ill effects on children as much for the arcade environment that grew up around them as for the games’ content.

These worries began in the mid-1970s—just a few years after Atari’s release of Pong, the simplistic Ping-Pong-like game that kicked off the arcade-game revolution—when a little San Francisco Bay Area video game company called Exidy released Death Race. Aside from the lurid skeleton-headed racers depicted on the side of its cabinet, the 1976 arcade fixture didn’t have realistic graphics. It was a driving game in which players used a big plastic steering wheel and foot pedals to guide little blobs of light around the screen. The game’s designer, Howell Ivy, had originally created it with a smash-up-derby theme, but contract issues and hopes of making a splash on the market had persuaded Exidy to modify it. In the new version, players drove their cars around the screen trying to run down little stick figures; success was indicated by the replacement of the figure with a cross-shaped grave marker.

The designers knew they were pushing the boundaries of what was acceptable in the market, but it was a call from a Seattle reporter that showed they might have stepped further across the line than they had anticipated. The figures were undead “gremlins,” not people, Exidy CEO Pete Kauffman explained to critics. That didn’t matter. The game quickly triggered national attention, garnering write-ups in the National Enquirer and other, more serious newspapers. It even prompted a 1985 segment on TV’s 60 Minutes probing the psychology of video game players.

Paralleling these fears over violent games, a national discussion about the potential harmful impact of Dungeons & Dragons was underway, fueled in part by speculation that Michigan State University student James Dallas Egbert III had disappeared after going into the university’s steam tunnels to play D&D in August 1979. The school’s newspaper initially played up the D&D connection, and the popular press followed. Eventually Egbert was found in New Orleans, where he’d fled after unsuccessfully attempting suicide at Michigan State. In 1981, Rona Jaffe wrote Mazes and Monsters, a book ostensibly about the Egbert case, which was adapted into a 1982 made-for-television movie starring Tom Hanks.

However, the exact details of Egbert’s disappearance, which ultimately had nothing to do with D&D, wouldn’t be revealed until 1984—four years after the young college student actually committed suicide—when the private investigator hired by Egbert’s parents wrote The Dungeon Master. [1] Nevertheless, the event and the media attention following the disappearance and the suicide helped spark the creation of concerned-parent groups across the United States.

By the mid-1980s, the parents’ movement was also calling for the regulation of video arcades on the local level, in much the same way that localities from New York City on down had once banned pinball machines. With individual arcade machines now ubiquitous everywhere from movie theaters to corner stores, parents worried that kids would skip school and be exposed to bad influences while playing. A Long Island mother and Parent-Teacher Association (PTA) president named Ronnie Lamm rose to national prominence as a spokeswoman for the anti–video game cause. Her activism started with petition drives, speeches to community groups, letters to state politicians, and even calls to the local fire department to ask them to check whether crowded local arcades were violating any fire-safety laws. Her own community of Brookhaven ultimately imposed a moratorium on new permits for arcades. [2] Other towns went further, making it illegal to place video game machines near schools, or barring video games from being used during school hours altogether.

While parents’ groups fought to stop the spread of arcades, many eyes turned to a legal case originating in Mesquite, Texas—coincidentally, the same Dallas suburb that would ultimately become the home of id Software. In 1976, in part fearing connections with organized crime, the Mesquite city council had targeted arcade builder Aladdin’s Castle with a variety of regulations, including one that would have blocked children under seventeen years of age from playing the games. The Fifth Circuit in New Orleans ruled that playing games was protected by the First Amendment. In 1982, the Supreme Court declined to rule on the constitutional issues, effectively granting those under seventeen the right to play arcade games.

This wave of concern wasn’t wholly focused on arcade environments. Critics including Lamm bolstered their arguments with the opinions of psychologists who criticized these games for being simplistic, aggressive, and potentially damaging to children. At this point, little medical research had been conducted to study the effects of interactive games, but prominent doctors were nevertheless ready with opinions. In 1982, even Surgeon General C. Everett Koop weighed in with an opinion, saying, “There is nothing constructive in the games. . . . Everything is eliminate, kill, destroy.” That opinion was widely quoted in later public debates, even though Koop clarified his remarks the following day, noting that his off-the-cuff opinion was “not based on any accumulated scientific evidence.” [3]

Science and facts, though, make for boring punditry. Some critics found it easy to identify provocative elements of games even if these seemed to be drawn from the realm of the absurd. Creative readings of Ms. Pac-Man and Donkey Kong, for instance, found rape metaphors hidden in the games’ subtext.

That isn’t to say that some video games didn’t cross well over the sometimes hard-to-define line of poor taste. A game explicitly celebrating sexual violence was created by Mystique, a company that designed a series of games with sexual content for Atari’s home video game system. Released in late 1982, Custer’s Revenge featured a tumescent, pixilated General Custer fighting his way past a hail of arrows to a woman tied to a pole at the other end of the screen. Success meant that a player had guided Custer successfully through the arrows and raped the smiling Native American woman. Groups that included Women Against Pornography, the National Organization for Women, and the American Indian Community House picketed a preview of the game at the New York Hilton. A second game by the same company called Beat ‘Em and Eat ‘Em featured similarly obscene content.

Yet those blatantly disturbing games often received a harsh and immediate rebuke from the industry. In the case of Mystique, Atari sued the distributor’s parent company for tarnishing the game system’s image by associating it with pornography. [4] A collapse of the console business in the mid-1980s temporarily drew attention away from industry, but this respite was no more than temporary. By the late 1980s, Nintendo’s home game system had wholly revitalized the game market, and sales were stronger than ever.

Grounded in the cartoonish world of the Super Mario Bros. titles, Nintendo catered primarily to teens and younger children, even as arcade games were becoming ever more violent. Sega, Nintendo’s chief rival in this new generation of consoles, looked to this arcade content as a way to set itself apart.

When the arcade mega-hit Mortal Kombat was released in 1992, the ultra-bloody fighting game found a huge audience. The game pitted two martial arts heroes against one another, featuring “finishing moves” that took the action definitively beyond the territory explored by similar games. Once an opponent was beaten, players had options such as setting an enemy on fire, punching his head off with a single uppercut, or ripping her heart out of her chest. Nintendo and Sega each wanted the game for their home systems, but didn’t agree on how to handle the violence. Nintendo took out the bloodiest parts of the game. Sega didn’t, and went on to sell far more copies than its more cautious rival.

In late 1993, Senators Joe Lieberman and Herb Kohl called a congressional hearing on violence in video games. While some in the industry muttered that the hearing had been spurred in part by complaints from Nintendo, angry at seeing rival Sega gain ground with the sale of its more violent games, the lawmakers’ attention was in fact focused across the industry. In the hope of defusing some of the criticism, a large group of leading game companies, including Sega and Nintendo, announced early on the first day of the hearing that they had agreed to create a rating system for their games.

This peace didn’t last long. In the hearing, a Nintendo representative attacked Sega for its release of violent games and said his own company had tried to mitigate the industry’s worst excesses. In response, the Sega representative pulled out a bazooka–style gun accessory used by some Nintendo games and wondered aloud whether it was an appropriate means of teaching nonviolence to children.

Nevertheless, this move toward self-regulation pacified the industry’s critics for several years, and the political and media spotlight was shifting elsewhere just as Doom and Quake were released in the computer world, kicking off a whole new genre of bloody games. The console world was no less bloodthirsty, and as computer graphics grew exponentially better and sound quality improved, the gore got gorier. Industry spokespeople countered criticism by arguing that violent games, which were rated “Mature” under the new system, constituted only a small percentage of the titles released, were not intended for children, and were outsold in any case by competing titles, such as sports games. For the most part, members of the growing game communities ignored the background hum of the outside world’s opinion. It had little relevance to their daily lives unless a rating prevented a young fan from getting a game.

Then came Columbine, and the outside world’s view, skewed or not, took on a new importance.


  1. William Dear, The Dungeon Master: The Disappearance of James Dallas Egbert III, (Boston: Houghton Mifflin, 1984).
  2. Ellen Mitchell, "Video Game Room Targeted by Towns," New York Times, December 13, 1981.
  3. E. Koop, public statement, November 10, 1982, http://profiles.nlm.nih.gov/ps/access/QQBBCF.pdf.
  4. Tim Moriarty, "Uncensored Videogames: Are Adults Ruining It for the Rest of Us?" Videogaming and Computer Gaming Illustrated (October 1983).

Leave a Reply

Your email address will not be published. Required fields are marked *