June 22, 2017 On Games When Lore Bores By Oliver Lee Bateman Video-game developers continue to search for the golden ratio of game play to storytelling. Still from Day of the Tentacle Remastered, an updated version of the 1993 game. My first video-gaming memories are clouded by Amnesia. That game, which comprised nothing more than white text on a black background, haunted me for years. My father bought it for the PC because he saw it on sale at Sears, brought it home, installed it via the command prompt, and then abandoned it. My brother had no use for it, either. They both played Microsoft’s Flight Simulator 3.0, for which we had purchased a joystick, and Tetris, which appeared on the home computer long before its popularity exploded on Nintendo’s handheld Game Boy. Tetris and Flight Simulator were “real” games, you see. Push the buttons in a skillful way and you would win. You could trump your high score or perfect your landing at Meigs Field. But Amnesia was just a story, a playable story, and from a game-play standpoint it wasn’t even a particularly good one. Like most text-based games, it relied on commands like “eat pizza” (always a favorite of mine) to advance the plot, and like most poor text-based games, it didn’t recognize many of the commands that the player typed. Read More
January 7, 2016 On Games Ready Player None By Michael Thomsen Talking to Jonathan Blow about his new game, The Witness. From The Witness. “Don’t print this,” Jonathan Blow tells me. I’ve just asked him how his game The Witness is going to end, having spent an hour playing it alone at the Bryant Park Hotel—in a suite I’d discovered was actually Blow’s personal room when I got a glass of water. He’d gone to the lobby so I wouldn’t feel like I was being watched as I played. I felt immediately conscious of being in someone else’s space as I stepped through the bedroom to reach the bathroom sink. The bed was still unmade; a small bag sat agape on a chair beside a pile of clothes in the corner. Blow’s games excel at making one conscious of these things: of being in someone else’s territory, at once intimate and opaque. Like unknowingly stepping into someone’s bedroom, it’s natural, when you play his games, to want to make sure you can find your way back out again, even as you think about going further in. Blow is the designer of two commercial games—2008’s Braid and now The Witness, due out later this month—and he’s as much a point of fascination as his creations. A 2012 profile in The Atlantic by Taylor Clark called him “the most dangerous gamer.” Though Braid added, by his own admission, “a lot of zeroes” to his bank account, he lives in a largely unfurnished apartment in Oakland, displaying what Clark described as “a total indifference toward the material fruits of wealth.” His longtime friend and programmer, Chris Hecker, told Clark, “You have to approach Jon on Jon’s terms. It’s not ‘Let’s go out and have fun.’ It’s more like ‘Let’s discuss this topic,’ or ‘Let’s work on our games.’ You don’t ask Jon to hang out, because he’ll just say ‘Why?’ ” Read More
March 25, 2015 On Games It’s-A-Me, Ishmael By Ted Trautman Can Nintendo tell a proper story? From The Legend of Zelda: Wind Waker. Nintendo and Netflix may be developing a Legend of Zelda TV series, the Wall Street Journal recently reported; or, as Time reported even more recently, they may not. Behind the will-they-or-won’t-they speculation lies a more complicated question: Can they? Do games like these bear expansion into full-fledged stories? At first glance, a Zelda series seems like a savvy move: HBO’s Game of Thrones has proven that there’s high demand for vaguely medieval fantasies, of which Zelda—a franchise that made its debut in 1986, and that’s grown to include roughly seventeen games—is a prime specimen. And since Nintendo has gradually been losing its share of the video-game market for the past fifteen years, it has every reason to find other ways to wring more value from its globally recognizable intellectual property. But games don’t translate as easily to TV or film as you might think. In his 2010 apologia Extra Lives: Why Video Games Matter, perhaps the most thorough defense to date of video games as art, the journalist and essayist Tom Bissell explains why: “The video-game form,” he writes, “is incompatible with traditional concepts of narrative progression.” Unlike books and films, games require challenge, “which frustrates the passing of time and impedes narrative progression.” Read More
February 2, 2015 On Games A Question Without an Answer By Dan Piepenbring The cover of Amnesia. Tom Disch, who would’ve celebrated his birthday today, is best known for his science fiction and his poems, some of which were first published in The Paris Review. But he also wrote, in 1986, a text-based video game called Thomas M. Disch’s Amnesia, which has become a kind of curio in the years since its publication—an emblem of a brief time when gaming and experimental fiction shared similar agendas, and when “interactive novels” seemed as if they might emerge as a popular art form. Amnesia begins the only way such a project could: in a state of total confusion. “You wake up feeling wonderful,” Disch writes, But also, in some indefinable way, strange. Slowly, as you lie there on the cool bedspread, it dawns on you that you have absolutely no idea where you are. A hotel room, by the look of it. But with the curtains drawn, you don’t know in what city, or even what country. Read More
January 6, 2015 On Games Trust Issues By Michael Thomsen How The Evil Within and horror games manipulate their players. A screenshot from The Evil Within. Few relationships depend more on trust than the one you have with your computer. Without faith in the indifference of its automation, how could you share as much with it as you do? Video games are built around the fragility of this trust: they let us play with the horror in our dependence, experiencing the computer as a hostile entity within the safe, fictive frame of competition. To entertain us, games must defy our expectations. But their surprises can’t lapse into incoherence—if they do, our trust is violated, our fun spoiled. Shinji Mikami’s games have tested the limits of that trust. He didn’t invent the horror video game, but in his twenty-plus-year career, he’s done more to popularize it than any other designer. His career began in the early nineties with a string of convivial family-oriented games, but it wasn’t until 1996’s Resident Evil that he made a name for himself. Combining graphic bodily horror and cryptographic claustrophobia—and set in a rotting mansion, no less—Resident Evil became a standard-setting high point. Playing the game felt like wearing a straitjacket, and this was part of the horror: its movement system was halting and cumbersome, and it used an incoherent array of fixed camera views, ensuring that even the basic rules for moving your character changed every few seconds, even during crises. The frustration informed the fear. Nearly a decade later, in 2005’s Resident Evil 4, Mikami abused player trust by making the game’s fundamental action—shooting—unnervingly realistic. The animations of bodies taking bullets were lifelike to the point of inducing vertigo. Most games depend on some form of violent conflict, even if it’s only colored bits of candy exploding when they’re properly aligned, but we expect the games to have moral alibis for the violence they ask of us. But in Resident Evil 4, you played the role of an alien invading an innocent foreign culture—and watched, say, a farmer stumble after being hit in the knee, then slowly rise again, pressing past the normal human threshold of pain. The game forced its players to violate moral and cultural taboos, while craftily reinforcing the adrenal joy that came with those sins. It unmasked the cruelty of play. Now, another decade later, Mikami has returned to horror with The Evil Within, which combines those earlier templates with a kind of graphic violence and semiotic incoherence reminiscent of pink cinema, a rich, revolting tradition of Japanese filmmaking that dates to the early sixties. Though the term is often used to describe Japanese erotica, pink cinema’s aesthetic is broader, with no real equivalent in the West. The scholar David Desser has described it as a brand of Japanese modernism—“achronological, arbitrarily episodic, acausal, dialectical, anti-mythic and anti-psychological, and metahistorical”—that aims to cast off the “bourgeois individualism” of American storytelling. Read More
August 29, 2014 On Games The Dreams in Which I’m Dying By Michael Thomsen The vanity of the zombie apocalypse. There are few things as narcissistic as an apocalypse fantasy. The apocalypse doesn’t mean the end of the world, just the end of humankind, and considering such a fate can lead us into a sentimental peace with the present day. Suddenly, in spite of all its flaws—flaws that might be harder to accept in less dire circumstances—the world seems worth keeping intact. In recent years, zombies have been a catalyst of fictional doom in every conceivable manner, from popular horror and comedy to moral parable and literary send-up. They offer us freedom from death in exchange for our subjective consciousness and social identity. But we’d sooner have death, if it means our egos can be spared for a bit. The Last of Us, a PlayStation game whose latest version was released last month, is a story about a zombie apocalypse, but it wasn’t supposed to be. Its creative director, Neil Druckmann, said in a 2011 interview that he wanted the game to be more of a love story, one between a middle-aged man and a fourteen-year-old girl. So maybe it’s more accurate to describe The Last of Us as a story about a kind of taboo love that requires a zombie apocalypse to normalize—and, by extension, a story that, through love, gives the fungal zombification of humanity a silver lining. Our species may be on the verge of extinction, but if we’re able to fall in love and learn a little about ourselves along the way, it can’t be all bad. Love is where all educated people go to bury their narcissism. Read More