Posts Tagged ‘video games’
September 9, 2015 | by Dan Piepenbring
- Sherman Alexie chose a poem by Yi-Fen Chou, a Chinese American, for this year’s Best American Poetry anthology. But Yi-Fen Chou was a pseudonym, it turned out, for Michael Derrick Hudson, a white guy. Now that he’s elected to include the poem anyway, Poetry Twitter is inflamed. But “I did exactly what that pseudonym-user feared other editors had done to him in the past,” Alexie says: “I paid more initial attention to his poem because of my perception and misperception of the poet's identity. Bluntly stated, I was more amenable to the poem because I thought the author was Chinese American.”
- Arthur Heming, the Canadian “painter of the great white north,” was diagnosed as color-blind when he was a kid; this motivated the strange palette of black, yellow, and white he used for most of his career in the early twentieth century. “Thematically, he worked with scenes whose colors were appropriately blanched: winter hunting and trapping expeditions that he took for the Hudson Bay Company and alongside people of the First Nations. His narrow focus in painting mirrored his work as a traveler, novelist, and illustrator, and the commercial nature of his output certainly influenced the mixed reception he received in the art market. In Canada he existed as an outsider of both the trapping communities he traveled with in the north and of his peers in the fine art world.”
- Rob Chapman’s new cultural history of LSD reminds us that psychedelia’s day in the sun wasn’t just some trippy bullshit in a kandy-kolored vacuum—it was a short-lived, potent moment with lingering political aftereffects. “Chapman insists that Hendrix, far from wandering up his own psychic fundament, ended up directing psychedelia’s transformative sonic potency against the state. ‘After Woodstock [in 1969], the atrocities of carpet-bombing and village burning were soundtracked by the symbolic flag-shredding that takes place during Hendrix’s extraordinary rendition of “The Star-Spangled Banner.” ’ ”
- And maybe, for a later generation, early video games were just as mind shattering as a tab of good acid: “I think Super Mario World was altering our perception long before acid or psilocybin mushrooms … the player irrevocably changes the landscape of Super Mario World. Empty space becomes solid matter, and you can access new parts of the game. Within the blink of an eye, the world, as well as the player’s view of the virtual world, transforms … Thirteen years later, I’d discover that LSD could similarly expose sediment layers of reality that I didn’t previously know about, thereby changing my perception in both immediate and permanent ways.”
- In 1906, a New Yorker named Julia Rice founded the Society for the Suppression of Unnecessary Noise, one in a continuing line of noble but ill-advised measures against the sounds of the city. In this case, the culprit was tugboat noise. “The campaign was related to the idea of a neurosis called ‘Newyorkitis’—an illness that arose from an unhealthy addiction to noisy environs. Her campaign was crowned with success: in 1907 Congress signed a law reducing the frequency of ships’ whistles in federal waters … However, Rice seems to have enjoyed quite a bit of noise in her life: her six children played instruments and the family allegedly kept a number of cats and dogs.”
March 25, 2015 | by Ted Trautman
Can Nintendo tell a proper story?
Nintendo and Netflix may be developing a Legend of Zelda TV series, the Wall Street Journal recently reported; or, as Time reported even more recently, they may not. Behind the will-they-or-won’t-they speculation lies a more complicated question: Can they? Do games like these bear expansion into full-fledged stories?
At first glance, a Zelda series seems like a savvy move: HBO’s Game of Thrones has proven that there’s high demand for vaguely medieval fantasies, of which Zelda—a franchise that made its debut in 1986, and that’s grown to include roughly seventeen games—is a prime specimen. And since Nintendo has gradually been losing its share of the video-game market for the past fifteen years, it has every reason to find other ways to wring more value from its globally recognizable intellectual property.
But games don’t translate as easily to TV or film as you might think. In his 2010 apologia Extra Lives: Why Video Games Matter, perhaps the most thorough defense to date of video games as art, the journalist and essayist Tom Bissell explains why: “The video-game form,” he writes, “is incompatible with traditional concepts of narrative progression.” Unlike books and films, games require challenge, “which frustrates the passing of time and impedes narrative progression.” Read More »
March 16, 2015 | by Dan Piepenbring
- On SimCity and the value of games that dared to make complex systems their protagonists: “SimCity is a game about urban societies, about the relationship between land value, pollution, industry, taxation, growth, and other factors … the game got us all to think about the relationships that make a city run, succeed, and decay, and in so doing to rise above our individual interests, even if only for a moment. This was a radical way of thinking about video games: as non-fictions about complex systems bigger than ourselves. It changed games forever—or it could have … ”
- Philip Roth’s misogyny is treated as a given these days; “the women are monstrous because for Philip Roth women are monstrous,” Vivian Gornick once wrote. But: “Maybe Philip Roth loves women? Maybe he, who offers a three-page description of female masturbation, is in fact an advocate for female desire? … While misogynists try to shame women, Roth celebrates women’s sexual power. It’s the men he is out to get.”
- “That sentence is shit. It’s got to be better. You asshole.” Matt Sumell on writing and doubt.
- Today in German words that dearly need English equivalents: verschlimmbessert, which can be roughly translated as “ ‘ver-worsebettered.’ In essence, it’s a combination of verbessern (‘to improve’) and verschlimmern (‘to make worse’). Here, then, is a verb that is able to express the idea of something simultaneously improving and worsening.”
- In which Benjamin Percy attends the dreadfully named Man Camp and enjoys a surprisingly rousing encounter with masculinity: “When men get together, they tend to speak with irony or rough-throated braggadocio, but [here] there was an uncommon sincerity to everyone’s tone. It caught me off guard.”
February 25, 2015 | by Dan Piepenbring
- In a 1914 publicity stunt—back when poets were free to partake of the great PR machine—Ezra Pound, W. B. Yeats, and four others gathered at a luncheon to eat a peacock. “The papers were alerted, and news of the meal spread far and wide, from the London Times to the Boston Evening Transcript.”
- Karl Ove Knausgaard, your humble correspondent, is traveling across America for The New York Times Magazine: “The editor proposed that I travel to Newfoundland and visit the place where the Vikings had settled, then rent a car and drive south, into the U.S. and westward to Minnesota, where a large majority of Norwegian-American immigrants had settled, and then write about it. ‘A tongue-in-cheek Tocqueville,’ as he put it.”
- Beethoven, Brahms, Mahler, Wagner: the Romantic legacy of these composers lives on … in first-person shooters. “The grandiloquent sounds of the nineteenth century are still alive in the new millennium … but only when someone is getting bludgeoned, bloodied, blown-up, or decimated with automatic weapons … Even heavy metal isn’t heavy enough for most composers seeking to juice up their combat scenes. We need something with a little more sturm und drang.
- Starting to write a book is hard. Then there’s the whole middle part—also difficult. And finally there’s the end, which is no cakewalk, either. Can we learn anything from the last sentences in famous novels? “For writers, the last sentences aren’t about reader responsibility at all—it’s a once-in-a-lifetime chance to stop worrying about what comes next, because nothing does. No more keeping the reader interested, no more wariness over giving the game away. This is the game.”
- On rereading Eileen Simpson’s Poets in Their Youth, a 1982 memoir of her turbulent marriage to John Berryman: “For a long time I could not shake the belief that these poets, all of them dead before their time from madness, self-neglect or suicide, paid a noble price for their pursuit of truth and beauty … I don’t think that anymore. Now, it’s Simpson herself who seems to be the hero … Simpson, who became a psychotherapist and went on to publish several books, writes with an almost uncanny clemency and a kind of cerulean objectivity. Where there might have been bitterness there is, instead, compassion.”
February 2, 2015 | by Dan Piepenbring
Tom Disch, who would’ve been seventy-four today, is best known for his science fiction and his poems, some of which were first published in The Paris Review. But he also wrote, in 1986, a text-based video game called Thomas M. Disch’s Amnesia, which has become a kind of curio in the years since its publication—an emblem of a brief time when gaming and experimental fiction shared similar agendas, and when “interactive novels” seemed as if they might emerge as a popular art form.
Amnesia begins the only way such a project could: in a state of total confusion. “You wake up feeling wonderful,” Disch writes,
But also, in some indefinable way, strange. Slowly, as you lie there on the cool bedspread, it dawns on you that you have absolutely no idea where you are. A hotel room, by the look of it. But with the curtains drawn, you don’t know in what city, or even what country.
January 6, 2015 | by Michael Thomsen
How The Evil Within and horror games manipulate their players.
Few relationships depend more on trust than the one you have with your computer. Without faith in the indifference of its automation, how could you share as much with it as you do? Video games are built around the fragility of this trust: they let us play with the horror in our dependence, experiencing the computer as a hostile entity within the safe, fictive frame of competition. To entertain us, games must defy our expectations. But their surprises can’t lapse into incoherence—if they do, our trust is violated, our fun spoiled.
Shinji Mikami’s games have tested the limits of that trust. He didn’t invent the horror video game, but in his twenty-plus-year career, he’s done more to popularize it than any other designer. His career began in the early nineties with a string of convivial family-oriented games, but it wasn’t until 1996’s Resident Evil that he made a name for himself. Combining graphic bodily horror and cryptographic claustrophobia—and set in a rotting mansion, no less—Resident Evil became a standard-setting high point. Playing the game felt like wearing a straitjacket, and this was part of the horror: its movement system was halting and cumbersome, and it used an incoherent array of fixed camera views, ensuring that even the basic rules for moving your character changed every few seconds, even during crises. The frustration informed the fear.
Nearly a decade later, in 2005’s Resident Evil 4, Mikami abused player trust by making the game’s fundamental action—shooting—unnervingly realistic. The animations of bodies taking bullets were lifelike to the point of inducing vertigo. Most games depend on some form of violent conflict, even if it’s only colored bits of candy exploding when they’re properly aligned, but we expect the games to have moral alibis for the violence they ask of us. But in Resident Evil 4, you played the role of an alien invading an innocent foreign culture—and watched, say, a farmer stumble after being hit in the knee, then slowly rise again, pressing past the normal human threshold of pain. The game forced its players to violate moral and cultural taboos, while craftily reinforcing the adrenal joy that came with those sins. It unmasked the cruelty of play.
Now, another decade later, Mikami has returned to horror with The Evil Within, which combines those earlier templates with a kind of graphic violence and semiotic incoherence reminiscent of pink cinema, a rich, revolting tradition of Japanese filmmaking that dates to the early sixties. Though the term is often used to describe Japanese erotica, pink cinema’s aesthetic is broader, with no real equivalent in the West. The scholar David Desser has described it as a brand of Japanese modernism—“achronological, arbitrarily episodic, acausal, dialectical, anti-mythic and anti-psychological, and metahistorical”—that aims to cast off the “bourgeois individualism” of American storytelling. Read More »