Posts Tagged ‘technology’
March 23, 2015 | by Dan Piepenbring
- The word glitch “may derive from Yiddish words conveying slippage”—and glitch art explores the grating moments of slippage in our technology. It is, depending on whom you ask, new, old, incisive, crass, “beatified violence, “ “the product of an elitist discourse and dogma widely pursued by the naïve victims of a persistent upgrade culture,” or just kind of neat to look at.
- If the art world is consumed by the effects of the Internet on our synapses, literary fiction is just the opposite: much of it seems unwilling—or unable—to engage with the texture of networked life. Novelists prefer to set their stories in technological vacuums, and it disadvantages them: “I don’t see these elements of contemporary life as destructive of narrative possibilities, but as sources for new. I’ve become something of a collector of fictional moments in which networked life matters. Not the simple inclusion of emails and other ‘found texts’ in a novel, nor casual mentions of characters owning phones and computers, but scenes in which these technologies allow writers to show something distinctly now.”
- Does a “safe space” have any chance of functioning as a truly intellectual space? “While keeping college-level discussions ‘safe’ may feel good to the hypersensitive, it’s bad for them and for everyone else. People ought to go to college to sharpen their wits and broaden their field of vision. Shield them from unfamiliar ideas, and they’ll never learn the discipline of seeing the world as other people see it.”
- Readers (or book buyers) in the UK have expressed a seemingly inexhaustible desire for nature writing—it sells well, it gets good reviews, it questions “the values of our current society.” “I know of nature books that are being released this year on the last Thursday in July, when [Helen Macdonald’s H Is for Hawk] was released. It’s now seen as the new magical date in publishing.”
- Mario Vargas Llosa on the state of literature: “The function of the critic was very important in establishing categories and hierarchies of information, but now critics don’t exist at all. That was one of the important contributions of the novel, once, too. But now the novels that are read are purely entertainment—well done, very polished, with a very effective technique—but not literature, just entertainment.”
January 23, 2015 | by Dan Piepenbring
- Resolve your literary feud the media-friendly way: (1) do it at a public event, (2) make sure there’s not a dry eye in the house, and (3) invoke the memory of Charles Dickens, just for the sport of it. More than fifteen years ago, V. S. Naipaul and Paul Theroux “fell out in a spectacularly-bitter war of words, after Naipaul sold some of Theroux’s gifts at auction. The anger seethed for almost two decades. But on Wednesday the hatchet was resoundingly buried, with eighty-two-year-old Naipaul breaking down in tears after Theroux praised one of his most famous books at a literary festival in India, and compared the author to Charles Dickens.”
- Centuries ago, an excavation in Italy revealed a collection of some two thousand ancient Roman scrolls, most of them treatises on Epicurean philosophy. Unfortunately, the scrolls have a tendency to crumble in your hands, which makes it fairly difficult to read or even preserve them. People have tried taking knives to them (didn’t work), applying a gelatin-based adhesive (didn’t work), or just throwing them away (didn’t work). The latest solution: X-rays.
- The architect who bought Ray Bradbury’s Los Angeles house demolished it earlier this month, thus unleashing a furor from Bradbury fans. “It’s really been a bummer,” the architect said, adding in his defense that the home was exceptionally bland. “I could make no connection between the extraordinary nature of the writer and the incredible un-extraordinariness of the house.” Yesterday he hatched a new plan to honor the space: a wall.
- On Quvenzhané Wallis’s black Annie: “the fact that a black Annie has arrived on the scene at this particular cultural moment seems to me cruelly ironic … When it comes to persuading Americans about the virtue of selfishness, Ayn Rand has nothing on Annie … By making innocence seem invulnerable, Annie and other Teflon kids in fiction and film have helped to enable the widespread apathy about social inequalities that allows Americans to claim that our society is child-centered even though the percentage of children living in poverty in this country continues to grow.”
- Has technology accelerated life to the point of meaninglessness? On Judy Wajcman’s Pressed for Time: “Wajcman recalls seeing, at a nursing home, a daughter with one arm slung around her elderly mother, the other tapping on her smartphone. Though Wajcman acknowledges an initial negative judgment of this scene, she quickly reconsidered. The elderly mother was clearly not very aware of her surroundings and was likely comforted by her daughter’s presence. The daughter was able to provide this solace while engaging in other activities. (She could also have been reading a book or magazine.) Is this really to be condemned?”
January 21, 2015 | by Sadie Stein
On this date in 1976, the Concorde started flying commercial passengers on London-Bahrain and Paris-Rio routes. For the next twenty-seven years, this fleet of turbojets would ferry the rich and famous betwixt glamorous world hubs with unprecedented speed and luxury. And when the Concorde ended its reign, following a 2000 crash and a global post-9/11 flying slump, it was regarded as the end of an era.
For many—particularly anyone in its flight path—this was a relief. And since its inception, critics had regarded the gas-guzzling fleet as indefensible. In perhaps the ultimate eighties quote, Linda Evangelista declared, “If they had Nautilus on the Concorde, I would work out all the time.” It’s probably this tinge of decadence that’s burnished Concorde’s image in the years since its end. The tenth anniversary in 2013 spawned tributes and slideshows, images of spa-food menus and full bars, memories of the jet-setting clientele and the monogrammed napkins and crockery that these jet-setters famously stole as souvenirs. (Well, Andy Warhol anyway.) Read More »
January 9, 2015 | by Jason Z. Resnikoff
Watching the sixties and seventies through 2001 and Alien.
It was April 1968 and my father was sitting in a theater in Times Square watching 2001: A Space Odyssey, certain that what he was seeing wasn’t just a movie but the future. When it ended, he got up and walked out into Times Square, with its peep-show glitz and sleazy, flashing advertisements; he found the uptown subway beneath the yellow marquees for dirty movies like The Filthy 5; and through all of it, he thought that when humanity hurls itself into the depths of the cosmos, this is how we will do it. In the film’s iconic final shot, the space baby looks down at the planet to which it is no longer bound. Freedom, this shot says, is imminent.
My father was twenty-four then, and perhaps at his most world-historical: he was becoming an expert in computers. He’d worked for IBM in Poughkeepsie, New York, a corporate labyrinth of beige cubicles and epochal breakthroughs; a world of punch cards and reel-to-reel magnetic tape, where at least some of the employees were deadly serious about making sure to wear the company tie clip and then, once they were off duty, to switch to their own personal tie clips.
When 2001 premiered, he was working at Columbia University’s Computer Center, in the academic computing branch. I don’t think it’s unreasonable to say that the movie summed up everything my father was in April 1968. It became something of a talisman for him, a semisacred object invested with all the crazy hopefulness of his youth. For as long as I can remember, my father had talked about 2001. He told me often of HAL, of the monolith of evolution, of how glorious the future would be. Of course, when I finally saw the movie, well after the actual year 2001, it bored me out of my mind. Too slow, too bizarre. Ah, my father told me, that’s because evolution is slow, evolution is bizarre. It wasn’t until much later that I started to understand the movie—and, maybe, to understand my father. Read More »
December 22, 2014 | by Vikram Chandra
We’re out until January 5, but we’re re-posting some of our favorite pieces from 2014 while we’re away. We hope you enjoy—and have a happy New Year!
This is what ugly code looks like. This is a dependency diagram—a graphic representation of interdependence or coupling (the black lines) between software components (the gray dots) within a program. A high degree of interdependence means that changing one component inside the program could lead to cascading changes in all the other connected components, and in turn to changes in their dependencies, and so on. Programs with this kind of structure are brittle, and hard to understand and fix. This dependency program was submitted anonymously to TheDailyWTF.com, where working programmers share “Curious Perversions in Information Technology” as they work. The exhibits at TheDailyWTF are often embodiments of stupidity, of miasmic dumbness perpetrated by the squadrons of sub-Mort programmers putting together the software that runs businesses across the globe. But, as often, high-flying “enterprise architects” and consultants put together systems that produce dependency diagrams that look like this renowned TheDailyWTF exhibit. A user commented, “I found something just like that blocking the drain once.”
If that knot of tangled hair provokes disgust, what kind of code garners admiration? In the anthology Beautiful Code, the contribution from the creator of the popular programming language Ruby, Yukihiro “Matz” Matsumoto, is an essay titled “Treating Code as an Essay.” Matz writes:
Judging the attributes of computer code is not simply a matter of aesthetics. Instead, computer programs are judged according to how well they execute their intended tasks. In other words, “beautiful code” is not an abstract virtue that exists independent of its programmers’ efforts. Rather, beautiful code is really meant to help the programmer be happy and productive. This is the metric I use to evaluate the beauty of a program.
October 1, 2014 | by Dan Piepenbring
There’s a post over at Print Magazine about Frank Romano’s new book, History of the Linotype Company, which chronicles the rise and decline of the Linotype, a “glorious contraption” that was not so very long ago the industry standard for printing newspapers, magazines, catalogs, you name it. I’d be lying if I said I knew how it worked—to look at it is to imagine it taking your hand off—but fortunately there’s Wikipedia, which explains:
The linotype machine operator enters text on a ninety-character keyboard. The machine assembles matrices, which are molds for the letter forms, in a line. The assembled line is then cast as a single piece, called a slug, of type metal in a process known as “hot metal” typesetting. The matrices are then returned to the type magazine from which they came, to be reused later. This allows much faster typesetting and composition than original hand composition in which operators place down one pre-cast metal letter, punctuation mark or space at a time.
The machine was invented by Ottmar Mergenthaler, a German immigrant who set up shop in Brooklyn. At the height of its powers, it was used in eighty-six countries and in 850 languages. And the public domain is teeming with miscellany from the Mergenthaler Company, which produced an endless succession of handbooks, manuals, brochures, and pamphlets, among them Linotype’s Shining Lines, a sort of trade magazine with impeccably designed cover art: