Advertisement

Objects of Despair: Drones

By

Objects of Despair

Inspired by Roland Barthes, Meghan O’Gieblyn’s monthly column Objects of Despair examines contemporary artifacts and the mythologies we have built around them.

A drone (Photo: Pexels)

There was a big magazine story several years ago—I don’t remember where—about drone pilots who worked at an air force base in Nevada’s desert. The pilots spent their days in a windowless control room at this complex, which was some distance outside of Las Vegas, operating drones in Iraq—or maybe it was Afghanistan. I can’t seem to remember any of the details precisely. At the time, drones were still novel, and the central thrust of the article seemed to be the ethically troublesome fact that a strike could be enacted from a distance of 7,500 miles. One detail I remember clearly was that the base was deliberately remote, so that the pilots, after their shifts were over, were forced to drive several hours back to civilization. Whoever was in charge decided that humans who had been at war should not be allowed to simply zip home and eat dinner with their families, or grab drinks with friends. They needed time alone in their cars to decompress and segue back into ordinary life, to transform from soldiers into civilians.

After reading this article, I tried to write a short story about a drone pilot who worked at this base. The story took place entirely during his drive home, and was largely interior, unfolding in the character’s mind. It was the kind of premise that interested me at the time. I envisioned a claustrophobic moral drama unfolding against the desert landscape as the car hummed across the interminable highway and the sun went down, turning the mountains the color of blood. But in the end, I couldn’t finish the piece. I could not imagine myself into the pilot’s head. Had he truly been at war? Or had he spent the afternoon in a Naugahyde recliner, pressing buttons?

This is the enduring question of foreign policy in the age of the drone: Are we at war? A strike kills six civilians in Yemen. The headline scrolls across the ticker on an airport flatscreen, appears on a news app amid the noonday quiet in a corporate office park. There is little or no context, little or no commentary. Outside, the sky is a clear and endless blue. The drone embodies the remoteness of modern warfare, but more than that, its thoughtlessness. It is the symbol of wars that are without leaders, of conflicts so diffuse and underreported they seem to have no face, no soul. Drone is a type of bee that is believed to be entirely mindless. It also describes the monotonous hum that machines make—or humans, when they are speaking like machines. Both meanings reflect our era of perpetual war, which is so unvaried and automatic that it can transition seamlessly from one presidential administration to the next, radically different one. (As the bumper sticker on my neighbor’s car puts it: AMERICAN FOREIGN POLICY DRONES ON.) At the time, I thought my failure to write the story was due to an epistemological problem—that I, a civilian, could not understand the psychological demands of war. But the problem was actually ontological. I was looking for consciousness in the byways of bureaucracy, searching for thought and conviction where there was none.

Drones, of course, serve many other purposes besides strikes. They are used for mapping, farming, and surveillance. Children are often given drones as toys, which is strange. Parents who would never dream of giving their child a toy gun or military figurines allow them to play with drones. We insist that the toys are not modeled after the weapons, even though they share a name. Children are not dropping bombs; they are filming, exploring, collecting information. But this is merely doubling down on the rhetoric of contemporary warfare—we are not killing, we are policing, mediating, surveilling. The myth of American exceptionalism has always relied on conflating war with the more mundane work of regulation and monitoring; it’s only recently that our technology has caught up to our dogma.

News footage is increasingly shot by drones. In fact, the machines are so ubiquitous on major networks that it’s easy to forget how recently they appeared. I can still remember when CNN first acquired one, on the fiftieth anniversary of the civil rights march in Selma—mostly because Jon Stewart memorably satirized their obsession with the gizmo. Throughout the anniversary coverage, the reporters offered very little commentary on the march’s historic significance. Instead, they kept gushing about the drone, cutting to the drone, showing endless aerial shots of the Edmund Pettus Bridge, as though perspective could substitute for insight. There was a certain irony to the fact that these antics appeared on CNN, a network that claims, more than others, to be impartial, and which Stewart himself frequently criticized for its lack of substantive analysis. In mocking the drone, he was making a joke about objectivity—the idea that news anchors, if armed with enough data, could soar above partisan conflicts and assume a moral authority. In reality, all they could offer was a bird’s-eye view.

In the future, I think all news footage will be shot by drones, and will unfold in total silence. There is already one network of this kind in Europe. It features footage from different locations around the world, on rotation. There is no commentary, no pundits, only a small box in the corner providing the location and the date. My sister, who lives in Albania, had it on one afternoon when I was visiting her. “I find it so soothing,” she said, then left me in front of the television while she put her children down for a nap. I sat there for almost an hour, watching aerial shots of protests in Romania and floods in Southern California and crowds gathered outside the Vatican, waiting for an appearance by the Pope. It was soothing. There is something sublime and hypnotic about seeing the earth from above. Before drones, satellites and helicopters provided such views, but this God-like perspective was never so abundant, nor accompanied by such elegant silence. As I sat there, I fell into a kind of trance, such that the images began to seem removed not only spatially but temporally. At some point, I understood that I was in the future, long after our planet had been obliterated, watching scenes that had taken place many centuries in the past; I was watching the final dramas of a fallen civilization.

What I was experiencing was delusion. It was the kind of hallucination induced by acid trips, madness, and extreme sleep deprivation, in which a person often feels that he is floating above his own body, looking down on it from above. Charles Lindbergh experienced something like this during his flight across the Atlantic, after remaining awake for over thirty hours. At one point, he felt as though his consciousness had become completely untethered. “For immeasurable periods,” he wrote, “I seemed divorced from my body as though I were an awareness, spreading through space, over the earth and into the heavens, unhampered by time and substance, free from the gravitation that binds men to heavy human problems of the world.” This is an account of human consciousness leaving a body, leaving a plane. It is an account of a man becoming a drone.

Drones satisfy our desire for transcendence. They extend our senses beyond their normal range. They tempt us to believe that we can rise above the particulars of our historic situation and perhaps even abandon the limits of subjective consciousness. For this reason, the drone has long been linked in my mind with the algorithm, another mindless technology that ascends to a scale unfathomable to humans—the petabyte, the exabyte—allowing us to obtain what we believe to be divine objectivity. Literalists will protest that unlike algorithms, drones are not truly thoughtless—there is still a human controlling them, albeit remotely. But this is merely a provisional technicality that cannot survive long. Last year, Google partnered with the Pentagon to create machine-learning algorithms that would aid drones in counterterrorist operations. The algorithms were built to scan video footage taken by drones and identify targets for strikes, making decisions that are currently made by pilots. Critics of the technology argued that it would lead to fully autonomous, weaponized AI. In the end, thousands of Google employees signed a letter protesting the collaboration, which resulted in the contract getting cancelled.

This kind of collective action necessarily calls upon the intense subjectivity of human interests. It requires waking from an automatic existence and attempting, as Hannah Arendt put it, “to think what we are doing.” Modern warfare, like all modern enterprise, is increasingly subject to the dominant myth of the information age: that insight and moral clarity will appear—like some emergent property—from the deluge of data we have collected. But thought does not arise from information any more than consciousness arises from the complexity of a computer. There is no ghost hiding within the gears—the homunculus piloting the machine is us.

 

Meghan O’Gieblyn is the author of the essay collection Interior States. Her work has appeared, most recently, in Harper’s Magazine, n+1, Tin House, and The Best American Essays 2017.