There’s a scene in Don DeLillo’s story “Midnight in Dostoevsky” that reflects on the current omnipresence of digital media and the relative oasis that the college classroom can be. Here we are in a laughably self-serious logic seminar, where the wizardly professor, Ilgauskas, utters one-line axioms before the small group of anxious, if intrigued, students:
“The atomic fact,” he said. Then he elaborated for ten minutes while we listened, glanced, made notes, riffled the textbook to find refuge in print, some semblance of meaning that might be roughly equivalent to what he was saying. There were no laptops or handheld devices in class. Ilgauskas didn’t exclude them; we did, sort of, unspokenly. Some of us could barely complete a thought without touch pads or scroll buttons, but we understood that high-speed data systems did not belong here. They were an assault on the environment, which was defined by length, width, and depth, with time drawn out, computed in heartbeats. We sat and listened or sat and waited. We wrote with pens or pencils. Our notebooks had pages made of flexible sheets of paper.
“The atomic fact,” he said.
Then he elaborated for ten minutes while we listened, glanced, made notes, riffled the textbook to find refuge in print, some semblance of meaning that might be roughly equivalent to what he was saying. There were no laptops or handheld devices in class. Ilgauskas didn’t exclude them; we did, sort of, unspokenly. Some of us could barely complete a thought without touch pads or scroll buttons, but we understood that high-speed data systems did not belong here. They were an assault on the environment, which was defined by length, width, and depth, with time drawn out, computed in heartbeats. We sat and listened or sat and waited. We wrote with pens or pencils. Our notebooks had pages made of flexible sheets of paper.
I don’t want to wax nostalgic for an earlier era when college students dutifully shunned digital technology or didn’t have it to begin with. I do want, as my university often encourages me, to meet my students “where they are.” But sometimes the imperative to digital mediation overwhelms me and makes me wonder about the threshold of these different ways of being: analog and digital. But of course, it’s never that simple, never a clear-cut binary.
Here’s a story that may sound apocryphal, but this really happened: One spring day on my campus, I saw a student who was staring into his smartphone walk straight into a light pole. He crashed into it, stumbled backward, and looked around to see who had seen him. (I was some distance away; he didn’t notice me.) Then he adjusted his course and went back to whatever he had been doing on his phone, unfazed. This is one of the often ignored, occasionally painful, and sometimes embarrassing consequences of what Ian Bogost discusses in an article for The Atlantic called “Hyperemployment, or the Exhausting Work of the Technology User.” Hyperemployment is the endless work we do for unseen agencies, owners, and conglomerations while seemingly merely tapping away at our phones, communicating or otherwise being entertained.
Around that time, I had been tuning into how hyperemployed people are on my campus. Just a few days before the student and the light pole, I had dropped my iPhone, and the screen shattered. The phone still worked, more or less, but after the fall, it lived on my desk in a Ziploc freezer bag, glass splinters crumbling away and accumulating gradually into tiny glinting dunes in the corners of the bag. So I had been reexperiencing my life without smartphone and especially reconsidering how these things permeated my workplace, the university.
After a few weeks of being smartphone free, from this altered vantage point I noticed just how busy everyone seemed to be, all the time. Whether in class, in meetings, or in the hallways—everyone was on their phones. And I don’t say this from an easy standpoint of judgment, for I had grown so accustomed to being on my phone, justifying my near constant attachment to it by the fact that it was allowing me flexibility and freedom. I would draft essays and outline book chapters on my phone’s notepad in the middle of the night. I emailed frantic students at all hours, reassuring them about assignments, missed classes, or exams. I carried on committee work long after meetings had let out, hashing out the fine points of strategic planning and SWOT analyses. I networked with remote colleagues on Twitter and set energizing collaborations into motion. This all seemed worthwhile and productive—and I suppose it was, for the most part.
I’m fully conscious of my own cyborg existence, and I have always been a lenient professor when it comes to students and their technologies. I generally don’t police their use in the classroom and have called students out only a handful of times when their texting got too conspicuous or a facial expression suggested that they had become totally distracted by something on their phone. For the most part, I accept that these things have interpenetrated our lives so thoroughly that it is impractical and unrealistic to try to sanction their use in the classroom. Rather, figuring out the etiquette and subtleties of smartphone use in everyday life is one of the innumerable soft skills that should be learned over the course of college.
But that was before my smartphone hiatus. During those weeks, I found myself walking to work, feeling great. Why? Because I was not thumbing madly and squinting into my hand as I stumbled along, neck craned, tripping over the curb. I was swinging my arms and looking around. Between meetings on campus, I was processing things people said as I strolled back to my office, rather than going immediately to my email inbox, replying to messages as I marched upstairs. I wasn’t leaving my classes and getting directly on Slack to catch up with my collaborators; I was decompressing and thinking about what my students brought up in our discussions. I thought my smartphone was granting me freedom, but it was more like the opposite.
I began to see these things everywhere on campus, and they were increasingly disgusting to me. This has been a difficult piece to write because I am aware of how my criticism verges on hypocrisy, or almost depends on it: I appreciate what smartphones can do—are doing—on a daily basis. But seeing these things from a slight remove, they became revolting to me. I saw my students and colleagues tethered to their smartphones, and I wondered how these things were meshing with—or not—our ostensibly collective purpose of higher education: working together to make the world better, at least our human part in it. I realized how entangled with my smartphone I had become and how different—how refreshing—it felt to be without it. I started reading (books!) for uninterrupted minutes in ways I hadn’t been able to for years because I always felt the need to live tweet or cross-reference whatever I was reading.
I talked to my students about this at one point during this time, extrapolating that they, too, probably didn’t realize how supplementary they had become to their phones—to which they looked at me wide-eyed as if to say, Oh, yes, we well realize this. The look they gave me was tragic, their faces creased in quiet despair. I told my students I was writing a piece on my experience of being without my iPhone, and they viewed me with sardonic skepticism. Good luck with that, they seemed to be thinking.
One student later emailed me a timely New Yorker piece called “The Useless Agony of Going Offline,” in which Matthew J. X. Malady describes the pointlessness of going off his handheld devices cold turkey. He tries it for seventy-two hours and concludes: “I would like to say that I reached some time-maximization epiphany … but I’m not sure that I used my time any ‘better’ than I normally would have during that span. I just used it differently, and found myself frequently bored as a result.” Malady complains that he was basically less informed when off his handheld devices, and the piece ends with a sort of discursive shrug, as if to suggest that it is futile to resist the hegemony burning away in our hands, pockets, and brains. It is a persuasive and shrewd article, and my student seemed to be daring me to prove Malady wrong. But I’m not trying to make a wholesale pronouncement against these things. My relationship with my phone persisted during that time the screen was shattered—it’s just that I didn’t see the thing for hours at a time, particularly when I was on campus.
I told my colleague Tim Welsh about the shattering of my iPhone, and he quickly dialed up dozens of bizarre, hilarious YouTube videos testing various fall heights and reporting the damage incurred by different devices put under various forms of duress. Take after slow-motion take of smartphones crashing into the pavement, being dipped in miscellaneous liquids, and being run over by SUVs. But these were tutorials ultimately geared toward protecting one’s phone or purchasing the most durable model out there. I was watching these videos from the other side, my phone having already been smashed. And perhaps the videos served as yet one more layer of entertainment and seamless commerce, no matter why they were dialed up in the first place.
The weird thing is that I probably wouldn’t have done it on my own; I don’t have the self-discipline to simply use the phone less (some people do, I understand). It took an accidental fall. And then, not wanting to spend a few hundred dollars to replace it, or suffer through the ordeal of an average AT&T or Apple customer-service experience, I just let the phone lie there in its bag, mostly inert, for several weeks. It was functional but changed, limited in a new way. As I was checking my phone one day, sheathed in its plastic envelope, my partner Lara remarked how having it in a gallon-size Ziploc freezer bag made the ridiculousness of these things wickedly obvious: we’re all hanging around gripping and staring into these awkward containers full of junk.
In his 1996 novel, Infinite Jest, David Foster Wallace imagines an early-twenty-first-century technological stress point as phone calls become increasingly video projected to others around the world—in short, a surge of things eerily like Skype or FaceTime. Wallace goes on to ponder what would follow: a feedback loop of increasingly reflexive self-awareness brought about by the constant demand of face-to-face communication by screen. Wallace conjures an elaborate cottage industry of ultrarealistic masks and background dioramas that would bloom alongside “videophonic” devices. Little did Wallace know that the problem would not be rampant self-presentation or its artifice. Our handheld devices are far more insidious for how they seduce us into tuning out while believing that we are tuned in. What Wallace got right was just how much power such communicative media could come to have over our whole bodies.
Only when you no longer carry around a phone all day—or have it at your bedside all night, to look at when you get up to pee and first thing in the morning—will you realize how chained to it you’ve become. I realize I’m making something of an arbitrary distinction here. The Internet for many of us is so intertwined in our lives that it has become a ubiquitous dwelling space, the dispersed hearths of modern homes. Where one device ends and another begins is no simple matter. The smartphone is a special kind of device, though—not because it merely gives us more of the Internet but because the smartphone gets insinuated into our creaturely lives. It has thoroughly “extended our senses and our nerves,” to borrow a line from Marshall McLuhan. McLuhan wrote those words in 1964; he was concerned that an “Age of Anxiety” was in the offing, thanks to new communications and entertainment technologies. Wallace, in the nineties, was projecting this age’s next phases. This is basically where we are now: in this age of posttruth, where the reality of where my body ends and communicative technologies begin is relentlessly complicated by the very object in question. Being without my iPhone clarified many of McLuhan and Wallace’s observations and insights about the addictive, accelerative qualities of our latest electronic media.
Not that I have any easy solution, beyond my own personal revelation. And it was a short-lived one. By the time I was revising this piece for my book, I had a new iPhone sitting mere inches from my fingers as I typed these words, and I eagerly awaited its alerts and epiphanies. Our imbrications with these devices are complex and intricate, to say the least. However, there is something to be said for the jolt of a fresh perspective. I understand that not everyone is caught in the vicious circle of checking their phone every few minutes. I recognize that many people have more self-discipline when it comes to these things. Not everyone is staring into a screen—at least, not yet.
These days, I’ve got an even newer iPhone burning away in my pocket. I don’t feel good about this, and I would be glad to discard it, especially if I were forced to again. I realize that this is a strange sentiment to articulate while not seeming able to act on it. My campus, Loyola University New Orleans, recently went smoke-free. The effect was striking. Where smokers used to sit in a place called “smokers’ alley” in the central quad, now they group together on a road adjacent to campus and puff away, newly organized, if also visibly abject. I wonder, though, if a smartphone-free campus might be a far more radical—and perhaps ultimately healthier—move for a university these days.
Christopher Schaberg is the Dorothy Harrell Brown Distinguished Professor of English at Loyola University New Orleans. He is the author of The Textual Life of Airports (2013), The End of Airports (2015), Airportness (2017), and The Work of Literature in an Age of Post-Truth (2018), as well as the coeditor of Deconstructing Brad Pitt (2014), all published by Bloomsbury.
This piece was adapted from The Work of Literature in an Age of Post-Truth, publishing July 26 by Bloomsbury.
Last / Next Article
Share