Ian Bogost | from The Geek’s Chihuahua | University of Minnesota Press | April 2015 | 22 minutes (5,539 words)
The following is an excerpt from Ian Bogost’s book The Geek’s Chihuahua, which addresses “the modern love affair of ‘living with Apple’ during the height of the company’s market influence and technology dominance,” and how smartphones created a phenomenon of “hyperemployment.”
***
Think back to 2007, when you got the first iPhone. (You did get one, didn’t you? Of course you did.) You don’t need me to remind you that it was a shiny object of impressive design, slick in hand and light in pocket. Its screen was bright and its many animations produced endless, silent “oohs” even as they became quickly familiar. Accelerometer-triggered rotations, cell tower triangulations (the first model didn’t have GPS yet), and seamless cellular/WiFi data transitions invoked strong levels of welcome magic. These were all novelties once, and not that long ago.
What you probably don’t remember: that first iPhone was also terrible. Practically unusable, really, for the ordinary barrage of phone calls, text messages, mobile email, and web browsing that earlier smartphones had made portable. And not for the reasons we feared before getting our hands on one—typing without tactile feedback wasn’t as hard to get used to as BlackBerry and Treo road warriors had feared, even if it still required a deliberate transition from t9 or mini-keyboard devices—but rather because the device software was pushing the limits of what affordable hardware could handle at the time.
Applications loaded incredibly slowly. Pulling up a number or composing an email by contact name was best begun before ordering a latte or watering a urinal to account for the ensuing delay. Cellular telephone reception was far inferior to other devices available at the time, and regaining a lost signal frequently required an antenna or power cycle. Wireless data reception was poor and slow, and the device’s ability to handle passing in and out of what coverage it might find was limited. Tasks interrupted by coverage losses, such as email sends in progress, frequently failed completely.
The software was barebones. There was no App Store in those early days, making the iPhone’s operating system a self-contained affair, a ladleful of Apple-apportioned software gruel, the same for everyone. That it worked at all was a miracle, but our expectations had been set high by decades of complex, adept desktop software. By comparison, the iPhone’s apps were barebones. The Mail application, for example, borrowed none of its desktop cousin’s elegant color-coded, threaded summary view but instead demanded inexplicable click-touches back and forward from folder to folder, mailbox to mailbox.
Some of these defects have been long since remedied in the many iterations of the device that have appeared since its 2007 debut. Telephony works well, and who uses the phone anymore anyway? Data speed and reliability have been updated both on wireless network infrastructures and in the smartphone itself. But other issues persist. For those who cut their computing teeth on desktops and laptops—the things that we used to mean when we used the word computer—manipulating mobile software still feels awkward and laborious. Those many taps of the original Mail app haven’t been altered or remedied so much as they have become standardized. Now, we use all software in the convoluted manner mobile operating systems demand, from email to word processing to video editing.
But to issue complaints about usability misses the point of the iPhone, even all those years ago, and certainly today. The iPhone was never a device one should have expected to “just work,” to quote Apple’s familiar advertising lingo. It is a device one has to accommodate. It taught us how to tolerate Apple making us tolerate it. It put us in our place before Apple. This was the purpose of the iPhone, and this is its primary legacy.
Then, as now, the iPhone demands to be touched just right, in precisely the right spot on menu, list, or keyboard, and with precisely the right gesture. Likewise, it demands not to be touched just after, when being pocketed or moved or simply turned to place at one’s ear. Doing otherwise erroneously launches, or quits, or selects, or deletes, or slides, or invokes Siri the supposedly intelligent personal assistant, or performs some other action, desired or not, slickly coupled to a touch or gestural control.
The iPhone resists usability, a term reserved for apparatuses humans make their servants. An iPhone is not a computer. It is a living creature, one filled with caprice and vagary like a brilliant artist, like a beautiful woman, like a difficult executive. Whether it is usable is not the point. To use the iPhone is to submit to it. Not to its interfaces, but to the ambiguity of its interpretation of them. To understand it as an Other, an alien being boasting ineffable promise and allure. Touch here? Stroke there? Stop here? Do it again? The impressive fragility of the device only reinforces this sense—to do it wrong by dropping or misgesturing might lead to unknown consequences. Unlike other portable devices—a Walkman or a traditional mobile phone— the iPhone embraces fragility rather than ruggedness. It demands to be treated with kid gloves. Even before you’ve first touched it, you can already hear yourself apologizing for your own blunders in its presence, as if you are there to serve it rather than it you. The iPhone is a device that can send you far out of your way, and yet you feel good about it. It is a device that can endear you to it by resisting your demands rather than surrendering to them.
Rather than thinking of the iPhone as a smartphone, like a Treo or a BlackBerry or, eventually, the Android devices that would mimic it, one would do better to think of the iPhone as a pet. It is the toy dog of mobile devices, a creature one holds gently and pets carefully, never sure whether it might nuzzle or bite. Like a Chihuahua, it rides along with you, in arm or in purse or in pocket, peering out to assert both your status as its owner and its mastery over you as empress. And like a toy dog, it reserves the right never to do the same thing a second time, even given the same triggers. Its foibles and eccentricities demand far greater effort than its more stoic smartphone cousins, but in so doing, it challenges you to make sense of it.
The BlackBerry’s simplicity and effectiveness yielded a constant barrage of new things to do. And eventually, so would the smartphone—social media feeds and status updates replaced work with play-as-work, with hyperemployment, a term I’ll explain soon enough. But that first iPhone resisted utility old and new. It acclimated us to the new quirks of touchscreen life, of attempting to accomplish complex tasks that would have been easy on a normal computer but laborious on a tiny screen that ran one program at a time. Today we’ve acclimated, accepting these inefficiencies as givens. But such an eventuality was never guaranteed, and iPhone had to train us to tolerate them. Like the infirm must endure physical therapy to reform damaged limbs and tissues, so the smartphone user needed to be trained to accept and overcome the intrinsic incapacities of the handheld computer.
This was harder than it sounds in retrospect. That first iPhone receded into itself at times, offering its owner no choice but to pet it in vain, or to pack it away it until it regained composure, or to reboot it in the hopes that what once worked might do so again. It was a beast of its vicissitudes. And it still is, albeit in different ways. To own an iPhone is to embrace such fickleness rather than to lament it in the hope for succor via software update. And even when one does come, it only introduces new quirks to replace the old ones: the slowdowns of an operating system upgrade launched to execute planned obsolescence, say, or via new sensors, panels, controls, and interfaces that render a once modernist simplicity baroque.
[pullquote align=”center”]The brilliance of the iPhone is not how intuitive or powerful or useful it is—for really it is none of these things.[/pullquote]
Indeed, when you would meet new iPhone users, they would share much more in common with smug, tired pet owners than with mobile busybodies. “Here, let me show you,” one would say proudly when asked how she liked it. Fingers would stretch gently over photos, zooming and turning. They’d flick nonchalantly through web pages and music playlists. As with the toy dog or the kitten, when the iPhone fails to perform as expected, its owners would simply shrug in capitulation. “Who knows what goes through its head,” one might rationalize, as she might do just the same when her Maltese jerks from sleep and scurries frantically, sliding across wood around a corner.
The brilliance of the iPhone is not how intuitive or powerful or useful it is—for really it is none of these things. Rather, the brilliance of the iPhone is in its ability to transcend the world of gadgetry and enter another one: the world of companionship. But unlike the Chihuahua or the bichon or even the kitten, the iPhone has no gender bias. It need not signal overwrought Hollywood glam, high-maintenance upper- class leisure, or sensitive loner solitude. iPhone owners can feel assured in their masculinity or femininity equally as they stroke and snuggle their pet devices, fearing no reprisal for fopishness or dorkship.
The Aibo and Pleo, those semirealistic robotic pets of the pre-iPhone era that attempted to simulate the form and movement of a furry biological pet, failed precisely because they did nothing else other than pretend to be real pets. The iPhone got it right: a pet is not an animal at all. A pet is a creature that responds meaningfully to touch and voice and closeness, but only sometimes. At other times, it retreats inextricably into its own mind, gears spinning in whatever alien way they must for other creatures. A pet is a sentient alien that cultures an attachment that might remain—that probably remains—unrequited. A pet is a bottomless pit for affect and devotion, yet one whose own feelings can never be truly known.
The iPhone offers an excuse to dampen the smartphone’s obsession with labor, productivity, progress, and efficiency with the touching, demented weirdness that comes with companionship. Despite its ability to text, to tweet, to Facebook, to Instagram, perhaps the real social promise of iPhone lies elsewhere: as a part of a more ordinary, more natural ecology of real social interaction. The messy sort that resists formalization in software form. The kind that makes unreasonable demands and yet sometimes surprises.
And of course, the kind that overheats and flips into mania. Mania, it turns out, is what iPhone wants most. To turn us all into the digital equivalent of the toy dog–toting socialite obsessive or the crazy cat lady, doting and tapping, swiping and cooing at glass rectangles with abandon.
***
In 1930, the economist John Maynard Keynes famously argued that by the time a century had passed, developed societies would be able to replace work with leisure thanks to widespread wealth and surplus. “We shall do more things for ourselves than is usual with the rich today,” he wrote, “only too glad to have small duties and tasks and routines.” Eighty years hence, it’s hard to find a moment in the day not filled with a duty or task or routine. If anything, it would seem that work has overtaken leisure almost entirely. We work increasingly hard for increasingly little, only to come home to catch up on the work we can’t manage to work on at work.
Take email. A friend recently posed a question on Facebook: “Remember when email was fun?” It’s hard to think back that far. On Prodigy, maybe, or with UNIX mail or Elm or Pine via telnet. Email was silly then, a trifle. A leisure activity out of Keynes’s macroeconomic tomorrowland. It was full of excess, a thing done because it could be rather than because it had to be. The worst part of email was forwarded jokes, and even those seem charming in retrospect. Even junk mail is endearing when it’s novel.
Now, email is a pot constantly boiling over. Like King Sisyphus pushing his boulder, we read, respond, delete, delete, delete, only to find that even more messages have arrived while we were pruning. A whole time management industry has erupted around email, urging us to check only once or twice a day, to avoid checking email first thing in the morning, and so forth. Even if such techniques work, the idea that managing the communication for a job now requires its own self-help literature reeks of a foul new anguish.
[pullquote align=”center”]Like King Sisyphus pushing his boulder, we read, respond, delete, delete, delete, only to find that even more messages have arrived while we were pruning.[/pullquote]
If you’re like many people, you’ve started using your smartphone as an alarm clock. Now it’s the first thing you see and hear in the morning. And touch, before your spouse or your crusty eyes. Then the ritual begins. Overnight, twenty or forty new emails: spam, solicitations, invitations, or requests from those whose days pass during your nights, mailing list reminders, bill pay notices. A quick triage, only to be undone while you shower and breakfast.
Email and online services have provided a way for employees to outsource work to one another. Whether you’re planning a meeting with an online poll, requesting an expense report submission to an enterprise resource planning (ERP) system, asking that a colleague contribute to a shared Google Doc, or just forwarding on a notice that “might be of interest,” jobs that previously would have been handled by specialized roles have now been distributed to everyone in an organization.
No matter what job you have, you probably have countless other jobs as well. Marketing and public communications were once centralized; now every division needs a social media presence, and maybe even a website to develop and manage. Thanks to Oracle and SAP, everyone is a part-time accountant and procurement specialist. Thanks to Oracle and Google Analytics, everyone is a part-time analyst.
And email has become the circulatory system along which internal outsourcing flows. Sending an email is easy and cheap, and emails create obligation on the part of a recipient without any prior agreement. In some cases, that obligation is bureaucratic, meant to drive productivity and reduce costs. “Self-service” software automation systems like these are nothing new—SAP’s ERP software has been around since the 1970s. But since the 2000s, such systems can notify and enforce compliance via email requests and nags. In other cases, email acts as a giant human shield, a white-collar Strategic Defense Initiative. The worker who emails enjoys both assignment and excuse all at once. “Didn’t you get my email?”
The despair of email has long left the workplace. Not just by infecting our evenings and weekends via Outlook web access and BlackBerry and iPhone, although it has certainly done that. Now we also run the email gauntlet with everyone. The ballet school’s schedule updates (always received too late, but “didn’t you get the email?”); the Scout troop announcements; the daily deals website notices; the PR distribution list you somehow got on after attending that conference; the insurance notification, informing you that your new coverage cards are available for self-service printing (you went paperless, yes?); and the email password reset notice that finally trickles in twelve hours later, because you forgot your insurance website password since a year ago. And so on.
[pullquote aling=”center”]Its primary function is to reproduce itself in enough volume to create anxiety and confusion.[/pullquote]
It’s easy to see email as unwelcome obligation, but too rarely do we take that obligation to its logical if obvious conclusion: those obligations are increasingly akin to another job—or better, many other jobs. For those of us lucky enough to be employed, we’re really hyperemployed—committed to our usual jobs and many other jobs as well. It goes without saying that we’re not being paid for all these jobs, but pay is almost beside the point, because the real cost of hyperemployment is time. We are doing all those things others aren’t doing instead of all the things we are competent at doing. And if we fail to do them, whether through active resistance or simple overwhelm, we alone suffer for it: the schedules don’t get made, the paperwork doesn’t get mailed, the proposals don’t get printed, and on and on.
But the deluge doesn’t stop with email, and hyperemployment extends even to the unemployed, thanks to our tacit agreement to work for so many Silicon Valley technology companies.
Increasingly, online life in general overwhelms. The endless, constant flow of email, notifications, direct messages, favorites, invitations. After that daybreak email triage, so many other icons on your phone boast badges silently enumerating their demands. Facebook notifications. Twitter @ messages, direct messages. Tumblr followers, Instagram favorites, Vine comments. Elsewhere too: comments on your blog, on your YouTube channel. The Facebook page you manage for your neighborhood association or your animal rescue charity. New messages in the forums you frequent. Your Kickstarter campaign updates. Your Etsy shop. Your eBay watch list. And then, of course, more email. Always more email.
Email is the plumbing of hyperemployment. Not only do automated systems notify and direct us via email but we direct and regulate one another through email. But even beyond its function as infrastructure, email also has a disciplinary function. The content of email almost doesn’t matter. Its primary function is to reproduce itself in enough volume to create anxiety and confusion. The constant flow of new email produces an endless supply of potential work. Even figuring out whether there is really any “actionable” effort in the endless stream of emails requires viewing, sorting, parsing, even before one can begin conducting the effort needed to act and respond.
We have become accustomed to using the term precarity to describe the condition whereby employment itself is unstable or insecure. But even within the increasingly precarious jobs, the work itself has become precarious too. Email is a mascot for this sensation. At every moment of the workday—and on into the evening and the night, thanks to smartphones—we face the possibility that some request or demand, reasonable or not, might be awaiting us.
Often, we cast these new obligations either as compulsions (the addictive, possibly dangerous draw of online life) or as necessities (the importance of digital contact and an “online brand” in the information economy). But what if we’re mistaken, and both tendencies are really just symptoms of hyperemployment? We are now competing with ourselves for our own attention.
[pullquote align+”center”]Rather than just being exploited or duped, we’ve been hyperemployed.[/pullquote]
When critics engage with the demands of online services via labor, they often cite exploitation as a simple explanation. It’s a sentiment that even has its own aphorism: “If you’re not paying for the product, you are the product.” The idea is that all the information you provide to Google and Facebook, all the content you create for Tumblr and Instagram, enables the primary business of such companies, which amounts to aggregating and reselling your data or access to it. In addition to the revenues extracted from ad sales, tech companies like YouTube and Instagram also managed to lever- age the speculative value of your data-and-attention into billion-dollar buyouts. Tech companies are using you, and they’re giving precious little back in return.
While often true, this phenomenon is not fundamentally new to online life. We get network television for free in exchange for the attention we devote to ads that interrupt our shows. We receive “discounts” on grocery store staples in exchange for allowing Kroger or Safeway to aggregate and sell our shopping data. Meanwhile, the companies we do pay directly as customers often treat us with disregard at best, abuse at worst (just think about your cable provider or your bank). Of course, we shouldn’t just accept online commercial exploitation just because exploitation in general has been around for ages. Rather, we should acknowledge that exploitation only partly explains today’s anxiety with online services.
Hyperemployment offers a subtly different way to characterize all the tiny effort we contribute to Facebook and Instagram and the like. It’s not just that we’ve been duped into contributing free value to technology companies (although that’s also true) but that we’ve tacitly agreed to work unpaid jobs for all these companies. And even calling them “unpaid” is slightly unfair, because we do get something back from these services, even if they often take more than they give. Rather than just being exploited or duped, we’ve been hyperemployed. We do tiny bits of work for Google, for Tumblr, for Twitter, all day and every day.
Today, everyone’s a hustler. But now we’re not even just hustling for ourselves or our bosses but for so many other, unseen bosses. For accounts payable and for marketing; for the Girl Scouts and the Youth Choir; for Facebook and for Google; for our friends via their Kickstarters and their Etsy shops; for Twitter, whose initial public offering converted years of tiny, aggregated work acts into seventy-eight dollars of fungible value per user.
Even if there is more than a modicum of exploitation at work in the hyperemployment economy, the despair and overwhelm of online life don’t derive from that exploitation—not directly anyway. Rather, it’s a type of exhaustion cut of the same sort that afflicts the underemployed as well, like the single mother working two part-time service jobs with no benefits or the PhD working three contingent teaching gigs at three different regional colleges to scrape together a still insufficient income. The economic impact of hyperemployment is obviously different from that of underemployment, but some of the same emotional toll imbues both: a sense of inundation, of being trounced by demands whose completion yields only their continuance, and a feeling of resignation that no other scenario is likely or even possible. The only difference between the despair of hyperemployment and that of underemployment is that the latter at least acknowledges itself as a substandard condition, whereas the former celebrates the hyperemployed’s purported freedom to “share” and “connect,” to do business more easily and effectively by doing jobs once left for others’ competence and compensation, from the convenience of your car or toilet.
Staring down the barrel of Keynes’s 2030 target for the arrival of universal leisure, economists have often considered why the economist seems to have been so wrong. The inflation of relative needs is one explanation—the arms race for better and more stuff and status. The ever-increasing wealth gap, on the rise since the anti-Keynes, supply-side 1980s, is another. But what if Keynes was right, too, in a way. Even if productivity has increased mostly to the benefit of the wealthy, hasn’t everyone gained enormous leisure, but by replacing recreation with work rather than work with recreation? This new work doesn’t even require employment; the destitute and unemployed hyperemployed are just as common as the affluent and retired hyperemployed. Perversely, it is only then, at the labor equivalent of the techno-anarchist’s singularity, that the malaise of hyperemployment can cease. Then all time will become work time, and we will not have any memory of leisure to distract us.
***
At the start of 2015, fewer than eight short years since the first launch of the iPhone, Apple was worth more than seven hundred billion dollars—more than the gross national product of Switzerland. Despite its origins as a computer company, this is a fortune built from smartphones more than laptops. Before 2007, smartphones were a curiosity, mostly an affectation of would-be executives carting BlackBerries and Treos in unfashionable belt holsters. Not even a decade ago, they were wild and feral. Today, smartphones are fully domesticated. Tigers made kittens, which we now pet ceaselessly. More than two-thirds of Americans own them, and they have become the primary form of computing.
But along with that domestication comes the inescapability of docility. Have you not accepted your smartphone’s reign over you rather than lamenting it? Stroking our glass screens, Chihuahua-like, is just what we do now, even if it also feels sinful. The hope and promise of new computer technology have given way to the malaise of living with it.
Shifts in technology are also shifts in culture and custom. And these shifts have become more frequent and more rapid over time. Before 2007, one of the most substantial technological shifts in daily life was probably the World Wide Web, which was already commercialized by the mid-1990s and mainstream by 2000. Before that? The personal computer, perhaps, which took from about 1977 until 1993 or so to become a staple of both home and business life. First we computerized work, then we computerized home and social life, then we condensed and transferred that life to our pockets. With the Apple Watch, now the company wants to condense it even further and have you wear it on your wrist.
[pullquote align=”center”]The hope and promise of new computer technology have given way to the malaise of living with it.[/pullquote]
Change is exciting, but it can also be exhausting. And for the first time in a long time, reactions to the Apple Watch seem to underscore exhaustion as much as excitement. But even these skeptical replies question the watch’s implementation rather than expressing lethargy at the prospect of living in the world it might bestow on us.
Some have accused Apple of failing to explain the purpose of its new wearable. The wristwatch connoisseur Benjamin Clymer calls it a “market leader in a category nobody asked for.” Apple veteran Ben Thompson rejoins Cook for failing to explain “why the Apple Watch existed, or what need it is supposed to fill.” Felix Salmon agrees, observing that Apple “has always been the company which makes products for real people, rather than gadgets for geeks,” before lamenting that the Apple Watch falls into the latter category.
“Apple hasn’t solved the basic smartwatch dilemma,” Salmon writes. But the dilemma he’s worried about proves to be a banal detail: “Smart watches use up far more energy than dumb watches.” He later admits that Apple might solve the battery and heft problems in a couple generations, but “I’m not holding my breath.” Salmon reacts to the Apple Watch’s design and engineering failings rather than lamenting the more mundane afflictions of being subjected to wrist-sized emails in addition to desktop- and pocket-sized ones. We’re rearranging icons on the Titanic.
After the Apple keynote, the Onion joked about the real product Apple had unveiled—a “brief, fleeting moment of excitement.” But like so much satire these days, it’s not really a joke. As Dan Frommer recently suggested, the Apple keynote is no less a product than are its phones and tablets. Apple is in the business of introducing big things as much as it is in the business of designing, manufacturing, distributing, and supporting them. In part, it has to be: Apple’s massive valuation, revenues, and past successes have only increased the street’s expectations for the company. In a world of so-called disruptive innovation, a company like Apple is expected to manufacture market-defining hit after hit.
Indeed, business is another context we often use to avoid engaging with our technological weariness. We talk about how Apple’s CEO Tim Cook must steer the tech giant into new waters—such as wearables—to ensure a fresh supply of desire, customers, and revenue. But the exigency of big business has an impact on our ordinary lives. It’s easy to cite the negative effects of a business environment focused on quarterly profits above all else, including maintaining job stability and paying into the federal or municipal tax base. In the case of Apple, something else is going on, too. In addition to being an economic burden, the urgency of technological innovation has become so habitual that we have become resigned to it. Wearables might not be perfect yet, we conclude, but they will happen. They already have.
I’m less interested in accepting wearables given the right technological conditions as I am prospectively exhausted at the idea of dealing with that future’s existence. Just think about it. All those people staring at their watches in the parking structure, in the elevator. Tapping and stroking them, nearly spilling their coffee as they swivel their hands to spin the watch’s tiny crown control.
A whole new tech cliché convention: the zoned-out smartwatch early adopter staring into his outstretched arm, like an inert judoka at the ready. The inevitable thinkpieces turned nonfiction trade books about “wrist shrift” or some similarly punsome quip on the promise-and-danger of wearables.
The variegated buzzes of so many variable “haptic engine” vibrations, sending notices of emails arriving from a boss or a spammer or obscene images received from a Facebook friend. The terrible battery life Salmon worries about, and the necessity of purchasing a new, expensive wristwatch every couple years, along with an equally costly smartphone with which to mate it.
The emergence of a new, laborious media creation and consumption ecosystem built for glancing. The rise of the “glancicle,” which will replace the listicle. The PR emails and the b2b advertisements and the business consulting conference promotions all asking,“Is your brand glance-aware?”
These are mundane future grievances, but they are also likely ones. Unlike those of its competitor Google, with its eyeglass wearables and delivery drones and autonomous cars, Apple’s products are reasonable and expected—prosaic even, despite their refined design. Google’s future is truly science fictional, whereas Apple’s is mostly foreseeable. You can imagine wearing Apple Watch, in no small part because you remember thinking that you could imagine carrying Apple’s iPhone—and then you did, and now you always do.
Technology moves fast, but its speed now slows us down. A torpor has descended, the weariness of having lived this change before—or one similar enough, anyway—and all too recently. The future isn’t even here yet, and it’s already exhausted us in advance.
It’s a far cry from “future shock,” Alvin Toffler’s 1970 term for the postindustrial sensation that too much change happens in too short a time. Where once the loss of familiar institutions and practices produced a shock, now it produces something more tepid and routine. The planned obsolescence that coaxes us to replace our iPhone 5 with an iPhone 6 is no longer disquieting but just expected. I have to have one has become Of course I’ll get one. The idea that we might willingly reinvent social practice around wristwatch computers less than a decade after reforming it for smart- phones is no longer surprising but predictable. We’ve heard this story before; we know how it ends.
Future shock is over. Apple Watch reveals that we suffer a new affliction: future ennui. The excitement of a novel technology (or anything, really) has been replaced—or at least dampened—by the anguish of knowing its future burden. This listlessness might yet prove even worse than blind boosterism or cynical naysaying. Where the trauma of future shock could at least light a fire under its sufferers, future ennui exudes the viscous languor of indifferent acceptance. It doesn’t really matter that the Apple Watch doesn’t seem necessary, no more than the iPhone once didn’t too. Increasingly, change is not revolutionary, to use a word Apple has made banal, but presaged.
Our lassitude will probably be great for the companies like Apple that have worn us down with the constancy of their pestering. The poet Charles Baudelaire called ennui the worst sin, the one that could “swallow the world in a yawn.” As Apple Watch leads the suppuration of a new era of wearables, who has energy left to object? Who has the leisure for revolution, as we keep up with our social media timelines and emails and home thermostats and heart monitors?
When one is enervated by future ennui, there’s no vigor left even to ask if this future is one we even want. And even if we ask, lethargy will likely curtail our answers. No matter, though: soon enough, only a wrist’s glance worth of ideas will matter anyway. And at that point, even this short book’s worth of reflections on technology will be too much to bear, incompatible with our newfound obsession with wrist-sizing ideas. I’m sure I’ll adapt, like you will. Living with Apple means marching ever forward, through its aluminum- and glass-lined streets and into the warm, selfsame glow of the future.
***
Ian Bogost is Ivan Allen College Distinguished Chair in Media Studies and professor of interactive computing at Georgia Institute of Technology, where he also holds an appointment in the Scheller College of Business. His books include How to Do Things with Videogames (Minnesota, 2011) and Alien Phenomenology, or What It’s Like to Be a Thing (Minnesota, 2012).