iphone Archives - Longreads https://longreads.com/tag/iphone/ Longreads : The best longform stories on the web Thu, 11 Jan 2024 01:47:20 +0000 en-US hourly 1 https://longreads.com/wp-content/uploads/2017/01/longreads-logo-sm-rgb-150x150.png iphone Archives - Longreads https://longreads.com/tag/iphone/ 32 32 211646052 A Second Life for My Beloved Dog https://longreads.com/2024/01/10/a-second-life-for-my-beloved-dog/ Thu, 11 Jan 2024 01:47:19 +0000 https://longreads.com/?p=202405 A short, but beautiful essay on how an iPhone feature helped Charlie Warzel to grieve for his dog, Peggy. An insightful reflection on grief and a heartwarming affirmation of the power of happy memories.

On the day she died, I set my phone’s wallpaper to my favorite photo of Peggy—appearing to smile on a ridgeline trail in Missoula, Montana, the bright-yellow balsamroot flowers in bloom behind her. But a month later, I told myself that it was time to stop wallowing. Instead of a memorial photo of Peggy, I opted to try a newer, “dynamic” wallpaper feature called “Photo Shuffle.” Every so often, my iPhone would change my wallpaper and home screen to an image it had grabbed from my camera roll. To help it along, I could offer parameters for the photo choice. Knowing that Apple’s Photos app uses image-recognition software to identify cats and dogs in the camera roll, I chose a “Pets” filter.

]]>
202405
You Have a New Memory https://longreads.com/2023/04/17/you-have-a-new-memory/ Mon, 17 Apr 2023 20:36:04 +0000 https://longreads.com/?p=189235 I came away from this essay by Merritt Tierce feeling … many things. A bit of confusion. Some unease. Unexpected mental fatigue. Tierce makes interesting observations about her and our Very Online lives, and the relationships we have with our phones, the internet, and one another.

The feeling of the internet has become such a feeling, a feeling of continuous vulnerability, and you can’t turn it off, it never ends. Even if my phone is off, is elsewhere, even if my computer is in a different country, the internet is there wherever I am, because it’s in me now. I’m talking about the lingering psychic, psychological, and physiological connection that I can no longer shut off, that has changed my mind. It manifests as a minor but noticeable discomfort, a permanent buzzing in my mind, like a leaf blower that never moves on down the street. Or consider the feeling of having your mouth stuck wide open at the dentist’s, or your breast smashed by the mammographer, or your legs spread for whatever consensually chosen activity you’d like to imagine; you may want what’s happening, you may have voluntarily paid for it or requested it, for reasons that fall along a spectrum from necessity to deep desire, but part of your original want includes the assumption that the experience will end, you will be able to relax your jaw and have your boob back and curl up into a ball.

]]>
189235
‘iPhones Are Made in Hell’ https://longreads.com/2023/02/14/iphones-are-made-in-hell/ Tue, 14 Feb 2023 18:29:51 +0000 https://longreads.com/?p=186919 It’s been more than a decade since Foxconn made international headlines after several workers committed suicide at the manufacturer’s iPhone factory in Shenzhen, China, which prompted revelations about inhumane working conditions. Now Foxconn’s facility in Zhengzhou, which produces about half the world’s iPhones, is under the media microscope. Viola Zhou of Rest of World kept in close contact with a few Foxconn assembly-line workers over the course of three months to capture what life is like in the mega-factory during peak production:

In December, as Western holiday shoppers were preparing Christmas presents, Foxconn renewed efforts to rev up its iPhone 14 Pro production. To attract a new crop of workers, the company again raised its pay. One contract seen by Rest of World promised a monthly bonus of 6,000 yuan ($885) if recruits worked at least 26 full days in December and 23 days in January. On social media, people described the proposition as the “60-day Foxconn challenge.”

Hunter had planned to return home once his quarantine ended, but the bonus made him reconsider. Going through a routine he was well familiar with, he lined up at the factory’s recruitment office, had his blood taken as part of a mandatory health check, and carried his belongings into an eight-person dorm room. The next day, he completed a mental health questionnaire, which asked whether he had insomnia or relationship issues — a practice that dates back to the spate of suicides in 2010 — and spent eight hours watching orientation videos on his phone. A frequent pop-up asking for a facial scan made sure he was paying attention. After three more days of quarantine, he started his most recent role — working the screws on the iPhone 14 Pro assembly line.

Inside the workshop, Hunter said he felt a kind of oppression he had never experienced in his previous Foxconn jobs, which were away from the factory floor. With no windows, he said that it was impossible to tell day from night without checking a clock. Managers required such a high tempo that he felt he could not stop for a second. Hunter even witnessed one colleague getting his pay reduced for spending too long drinking water. The constant scolding was humiliating, he said, even though he was rarely the target. Colleagues broke into tears under the stress.

]]>
186919
To Be an Instagram-Ready Restaurant, Don’t Forget Your Selfie-Optimized Lamps https://longreads.com/2017/07/27/instagram-restaurant-design/ Thu, 27 Jul 2017 15:00:07 +0000 http://longreads.com/?p=82076 Sleek-kitschy idiosyncrasy is all the rage.]]>

Back in the 1970s, memorabilia-heavy restaurants became popular as they facilitated the loosening-up of sexual mores. These days, colorful tiles, bold wallpaper, and the occasional (ironic?) taxidermy piece can all trace their origins to our need to capture and broadcast our well-curated pleasures. As Casey Newton shows at The Verge, Instagram is the driving force behind the current vogue for easily reproduced, sleek-kitschy idiosyncrasy — including adjustable lighting that allows diners to take the most flattering selfie possible.

Few restaurants have taken photo-friendliness as seriously as Bellota, a Spanish restaurant that opened in San Francisco last year. The entryway is enclosed, creating a pleasing shadowbox effect as you look into the dining room. The kitchen is open, and encourages patrons to take 360-degree videos of the space. Many Instagram posts feature pictures of “the ham wall,” which is just what it sounds like: a window that looks into the temperature-controlled room where Bellota stores $50,000 worth of Spanish jamón ibérico.

The most striking thing about Bellota may be the custom lamps at its 25-seat bar, which let patrons adjust the lighting in order to get the perfect shot. “I’m probably the most avid Instagram user of the group, so I kept bringing it up,” says Ryan McIlwraith, Bellota’s chef. He wanted the lighting to do justice to the restaurant’s tapas plates and signature paellas. “It turned out these lamps we got were just perfect for it,” he says. The lamps can be tilted or turned 180 degrees, and the light’s intensity can be adjusted up and down. An “advanced feature” allows patrons to rest their phones on the lamp’s neck so as to take a selfie. (I did, and must admit the lighting was lovely.)

Read the story

]]>
82076
To Be an Instagram-Ready Restaurant, Don’t Forget Your Selfie-Optimized Lamps https://longreads.com/2017/07/27/instagram-restaurant-design-2/ Thu, 27 Jul 2017 15:00:07 +0000 http://longreads.com/?p=82076 Sleek-kitschy idiosyncrasy is all the rage.]]>

Back in the 1970s, memorabilia-heavy restaurants became popular as they facilitated the loosening-up of sexual mores. These days, colorful tiles, bold wallpaper, and the occasional (ironic?) taxidermy piece can all trace their origins to our need to capture and broadcast our well-curated pleasures. As Casey Newton shows at The Verge, Instagram is the driving force behind the current vogue for easily reproduced, sleek-kitschy idiosyncrasy — including adjustable lighting that allows diners to take the most flattering selfie possible.

Few restaurants have taken photo-friendliness as seriously as Bellota, a Spanish restaurant that opened in San Francisco last year. The entryway is enclosed, creating a pleasing shadowbox effect as you look into the dining room. The kitchen is open, and encourages patrons to take 360-degree videos of the space. Many Instagram posts feature pictures of “the ham wall,” which is just what it sounds like: a window that looks into the temperature-controlled room where Bellota stores $50,000 worth of Spanish jamón ibérico.

The most striking thing about Bellota may be the custom lamps at its 25-seat bar, which let patrons adjust the lighting in order to get the perfect shot. “I’m probably the most avid Instagram user of the group, so I kept bringing it up,” says Ryan McIlwraith, Bellota’s chef. He wanted the lighting to do justice to the restaurant’s tapas plates and signature paellas. “It turned out these lamps we got were just perfect for it,” he says. The lamps can be tilted or turned 180 degrees, and the light’s intensity can be adjusted up and down. An “advanced feature” allows patrons to rest their phones on the lamp’s neck so as to take a selfie. (I did, and must admit the lighting was lovely.)

Read the story

]]>
173022
Instagram Is Pushing Restaurants to Be Kitschy, Colorful, and Irresistible to Photographers https://longreads.com/2017/07/24/instagram-restaurant-design/ Mon, 24 Jul 2017 05:11:01 +0000 http://longreads.com/?post_type=lr_pick&p=81804 How the popular app has transformed the way diners, designers, and marketers approach restaurants. (Hint: that bold wallpaper pattern isn’t there by accident.)

]]>
173009
When Your Lost Phone Ends Up in Yemen https://longreads.com/2015/09/27/22774/ Sun, 27 Sep 2015 23:00:29 +0000 http://blog.longreads.com/?p=22774 In the summer of 2013, a New York yuppie lost her iPhone in the Hamptons. A few months later, she got an alert saying that her phone had been turned on in Yemen, and then candid pictures of a Yemeni family started filling her iCloud account. The phone was soon updated under the name of its new owner, a teenager […]]]>

In the summer of 2013, a New York yuppie lost her iPhone in the Hamptons. A few months later, she got an alert saying that her phone had been turned on in Yemen, and then candid pictures of a Yemeni family started filling her iCloud account. The phone was soon updated under the name of its new owner, a teenager boy named Yacoub. In this essay for The Atlantic, Will McGrath writes about the saga of his friend’s lost phone, guns, shared humanity, and how the photos provided his friend with a strange keyhole into another world:

Flipping through these pictures is like watching Yacoub muddle through adolescence in time-lapse. He is deep into the age of identity-building, trying to document and establish his place in the world. He travels through the countryside taking landscape shots: stunning mountains with verdant terraced fields, clusters of houses that stair-step down toward a valley floor. Now he is at a construction site looking supercool behind the wheel of a forklift.

Another picture has him posing before a shuttered storefront with an AK-47 (the safety is on and the gun’s stock is folded under so he can’t touch the trigger). In late August Yacoub writes the Shahada—the standard Muslim declaration of faith—in the phone’s Notes app, where Maura discovers it while making her IKEA shopping list. There is no god but God, and Mohammed is the messenger of God. He writes cheesy love poetry into the Notes app (How does the heart forget you, the taste of sugar that is lost?), he tries to visit Sex.com, he takes selfies with qat wadded in his cheek, he is every teenager in the history of teenagers.

Read the story

]]>
22774
When Your Lost Phone Ends Up in Yemen https://longreads.com/2015/09/27/22774-2/ Sun, 27 Sep 2015 23:00:29 +0000 http://blog.longreads.com/?p=22774 In the summer of 2013, a New York yuppie lost her iPhone in the Hamptons. A few months later, she got an alert saying that her phone had been turned on in Yemen, and then candid pictures of a Yemeni family started filling her iCloud account. The phone was soon updated under the name of its new owner, a teenager […]]]>

In the summer of 2013, a New York yuppie lost her iPhone in the Hamptons. A few months later, she got an alert saying that her phone had been turned on in Yemen, and then candid pictures of a Yemeni family started filling her iCloud account. The phone was soon updated under the name of its new owner, a teenager boy named Yacoub. In this essay for The Atlantic, Will McGrath writes about the saga of his friend’s lost phone, guns, shared humanity, and how the photos provided his friend with a strange keyhole into another world:

Flipping through these pictures is like watching Yacoub muddle through adolescence in time-lapse. He is deep into the age of identity-building, trying to document and establish his place in the world. He travels through the countryside taking landscape shots: stunning mountains with verdant terraced fields, clusters of houses that stair-step down toward a valley floor. Now he is at a construction site looking supercool behind the wheel of a forklift.

Another picture has him posing before a shuttered storefront with an AK-47 (the safety is on and the gun’s stock is folded under so he can’t touch the trigger). In late August Yacoub writes the Shahada—the standard Muslim declaration of faith—in the phone’s Notes app, where Maura discovers it while making her IKEA shopping list. There is no god but God, and Mohammed is the messenger of God. He writes cheesy love poetry into the Notes app (How does the heart forget you, the taste of sugar that is lost?), he tries to visit Sex.com, he takes selfies with qat wadded in his cheek, he is every teenager in the history of teenagers.

Read the story

]]>
164802
How Apple’s Transcendent Chihuahua Killed the Revolution https://longreads.com/2015/06/16/how-apples-transcendent-chihuahua-killed-the-revolution/ Tue, 16 Jun 2015 15:00:29 +0000 http://blog.longreads.com/?p=16811 Few are excited about the Apple Watch—its burdens are too easily imagined. And yet we treat it as an inevitability. How did this happen?]]>

Ian Bogost | from The Geek’s Chihuahua | University of Minnesota Press | April 2015 | 22 minutes (5,539 words)

The following is an excerpt from Ian Bogost’s book The Geek’s Chihuahua, which addresses “the modern love affair of ‘living with Apple’ during the height of the company’s market influence and technology dominance,” and how smartphones created a phenomenon of “hyperemployment.”

***

Think back to 2007, when you got the first iPhone. (You did get one, didn’t you? Of course you did.) You don’t need me to remind you that it was a shiny object of impressive design, slick in hand and light in pocket. Its screen was bright and its many animations produced endless, silent “oohs” even as they became quickly familiar. Accelerometer-triggered rotations, cell tower triangulations (the first model didn’t have GPS yet), and seamless cellular/WiFi data transitions invoked strong levels of welcome magic. These were all novelties once, and not that long ago.

What you probably don’t remember: that first iPhone was also terrible. Practically unusable, really, for the ordinary barrage of phone calls, text messages, mobile email, and web browsing that earlier smartphones had made portable. And not for the reasons we feared before getting our hands on one—typing without tactile feedback wasn’t as hard to get used to as BlackBerry and Treo road warriors had feared, even if it still required a deliberate transition from t9 or mini-keyboard devices—but rather because the device software was pushing the limits of what affordable hardware could handle at the time.

Applications loaded incredibly slowly. Pulling up a number or composing an email by contact name was best begun before ordering a latte or watering a urinal to account for the ensuing delay. Cellular telephone reception was far inferior to other devices available at the time, and regaining a lost signal frequently required an antenna or power cycle. Wireless data reception was poor and slow, and the device’s ability to handle passing in and out of what coverage it might find was limited. Tasks interrupted by coverage losses, such as email sends in progress, frequently failed completely.

The software was barebones. There was no App Store in those early days, making the iPhone’s operating system a self-contained affair, a ladleful of Apple-apportioned software gruel, the same for everyone. That it worked at all was a miracle, but our expectations had been set high by decades of complex, adept desktop software. By comparison, the iPhone’s apps were barebones. The Mail application, for example, borrowed none of its desktop cousin’s elegant color-coded, threaded summary view but instead demanded inexplicable click-touches back and forward from folder to folder, mailbox to mailbox.

Some of these defects have been long since remedied in the many iterations of the device that have appeared since its 2007 debut. Telephony works well, and who uses the phone anymore anyway? Data speed and reliability have been updated both on wireless network infrastructures and in the smartphone itself. But other issues persist. For those who cut their computing teeth on desktops and laptops—the things that we used to mean when we used the word computer—manipulating mobile software still feels awkward and laborious. Those many taps of the original Mail app haven’t been altered or remedied so much as they have become standardized. Now, we use all software in the convoluted manner mobile operating systems demand, from email to word processing to video editing.

It put us in our place before Apple. This was the purpose of the iPhone, and this is its primary legacy.

But to issue complaints about usability misses the point of the iPhone, even all those years ago, and certainly today. The iPhone was never a device one should have expected to “just work,” to quote Apple’s familiar advertising lingo. It is a device one has to accommodate. It taught us how to tolerate Apple making us tolerate it. It put us in our place before Apple. This was the purpose of the iPhone, and this is its primary legacy.

Then, as now, the iPhone demands to be touched just right, in precisely the right spot on menu, list, or keyboard, and with precisely the right gesture. Likewise, it demands not to be touched just after, when being pocketed or moved or simply turned to place at one’s ear. Doing otherwise erroneously launches, or quits, or selects, or deletes, or slides, or invokes Siri the supposedly intelligent personal assistant, or performs some other action, desired or not, slickly coupled to a touch or gestural control.

The iPhone resists usability, a term reserved for apparatuses humans make their servants. An iPhone is not a computer. It is a living creature, one filled with caprice and vagary like a brilliant artist, like a beautiful woman, like a difficult executive. Whether it is usable is not the point. To use the iPhone is to submit to it. Not to its interfaces, but to the ambiguity of its interpretation of them. To understand it as an Other, an alien being boasting ineffable promise and allure. Touch here? Stroke there? Stop here? Do it again? The impressive fragility of the device only reinforces this sense—to do it wrong by dropping or misgesturing might lead to unknown consequences. Unlike other portable devices—a Walkman or a traditional mobile phone— the iPhone embraces fragility rather than ruggedness. It demands to be treated with kid gloves. Even before you’ve first touched it, you can already hear yourself apologizing for your own blunders in its presence, as if you are there to serve it rather than it you. The iPhone is a device that can send you far out of your way, and yet you feel good about it. It is a device that can endear you to it by resisting your demands rather than surrendering to them.

Rather than thinking of the iPhone as a smartphone, like a Treo or a BlackBerry or, eventually, the Android devices that would mimic it, one would do better to think of the iPhone as a pet. It is the toy dog of mobile devices, a creature one holds gently and pets carefully, never sure whether it might nuzzle or bite. Like a Chihuahua, it rides along with you, in arm or in purse or in pocket, peering out to assert both your status as its owner and its mastery over you as empress. And like a toy dog, it reserves the right never to do the same thing a second time, even given the same triggers. Its foibles and eccentricities demand far greater effort than its more stoic smartphone cousins, but in so doing, it challenges you to make sense of it.

iPhone Original/3G/4/5. Photo by Yutaka Tsutano
iPhone Original/3G/4/5. Photo by Yutaka Tsutano

The BlackBerry’s simplicity and effectiveness yielded a constant barrage of new things to do. And eventually, so would the smartphone—social media feeds and status updates replaced work with play-as-work, with hyperemployment, a term I’ll explain soon enough. But that first iPhone resisted utility old and new. It acclimated us to the new quirks of touchscreen life, of attempting to accomplish complex tasks that would have been easy on a normal computer but laborious on a tiny screen that ran one program at a time. Today we’ve acclimated, accepting these inefficiencies as givens. But such an eventuality was never guaranteed, and iPhone had to train us to tolerate them. Like the infirm must endure physical therapy to reform damaged limbs and tissues, so the smartphone user needed to be trained to accept and overcome the intrinsic incapacities of the handheld computer.

This was harder than it sounds in retrospect. That first iPhone receded into itself at times, offering its owner no choice but to pet it in vain, or to pack it away it until it regained composure, or to reboot it in the hopes that what once worked might do so again. It was a beast of its vicissitudes. And it still is, albeit in different ways. To own an iPhone is to embrace such fickleness rather than to lament it in the hope for succor via software update. And even when one does come, it only introduces new quirks to replace the old ones: the slowdowns of an operating system upgrade launched to execute planned obsolescence, say, or via new sensors, panels, controls, and interfaces that render a once modernist simplicity baroque.
[pullquote align=”center”]The brilliance of the iPhone is not how intuitive or powerful or useful it is—for really it is none of these things.[/pullquote]
Indeed, when you would meet new iPhone users, they would share much more in common with smug, tired pet owners than with mobile busybodies. “Here, let me show you,” one would say proudly when asked how she liked it. Fingers would stretch gently over photos, zooming and turning. They’d flick nonchalantly through web pages and music playlists. As with the toy dog or the kitten, when the iPhone fails to perform as expected, its owners would simply shrug in capitulation. “Who knows what goes through its head,” one might rationalize, as she might do just the same when her Maltese jerks from sleep and scurries frantically, sliding across wood around a corner.

The brilliance of the iPhone is not how intuitive or powerful or useful it is—for really it is none of these things. Rather, the brilliance of the iPhone is in its ability to transcend the world of gadgetry and enter another one: the world of companionship. But unlike the Chihuahua or the bichon or even the kitten, the iPhone has no gender bias. It need not signal overwrought Hollywood glam, high-maintenance upper- class leisure, or sensitive loner solitude. iPhone owners can feel assured in their masculinity or femininity equally as they stroke and snuggle their pet devices, fearing no reprisal for fopishness or dorkship.

The Aibo and Pleo, those semirealistic robotic pets of the pre-iPhone era that attempted to simulate the form and movement of a furry biological pet, failed precisely because they did nothing else other than pretend to be real pets. The iPhone got it right: a pet is not an animal at all. A pet is a creature that responds meaningfully to touch and voice and closeness, but only sometimes. At other times, it retreats inextricably into its own mind, gears spinning in whatever alien way they must for other creatures. A pet is a sentient alien that cultures an attachment that might remain—that probably remains—unrequited. A pet is a bottomless pit for affect and devotion, yet one whose own feelings can never be truly known.

The iPhone offers an excuse to dampen the smartphone’s obsession with labor, productivity, progress, and efficiency with the touching, demented weirdness that comes with companionship. Despite its ability to text, to tweet, to Facebook, to Instagram, perhaps the real social promise of iPhone lies elsewhere: as a part of a more ordinary, more natural ecology of real social interaction. The messy sort that resists formalization in software form. The kind that makes unreasonable demands and yet sometimes surprises.

And of course, the kind that overheats and flips into mania. Mania, it turns out, is what iPhone wants most. To turn us all into the digital equivalent of the toy dog–toting socialite obsessive or the crazy cat lady, doting and tapping, swiping and cooing at glass rectangles with abandon.

***

In 1930, the economist John Maynard Keynes famously argued that by the time a century had passed, developed societies would be able to replace work with leisure thanks to widespread wealth and surplus. “We shall do more things for ourselves than is usual with the rich today,” he wrote, “only too glad to have small duties and tasks and routines.” Eighty years hence, it’s hard to find a moment in the day not filled with a duty or task or routine. If anything, it would seem that work has overtaken leisure almost entirely. We work increasingly hard for increasingly little, only to come home to catch up on the work we can’t manage to work on at work.

Take email. A friend recently posed a question on Facebook: “Remember when email was fun?” It’s hard to think back that far. On Prodigy, maybe, or with UNIX mail or Elm or Pine via telnet. Email was silly then, a trifle. A leisure activity out of Keynes’s macroeconomic tomorrowland. It was full of excess, a thing done because it could be rather than because it had to be. The worst part of email was forwarded jokes, and even those seem charming in retrospect. Even junk mail is endearing when it’s novel.

Now, email is a pot constantly boiling over. Like King Sisyphus pushing his boulder, we read, respond, delete, delete, delete, only to find that even more messages have arrived while we were pruning. A whole time management industry has erupted around email, urging us to check only once or twice a day, to avoid checking email first thing in the morning, and so forth. Even if such techniques work, the idea that managing the communication for a job now requires its own self-help literature reeks of a foul new anguish.
[pullquote align=”center”]Like King Sisyphus pushing his boulder, we read, respond, delete, delete, delete, only to find that even more messages have arrived while we were pruning.[/pullquote]
If you’re like many people, you’ve started using your smartphone as an alarm clock. Now it’s the first thing you see and hear in the morning. And touch, before your spouse or your crusty eyes. Then the ritual begins. Overnight, twenty or forty new emails: spam, solicitations, invitations, or requests from those whose days pass during your nights, mailing list reminders, bill pay notices. A quick triage, only to be undone while you shower and breakfast.

Email and online services have provided a way for employees to outsource work to one another. Whether you’re planning a meeting with an online poll, requesting an expense report submission to an enterprise resource planning (ERP) system, asking that a colleague contribute to a shared Google Doc, or just forwarding on a notice that “might be of interest,” jobs that previously would have been handled by specialized roles have now been distributed to everyone in an organization.

No matter what job you have, you probably have countless other jobs as well. Marketing and public communications were once centralized; now every division needs a social media presence, and maybe even a website to develop and manage. Thanks to Oracle and SAP, everyone is a part-time accountant and procurement specialist. Thanks to Oracle and Google Analytics, everyone is a part-time analyst.

Photo by Leo Chen
Photo by Leo Chen

And email has become the circulatory system along which internal outsourcing flows. Sending an email is easy and cheap, and emails create obligation on the part of a recipient without any prior agreement. In some cases, that obligation is bureaucratic, meant to drive productivity and reduce costs. “Self-service” software automation systems like these are nothing new—SAP’s ERP software has been around since the 1970s. But since the 2000s, such systems can notify and enforce compliance via email requests and nags. In other cases, email acts as a giant human shield, a white-collar Strategic Defense Initiative. The worker who emails enjoys both assignment and excuse all at once. “Didn’t you get my email?”

The despair of email has long left the workplace. Not just by infecting our evenings and weekends via Outlook web access and BlackBerry and iPhone, although it has certainly done that. Now we also run the email gauntlet with everyone. The ballet school’s schedule updates (always received too late, but “didn’t you get the email?”); the Scout troop announcements; the daily deals website notices; the PR distribution list you somehow got on after attending that conference; the insurance notification, informing you that your new coverage cards are available for self-service printing (you went paperless, yes?); and the email password reset notice that finally trickles in twelve hours later, because you forgot your insurance website password since a year ago. And so on.
[pullquote aling=”center”]Its primary function is to reproduce itself in enough volume to create anxiety and confusion.[/pullquote]
It’s easy to see email as unwelcome obligation, but too rarely do we take that obligation to its logical if obvious conclusion: those obligations are increasingly akin to another job—or better, many other jobs. For those of us lucky enough to be employed, we’re really hyperemployed—committed to our usual jobs and many other jobs as well. It goes without saying that we’re not being paid for all these jobs, but pay is almost beside the point, because the real cost of hyperemployment is time. We are doing all those things others aren’t doing instead of all the things we are competent at doing. And if we fail to do them, whether through active resistance or simple overwhelm, we alone suffer for it: the schedules don’t get made, the paperwork doesn’t get mailed, the proposals don’t get printed, and on and on.

But the deluge doesn’t stop with email, and hyperemployment extends even to the unemployed, thanks to our tacit agreement to work for so many Silicon Valley technology companies.

Increasingly, online life in general overwhelms. The endless, constant flow of email, notifications, direct messages, favorites, invitations. After that daybreak email triage, so many other icons on your phone boast badges silently enumerating their demands. Facebook notifications. Twitter @ messages, direct messages. Tumblr followers, Instagram favorites, Vine comments. Elsewhere too: comments on your blog, on your YouTube channel. The Facebook page you manage for your neighborhood association or your animal rescue charity. New messages in the forums you frequent. Your Kickstarter campaign updates. Your Etsy shop. Your eBay watch list. And then, of course, more email. Always more email.

Email is the plumbing of hyperemployment. Not only do automated systems notify and direct us via email but we direct and regulate one another through email. But even beyond its function as infrastructure, email also has a disciplinary function. The content of email almost doesn’t matter. Its primary function is to reproduce itself in enough volume to create anxiety and confusion. The constant flow of new email produces an endless supply of potential work. Even figuring out whether there is really any “actionable” effort in the endless stream of emails requires viewing, sorting, parsing, even before one can begin conducting the effort needed to act and respond.

We have become accustomed to using the term precarity to describe the condition whereby employment itself is unstable or insecure. But even within the increasingly precarious jobs, the work itself has become precarious too. Email is a mascot for this sensation. At every moment of the workday—and on into the evening and the night, thanks to smartphones—we face the possibility that some request or demand, reasonable or not, might be awaiting us.

Often, we cast these new obligations either as compulsions (the addictive, possibly dangerous draw of online life) or as necessities (the importance of digital contact and an “online brand” in the information economy). But what if we’re mistaken, and both tendencies are really just symptoms of hyperemployment? We are now competing with ourselves for our own attention.
[pullquote align+”center”]Rather than just being exploited or duped, we’ve been hyperemployed.[/pullquote]
When critics engage with the demands of online services via labor, they often cite exploitation as a simple explanation. It’s a sentiment that even has its own aphorism: “If you’re not paying for the product, you are the product.” The idea is that all the information you provide to Google and Facebook, all the content you create for Tumblr and Instagram, enables the primary business of such companies, which amounts to aggregating and reselling your data or access to it. In addition to the revenues extracted from ad sales, tech companies like YouTube and Instagram also managed to lever- age the speculative value of your data-and-attention into billion-dollar buyouts. Tech companies are using you, and they’re giving precious little back in return.

While often true, this phenomenon is not fundamentally new to online life. We get network television for free in exchange for the attention we devote to ads that interrupt our shows. We receive “discounts” on grocery store staples in exchange for allowing Kroger or Safeway to aggregate and sell our shopping data. Meanwhile, the companies we do pay directly as customers often treat us with disregard at best, abuse at worst (just think about your cable provider or your bank). Of course, we shouldn’t just accept online commercial exploitation just because exploitation in general has been around for ages. Rather, we should acknowledge that exploitation only partly explains today’s anxiety with online services.

Hyperemployment offers a subtly different way to characterize all the tiny effort we contribute to Facebook and Instagram and the like. It’s not just that we’ve been duped into contributing free value to technology companies (although that’s also true) but that we’ve tacitly agreed to work unpaid jobs for all these companies. And even calling them “unpaid” is slightly unfair, because we do get something back from these services, even if they often take more than they give. Rather than just being exploited or duped, we’ve been hyperemployed. We do tiny bits of work for Google, for Tumblr, for Twitter, all day and every day.

Today, everyone’s a hustler. But now we’re not even just hustling for ourselves or our bosses but for so many other, unseen bosses. For accounts payable and for marketing; for the Girl Scouts and the Youth Choir; for Facebook and for Google; for our friends via their Kickstarters and their Etsy shops; for Twitter, whose initial public offering converted years of tiny, aggregated work acts into seventy-eight dollars of fungible value per user.

Even if there is more than a modicum of exploitation at work in the hyperemployment economy, the despair and overwhelm of online life don’t derive from that exploitation—not directly anyway. Rather, it’s a type of exhaustion cut of the same sort that afflicts the underemployed as well, like the single mother working two part-time service jobs with no benefits or the PhD working three contingent teaching gigs at three different regional colleges to scrape together a still insufficient income. The economic impact of hyperemployment is obviously different from that of underemployment, but some of the same emotional toll imbues both: a sense of inundation, of being trounced by demands whose completion yields only their continuance, and a feeling of resignation that no other scenario is likely or even possible. The only difference between the despair of hyperemployment and that of underemployment is that the latter at least acknowledges itself as a substandard condition, whereas the former celebrates the hyperemployed’s purported freedom to “share” and “connect,” to do business more easily and effectively by doing jobs once left for others’ competence and compensation, from the convenience of your car or toilet.

Staring down the barrel of Keynes’s 2030 target for the arrival of universal leisure, economists have often considered why the economist seems to have been so wrong. The inflation of relative needs is one explanation—the arms race for better and more stuff and status. The ever-increasing wealth gap, on the rise since the anti-Keynes, supply-side 1980s, is another. But what if Keynes was right, too, in a way. Even if productivity has increased mostly to the benefit of the wealthy, hasn’t everyone gained enormous leisure, but by replacing recreation with work rather than work with recreation? This new work doesn’t even require employment; the destitute and unemployed hyperemployed are just as common as the affluent and retired hyperemployed. Perversely, it is only then, at the labor equivalent of the techno-anarchist’s singularity, that the malaise of hyperemployment can cease. Then all time will become work time, and we will not have any memory of leisure to distract us.

***

At the start of 2015, fewer than eight short years since the first launch of the iPhone, Apple was worth more than seven hundred billion dollars—more than the gross national product of Switzerland. Despite its origins as a computer company, this is a fortune built from smartphones more than laptops. Before 2007, smartphones were a curiosity, mostly an affectation of would-be executives carting BlackBerries and Treos in unfashionable belt holsters. Not even a decade ago, they were wild and feral. Today, smartphones are fully domesticated. Tigers made kittens, which we now pet ceaselessly. More than two-thirds of Americans own them, and they have become the primary form of computing.

But along with that domestication comes the inescapability of docility. Have you not accepted your smartphone’s reign over you rather than lamenting it? Stroking our glass screens, Chihuahua-like, is just what we do now, even if it also feels sinful. The hope and promise of new computer technology have given way to the malaise of living with it.

Shifts in technology are also shifts in culture and custom. And these shifts have become more frequent and more rapid over time. Before 2007, one of the most substantial technological shifts in daily life was probably the World Wide Web, which was already commercialized by the mid-1990s and mainstream by 2000. Before that? The personal computer, perhaps, which took from about 1977 until 1993 or so to become a staple of both home and business life. First we computerized work, then we computerized home and social life, then we condensed and transferred that life to our pockets. With the Apple Watch, now the company wants to condense it even further and have you wear it on your wrist.
[pullquote align=”center”]The hope and promise of new computer technology have given way to the malaise of living with it.[/pullquote]
Change is exciting, but it can also be exhausting. And for the first time in a long time, reactions to the Apple Watch seem to underscore exhaustion as much as excitement. But even these skeptical replies question the watch’s implementation rather than expressing lethargy at the prospect of living in the world it might bestow on us.

Some have accused Apple of failing to explain the purpose of its new wearable. The wristwatch connoisseur Benjamin Clymer calls it a “market leader in a category nobody asked for.” Apple veteran Ben Thompson rejoins Cook for failing to explain “why the Apple Watch existed, or what need it is supposed to fill.” Felix Salmon agrees, observing that Apple “has always been the company which makes products for real people, rather than gadgets for geeks,” before lamenting that the Apple Watch falls into the latter category.

“Apple hasn’t solved the basic smartwatch dilemma,” Salmon writes. But the dilemma he’s worried about proves to be a banal detail: “Smart watches use up far more energy than dumb watches.” He later admits that Apple might solve the battery and heft problems in a couple generations, but “I’m not holding my breath.” Salmon reacts to the Apple Watch’s design and engineering failings rather than lamenting the more mundane afflictions of being subjected to wrist-sized emails in addition to desktop- and pocket-sized ones. We’re rearranging icons on the Titanic.

After the Apple keynote, the Onion joked about the real product Apple had unveiled—a “brief, fleeting moment of excitement.” But like so much satire these days, it’s not really a joke. As Dan Frommer recently suggested, the Apple keynote is no less a product than are its phones and tablets. Apple is in the business of introducing big things as much as it is in the business of designing, manufacturing, distributing, and supporting them. In part, it has to be: Apple’s massive valuation, revenues, and past successes have only increased the street’s expectations for the company. In a world of so-called disruptive innovation, a company like Apple is expected to manufacture market-defining hit after hit.

Indeed, business is another context we often use to avoid engaging with our technological weariness. We talk about how Apple’s CEO Tim Cook must steer the tech giant into new waters—such as wearables—to ensure a fresh supply of desire, customers, and revenue. But the exigency of big business has an impact on our ordinary lives. It’s easy to cite the negative effects of a business environment focused on quarterly profits above all else, including maintaining job stability and paying into the federal or municipal tax base. In the case of Apple, something else is going on, too. In addition to being an economic burden, the urgency of technological innovation has become so habitual that we have become resigned to it. Wearables might not be perfect yet, we conclude, but they will happen. They already have.

I’m less interested in accepting wearables given the right technological conditions as I am prospectively exhausted at the idea of dealing with that future’s existence. Just think about it. All those people staring at their watches in the parking structure, in the elevator. Tapping and stroking them, nearly spilling their coffee as they swivel their hands to spin the watch’s tiny crown control.

You can imagine wearing Apple Watch, in no small part because you remember thinking that you could imagine carrying Apple’s iPhone—and then you did, and now you always do.

A whole new tech cliché convention: the zoned-out smartwatch early adopter staring into his outstretched arm, like an inert judoka at the ready. The inevitable thinkpieces turned nonfiction trade books about “wrist shrift” or some similarly punsome quip on the promise-and-danger of wearables.

The variegated buzzes of so many variable “haptic engine” vibrations, sending notices of emails arriving from a boss or a spammer or obscene images received from a Facebook friend. The terrible battery life Salmon worries about, and the necessity of purchasing a new, expensive wristwatch every couple years, along with an equally costly smartphone with which to mate it.

The emergence of a new, laborious media creation and consumption ecosystem built for glancing. The rise of the “glancicle,” which will replace the listicle. The PR emails and the b2b advertisements and the business consulting conference promotions all asking,“Is your brand glance-aware?”

These are mundane future grievances, but they are also likely ones. Unlike those of its competitor Google, with its eyeglass wearables and delivery drones and autonomous cars, Apple’s products are reasonable and expected—prosaic even, despite their refined design. Google’s future is truly science fictional, whereas Apple’s is mostly foreseeable. You can imagine wearing Apple Watch, in no small part because you remember thinking that you could imagine carrying Apple’s iPhone—and then you did, and now you always do.

Photo by LWYang
Photo by LWYang

Technology moves fast, but its speed now slows us down. A torpor has descended, the weariness of having lived this change before—or one similar enough, anyway—and all too recently. The future isn’t even here yet, and it’s already exhausted us in advance.

It’s a far cry from “future shock,” Alvin Toffler’s 1970 term for the postindustrial sensation that too much change happens in too short a time. Where once the loss of familiar institutions and practices produced a shock, now it produces something more tepid and routine. The planned obsolescence that coaxes us to replace our iPhone 5 with an iPhone 6 is no longer disquieting but just expected. I have to have one has become Of course I’ll get one. The idea that we might willingly reinvent social practice around wristwatch computers less than a decade after reforming it for smart- phones is no longer surprising but predictable. We’ve heard this story before; we know how it ends.

Future shock is over. Apple Watch reveals that we suffer a new affliction: future ennui. The excitement of a novel technology (or anything, really) has been replaced—or at least dampened—by the anguish of knowing its future burden. This listlessness might yet prove even worse than blind boosterism or cynical naysaying. Where the trauma of future shock could at least light a fire under its sufferers, future ennui exudes the viscous languor of indifferent acceptance. It doesn’t really matter that the Apple Watch doesn’t seem necessary, no more than the iPhone once didn’t too. Increasingly, change is not revolutionary, to use a word Apple has made banal, but presaged.

Our lassitude will probably be great for the companies like Apple that have worn us down with the constancy of their pestering. The poet Charles Baudelaire called ennui the worst sin, the one that could “swallow the world in a yawn.” As Apple Watch leads the suppuration of a new era of wearables, who has energy left to object? Who has the leisure for revolution, as we keep up with our social media timelines and emails and home thermostats and heart monitors?

When one is enervated by future ennui, there’s no vigor left even to ask if this future is one we even want. And even if we ask, lethargy will likely curtail our answers. No matter, though: soon enough, only a wrist’s glance worth of ideas will matter anyway. And at that point, even this short book’s worth of reflections on technology will be too much to bear, incompatible with our newfound obsession with wrist-sizing ideas. I’m sure I’ll adapt, like you will. Living with Apple means marching ever forward, through its aluminum- and glass-lined streets and into the warm, selfsame glow of the future.

***

Ian Bogost is Ivan Allen College Distinguished Chair in Media Studies and professor of interactive computing at Georgia Institute of Technology, where he also holds an appointment in the Scheller College of Business. His books include How to Do Things with Videogames (Minnesota, 2011) and Alien Phenomenology, or What It’s Like to Be a Thing (Minnesota, 2012).

]]>
16811
How Apple’s Transcendent Chihuahua Killed the Revolution https://longreads.com/2015/06/16/how-apples-transcendent-chihuahua-killed-the-revolution-2/ Tue, 16 Jun 2015 15:00:29 +0000 http://blog.longreads.com/?p=16811 Few are excited about the Apple Watch—its burdens are too easily imagined. And yet we treat it as an inevitability. How did this happen?]]>

Ian Bogost | from The Geek’s Chihuahua | University of Minnesota Press | April 2015 | 22 minutes (5,539 words)

The following is an excerpt from Ian Bogost’s book The Geek’s Chihuahua, which addresses “the modern love affair of ‘living with Apple’ during the height of the company’s market influence and technology dominance,” and how smartphones created a phenomenon of “hyperemployment.”

***

Think back to 2007, when you got the first iPhone. (You did get one, didn’t you? Of course you did.) You don’t need me to remind you that it was a shiny object of impressive design, slick in hand and light in pocket. Its screen was bright and its many animations produced endless, silent “oohs” even as they became quickly familiar. Accelerometer-triggered rotations, cell tower triangulations (the first model didn’t have GPS yet), and seamless cellular/WiFi data transitions invoked strong levels of welcome magic. These were all novelties once, and not that long ago.

What you probably don’t remember: that first iPhone was also terrible. Practically unusable, really, for the ordinary barrage of phone calls, text messages, mobile email, and web browsing that earlier smartphones had made portable. And not for the reasons we feared before getting our hands on one—typing without tactile feedback wasn’t as hard to get used to as BlackBerry and Treo road warriors had feared, even if it still required a deliberate transition from t9 or mini-keyboard devices—but rather because the device software was pushing the limits of what affordable hardware could handle at the time.

Applications loaded incredibly slowly. Pulling up a number or composing an email by contact name was best begun before ordering a latte or watering a urinal to account for the ensuing delay. Cellular telephone reception was far inferior to other devices available at the time, and regaining a lost signal frequently required an antenna or power cycle. Wireless data reception was poor and slow, and the device’s ability to handle passing in and out of what coverage it might find was limited. Tasks interrupted by coverage losses, such as email sends in progress, frequently failed completely.

The software was barebones. There was no App Store in those early days, making the iPhone’s operating system a self-contained affair, a ladleful of Apple-apportioned software gruel, the same for everyone. That it worked at all was a miracle, but our expectations had been set high by decades of complex, adept desktop software. By comparison, the iPhone’s apps were barebones. The Mail application, for example, borrowed none of its desktop cousin’s elegant color-coded, threaded summary view but instead demanded inexplicable click-touches back and forward from folder to folder, mailbox to mailbox.

Some of these defects have been long since remedied in the many iterations of the device that have appeared since its 2007 debut. Telephony works well, and who uses the phone anymore anyway? Data speed and reliability have been updated both on wireless network infrastructures and in the smartphone itself. But other issues persist. For those who cut their computing teeth on desktops and laptops—the things that we used to mean when we used the word computer—manipulating mobile software still feels awkward and laborious. Those many taps of the original Mail app haven’t been altered or remedied so much as they have become standardized. Now, we use all software in the convoluted manner mobile operating systems demand, from email to word processing to video editing.

It put us in our place before Apple. This was the purpose of the iPhone, and this is its primary legacy.

But to issue complaints about usability misses the point of the iPhone, even all those years ago, and certainly today. The iPhone was never a device one should have expected to “just work,” to quote Apple’s familiar advertising lingo. It is a device one has to accommodate. It taught us how to tolerate Apple making us tolerate it. It put us in our place before Apple. This was the purpose of the iPhone, and this is its primary legacy.

Then, as now, the iPhone demands to be touched just right, in precisely the right spot on menu, list, or keyboard, and with precisely the right gesture. Likewise, it demands not to be touched just after, when being pocketed or moved or simply turned to place at one’s ear. Doing otherwise erroneously launches, or quits, or selects, or deletes, or slides, or invokes Siri the supposedly intelligent personal assistant, or performs some other action, desired or not, slickly coupled to a touch or gestural control.

The iPhone resists usability, a term reserved for apparatuses humans make their servants. An iPhone is not a computer. It is a living creature, one filled with caprice and vagary like a brilliant artist, like a beautiful woman, like a difficult executive. Whether it is usable is not the point. To use the iPhone is to submit to it. Not to its interfaces, but to the ambiguity of its interpretation of them. To understand it as an Other, an alien being boasting ineffable promise and allure. Touch here? Stroke there? Stop here? Do it again? The impressive fragility of the device only reinforces this sense—to do it wrong by dropping or misgesturing might lead to unknown consequences. Unlike other portable devices—a Walkman or a traditional mobile phone— the iPhone embraces fragility rather than ruggedness. It demands to be treated with kid gloves. Even before you’ve first touched it, you can already hear yourself apologizing for your own blunders in its presence, as if you are there to serve it rather than it you. The iPhone is a device that can send you far out of your way, and yet you feel good about it. It is a device that can endear you to it by resisting your demands rather than surrendering to them.

Rather than thinking of the iPhone as a smartphone, like a Treo or a BlackBerry or, eventually, the Android devices that would mimic it, one would do better to think of the iPhone as a pet. It is the toy dog of mobile devices, a creature one holds gently and pets carefully, never sure whether it might nuzzle or bite. Like a Chihuahua, it rides along with you, in arm or in purse or in pocket, peering out to assert both your status as its owner and its mastery over you as empress. And like a toy dog, it reserves the right never to do the same thing a second time, even given the same triggers. Its foibles and eccentricities demand far greater effort than its more stoic smartphone cousins, but in so doing, it challenges you to make sense of it.

iPhone Original/3G/4/5. Photo by Yutaka Tsutano
iPhone Original/3G/4/5. Photo by Yutaka Tsutano

The BlackBerry’s simplicity and effectiveness yielded a constant barrage of new things to do. And eventually, so would the smartphone—social media feeds and status updates replaced work with play-as-work, with hyperemployment, a term I’ll explain soon enough. But that first iPhone resisted utility old and new. It acclimated us to the new quirks of touchscreen life, of attempting to accomplish complex tasks that would have been easy on a normal computer but laborious on a tiny screen that ran one program at a time. Today we’ve acclimated, accepting these inefficiencies as givens. But such an eventuality was never guaranteed, and iPhone had to train us to tolerate them. Like the infirm must endure physical therapy to reform damaged limbs and tissues, so the smartphone user needed to be trained to accept and overcome the intrinsic incapacities of the handheld computer.

This was harder than it sounds in retrospect. That first iPhone receded into itself at times, offering its owner no choice but to pet it in vain, or to pack it away it until it regained composure, or to reboot it in the hopes that what once worked might do so again. It was a beast of its vicissitudes. And it still is, albeit in different ways. To own an iPhone is to embrace such fickleness rather than to lament it in the hope for succor via software update. And even when one does come, it only introduces new quirks to replace the old ones: the slowdowns of an operating system upgrade launched to execute planned obsolescence, say, or via new sensors, panels, controls, and interfaces that render a once modernist simplicity baroque.
[pullquote align=”center”]The brilliance of the iPhone is not how intuitive or powerful or useful it is—for really it is none of these things.[/pullquote]
Indeed, when you would meet new iPhone users, they would share much more in common with smug, tired pet owners than with mobile busybodies. “Here, let me show you,” one would say proudly when asked how she liked it. Fingers would stretch gently over photos, zooming and turning. They’d flick nonchalantly through web pages and music playlists. As with the toy dog or the kitten, when the iPhone fails to perform as expected, its owners would simply shrug in capitulation. “Who knows what goes through its head,” one might rationalize, as she might do just the same when her Maltese jerks from sleep and scurries frantically, sliding across wood around a corner.

The brilliance of the iPhone is not how intuitive or powerful or useful it is—for really it is none of these things. Rather, the brilliance of the iPhone is in its ability to transcend the world of gadgetry and enter another one: the world of companionship. But unlike the Chihuahua or the bichon or even the kitten, the iPhone has no gender bias. It need not signal overwrought Hollywood glam, high-maintenance upper- class leisure, or sensitive loner solitude. iPhone owners can feel assured in their masculinity or femininity equally as they stroke and snuggle their pet devices, fearing no reprisal for fopishness or dorkship.

The Aibo and Pleo, those semirealistic robotic pets of the pre-iPhone era that attempted to simulate the form and movement of a furry biological pet, failed precisely because they did nothing else other than pretend to be real pets. The iPhone got it right: a pet is not an animal at all. A pet is a creature that responds meaningfully to touch and voice and closeness, but only sometimes. At other times, it retreats inextricably into its own mind, gears spinning in whatever alien way they must for other creatures. A pet is a sentient alien that cultures an attachment that might remain—that probably remains—unrequited. A pet is a bottomless pit for affect and devotion, yet one whose own feelings can never be truly known.

The iPhone offers an excuse to dampen the smartphone’s obsession with labor, productivity, progress, and efficiency with the touching, demented weirdness that comes with companionship. Despite its ability to text, to tweet, to Facebook, to Instagram, perhaps the real social promise of iPhone lies elsewhere: as a part of a more ordinary, more natural ecology of real social interaction. The messy sort that resists formalization in software form. The kind that makes unreasonable demands and yet sometimes surprises.

And of course, the kind that overheats and flips into mania. Mania, it turns out, is what iPhone wants most. To turn us all into the digital equivalent of the toy dog–toting socialite obsessive or the crazy cat lady, doting and tapping, swiping and cooing at glass rectangles with abandon.

***

In 1930, the economist John Maynard Keynes famously argued that by the time a century had passed, developed societies would be able to replace work with leisure thanks to widespread wealth and surplus. “We shall do more things for ourselves than is usual with the rich today,” he wrote, “only too glad to have small duties and tasks and routines.” Eighty years hence, it’s hard to find a moment in the day not filled with a duty or task or routine. If anything, it would seem that work has overtaken leisure almost entirely. We work increasingly hard for increasingly little, only to come home to catch up on the work we can’t manage to work on at work.

Take email. A friend recently posed a question on Facebook: “Remember when email was fun?” It’s hard to think back that far. On Prodigy, maybe, or with UNIX mail or Elm or Pine via telnet. Email was silly then, a trifle. A leisure activity out of Keynes’s macroeconomic tomorrowland. It was full of excess, a thing done because it could be rather than because it had to be. The worst part of email was forwarded jokes, and even those seem charming in retrospect. Even junk mail is endearing when it’s novel.

Now, email is a pot constantly boiling over. Like King Sisyphus pushing his boulder, we read, respond, delete, delete, delete, only to find that even more messages have arrived while we were pruning. A whole time management industry has erupted around email, urging us to check only once or twice a day, to avoid checking email first thing in the morning, and so forth. Even if such techniques work, the idea that managing the communication for a job now requires its own self-help literature reeks of a foul new anguish.
[pullquote align=”center”]Like King Sisyphus pushing his boulder, we read, respond, delete, delete, delete, only to find that even more messages have arrived while we were pruning.[/pullquote]
If you’re like many people, you’ve started using your smartphone as an alarm clock. Now it’s the first thing you see and hear in the morning. And touch, before your spouse or your crusty eyes. Then the ritual begins. Overnight, twenty or forty new emails: spam, solicitations, invitations, or requests from those whose days pass during your nights, mailing list reminders, bill pay notices. A quick triage, only to be undone while you shower and breakfast.

Email and online services have provided a way for employees to outsource work to one another. Whether you’re planning a meeting with an online poll, requesting an expense report submission to an enterprise resource planning (ERP) system, asking that a colleague contribute to a shared Google Doc, or just forwarding on a notice that “might be of interest,” jobs that previously would have been handled by specialized roles have now been distributed to everyone in an organization.

No matter what job you have, you probably have countless other jobs as well. Marketing and public communications were once centralized; now every division needs a social media presence, and maybe even a website to develop and manage. Thanks to Oracle and SAP, everyone is a part-time accountant and procurement specialist. Thanks to Oracle and Google Analytics, everyone is a part-time analyst.

Photo by Leo Chen
Photo by Leo Chen

And email has become the circulatory system along which internal outsourcing flows. Sending an email is easy and cheap, and emails create obligation on the part of a recipient without any prior agreement. In some cases, that obligation is bureaucratic, meant to drive productivity and reduce costs. “Self-service” software automation systems like these are nothing new—SAP’s ERP software has been around since the 1970s. But since the 2000s, such systems can notify and enforce compliance via email requests and nags. In other cases, email acts as a giant human shield, a white-collar Strategic Defense Initiative. The worker who emails enjoys both assignment and excuse all at once. “Didn’t you get my email?”

The despair of email has long left the workplace. Not just by infecting our evenings and weekends via Outlook web access and BlackBerry and iPhone, although it has certainly done that. Now we also run the email gauntlet with everyone. The ballet school’s schedule updates (always received too late, but “didn’t you get the email?”); the Scout troop announcements; the daily deals website notices; the PR distribution list you somehow got on after attending that conference; the insurance notification, informing you that your new coverage cards are available for self-service printing (you went paperless, yes?); and the email password reset notice that finally trickles in twelve hours later, because you forgot your insurance website password since a year ago. And so on.
[pullquote aling=”center”]Its primary function is to reproduce itself in enough volume to create anxiety and confusion.[/pullquote]
It’s easy to see email as unwelcome obligation, but too rarely do we take that obligation to its logical if obvious conclusion: those obligations are increasingly akin to another job—or better, many other jobs. For those of us lucky enough to be employed, we’re really hyperemployed—committed to our usual jobs and many other jobs as well. It goes without saying that we’re not being paid for all these jobs, but pay is almost beside the point, because the real cost of hyperemployment is time. We are doing all those things others aren’t doing instead of all the things we are competent at doing. And if we fail to do them, whether through active resistance or simple overwhelm, we alone suffer for it: the schedules don’t get made, the paperwork doesn’t get mailed, the proposals don’t get printed, and on and on.

But the deluge doesn’t stop with email, and hyperemployment extends even to the unemployed, thanks to our tacit agreement to work for so many Silicon Valley technology companies.

Increasingly, online life in general overwhelms. The endless, constant flow of email, notifications, direct messages, favorites, invitations. After that daybreak email triage, so many other icons on your phone boast badges silently enumerating their demands. Facebook notifications. Twitter @ messages, direct messages. Tumblr followers, Instagram favorites, Vine comments. Elsewhere too: comments on your blog, on your YouTube channel. The Facebook page you manage for your neighborhood association or your animal rescue charity. New messages in the forums you frequent. Your Kickstarter campaign updates. Your Etsy shop. Your eBay watch list. And then, of course, more email. Always more email.

Email is the plumbing of hyperemployment. Not only do automated systems notify and direct us via email but we direct and regulate one another through email. But even beyond its function as infrastructure, email also has a disciplinary function. The content of email almost doesn’t matter. Its primary function is to reproduce itself in enough volume to create anxiety and confusion. The constant flow of new email produces an endless supply of potential work. Even figuring out whether there is really any “actionable” effort in the endless stream of emails requires viewing, sorting, parsing, even before one can begin conducting the effort needed to act and respond.

We have become accustomed to using the term precarity to describe the condition whereby employment itself is unstable or insecure. But even within the increasingly precarious jobs, the work itself has become precarious too. Email is a mascot for this sensation. At every moment of the workday—and on into the evening and the night, thanks to smartphones—we face the possibility that some request or demand, reasonable or not, might be awaiting us.

Often, we cast these new obligations either as compulsions (the addictive, possibly dangerous draw of online life) or as necessities (the importance of digital contact and an “online brand” in the information economy). But what if we’re mistaken, and both tendencies are really just symptoms of hyperemployment? We are now competing with ourselves for our own attention.
[pullquote align+”center”]Rather than just being exploited or duped, we’ve been hyperemployed.[/pullquote]
When critics engage with the demands of online services via labor, they often cite exploitation as a simple explanation. It’s a sentiment that even has its own aphorism: “If you’re not paying for the product, you are the product.” The idea is that all the information you provide to Google and Facebook, all the content you create for Tumblr and Instagram, enables the primary business of such companies, which amounts to aggregating and reselling your data or access to it. In addition to the revenues extracted from ad sales, tech companies like YouTube and Instagram also managed to lever- age the speculative value of your data-and-attention into billion-dollar buyouts. Tech companies are using you, and they’re giving precious little back in return.

While often true, this phenomenon is not fundamentally new to online life. We get network television for free in exchange for the attention we devote to ads that interrupt our shows. We receive “discounts” on grocery store staples in exchange for allowing Kroger or Safeway to aggregate and sell our shopping data. Meanwhile, the companies we do pay directly as customers often treat us with disregard at best, abuse at worst (just think about your cable provider or your bank). Of course, we shouldn’t just accept online commercial exploitation just because exploitation in general has been around for ages. Rather, we should acknowledge that exploitation only partly explains today’s anxiety with online services.

Hyperemployment offers a subtly different way to characterize all the tiny effort we contribute to Facebook and Instagram and the like. It’s not just that we’ve been duped into contributing free value to technology companies (although that’s also true) but that we’ve tacitly agreed to work unpaid jobs for all these companies. And even calling them “unpaid” is slightly unfair, because we do get something back from these services, even if they often take more than they give. Rather than just being exploited or duped, we’ve been hyperemployed. We do tiny bits of work for Google, for Tumblr, for Twitter, all day and every day.

Today, everyone’s a hustler. But now we’re not even just hustling for ourselves or our bosses but for so many other, unseen bosses. For accounts payable and for marketing; for the Girl Scouts and the Youth Choir; for Facebook and for Google; for our friends via their Kickstarters and their Etsy shops; for Twitter, whose initial public offering converted years of tiny, aggregated work acts into seventy-eight dollars of fungible value per user.

Even if there is more than a modicum of exploitation at work in the hyperemployment economy, the despair and overwhelm of online life don’t derive from that exploitation—not directly anyway. Rather, it’s a type of exhaustion cut of the same sort that afflicts the underemployed as well, like the single mother working two part-time service jobs with no benefits or the PhD working three contingent teaching gigs at three different regional colleges to scrape together a still insufficient income. The economic impact of hyperemployment is obviously different from that of underemployment, but some of the same emotional toll imbues both: a sense of inundation, of being trounced by demands whose completion yields only their continuance, and a feeling of resignation that no other scenario is likely or even possible. The only difference between the despair of hyperemployment and that of underemployment is that the latter at least acknowledges itself as a substandard condition, whereas the former celebrates the hyperemployed’s purported freedom to “share” and “connect,” to do business more easily and effectively by doing jobs once left for others’ competence and compensation, from the convenience of your car or toilet.

Staring down the barrel of Keynes’s 2030 target for the arrival of universal leisure, economists have often considered why the economist seems to have been so wrong. The inflation of relative needs is one explanation—the arms race for better and more stuff and status. The ever-increasing wealth gap, on the rise since the anti-Keynes, supply-side 1980s, is another. But what if Keynes was right, too, in a way. Even if productivity has increased mostly to the benefit of the wealthy, hasn’t everyone gained enormous leisure, but by replacing recreation with work rather than work with recreation? This new work doesn’t even require employment; the destitute and unemployed hyperemployed are just as common as the affluent and retired hyperemployed. Perversely, it is only then, at the labor equivalent of the techno-anarchist’s singularity, that the malaise of hyperemployment can cease. Then all time will become work time, and we will not have any memory of leisure to distract us.

***

At the start of 2015, fewer than eight short years since the first launch of the iPhone, Apple was worth more than seven hundred billion dollars—more than the gross national product of Switzerland. Despite its origins as a computer company, this is a fortune built from smartphones more than laptops. Before 2007, smartphones were a curiosity, mostly an affectation of would-be executives carting BlackBerries and Treos in unfashionable belt holsters. Not even a decade ago, they were wild and feral. Today, smartphones are fully domesticated. Tigers made kittens, which we now pet ceaselessly. More than two-thirds of Americans own them, and they have become the primary form of computing.

But along with that domestication comes the inescapability of docility. Have you not accepted your smartphone’s reign over you rather than lamenting it? Stroking our glass screens, Chihuahua-like, is just what we do now, even if it also feels sinful. The hope and promise of new computer technology have given way to the malaise of living with it.

Shifts in technology are also shifts in culture and custom. And these shifts have become more frequent and more rapid over time. Before 2007, one of the most substantial technological shifts in daily life was probably the World Wide Web, which was already commercialized by the mid-1990s and mainstream by 2000. Before that? The personal computer, perhaps, which took from about 1977 until 1993 or so to become a staple of both home and business life. First we computerized work, then we computerized home and social life, then we condensed and transferred that life to our pockets. With the Apple Watch, now the company wants to condense it even further and have you wear it on your wrist.
[pullquote align=”center”]The hope and promise of new computer technology have given way to the malaise of living with it.[/pullquote]
Change is exciting, but it can also be exhausting. And for the first time in a long time, reactions to the Apple Watch seem to underscore exhaustion as much as excitement. But even these skeptical replies question the watch’s implementation rather than expressing lethargy at the prospect of living in the world it might bestow on us.

Some have accused Apple of failing to explain the purpose of its new wearable. The wristwatch connoisseur Benjamin Clymer calls it a “market leader in a category nobody asked for.” Apple veteran Ben Thompson rejoins Cook for failing to explain “why the Apple Watch existed, or what need it is supposed to fill.” Felix Salmon agrees, observing that Apple “has always been the company which makes products for real people, rather than gadgets for geeks,” before lamenting that the Apple Watch falls into the latter category.

“Apple hasn’t solved the basic smartwatch dilemma,” Salmon writes. But the dilemma he’s worried about proves to be a banal detail: “Smart watches use up far more energy than dumb watches.” He later admits that Apple might solve the battery and heft problems in a couple generations, but “I’m not holding my breath.” Salmon reacts to the Apple Watch’s design and engineering failings rather than lamenting the more mundane afflictions of being subjected to wrist-sized emails in addition to desktop- and pocket-sized ones. We’re rearranging icons on the Titanic.

After the Apple keynote, the Onion joked about the real product Apple had unveiled—a “brief, fleeting moment of excitement.” But like so much satire these days, it’s not really a joke. As Dan Frommer recently suggested, the Apple keynote is no less a product than are its phones and tablets. Apple is in the business of introducing big things as much as it is in the business of designing, manufacturing, distributing, and supporting them. In part, it has to be: Apple’s massive valuation, revenues, and past successes have only increased the street’s expectations for the company. In a world of so-called disruptive innovation, a company like Apple is expected to manufacture market-defining hit after hit.

Indeed, business is another context we often use to avoid engaging with our technological weariness. We talk about how Apple’s CEO Tim Cook must steer the tech giant into new waters—such as wearables—to ensure a fresh supply of desire, customers, and revenue. But the exigency of big business has an impact on our ordinary lives. It’s easy to cite the negative effects of a business environment focused on quarterly profits above all else, including maintaining job stability and paying into the federal or municipal tax base. In the case of Apple, something else is going on, too. In addition to being an economic burden, the urgency of technological innovation has become so habitual that we have become resigned to it. Wearables might not be perfect yet, we conclude, but they will happen. They already have.

I’m less interested in accepting wearables given the right technological conditions as I am prospectively exhausted at the idea of dealing with that future’s existence. Just think about it. All those people staring at their watches in the parking structure, in the elevator. Tapping and stroking them, nearly spilling their coffee as they swivel their hands to spin the watch’s tiny crown control.

You can imagine wearing Apple Watch, in no small part because you remember thinking that you could imagine carrying Apple’s iPhone—and then you did, and now you always do.

A whole new tech cliché convention: the zoned-out smartwatch early adopter staring into his outstretched arm, like an inert judoka at the ready. The inevitable thinkpieces turned nonfiction trade books about “wrist shrift” or some similarly punsome quip on the promise-and-danger of wearables.

The variegated buzzes of so many variable “haptic engine” vibrations, sending notices of emails arriving from a boss or a spammer or obscene images received from a Facebook friend. The terrible battery life Salmon worries about, and the necessity of purchasing a new, expensive wristwatch every couple years, along with an equally costly smartphone with which to mate it.

The emergence of a new, laborious media creation and consumption ecosystem built for glancing. The rise of the “glancicle,” which will replace the listicle. The PR emails and the b2b advertisements and the business consulting conference promotions all asking,“Is your brand glance-aware?”

These are mundane future grievances, but they are also likely ones. Unlike those of its competitor Google, with its eyeglass wearables and delivery drones and autonomous cars, Apple’s products are reasonable and expected—prosaic even, despite their refined design. Google’s future is truly science fictional, whereas Apple’s is mostly foreseeable. You can imagine wearing Apple Watch, in no small part because you remember thinking that you could imagine carrying Apple’s iPhone—and then you did, and now you always do.

Photo by LWYang
Photo by LWYang

Technology moves fast, but its speed now slows us down. A torpor has descended, the weariness of having lived this change before—or one similar enough, anyway—and all too recently. The future isn’t even here yet, and it’s already exhausted us in advance.

It’s a far cry from “future shock,” Alvin Toffler’s 1970 term for the postindustrial sensation that too much change happens in too short a time. Where once the loss of familiar institutions and practices produced a shock, now it produces something more tepid and routine. The planned obsolescence that coaxes us to replace our iPhone 5 with an iPhone 6 is no longer disquieting but just expected. I have to have one has become Of course I’ll get one. The idea that we might willingly reinvent social practice around wristwatch computers less than a decade after reforming it for smart- phones is no longer surprising but predictable. We’ve heard this story before; we know how it ends.

Future shock is over. Apple Watch reveals that we suffer a new affliction: future ennui. The excitement of a novel technology (or anything, really) has been replaced—or at least dampened—by the anguish of knowing its future burden. This listlessness might yet prove even worse than blind boosterism or cynical naysaying. Where the trauma of future shock could at least light a fire under its sufferers, future ennui exudes the viscous languor of indifferent acceptance. It doesn’t really matter that the Apple Watch doesn’t seem necessary, no more than the iPhone once didn’t too. Increasingly, change is not revolutionary, to use a word Apple has made banal, but presaged.

Our lassitude will probably be great for the companies like Apple that have worn us down with the constancy of their pestering. The poet Charles Baudelaire called ennui the worst sin, the one that could “swallow the world in a yawn.” As Apple Watch leads the suppuration of a new era of wearables, who has energy left to object? Who has the leisure for revolution, as we keep up with our social media timelines and emails and home thermostats and heart monitors?

When one is enervated by future ennui, there’s no vigor left even to ask if this future is one we even want. And even if we ask, lethargy will likely curtail our answers. No matter, though: soon enough, only a wrist’s glance worth of ideas will matter anyway. And at that point, even this short book’s worth of reflections on technology will be too much to bear, incompatible with our newfound obsession with wrist-sizing ideas. I’m sure I’ll adapt, like you will. Living with Apple means marching ever forward, through its aluminum- and glass-lined streets and into the warm, selfsame glow of the future.

***

Ian Bogost is Ivan Allen College Distinguished Chair in Media Studies and professor of interactive computing at Georgia Institute of Technology, where he also holds an appointment in the Scheller College of Business. His books include How to Do Things with Videogames (Minnesota, 2011) and Alien Phenomenology, or What It’s Like to Be a Thing (Minnesota, 2012).

]]>
164731