The Roblox app in the App Store is displayed on a smartphone screen and a Roblox logo in the background. (Thiago Prudencio / SOPA Images/Sipa USA)
Notifications incessantly ping our mobile and desktop screens. Algorithmic social media feeds consume vast quantities of our time. Simple online tasks require users to traverse minefields of unfavorable default options, all of which need to be laboriously unclicked. To address these daily annoyances of digital life, some might suggest updating smartphone notification settings, practicing better personal discipline, and doing less business online—in short, emphasizing personal responsibility and digital hygiene. But digital hygiene falls far short of systematically addressing the way in which technology is capturing an increasingly large share of our limited stock of attention.
Software does not get bored, tired, or overwhelmed, but we do—and when we do, software is often designed to prey on us. Without recognizing and potentially regulating for engagement maximization in technology, we may increasingly lose de facto ownership of our own attention through seemingly minute, but pervasive digital incursions. In a white paper recently published by UC Berkeley’s Center for Long-Term Cybersecurity, I propose a two-part solution to tech’s attention problem. First, we need to measure attention costs imposed by digital products so as to better understand just how much tech’s engagement maximization practices are costing us as we navigate ubiquitous digital infrastructures. Second, we need to develop measures to reduce attention costs when they are unacceptably high.
Maximizing for engagement, maximizing for attention
Digital products consume vast quantities of our attention, often for profit as part of a larger practice the scholar Tim Wu fittingly christened “attention harvesting.” Digital distractions may summon us by our names or private interests, as with personalized advertising and behavioral targeting. Digital interfaces may trick us into taking actions we don’t intend to, through the use of so-called “dark patterns.” Many or even most consumer-facing digital technologies have elements of design to maximally engross or distract us—a practice known as “engagement maximization.” Such engagement is defined with respect to observable metadata, such as how many times we click on content, how much time we spend interacting with that content, and how often we come back for more content. From such measures has grown a well-developed science for making digital products addictive.
But maximizing our engagement with digital products doesn’t necessarily increase consumer welfare and may even harm us. Researchers have shown correlations between smartphone addiction and low workplace productivity and a link between depression and use of social media networks such as Facebook, a master of engagement maximization. Digital products provide highly engineered distractions that may even undermine the physical safety of children due to digitally distracted caregivers. These harms may disproportionately impact lower-income communities, since affluent people have already begun taking steps to protect themselves from digital engagement. Of course, examples of harms do not prove that the net result of using a technology is necessarily negative, but evidence of such harms calls for more careful study. Internal company documents made public by Facebook whistleblower Frances Haugen illustrate that potential harms of large social-media platforms remain understudied, in large part due to researchers’ lack of access to quality data.
Of course, digital products can and do enrich our lives. They have been essential to maintaining our social lives, work, and education during the pandemic. But the increased use …….