Regaining Focus by Enlarging Context – Part 2
How social media applications are designed to distract.
Johann Hari, in his book Stolen Focus: Why You Can’t Pay Attention – and How to Think Deeply Again, writes about how the very architecture of social media apps is designed to divide our attention and strip us of context. As Hari writes:
[Nathan Spreng, a neurologist at McGill University] is constantly trying to figure out how to get his phone to stop sending him notifications for things he doesn’t want to know. All this frenetic digital interruption is “pulling our attention away from our thoughts” … At Google, [Tristan Harris, a former Google engineer] quickly learned, success was measured, in the main, by what was called “engagement”— which was defined as minutes and hours of eyeballs on the product. More engagement was good; less engagement was bad. This was for a simple reason. The longer you make people look at their phones, the more advertising they see— and therefore the more money Google gets. Tristan’s co-workers were decent people, struggling with their own tech distractions— but the incentives seemed to lead only one way: you should always design products that “engage” the maximum number of people, because engagement equals more dollars, and disengagement equals fewer dollars. With each month that passed, Tristan became more startled by the casualness with which the attention of a billion people was being corroded at Google and the other Big Tech companies. One day he would hear an engineer excitedly saying: “Why don’t we make it buzz your phone every time we get an email?” Everyone would be thrilled— and a few weeks later, all over the world, phones began to buzz in pockets, and more people found themselves looking at Gmail more times a day. The engineers were always looking for new ways to suck eyeballs onto their program and keep them there … [Tristan] suggested [to Google] some modest changes as a place to start. Instead of notifying someone every time they have a new email, he suggested, we could notify them once a day, in a batch— so it’d be like getting a newspaper in the morning, instead of constantly following the rolling news. Every time we prompt somebody to click over to a new photo their friend has posted, we could warn them— on the same screen— that the average person who clicks on a photo is pulled away for twenty minutes before they get back to their task. We could tell them: You think it’ll only take a second, but it won’t. He suggested giving users a chance to pause every time they click to do something potentially seriously distracting, to check: Are you sure you want to do this? Do you know how much time it will take from you? “Humans make different decisions when we pause and consider,” he said. He was trying to give his colleagues a sense of the weight of the decisions they made every day: “We shape more than eleven billion interruptions to people’s lives every day. This is nuts!” The people sitting around you in the Googleplex, he explained, control more than 50 percent of all the notifications on all the phones in the whole world. We are “creating an arms race that causes companies to find more reasons to steal people’s time,” and it “destroys our common silence and ability to think.” He asked: “Do we really know what we’re doing to people?” … As part of this work, Aza Raskin] designed something that distinctly changed how the web works. It’s called “infinite scroll.” Older readers will remember that it used to be that the internet was divided into pages, and when you got to the bottom of one page, you had to decide to click a button to get to the next page. It was an active choice. It gave you a moment to pause and ask: Do I want to carry on looking at this? Aza designed the code that means you don’t have to ask that question anymore. Imagine you open Facebook. It downloads a chunk of status updates for you to read through. You scroll down through it, flicking your finger— and when you get to the bottom, it will automatically load another chunk for you to flick through. When you get to the bottom of that, it will automatically load another chunk, and another, and another, forever. You can never exhaust it. It will scroll infinitely. Aza was proud of the design. “At the outset, it looks like a really good invention,” he told me. He believed he was making life easier for everyone. He had been taught that increased speed and efficiency of access were always advances. His invention quickly spread all over the internet. Today, all social media and lots of other sites use a version of infinite scroll. But then Aza watched as the people around him changed. They seemed to be unable to pull themselves away from their devices, flicking through and through and through, thanks in part to the code he had designed. He found himself infinitely scrolling through what he often realized afterward was crap, and he wondered if he was making good use of his life … The logic of the underlying system was being laid bare for Aza. Silicon Valley sells itself by articulating “a big, lofty goal— connecting everyone in the world, or whatever it is. But when you’re actually doing the day- to- day work, it’s about increasing user numbers.” What you are selling is your ability to grab and hold attention … One day, James Williams— the former Google strategist I met— addressed an audience of hundreds of leading tech designers and asked them a simple question: “How many of you want to live in the world you are designing?” There was a silence in the room. People looked around them. Nobody put up their hand.
Humans are social animals, but today humans, especially younger ones, are increasingly becoming social media animals, increasingly separated from human contact. As Hari writes:
Tristan said to me that if you want to understand the deeper problems in the way our tech currently works— and why it is undermining our attention— a good place to start is with what seems like a simple question. Imagine you are visiting New York and you want to know which of your friends are around in the city so you can hang out with them. You turn to Facebook. The site will alert you about lots of things— a friend’s birthday, a photo you’ve been tagged in, a terrorist attack— but it won’t alert you to the physical proximity of somebody you might want to see in the real world. There’s no button that says “I want to meet up— who’s nearby and free?” This isn’t technologically tricky. It would be really easy for Facebook to be designed so that when you opened it, it told you which of your friends were close by and which of them would like to meet for a drink or dinner that week. The coding to do that is simple; Tristan and Aza and their friends could probably write it in a day. And it would be hugely popular. Ask any Facebook user: Would you like Facebook to physically connect you to your friends more, instead of keeping you endlessly scrolling? So— it’s an easy tweak, and users would love it. Why doesn’t it happen? Why won’t the market provide it? To understand why, Tristan and his colleagues explained to me, you need to step back and understand more about the business model of Facebook and the other social- media companies. If you follow the trail from this simple question, you will see the root of many of the problems we are facing. Facebook makes more money for every extra second you are staring through a screen at their site, and they lose money every time you put the screen down … Once you understand all this, you can see why there is no button that suggests you meet up with your friends and family away from the screen. Instead of getting us to maximize screen time, that would get us to maximize face- to- face time. Tristan said: “If people used Facebook just to quickly get on, so they could find the amazing thing to do with their friends that night, and get off, how would that [affect] Facebook’s stock price? The average amount of time people spend on Facebook today is something like fifty minutes a day…. [But] if Facebook acted that way, people would spend barely a few minutes on there per day, in a much more fulfilling way.” Facebook’s share price would collapse; it would be, for them, a catastrophe. This is why these sites are designed to be maximally distracting. They need to distract us, to make more money. “Their business model,” he says, “is screen time, not life time.”
And what you see on the social media screens is not put in chronological order, based on wider context, or even presented randomly. It’s designed by algorithms to direct your attention to specific things that an algorithm predicts will keep you hooked to your screen:
[W]hen Tristan and Aza said that these sites are designed to be as distracting as possible, I still didn’t really understand how. It seemed like a big claim. To grasp it, I had to first learn something else embarrassingly basic. When you open your Facebook feed, you see a whir of things for you to look at— your friends, their photos, some news stories. When I first joined Facebook back in 2008, I naively thought that these things appeared simply in the order in which my friends had posted them. I’m seeing my friend Rob’s photo because he just put it up; then my auntie’s status update comes next because she posted it before him. Or maybe, I thought, they were selected randomly. In fact, I learned over the years— as we all became more informed about these questions— that what you see is selected for you according to an algorithm. When Facebook (and all the others) decide what you see in your news feed, there are many thousands of things they could show you. So they have written a piece of code to automatically decide what you will see. There are all sorts of algorithms they could use— ways they could decide what you should see, and the order in which you should see them. They could have an algorithm designed to show you things that make you feel happy. They could have an algorithm designed to show you things that make you feel sad. They could have an algorithm to show you things that your friends are talking about most. The list of potential algorithms is long. The algorithm they actually use varies all the time, but it has one key driving principle that is consistent. It shows you things that will keep you looking at your screen. That’s it … Facebook, in 2015, filed a patent for technology that will be able to detect your emotions from the cameras on your laptop and phone … Aza warns, “our supercomputers are going to test their way to finding all our vulnerabilities, without anyone ever stopping to ask— is that right? It’ll feel to us a little bit like we’re still making our own decisions,” but it will be “a direct attack against agency and free will.” … [T]hese sites learn— as Tristan put it— how to “frack” you. These sites get to know what makes you tick, in very specific ways— they learn what you like to look at, what excites you, what angers you, what enrages you. They learn your personal triggers— what, specifically, will distract you … Remember: the more time you look, the more money they make. So the algorithm is always weighted toward figuring out what will keep you looking, and pumping more and more of that onto your screen to keep you from putting down your phone. It is designed to distract. But, Tristan was learning, that leads— quite unexpectedly, and without anyone intending it— to some other changes, which have turned out to be incredibly consequential. Imagine two Facebook feeds. One is full of updates, news, and videos that make you feel calm and happy. The other is full of updates, news, and videos that make you feel angry and outraged. Which one does the algorithm select? The algorithm is neutral about the question of whether it wants you to be calm or angry. That’s not its concern. It only cares about one thing: Will you keep scrolling? Unfortunately, there’s a quirk of human behavior. On average, we will stare at something negative and outrageous for a lot longer than we will stare at something positive and calm. You will stare at a car crash longer than you will stare at a person handing out flowers by the side of the road, even though the flowers will give you a lot more pleasure than the mangled bodies in a crash. This has been known about in psychology for years and is based on a broad body of evidence. It’s called “negativity bias.” There is growing evidence that this natural human quirk has a huge effect online. On YouTube, what are the words that you should put into the title of your video, if you want to get picked up by the algorithm? They are— according to the best site monitoring YouTube trends— words such as “hates,” “obliterates,” “slams,” “destroys.” A major study at New York University found that for every word of moral outrage you add to a tweet, your retweet rate will go up by 20 percent on average, and the words that will increase your retweet rate most are “attack,” “bad,” and “blame.” A study by the Pew Research Center found that if you fill your Facebook posts with “indignant disagreement,” you’ll double your likes and shares. So an algorithm that prioritizes keeping you glued to the screen will— unintentionally but inevitably— prioritize outraging and angering you. If it’s more enraging, it’s more engaging. If enough people are spending enough of their time being angered, that starts to change the culture. As Tristan told me, it “turns hate into a habit.” You can see this seeping into the bones of our society … [B]ecause of the way the [social media] algorithms work, these sites make you angry a lot of the time. Scientists have been proving in experiments for years that anger itself screws with your ability to pay attention. They have discovered that if I make you angry, you will pay less attention to the quality of arguments around you, and you will show “decreased depth of processing”— that is, you will think in a shallower, less attentive way. We’ve all had that feeling— you start prickling with rage, and your ability to properly listen goes out the window. The business models of these sites are jacking up our anger every day. Remember the words their algorithms promote— attack, bad, blame. These sites make you feel that you are in an environment full of anger and hostility, so you become more vigilant— a situation where more of your attention shifts to searching for dangers, and less and less is available for slower forms of focus like reading a book or playing with your kids.
Regarding kids, the next essay in this series will explore how modern internet media has affected children, and what might be done about it.