Media and developing minds header

How to Stop Technology From Destabilizing the World: The Digital Assault on the Human Mind

October 17, 2018

 
Tristan Harris, PhD

Tristan Harris, PhD

Co-Founder and CEO, The Center for Humane Technology

 

Overview

Self-Knowledge and Persuasion.  We don’t know how we work.  The practice of magic suggests why that fact matters.  “… [W]hy is it that a magician can do something and it works on your mind every single time?  It’s because they know more about your mind in that particular niche than you do.” More broadly, in every competitive interaction, whomever knows more about the other’s mind and cognitive process wins.  This is rationale for embedding the principles of persuasion into how technology works.

The Race for Attention.  Human attention is a finite resource.  There are many activities and entities competing for it.  Our experiences with digitally induced distraction illustrate this phenomenon.  We go online to look at a short video sent by a friend, then find ourselves watching two hours of other videos suggested to us by YouTube.  “… [W]hat’s really going is in that moment you didn’t realize that you had a supercomputer pointed at your brain.”

Audiences and Algorithms.  This process repeats itself across numerous content providers and their vast audiences.  Google has 1.9 billion daily users. Facebook has 2.2 billion users. Apple estimates that people check their phone 80 times a day.  These measures of engagement suggest the extent of technology’s psychological influence surface area. Content providers acquire and exploit it so effectively because their ability to gather, analyze, and deploy information about our tastes and tendencies gives them an insurmountable advantage over sovereign human choice.  The fact that 70% of YouTube’s traffic is driven by its recommendation algorithms reflects that.

Processes that Defy Control.  The engineers behind the algorithms cannot control the ultimate consequences of their creations, because the algorithms are acting in languages and cultural and political contexts that the engineers do not understand, and because the details of what the algorithms do is beyond the ability of their human creators to comprehend.  One thing that seems true is that algorithms designed to capture attention introduce into the world an intrinsic bias toward sensationalization – a “race to the bottom of the brain stem”. “[N]o matter where you land, if YouTube wants you to watch more it always steers you, it tilts the playing field toward the radicalizing stuff.”  This leads to the proliferation of paranoia and conspiracy.  It explains why a new mother looking for peer support on Facebook gets referred to anti-vaccine groups, and from there to other groups dedicated to various conspiracies.  This introduces a thread of paranoia into many lives, which in turn spills out into the social fabric. It is not an intentional consequence of the social networks design, but it is a predictable one.  It explains how the United Nations can conclude that Facebook had a determining role in the Burmese genocide of the Rohingya people. When this vulnerability in the system is intentionally manipulated, you get results like the Russian propaganda that led Syrian humanitarian workers’ own parents to think that their own sons might be terrorists.  The potential effects on young audiences are particularly concerning.  “[L]et’s have algorithms actually figure what new children’s videos we should generate.  Let’s actually algorithmically generate titles of video and then we’ll automatically game the recommendation systems.”  All of our cognition is limited, but as to children, self-worth, identity development, and discernment capacity are limited, too.

Seeking Solutions.  “Going grayscale” – removing color cues from social media – might mitigate their influence on human behavior.  There’s no scientific evidence to support that idea, but it’s an extension of what we know about the role that color plays in attracting and holding attention.  (For example, “… red is the color reward that makes it a little bit more addictive.”) That won’t get to the broader, deeper problem, though.  Something more systematic is required.  We need to make fundamental changes to how social media are designed.

Just as astronauts’ photos of Earth from space launched the environmental movement, we must look more closely and deeply at ourselves to find a solution to the uncontrolled influence of social media.  “[T]he next phase of evolution is actually taking an honest look at the vulnerabilities of human nature – a manipulator’s view of a human animal.”

“[O]ne example is, we have to stop screwing with kids.  We can just say we need to do that. We don’t need to study it for 20 years.”  That does not mean insulating children from technology. Facetime, for example, promotes a kind of face-to-face contact.  “[I]t’s not that social media is bad. It’s that there is a certain form that is guided by this model of maximizing attention and that is the problem.”

Change is Possible. Change may be possible because no one wants the dystopian alternative.  That’s even reflected in the reluctance of some engineers to join the “engagement-maximizing machines” of the social media industry.  One step toward effecting change is the compilation of a “Ledger of Harms” – an inventory of the external costs generated when companies privately profit from people’s attention.  (It’s being created at http://ledger.humanetech.com.) Other forms of public awareness may help, too. For example, the Common Sense Media report shows that 72% of teens now believe that social media are manipulating them.  Other developments, at companies such as Facebook, Apple, and Google, suggest that public concerns are starting to influence corporate behavior, too. “All of this stuff is happening because we’re realizing that there is a massive problem.  You actually have political power to do something about it.”

Questions and Answers

Audience Question: Social media business models separate “users” from “customers” (those who buy access to user data). What about the alternative business model proposed by Jaron Lanier, in which the users are the customers, and the best data providers among those users get compensated for the data that they contribute, and other users support the social media with fees?

Answer: That’s a longer conversation. Government policies and legislation may need to change, to reveal to people how much they are worth as the product in social media transactions. That information would induce cultural change, and make the objectionable business model more expensive to operate.

Audience Question: What are the challenges encountered in trying to reach out to engineers and awaken companies to these issues?

Answer: Inside companies, people’s salaries depend on their not seeing these issues. Sen. Mark Warner, Sen. Richard Blumenthal and others convened a hearing in November 2017 that uncovered the real number of people affected by Russian propaganda (126 million Americans). This illustrates the importance of political pressure from outside the industry for opening minds within the industry.

Session Materials