MauiHawaiiTheWorld

Shedding Light in the Darkness

The Rise of the Attention Economy

kids-group-on-phones

“Everyone is distracted – all of the time” Justin Rosenstein, Facebook engineer who created the “like” button. “We’ve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium,” former Google strategist James Williams.

The Guardian has posted a brilliant analysis of the dark side of the internet explosion. Here is an extract:

Justin Rosenstein had tweaked his laptop’s operating system to block
Reddit, banned himself from Snapchat, which he compares to heroin, and
imposed limits on his use of Facebook. But even that wasn’t enough. In
August, the 34-year-old tech executive took a more radical step to
restrict his use of social media and other addictive technologies.

Rosenstein purchased a new iPhone and instructed his assistant to set
up a parental-control feature to prevent him from downloading any
apps.

He was particularly aware of the allure of Facebook “likes”, which he
describes as “bright dings of pseudo-pleasure” that can be as hollow
as they are seductive. And Rosenstein should know: he was the Facebook
engineer who created the “like” button in the first place.

A decade after he stayed up all night coding a prototype of what was
then called an “awesome” button, Rosenstein belongs to a small but
growing band of Silicon Valley heretics who complain about the rise of
the so-called “attention economy”: an internet shaped around the
demands of an advertising economy.

“It is very common,” Rosenstein says, “for humans to develop things
with the best of intentions and for them to have unintended, negative
consequences.”

There is growing concern that as well as addicting users, technology
is contributing toward so-called “continuous partial attention”,
severely limiting people’s ability to focus, and possibly lowering IQ.
One recent study showed that the mere presence of smartphones damages
cognitive capacity – even when the device is turned off. “Everyone is
distracted,” Rosenstein says. “All of the time.”

Drawing a straight line between addiction to social media and
political earthquakes like Brexit and the rise of Donald Trump, they
contend that digital forces have completely upended the political
system and, left unchecked, could even render democracy as we know it
obsolete.

In 2007, Rosenstein was one of a small group of Facebook employees who
decided to create a path of least resistance – a single click – to
“send little bits of positivity” across the platform. Facebook’s
“like” feature was, Rosenstein says, “wildly” successful: engagement
soared as people enjoyed the short-term boost they got from giving or
receiving social affirmation, while Facebook harvested valuable data
about the preferences of users that could be sold to advertisers.

It was Rosenstein’s colleague, Leah Pearlman, then a product manager
at Facebook and on the team that created the Facebook “like”, who
announced the feature in a 2009 blogpost. Now 35 and an illustrator,
Pearlman confirmed via email that she, too, has grown disaffected with
Facebook “likes” and other addictive feedback loops. She has installed
a web browser plug-in to eradicate her Facebook news feed, and hired a
social media manager to monitor her Facebook page so that she doesn’t
have to.

“One reason I think it is particularly important for us to talk about
this now is that we may be the last generation that can remember life
before,” Rosenstein says. It may or may not be relevant that
Rosenstein, Pearlman and most of the tech insiders questioning today’s
attention economy are in their 30s, members of the last generation
that can remember a world in which telephones were plugged into walls.

It is revealing that many of these younger technologists are weaning
themselves off their own products, sending their children to elite
Silicon Valley schools where iPhones, iPads and even laptops are
banned.

One morning in April this year, designers, programmers and tech
entrepreneurs from across the world gathered at a conference centre on
the shore of the San Francisco Bay. They had each paid up to $1,700 to
learn how to manipulate people into habitual use of their products, on
a course curated by conference organiser Nir Eyal.

Eyal, 39, the author of Hooked: How to Build Habit-Forming Products,
has spent several years consulting for the tech industry, teaching
techniques he developed by closely studying how the Silicon Valley
giants operate.

“The technologies we use have turned into compulsions, if not
full-fledged addictions,” Eyal writes. “It’s the impulse to check a
message notification. It’s the pull to visit YouTube, Facebook, or
Twitter for just a few minutes, only to find yourself still tapping
and scrolling an hour later.” None of this is an accident, he writes.
It is all “just as their designers intended”.

Aaccording to Tristan Harris, a 33-year-old former Google employee
turned vocal critic of the tech industry. “All of us are jacked into
this system,” he says. “All of our minds can be hijacked. Our choices
are not as free as we think they are.”

Harris, who has been branded “the closest thing Silicon Valley has to
a conscience”, insists that billions of people have little choice over
whether they use these now ubiquitous technologies, and are largely
unaware of the invisible ways in which a small number of people in
Silicon Valley are shaping their lives.

“A handful of people, working at a handful of technology companies,
through their choices will steer what a billion people are thinking
today,” he said at a recent TED talk in Vancouver.

“I don’t know a more urgent problem than this,” Harris says. “It’s
changing our democracy, and it’s changing our ability to have the
conversations and relationships that we want with each other.” Harris
went public – giving talks, writing papers, meeting lawmakers and
campaigning for reform after three years struggling to effect change
inside Google’s Mountain View headquarters.

An internal Facebook report leaked this year, revealed that the
company can identify when teens feel “insecure”, “worthless” and “need
a confidence boost”. Such granular information, Harris adds, is “a
perfect model of what buttons you can push in a particular person”.

Harris believes that tech companies never deliberately set out to make
their products addictive. They were responding to the incentives of an
advertising economy, experimenting with techniques that might capture
people’s attention, even stumbling across highly effective design by
accident.

A friend at Facebook told Harris that designers initially decided the
notification icon, which alerts people to new activity such as “friend
requests” or “likes”, should be blue. It fit Facebook’s style and, the
thinking went, would appear “subtle and innocuous”. “But no one used
it,” Harris says. “Then they switched it to red and of course everyone
used it.”

That red icon is now everywhere. When smartphone users glance at
their phones, dozens or hundreds of times a day, they are confronted
with small red dots beside their apps, pleading to be tapped. “Red is
a trigger colour,” Harris says. “That’s why it is used as an alarm
signal.”

The most seductive design, Harris explains, exploits the same
psychological susceptibility that makes gambling so compulsive:
variable rewards. When we tap those apps with red icons, we don’t know
whether we’ll discover an interesting email, an avalanche of “likes”,
or nothing at all. It is the possibility of disappointment that makes
it so compulsive.

The designer who created the pull-to-refresh mechanism, first used to
update Twitter feeds, is Loren Brichter, widely admired in the
app-building community for his sleek and intuitive designs. Now 32,
Brichter says he never intended the design to be addictive.

“I’ve spent many hours and weeks and months and years thinking about
whether anything I’ve done has made a net positive impact on society
or humanity at all,” he says.

“Smartphones are useful tools,” he says. “But they’re addictive.
Pull-to-refresh is addictive. Twitter is addictive. These are not good
things. When I was working on them, it was not something I was mature
enough to think about. I’m not saying I’m mature now, but I’m a little
bit more mature, and I regret the downsides.”

Roger McNamee, a venture capitalist who benefited from hugely
profitable investments in Google and Facebook, has grown disenchanted
with both companies, arguing that their early missions have been
distorted by the fortunes they have been able to earn through
advertising.

He identifies the advent of the smartphone as a turning point, raising
the stakes in an arms race for people’s attention. “Facebook and
Google assert with merit that they are giving users what they want,”
McNamee says. “The same can be said about tobacco companies and drug
dealers.”

“The people who run Facebook and Google are good people, whose
well-intentioned strategies have led to horrific unintended
consequences,” he says. “The problem is that there is nothing the
companies can do to address the harm unless they abandon their current
advertising models.”

James Williams does not believe talk of dystopia is far-fetched. The
ex-Google strategist who built the metrics system for the company’s
global search advertising business, he has had a front-row view of an
industry he describes as the “largest, most standardised and most
centralised form of attentional control in human history”.

The same forces that led tech firms to hook users with design tricks,
he says, also encourage those companies to depict the world in a way
that makes for compulsive, irresistible viewing. “The attention
economy incentivises the design of technologies that grab our
attention,” he says. “In so doing, it privileges our impulses over our
intentions.”

That means privileging what is sensational over what is nuanced,
appealing to emotion, anger and outrage. The news media is
increasingly working in service to tech companies, Williams adds, and
must play by the rules of the attention economy to “sensationalise,
bait and entertain in order to survive”.

It is not just shady or bad actors who were exploiting the internet to
change public opinion. The attention economy itself is set up to
promote a phenomenon like Trump, who is masterly at grabbing and
retaining the attention of supporters and critics alike, often by
exploiting or creating outrage.

Williams was making this case before the president was elected. In a
blog published a month before the US election, Williams sounded the
alarm bell on an issue he argued was a “far more consequential
question” than whether Trump reached the White House. The reality TV
star’s campaign, he said, had heralded a watershed in which “the new,
digitally supercharged dynamics of the attention economy have finally
crossed a threshold and become manifest in the political realm”.

“We’ve habituated ourselves into a perpetual cognitive style of
outrage, by internalising the dynamics of the medium,” he says.

“The dynamics of the attention economy are structurally set up to
undermine the human will,” he says. “If politics is an expression of
our human will, on individual and collective levels, then the
attention economy is directly undermining the assumptions that
democracy rests on.” If Apple, Facebook, Google, Twitter, Instagram
and Snapchat are gradually chipping away at our ability to control our
own minds, could there come a point, I ask, at which democracy no
longer functions?

“Will we be able to recognise it, if and when it happens?” Williams
replies. “And if we can’t, then how do we know it hasn’t happened
already?”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: