Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
There's quite a lot of ads on our
0:02
podcasts these days. We need them to pay
0:04
for our team. But if you find them
0:06
a little distracting, annoying maybe, there's a simple
0:08
solution. Backers on patron, that means
0:10
no ads and uninterrupted listening. Follow
0:12
the link in the show notes to find out more.
0:14
If you If you have
0:16
kids or pets, you know
0:18
stains and odors in your
0:20
carpet and upholstery are inevitable.
0:23
But the experts at Kemdry
0:25
can help. Kemdry removes odors
0:27
and stubborn stains by sending
0:29
millions of carbonating bubbles deep
0:31
within your carpet. Kemdry lifts
0:34
dirt, urine, and stains to
0:36
the surface to then be
0:38
extracted away, giving you a
0:40
cleaner and healthier home. Call
0:42
1-800-kem-dry or visit kemdry.com today.
0:44
This episode This episode
0:47
is brought to you
0:49
by Enterprise Mobility. From
0:51
fleet management to flexible
0:53
truck rentals to technology
0:55
solutions. Enterprise Mobility helps
0:57
businesses find the right
0:59
mobility solutions so they
1:01
can find new opportunities.
1:03
Because if your business
1:05
is on the road
1:07
to success. Enterprise Mobility,
1:09
moving you, moves the
1:11
world. Find your road at
1:13
Enterprise You may mobility.com. get a little
1:16
excited when you shop
1:18
at Burlington. They have
1:20
a whole new Burlington!
1:22
I can buy too!
1:24
I'm saving so much!
1:27
Burlington saves you up
1:29
to 60% off other
1:31
retailers' prices every day.
1:33
Will it be the low
1:35
prices or the great brands?
1:38
You'll love the deals.
1:40
I told you so! I told
1:42
you so! the
1:52
deals. You'll love
1:55
Burlington. I
1:57
told you so. something today,
2:00
as in click the little thumbs up
2:02
or a heart icon on a social
2:04
media app. You might have wanted to
2:06
show your approval for a new story
2:08
or a political post or dole out
2:10
some little dopamine hit to a friend
2:12
or you might have just thought that
2:14
a photo or a video was funny.
2:16
But in doing so you've added in
2:18
the tiniest way to a vast and
2:20
growing data set that's changed the way
2:22
economies, political parties, technology and entire countries
2:24
operate and may be rewiring human psychology
2:26
at a fundamental level. The authors of
2:28
the new book, like, the button that
2:30
changed the world, estimates that a thumbs-up
2:33
like button, or one of its many
2:35
progeny, like a smiley or a heart,
2:37
gets clicked seven billion times a day.
2:39
That's a whole lot of like, and
2:41
as fast company magazine puts it, in
2:43
terms of sheer impact, the like button
2:45
was one of the most successful pieces
2:47
of code ever shipped. Where did the
2:49
commoditisation of like come from? What is
2:51
it doing to our world and to
2:53
ourselves? And where is like leading us?
2:55
Joining us today is a man who
2:57
co-wrote that book with his friend, a
2:59
think tank marketing girl called Martin Reeves.
3:01
Bob Goodson was one of the founding
3:03
engineering team at Yelp, the early recommendation
3:05
site that's still going, and which I
3:08
still use now and again. He's a
3:10
Silicon Valley O. And he has a
3:12
very special relationship with the like button.
3:14
Bob Goodson, welcome to the bunker. Thanks
3:16
Angie. So on page two of the
3:18
book there's a very crude sketch by
3:20
you from the early 2000s of a
3:22
thumbs up and a thumbs down. Tell
3:24
us how you personally with no aid
3:26
from anybody else created the like button
3:28
without any influence from the outside world.
3:30
Yes of course these things always evolve
3:32
and what we were trying to do
3:35
at Yelp as you say this is
3:37
local review site popular in the US
3:39
now in the early days is we're
3:41
trying to help create incentives for people
3:43
to write content to write content and
3:45
so we had a feature called sender
3:47
complement which took a few clicks and
3:49
back then that was a meaningful amount
3:51
of time because bandwidth was just much
3:53
slower and so we we were looking
3:55
for a way to allow people to
3:57
interact in the most fluid way. possible.
4:00
And our co-founder and CTO said to
4:02
me one day, could you come up
4:04
with a design? I'm going to look at
4:06
the JavaScript and see if we can
4:08
make this as smooth as possible. And
4:10
so I spent the afternoon just looking
4:12
around at other sites and thinking about
4:15
what it might look like. and there's
4:17
a little drawing on page two of
4:19
like the very first like button because
4:21
that was it was a bit of
4:23
a joke question because a key point
4:25
in the book is that no one
4:27
person ever invents these things they're kind
4:29
of you know almost kind of unconsciously
4:32
crowd sourced of people everybody thinking about
4:34
a similar problem and coming towards almost
4:36
an answer this it's always predestined. Exactly
4:38
we dedicated the book to all the
4:40
tinkers and makers that history will never
4:42
record and in many ways the book
4:44
is in tracing the origins of the
4:46
like button and trying to answer this
4:49
question well who really invented it we
4:51
found ourselves going on a journey of
4:53
research over a 30 year period and
4:56
looking into and finding all kinds of
4:58
characters that made little you know tweaks
5:00
around the edges and this thing really
5:03
evolved on its own and such as
5:05
the way with almost all technology, you
5:07
know, in the media we often get
5:09
fed the cartoon version of innovation where
5:12
there's one person heroically coming up with
5:14
something without outside influence and and one
5:16
of the things we're trying to do
5:18
with the book is reveal how Silicon
5:21
Valley really works and how technology really
5:23
gets developed because there are some important
5:25
lessons for how we think about like
5:27
new waves of innovation. Tell me, why
5:29
do humans, and this isn't way sounds
5:31
like a ridiculous question, but why do
5:33
humans like liking things because you kind
5:35
of homophily in that like you like
5:37
things that are like yourself and hypersociality.
5:40
We like being connected to all the
5:42
people and we like sharing values with
5:44
them. Why do we like these things?
5:46
Whenever something is as successful
5:48
as this feature has been, I think
5:50
we have to ask ourselves, what can
5:52
it tell us about us? What does
5:54
it reveal about the way we're wired
5:56
and our motivations? And there's no doubt
5:58
this has been incredibly successful. feature, perhaps
6:00
the most successful interaction element like
6:02
created in the last 20 years,
6:04
and having spoken to a number
6:06
of experts in psychology and neuroscience
6:08
and various other fields, a couple
6:10
of the key takeaways for us
6:13
are that there is a dopamine
6:15
mechanism which is becoming increasingly well
6:17
understood that liking connects to, and
6:19
there's an element of the sort
6:21
of sociality that we're wired for
6:23
as well, which is... you know,
6:25
see, there's the approval seeking mechanism
6:27
within those close to us that
6:29
it's also wired into. So we
6:31
make the case that there is,
6:33
that the, this feature evolved in
6:35
response to the way people behave
6:37
and interact with it through many,
6:39
many tweaks and changes over many
6:41
different sides. And there are some
6:43
important lessons to that are revealed
6:45
about some of the fundamental motivations
6:47
of day-to-day motivations of day-to-day life.
6:49
Yeah, it's like getting, it's more
6:51
than just getting a seal of
6:54
approval. It kind of like makes
6:56
you feel that you're part of
6:58
a community, and that's something that
7:00
seems to go right back to
7:02
the to the primal mind. There
7:04
were a lot of astonishing the
7:06
interesting nuggets in this. I am
7:08
quite fascinated at the idea. that
7:10
the thumbs up itself, the myth
7:12
that the Roman Emperor would give
7:14
a thumbs up or a thumbs
7:16
down, turns out to be almost
7:18
certainly untrue or, you know, distorted
7:20
to the lens of history. Can
7:22
you tell us a little bit
7:24
about that? Because that's actually quite
7:26
fascinating where the actual thumbs up
7:28
gesture comes from. Yeah, we thought
7:30
going into this, or at least
7:33
one of the first things we
7:35
wondered is how far back does
7:37
this mechanism go? And so we
7:39
have a chapter on the pre-digital
7:41
history of liking and we explore...
7:43
the significance of the thumbs-up symbol
7:45
and where that came from and
7:47
how that evolved over time to
7:49
become a popular icon really in
7:51
sort of, you know, post-Second World
7:53
War actually, or when it hit
7:55
a new level of popularity. But
7:57
we went back and thought about
7:59
the... gladatorial arenas. One
8:02
of the things that we realized
8:04
is that it was a new type
8:06
of architecture which permitted
8:08
visual signaling within
8:11
a crowd. And there were
8:13
certainly visual signals that
8:15
were, you know, used. Whether it
8:17
was thumbs up, sideways down and
8:19
so on, like that's much murkier
8:22
territory and probably our idea of
8:24
that. came from a famous painting
8:26
that became popular in the US
8:28
100 years ago or so. But
8:30
what is fascinating to me about
8:33
that time is that if you
8:35
think about it, before an amphitheater
8:37
structure, you couldn't see thousands of
8:39
people at once. When you put
8:42
everyone in a circular structure and
8:44
you put them on, you know,
8:46
tears, then everyone could see each
8:48
other and suddenly it creates the
8:51
possibility for... real-time voting in
8:53
a visual way across thousands
8:55
of people. And so there
8:57
is an interesting parallel because
9:00
this is an innovation or an architectural
9:02
shift. that made a type of voting
9:04
possible and in the same way the
9:06
creation of the internet and the the
9:08
browser created another type of voting potential
9:10
right that but but the the mechanism
9:13
and the appeal of it goes back
9:15
way before the digital era. Yeah I
9:17
just like the story of how you
9:19
know there's a notion that was a
9:21
source of signal with the thumb in
9:23
the Roman arena. But it's far from
9:26
certain that it was thumbs up, thumbs
9:28
down. In fact, it may have been
9:30
drawing the thumb across the throat to
9:32
indicate kill him or drawing the thumb
9:34
into the pocket to indicate sheath your
9:36
dagger. And this gets boudlerized by I
9:38
think a prechical Talmadge, the United States,
9:41
who infers the uptighted thumb, and therefore
9:43
infers from the downturn thumb that there
9:45
will be an upturned thumb. So the
9:47
idea that like something that's misunderstood, then
9:49
goes around the world that becomes the truth.
9:51
It becomes a truth. It is actually very
9:54
internet, is actually very internet, isn't it. Maybe
9:56
we've uncovered some of the... But that's the
9:58
nature of history, isn't it? I mean, in
10:00
a way, it's the crux of history,
10:02
isn't it? Trying to figure out
10:04
where did these ideas come from?
10:06
And can we get back to
10:09
the source? And so many layers
10:11
of interpretation mean history is a
10:13
very difficult subject. And the harder
10:15
you think about it. The more
10:17
you say, why do you think
10:19
about it? The more you say,
10:21
why should a thumbs-up mean, no,
10:23
anyway? It doesn't make any. Well,
10:25
I mean, go back to Silicon
10:28
Valley. mode of early age and
10:30
broadband and so on. After that
10:32
early age, why did developers, why
10:34
did large tech platforms
10:36
realize that like was so valuable
10:38
so why do they want this
10:40
functionality more and more?
10:42
So one way to think about it
10:44
is that the era of Web
10:47
2.0 was really about user-generated content
10:49
and the shift that happened during
10:51
that era is that we went
10:53
from a time when the web
10:55
was read by... 90% of people and
10:57
maybe 5 to 10% of people
10:59
were the ones who are doing
11:01
the writing and the content creation.
11:04
So we sort of had a
11:06
90-10 scenario is how we all
11:08
treated it and understood it looking
11:10
at the numbers. And what happened
11:12
during this era in the early
11:14
2000s is that people figured
11:16
out how to bring more and more
11:18
people into a content creator role. And
11:21
nowadays we all assume that we're going
11:23
to both consume and create content. But
11:25
that really wasn't the case in the
11:27
early web. And so in some ways the
11:30
like button accelerated this because it
11:32
created the atomic unit of user-generated
11:34
content. There was and there still
11:36
is no simpler way to create
11:38
content on the internet. And people
11:40
don't think of liking as a
11:43
content creation act. They think of
11:45
it as a reaction or a
11:47
type of reading actually, but it
11:49
pulled people in on a massive
11:51
scale as a percentage of those
11:53
people reading to writing into content
11:56
creators. And liking was just one
11:58
of those mechanisms, but really... social
12:00
media itself made it so easy to
12:02
create content, share photos, send messages that,
12:05
you know, it pull people into content
12:07
creator roles. Why is it so addictive?
12:09
Well, I mean, we mentioned like how,
12:12
you know, we want to feel that
12:14
we're part of the community, we want
12:16
to feel that the people around us,
12:19
we want to learn from the people
12:21
around us, but there's a kind of...
12:23
almost an addictive feel to both giving
12:26
your approval, but also receiving your approval.
12:28
We all know people and sometimes we
12:30
do it ourselves. You get very antsy
12:32
about the fact that they might have
12:35
stuck something up on, you know, on
12:37
a social media platform. I'm like, well,
12:39
where am I like? So nobody's lucky.
12:42
Nobody cares about me. I am nobody
12:44
without like I have nothing. Where does
12:46
this addictive thing come from? Well, dopamine
12:49
is a highly addictive substance. And I
12:51
mean, this is my interpretation that so
12:53
addictive. If you think about what dopamine
12:56
is meant for and how it's always
12:58
served humans, it's the thing that motivates
13:00
action. So I think about lunch, right?
13:03
You get to 1130 or something and
13:05
you start to get hungry and you
13:07
have this thought, I can make a
13:10
chicken sandwich and that it feels quite
13:12
good and you imagine the taste of
13:14
that chicken sandwich and that's a little
13:17
dopamine hit and the nature of dopamine
13:19
is it gives you a little hit.
13:21
an anticipation of the big hit of
13:24
getting a reward. Now you motivate to
13:26
take action, go and make the sandwich,
13:28
you know, there's some physical effort required
13:31
in that, and now you eat the
13:33
sandwich, and it closes the loop because
13:35
if the sandwich is good or better
13:38
than you expected, you're going to get
13:40
a really big dopamine hit. If it's
13:42
disappointing, you'll get a much smaller dopamine
13:45
hit. So we're constantly in this cycle
13:47
of... of anticipation of reward and it
13:49
should be hard work to get dopamine.
13:51
This is just speaking from personal opinion.
13:54
It should be hard work, right? It
13:56
should, dopamine rewards things like working, having
13:58
relationships. being a parent, you know, creating
14:01
things, cleaning up, and you know, all
14:03
the things that we... That's just saying
14:05
something funny on the internet and watching
14:08
the like file up, yes. Right, throughout
14:10
all of human history, dopamine has been
14:12
hard work to attain, and it's a
14:15
very, very powerful motivator for action. And
14:17
one of the things that's happened in
14:19
the past, particularly the
14:21
past five years in the way social
14:24
media has evolved, is it has become
14:26
increasingly easy to access dopamine. And I
14:28
think this is... just a fascinating shift.
14:30
I don't think there's ever been a
14:32
time in history where dopamine has been
14:34
so easy to access. It's got down
14:36
to a point where you can just
14:39
swipe your finger and you can get
14:41
a dopamine hit and then you swipe
14:43
again and you get another one. And
14:45
it is never really designed. I don't
14:47
think that mechanism evolved to be so
14:49
easy to access and for you to be
14:51
able to access it so frequently. When
14:56
you think about super successful businesses
14:58
that are selling through the roof
15:00
like Heinz or Mattel, you think
15:02
about a great product, a cool
15:05
brand and brilliant marketing. But there's
15:07
a secret. The business behind the
15:09
business making selling simple for them
15:12
and buying simple for their customers.
15:14
For millions of businesses, that business
15:16
is Shopify. Upgrade your business and
15:18
get the same checkout as Heinz
15:21
and Mattel. Sign up for your
15:23
$1 per month trial period at
15:25
shopify.com/promo. All lower case. Go
15:27
to shopify.com/promo to upgrade
15:30
your selling today. shopify.com
15:32
slash promo. Lo's knows
15:35
how to Lo's knows how to help pro save.
15:37
That's why the new Milo's pro rewards
15:39
program lets you unlock exclusive member deals
15:41
on the things you need every day
15:43
on the job. Plus, Milo's pro rewards
15:46
members can get volume discounts on eligible
15:48
orders through a quote of $2,000 or
15:50
more. Join for free today. Lo's, we
15:52
help, you save. Exclusions, more terms, and
15:55
restrictions apply. Program subject to terms conditions,
15:57
terms, conditions, terms, and restrictions, and restrictions,
15:59
conditions, and restrictions, conditions, details, and terms,
16:01
and terms, subject to change. Subject to
16:03
change. This This
16:05
episode is brought to you by Lifelock.
16:08
It's tax season and we're all a
16:10
bit tired of numbers, but here's one
16:12
you need to hear. $16.5 billion. That's
16:14
how much the IRS flagged
16:16
for possible identity fraud last
16:18
year. Now here's a good
16:21
number. 100 million. That's how
16:23
many data points Lifelock monitors
16:25
every second. If your identity
16:27
is stolen, they'll fix it.
16:29
Guaranteed. Save up to 40%
16:31
your first year at lifelock.com/Podcastas.
16:33
Terms apply. The
16:35
other thing that becomes very clear
16:37
from the book and in the
16:40
story of the whole mechanic is
16:42
that technology companies and platforms realize
16:44
very quickly that they can harness
16:46
that search for dopamine to create
16:48
one of the most powerful
16:50
objects that's ever existed, which
16:52
is the giant corpus of big
16:54
data where you have knowledge
16:56
of individuals' behavior right down to
16:58
individual decisions and you have
17:00
the methods by which to use
17:02
that. So previously, no human
17:04
being could ever fully understood the
17:06
activities of large numbers of people.
17:08
Now, the activities of billions of
17:11
people could be fully understood by
17:13
big data models and they can
17:15
be exploited to sell you things
17:17
to persuade you of certain political
17:19
arguments to deceive you. And it
17:21
can go really dark. So for
17:24
instance, as lots of listeners will
17:26
know, Facebook was held responsible for
17:28
prompting the genocide in Myanmar against
17:30
the Rohingya. This is to go
17:32
from, and I'm not blaming
17:34
you personally, obviously, but to go from
17:36
like a positive review, little thumbs up
17:38
on Yelp to this kind of mass
17:41
manipulation of the way humans think
17:43
is quite momentous, isn't it? Yeah,
17:45
there are a lot of
17:47
important lessons for how to
17:50
manage new technology. I mean,
17:52
ultimately, technology is a word that sort of
17:54
has a spell to it and a power
17:56
to it, but really what it is
17:58
is a new way to do... something.
18:00
And as soon as there is a new
18:02
way of doing something, then there can
18:04
be benefits and harms. In fact, there
18:07
are always benefits and harms. The idea
18:09
that you could ever create a
18:11
technology and it only ever be
18:13
used for good or only ever
18:15
create positive outcomes is ludicrous when
18:17
you think about it. Because it's
18:19
just a way of doing something
18:22
differently. So as soon as something
18:24
exists that's a new way of doing something,
18:26
there are new potential harms to
18:28
people. And that's what regulation and
18:30
governments have a role to find
18:33
and protect and they do. It's
18:35
a matter of time, usually as
18:38
these things get developed. Sometimes governments
18:40
react very quickly and preemptively to
18:42
new technologies. Other times it takes years
18:45
and sometimes it takes decades. Well perhaps
18:47
we can get to that in little
18:49
more detail in a minute, but you
18:51
know, the fact is that the use of
18:53
the like mechanic became pretty amoral
18:55
and driven by... the growth of companies
18:57
and the ability to do things like
18:59
transform the advertising market. So, you know,
19:02
just print ads and display ads have
19:04
gone largely and they've been replaced by
19:06
personal targeting of advertising. The UK information
19:08
commissioner, as you quote in the box,
19:11
says that reward loops or positive reinforcement
19:13
techniques, nudge or encourage users to say
19:15
actively engage with the service, allowing the
19:18
online service to collect more personal data.
19:20
So the more we use it, the
19:22
better it gets the more we use
19:25
it. and it seems to race
19:27
ahead of the regulation that you've
19:29
just described there, you're a very
19:31
experienced Silicon Valley person, is that,
19:33
are people in Silicon Valley worried
19:35
about what they're doing towards or
19:37
are they simply riding this to yet
19:40
more fame and fortune? It's a
19:42
good question and what's been interesting
19:44
to me is to see the sort of a
19:46
pattern or the shifts in the dialogue over
19:49
the years, over the last 20 years in
19:51
Silicon Valley, something that perhaps doesn't
19:53
get... talked about so much is
19:55
how idealistic people were in the early
19:57
days of the web and certainly in
20:00
the early 2000s as we went
20:02
into, you know, the second surge
20:04
of the web came into
20:07
effect. I think back
20:09
to being in that community
20:11
where so many of these
20:13
companies were getting started
20:16
and just knowing so many
20:18
of those founders. And I would
20:20
say, hand on heart, about
20:22
half of the dialogue was
20:25
about the future of education,
20:27
you know. We take for granted now
20:29
that we all have access to
20:32
an amazing encyclopedia for free, for
20:34
example. But when I arrived in
20:37
San Francisco in 2004, if you
20:39
wanted an encyclopedia in your
20:41
home, you had to be somewhat wealthy
20:43
to have one. They weren't for everybody.
20:45
I yearned for an encyclopedia when I
20:48
was a kid. So, you know, the
20:50
idea that information would be democratized,
20:52
that the ways we could learn
20:54
things would be democratized, you know,
20:57
the fact that, I mean, that
20:59
all the conversation that was around
21:01
politics is that democracies would get
21:03
stronger as information became free. And
21:05
so, but what I also observed
21:08
is that from those quite idealistic
21:10
foundations in say 2003, 2004, within
21:12
about four or five years, you
21:14
just didn't hear so much of
21:17
those conversations. And I honestly don't
21:19
think anybody involved at that time
21:21
realized that there were... were at the start
21:23
of a creation of a multi-trillion dollar
21:25
industry. I think that everyone was
21:28
aspirational and I think people were trying
21:30
to build big things. I don't think they
21:32
realized how much money was going to be
21:34
involved and once that was clear, then there
21:37
was a shift in the dialogue. I think
21:39
a lot of listeners were hearing the world
21:41
that you just described there and saying, well,
21:43
it's gone forever because we look at the
21:46
way tech behaves now, you know, one of
21:48
the most powerful men on the planet is
21:50
using it entirely. for ill. Zuckerberg, who is,
21:52
you know, not quite as powerful as Elon
21:55
Musk, but has basically abdicated any responsibility for
21:57
Facebook to moderate and pull back on
21:59
the... the harmful uses of Facebook.
22:01
It's as if the platforms
22:03
began being interested in like, and
22:05
then quickly realized that any
22:07
engagement is good engagement. And anybody
22:10
who has been using Twitter
22:12
until recently will have seen the
22:14
horrendous For You page, which
22:16
is designed solely to get you
22:18
angry, because anger is also
22:20
engagement. Is it just the profit
22:22
motive that pulled developers and
22:24
strategists away from the idea of
22:26
like, let's just quantify what
22:28
people like, and into darker and
22:30
more dangerous forms of engagement?
22:32
I think there's a new wave
22:35
of responsibility and development, which
22:37
I'm observing now, right? There are
22:39
folks that with part of
22:41
this generation of creating some of
22:43
these applications who are now
22:45
kind of going back into like,
22:47
you know, the second wave
22:49
of their careers, building new types
22:51
of applications, and doing so
22:53
with a new level of intention
22:55
and a sense of responsibility.
22:58
So I certainly wouldn't want to
23:00
characterize it that there aren't
23:02
a lot of people who are
23:04
in technology who think very
23:06
deeply about these questions. They absolutely
23:08
do. That said, you know,
23:10
I always like to look at
23:12
these systems from an incentive
23:14
standpoint. And what it's worth understanding
23:16
is that when there is
23:18
a competitive dynamic in an industry,
23:20
it's very, very hard for
23:23
companies not to pursue the outcome
23:25
that's best to compete. And
23:27
this is just the nature of
23:29
capitalism. And I don't see
23:31
that changing. Again, speaking from my
23:33
personal perspective, you look at
23:35
the incentives, you know, look at
23:37
what happened when TikTok came
23:39
into play with a very different
23:41
approach to the algorithms to
23:43
feed you content that would keep
23:45
your engagement, right? Completely different
23:48
approach. They didn't care if you
23:50
and I know each other
23:52
to show each other content. They
23:54
just cared about the attention
23:56
graph essentially, which is an evolution
23:58
from the social graph, perhaps
24:00
to the light graph, to the
24:02
attention graph. And... that changed the game for
24:04
social media because it was so successful those algorithms are so
24:06
successful the other social media platforms had to adopt the
24:09
same techniques in order to compete if
24:11
they hadn't if they hadn't focused on
24:13
those adoption of techniques then you've got
24:15
one player who's running away with it
24:17
for attention so that there's this massive
24:19
in the case of social media it's
24:22
a massive competition for attention and if
24:24
if one player finds something that works
24:26
and gets a bit more attention the
24:28
others kind of have to follow to
24:30
keep up. And so can you into
24:32
that system design a sense of withholding
24:34
or you know moralistic responsibility when the
24:37
science isn't there yet, when the
24:39
studies aren't there yet in terms
24:41
of the harm? And you know
24:43
and I've looked into this deeply
24:45
and you know we're still at
24:47
early stages when it comes to
24:49
you know the scientific research around
24:52
this, you know expecting companies to
24:54
hold back. is really not that
24:56
realistic and which is where governments
24:58
and regulations come in to protect
25:00
to protect its citizens. Yeah it's
25:02
possibly unrealistic to expect companies to
25:04
hold back but it is realistic to
25:06
expect regulatory authorities and government to intervene, which
25:09
the social media platforms are fought doggedly to
25:11
prevent from happening. They don't want that to
25:13
happen. I mean, every social media user recognizes
25:15
the buzz you'll get from getting a lot
25:18
of likes and the annoying feeling of being
25:20
ignored. These are kind of primal things. And
25:22
it looks like where they've driven us to
25:25
is a place. Again, I'm not blaming the
25:27
like button for this. It's a little tiny
25:29
bit part player in a bigger drama, but
25:31
it seems to a different place where kind
25:34
of social technology that was going to
25:36
bring about the utopia that you described
25:38
that actually just become a big rage
25:40
machine and a big hatred machine and
25:42
a big division machine. Silicon Valley stood
25:44
against the kind of regulation and still
25:46
tries to stand against the regulation of
25:48
its product. And yet we have, you
25:50
know, situations where, you know, smartphone bans
25:52
have been seriously considered for kids because we
25:54
don't know what it's doing to kids' minds.
25:56
We have New York declaring a mental health
25:59
crisis caused by social media which you
26:01
just you describe in the book is
26:03
it enough for Silicon Valley to say
26:05
well we just build this stuff and
26:08
what it reveals about human nature is
26:10
sadly that's what's hard writing to human
26:12
nature. Yeah I'll leave others to comment
26:14
on the you know the the art
26:17
and science of regulation and and this
26:19
system is no doubt complex. I think
26:21
what is helpful for everyone to understand
26:24
is that is the system itself. I
26:26
think that a better level of understanding
26:28
of how innovation happens, how these things
26:30
get created, who the players are, you
26:33
know, I think that's healthy. I think
26:35
that more education and more information out
26:37
there about how these things work can
26:39
only be a good thing. And it's
26:42
one of the reasons that I was
26:44
motivated to write the book. I mean,
26:46
it took four years to write and
26:48
it's been a passion project, but it's...
26:51
I felt that there were lessons that
26:53
I could see that are not out
26:55
there in the public domain and that
26:58
myself and my co-author Martin were very
27:00
keen to put out there. I'll add
27:02
one more thing to the comment you
27:04
made that the utopia is gone. I
27:07
don't think the utopia is gone. I
27:09
believe that it's a matter of timeline.
27:11
So we're 20 years, let's say, into
27:13
the era of web 2.30 years into
27:16
the era of the web or so.
27:18
And I believe that when we look
27:20
back on a 100-year-year time frame, there
27:23
will be no doubt of the power
27:25
and benefits that the internet has brought
27:27
to humanity. I think that when you
27:29
look at things on a six-month time
27:32
frame or one-year time frame you say
27:34
how is it doing right now, that's
27:36
a different question and you're looking at
27:38
a different time frame. But the arc
27:41
of what's getting created and the value
27:43
getting created for people, I think is
27:45
overall a positive one and is going
27:48
to be reflected that way over a
27:50
long period of time. For example, I
27:52
like to think about what YouTube does.
27:54
Right? Now, as an adult, I was
27:57
kind of interested in music as a
27:59
kid, but I had a few piano
28:01
lessons and then, you know, stopped having
28:03
lessons. But I didn't play instruments as
28:06
a kid, but I've learned to play
28:08
guitar and piano in my adult years
28:10
entirely with YouTube. And you can access
28:12
music lessons essentially, right, through this free
28:15
platform. If you think about how many
28:17
skills people have learned on YouTube for
28:19
free, it's an enormous contribution to education.
28:22
And again, thinking back to pre-internet, if
28:24
you wanted to learn an instrument, you
28:26
had to have enough money to pay.
28:28
for a music teacher and most people
28:31
cannot afford to pay for a music
28:33
teacher. And this is just one example,
28:35
but you can say the same about
28:37
language acquisition and physical skills and training
28:40
and all sorts of other fields. In
28:42
fact, you know, we're having this conversation
28:44
today, thanks and people are listening to
28:47
it thanks to the internet. Yeah, I
28:49
mean, it's enabled so many things, but
28:51
also you have to put little aspects
28:53
that you could have, by the way,
28:56
also destroyed democracy. And, you know, is
28:58
that, you know, you know, without being
29:00
too facetious about it. It's like, you
29:02
know, it is a tradeoff, you know,
29:05
what thing, we can't resolve this here,
29:07
but I just think one particular point
29:09
that you made that was really interesting.
29:12
You pointed out, I mean, you could
29:14
have called this book, like, and the
29:16
Lord of Unintended Consequences, because so many
29:18
things that you describe, uh, generates, consequences
29:21
nobody because of envisaged. One thing he
29:23
points out is that it's encouraged, which
29:25
these innovations take off, has encouraged an
29:27
idea among the public that. technological transformation
29:30
can take place really really quickly and
29:32
with almost instantaneous changes and suddenly you're
29:34
living in a different world and things
29:36
of kind of you know destruction begets
29:39
creation immediately and now we're seeing that
29:41
applied to like the bricks and moors
29:43
a real world by a guy who's
29:46
entirely steeped in the world of the
29:48
internet this is what gets you the
29:50
doge mindset has it has the speed
29:52
of innovation in Silicon Valley encouraged people
29:55
in the false belief that all change
29:57
can be quick radical, instantaneous and ultimately
29:59
positive. Location
30:45
the lab. Quinton only has
30:47
24 hours to sell his
30:49
car. Is that even possible?
30:51
He goes to carvana.com.
30:54
What is this? A movie trailer?
30:56
He ignores the doubters, enters
30:58
his license plate. Wow, that's
31:00
a great offer. The car
31:02
is sold, but will
31:04
Carvana pick it up in time
31:07
for? They'll literally pick
31:09
it up tomorrow morning. Done
31:11
with the dram. Done with
31:13
the dram. The
31:15
new KFC Duncan Bucket with juicy
31:18
original recipe tenders, new mashed potato
31:20
poppers, crispy fries, plus three sauces
31:22
that fit right on top of
31:24
the lid, so you can dunk
31:26
anywhere. You can dunk at the
31:28
game. Dunk while security points to
31:30
the no outside food sign. And
31:32
Dunk as 20,000 people watch you
31:35
and your Duncan Bucket get removed
31:37
from the stadium. Dunk almost anywhere
31:39
with the new $7 KFC Duncan
31:41
Bucket or get the new $7
31:43
KFC Duncan Bucket. slash
32:13
Spotify. We're not even 30
32:15
years into the social web, as you
32:17
just don't know, it's really 20 years into
32:19
Web 2.0. At the end of the book,
32:22
you talk about potential futures and the idea
32:24
that knitting... technology deeper into our lives and
32:26
so that it will it will have a
32:29
deeper understanding of what we like what you
32:31
call our revealed behaviors. You know if you
32:33
ask me what I do I'll tell you
32:35
something it might not be true but if
32:38
you watch me you'll see what I really
32:40
do what my real preferences are. Some of
32:42
these things are quite disturbing like the idea
32:44
that you know will be you know our
32:46
future generations of smart watches or phones will
32:48
not just be monitoring our choices and activities
32:51
and activities of things that we buy and
32:53
look at but will be you know, monitoring our
32:55
body temperature and, you know, rates of,
32:57
you know, pulse rates and, you know,
32:59
hormonal balances and kind of reading our
33:01
minds, kind of anticipating our behavior before
33:03
we know we're going to do it
33:05
ourselves. And it seems foolish to say,
33:07
is this going to be a good
33:09
thing or a bad thing because it's
33:11
going to happen or it isn't? Well,
33:13
do you see a future where ultimately everything
33:16
we do can be read and drawn into
33:18
the big data corpus and we will live
33:20
in a world where everything is kind of...
33:23
predictive level? That's a
33:25
good question. First of all, I
33:27
think it's inevitable that there
33:30
will be widespread integration
33:32
of brain computer interfaces.
33:35
I think it's the next
33:37
iteration of devices after
33:40
the smartphone. And the research
33:42
and the results are
33:44
already there with minimally
33:46
invasive techniques. We can
33:49
already get incredibly
33:51
accurate. information about
33:53
what someone is thinking to
33:55
the point where you can
33:57
extract the image they're thinking
33:59
of. And so the technology is already
34:01
there and I think it when you
34:03
look at the arc of computing and
34:05
the adoption of computing the number one
34:07
thing is is that there are always
34:09
huge incentives for people into compete in
34:11
life by bringing down the barriers to
34:13
interacting with compute let's say right so
34:15
we look at you know originally you've
34:17
got computers in a in a lab
34:19
and they're huge and they feel you
34:21
know space bigger than the rumor in
34:23
today and you have to show up
34:25
and you have to be in that
34:27
part of the city in that particular
34:29
room to access that computer and you
34:31
have your own terminal at it, but
34:33
you have to be in that university
34:35
building to access that machine and it's
34:37
big and fixed and you know and
34:39
then you move to personal computers where
34:41
you can have one in your home,
34:43
you can have in your office, you
34:46
can have your own one, but you
34:48
still have to be in that location,
34:50
then you move to smartphones where you
34:52
can carry the computer around with you.
34:54
the trend is to bring down the
34:56
barrier of interaction with compute and brain
34:58
computer interfaces are a natural evolution of
35:00
that and I think it's worth people
35:02
you know realizing early that's where this
35:04
is going and there will be a
35:06
huge amount of information that's drawn out
35:08
from those devices including what we think
35:10
how we're thinking how we're feeling and
35:12
all of our bio you know sensory
35:14
information and that will be used to
35:16
bring advantages to people which is why
35:18
they will you know people will will
35:20
opt in to to that system. So
35:22
when I say that I don't want
35:24
the implant I'll be like the Luddite
35:26
of the future I'll be the kind
35:28
of refuse Nick. There'll be the same
35:30
reason you've got a smartphone sitting next
35:32
to you right now you know I
35:34
have I have some friends by the
35:36
way technologists who don't own a smartphone
35:38
and they I mean just one. But
35:40
he he insists, his name is Paul
35:42
Bragel and he insists on using a
35:44
laptop for communication and it's a it's
35:46
a fascinating choice that goes against the
35:48
grain but he he wants to message
35:50
when he's ready. a message and on
35:52
his own terms and he doesn't want
35:54
to be followed around by the information.
35:56
I'm sure there are many people out
35:58
there where that's the case. But there
36:00
are outliers, right? They're making a conscious
36:02
decision. The rest of us carry a
36:04
smartphone very willingly and they'll be exactly
36:06
the same, making a conscious decision. The
36:08
rest of us carry a smartphone very
36:10
willingly and there will be exactly the
36:12
same incentives in place to accept. Four
36:14
years is a very long time and
36:16
internet time, isn't it? Did writing the
36:18
book change the way you think about,
36:20
you know, when you open up a
36:22
page and you start, you're going to
36:24
tap the smile on it? Yeah, I've
36:26
learned so much. You know, it was
36:28
like this, it was really a thread
36:30
that we started to pull and it
36:32
just went and went and went. I
36:34
mean, four years ago, Martin and I
36:36
had this chat in a cafe about...
36:38
about this idea and we thought maybe
36:40
we'll write an article about it together
36:42
and we presented the idea to Harvard
36:44
Business Review to see if they'd like
36:46
us to write an article. We'd written
36:48
one for them before and they came
36:50
back and said we think it would
36:52
make a great book and so what
36:54
turned from an article outline turned to
36:56
a book outline and then I think
36:58
we could have written our book about
37:00
every chapter in this book. it was
37:02
this little thread and once we pulled
37:04
it it just kept going and like
37:06
I say we never really intended to
37:08
write a book but it's been a
37:10
fascinating journey and and it's become almost
37:12
like a keyhole into so many different
37:14
fields that we've learned about and we've
37:16
been lucky to interview over a hundred
37:18
people in the book and that's what
37:20
it took so long we just wanted
37:22
to speak to everybody and everyone gave
37:24
such interesting perspective so really really lucky
37:27
for the hundred people that contributed and
37:29
the book ends up being really a
37:31
collection of ideas that others have shared
37:33
with us. Bob Carson, thanks so much
37:35
for joining us in the bunker to
37:37
discuss the weird world of the like.
37:39
Thanks to having me on. Not at
37:41
all. Listeners, please leave a positive review
37:43
or press the little smile or something
37:45
like that. Like, the button that changed
37:47
the world is published on April 29th.
37:49
There is a button to preorder it
37:51
in the show notes. Yet more buttons.
37:53
Listen, if you enjoyed this podcast, there
37:55
is no like button as such.
37:57
but you can show
37:59
your appreciation by
38:01
supporting us on Patreon.
38:03
Independent are the the plucky underdogs of the information
38:05
Saturated Age and we we need your help to
38:07
keep going. Follow the show notes to find
38:09
out how to back us on Patreon
38:11
and to keep us going. keep Thanks for
38:14
listening for we'll see you next time. next time. The
38:27
Bunker was written and presented
38:29
by by group editor group editor Andrew Harrison.
38:31
Audio production was by Tom
38:33
Taylor and and Charlie Duffield by Kenny
38:35
Dickinson and artwork by Jim
38:37
by Jim Parrot. The managing editor is Jacob Jarvis
38:40
The Bunker is a is a
38:42
podmasters production.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More