Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:15
Pushkin. I
0:20
want to start this episode by telling you just
0:23
the very beginning of a story I recently
0:25
heard about. A guy named
0:27
Alex Cogan born in nineteen
0:29
eighty six into a Jewish family in the Soviet
0:32
Union. After
0:34
the collapse in nineteen ninety one, the government
0:36
loses control and Jews are even less
0:38
safe than before. Alex's
0:41
dad starts getting death threats, so
0:44
he up and moves his entire family four
0:47
generations of Cogan's, to New York City.
0:50
In nineteen ninety four, Alex
0:52
enter's first grade in a Brooklyn public school.
0:56
He's conspicuous, way
0:58
taller than the other kids. He speaks
1:00
no English. He's
1:02
also got a talent from math and science.
1:06
Once his teachers can understand him, they
1:09
think he has the makings of a gifted physicist.
1:12
Life's not hard for him, but as
1:14
he grows up, he begins to see that it isn't
1:16
always easy for everybody else. Six
1:20
months after they've arrived in the United States,
1:22
his great grandmother had jumped from their apartment
1:24
window to her death. His parents,
1:27
the loves of each other's lives,
1:29
split up. Alex cries
1:31
every night until they get back together. He
1:35
enters high school and one of his close
1:37
friends attempts suicide, another
1:40
becomes clinically depressed. Alex
1:42
begins to read psychology. He's
1:45
a math and science kid, but
1:48
he's getting more and more curious about human
1:50
nature. And
1:52
the first time I met him, and I really remember it very
1:54
distinctly, because he almost
1:56
always wore these giant basketball shorts no
1:59
matter what the weather. You know, he's terribly
2:01
dressed, like a lot of Berkeley undergrads, and
2:03
you know, and basketball shoes. That's
2:06
Daker Keltner, the psychologist
2:08
at the University of California, Berkeley who
2:11
runs something called the Greater Good Science Center
2:13
where they study human emotion. We
2:15
heard from him in episode one. Alex
2:19
Cogan was a shambolic six foot four
2:21
inch freshman back in two thousand and five when
2:23
he knocked on Daker's office door and
2:26
said he'd like for Daker to teach him. Emotions
2:29
fascinated him. He'd
2:32
come to cal to study physics, but
2:34
he'd been thinking about love, about
2:36
the distinction between loving
2:39
and being loved. He wanted to
2:41
study it the way you'd study a quark. And
2:44
Alex came in and he said, you know, I have seven
2:46
kinds of love. That I'm going to put people into.
2:49
I was like, wow, that's interesting.
2:52
And then there are twelve variations
2:55
of I forgot what the other
2:58
factor was that or set of conditions
3:00
that he wanted to create, And there are eighty four different
3:02
conditions in his study. So he's going to study seven
3:05
different kinds of love, and he's going
3:07
to study all these different variables that would
3:09
maybe predict the force
3:12
of the love, the power of the love exactly.
3:14
So he's about to make glove more complicated
3:16
than it's ever been made. So it sounds
3:18
like, right, he was gonna
3:20
confound our understanding of love. Daker
3:23
talks Alex out of that idea,
3:25
but this kid is so smart and original
3:28
and full of energy, and so
3:30
Daker takes him in and it isn't
3:32
long before Alex is finding things to do
3:35
that no one else is doing. For
3:37
instance, the thing that he does after they discover
3:39
a gene it's associated with human
3:41
kindness. And Alex did this
3:43
cool paper where he showed if
3:46
you present videotapes of people
3:48
who have that gene or this variant
3:50
of a gene that makes them kind
3:53
and I am an observer and I see one
3:55
of those people for twenty seconds on video.
3:58
I trust them, right, I'm like this guy.
4:00
I go to battle at this guy, right, I
4:02
trust this guy. By the time Alex
4:05
graduates from cal he's established
4:07
himself as the most promising student in
4:09
the entire psychology department and
4:12
the most unusual. Just
4:14
this big, sweet natured guy with
4:16
a serious talent for math and statistics
4:19
and a desire to study huge questions
4:22
like what is love? When
4:24
he left and he's so unconventional,
4:26
Michael, he could have gone to any
4:29
graduate program in the country, and he chooses
4:31
the University of Hong Kong. I'm what because
4:33
he met this woman or got engaged and
4:35
fell in love? Yeah, fell in love. But
4:37
Daker and Alex stay in touch. They
4:40
collaborate on a few papers. They're
4:42
both interested in big questions about human
4:44
nature. At the same time, social
4:47
media has started to create a new way to study
4:49
those questions. In
4:51
late twenty twelve, Facebook
4:54
invites Daker to visit and asks
4:56
him to create a bunch of new emojis, ones
4:58
that better convey actual emotions.
5:01
When Daker sees what Facebook knows about
5:03
its users, he's blown away.
5:06
This could be the greatest data source that will ever EXI
5:10
and it would help us answer
5:12
questions from the scientific perspective, like
5:15
how does disease spread in
5:18
some neighborhoods but not others? What
5:20
predicts heart attacks? Where
5:23
does hate crime? Where is it likely to
5:25
happen? Right? That was all tractable
5:27
with the data that they had. Meanwhile,
5:29
Alex had moved to England to teach
5:31
at Cambridge University. He was still
5:34
researching the same stuff, the positive
5:36
emotions, and he too was seeing
5:38
possibilities in the new social media data.
5:41
And I was at Facebook doing my consulting
5:43
work and I saw Alex there.
5:45
I was like, what are you doing here? And
5:48
he's everywhere, you know, So he's like, oh, I'm working
5:50
on this other project and he told me about it. Alex
5:53
Cogan told Daker that he wanted to
5:55
use Facebook to study things
5:57
like love and happiness. For
5:59
example, you might be able to take a fairly small
6:02
sample of data, say the likes
6:04
of ten thousand Facebook users, to
6:06
make discoveries about those emotions
6:09
entire countries. The math
6:11
was complicated enough the Dacker himself
6:14
didn't fully understand it. He then forgot
6:16
all about it until one day
6:18
a year or so later, when Alex
6:21
Cogan called him up. He calls me after
6:23
Trump's elected and he says,
6:25
I think I've done something that
6:28
was part of this election. And
6:30
I was like, okay, well, let's talk what is it? And
6:32
he said, I created this mechanism
6:34
that was purchased and
6:38
used in the Trump campaign. He
6:40
was worried that he actually had had some effect, or that
6:42
he'd be perceived to have had some effect. I
6:44
don't think he made that distinction. I just
6:47
think he thought, oh now, Alex
6:49
Cogan sensed that he might have a problem.
6:51
He just had no idea how big it was
6:54
going to be. I'm
7:03
Michael Lewis, and this
7:05
is Against the Rules, a show
7:07
about the decline of the human referee in American
7:10
life and what that's doing to
7:12
our idea of fairness today.
7:15
I want to talk about an entire species
7:17
of refs, one that's nearing
7:19
extinction, whom no one
7:21
will miss until it's too late.
7:27
I used to be a referee in the big leagues
7:30
of dictionaries. The American
7:32
Heritage. You've heard of it. The
7:34
American Heritage has something called the usage
7:36
Panel, and I was on it, along
7:38
with a couple of hundred other word people. Every
7:41
year we get this mass email asking
7:43
us to judge the latest word controversies
7:47
how certain words should be defined, or
7:49
spelled or pronounced. English
7:51
is always changing, and the dictionary wanted to keep up
7:53
with the times and sometimes resist them.
7:56
Was it okay to use unique to mean unusual?
7:59
Should you say banal or banal or
8:02
both? This year I got
8:04
a different sort of email, saying I've been
8:06
fired. They fired the whole
8:08
panel, so I didn't take it personally, but
8:11
I still want to know why. As far
8:13
as I could see, we've done nothing wrong. Our
8:15
definitions were still definitive. I
8:18
call the guy who'd been my boss as
8:21
head of the usage panel. What did you do? I
8:24
advised on people
8:26
to include on the usage
8:29
panel. Occasionally people die, and or
8:31
occasionally people would simply not respond to the questionnaire
8:34
for several years running, and we'd want to replace
8:36
them. His name is Stephen Pinker. Yes,
8:39
that's Stephen Pinker, Harvard psychologist
8:42
and author of many best selling books. In
8:45
the case of disputed usage,
8:48
where people wonder what
8:51
is the correct use? Can I use decimate
8:54
to mean destroy
8:56
most of, or, as rumor has
8:58
it should only mean destroy one tenth of? Or
9:02
what's the best way to use epicenter. Is it
9:04
just the center of something or does it have to
9:06
mean propagating outward? Of course,
9:08
if you want to know what a center means, you can now just google
9:10
it. The Internet has been bad
9:12
for dictionaries. They don't sell
9:14
the way they used to. But the Internet
9:16
doesn't explain why our panel was fired. We
9:19
didn't cost the dictionary a dime. We
9:22
all work for free. Why did
9:24
they cut it? You know, I haven't
9:26
gotten to the bottom of this. Maybe
9:29
I'll just let someone else chase this one down. I
9:31
mentioned this whole situation because it's not unique,
9:34
which, by the way, should only be used
9:36
to mean one of a kind. Nothing
9:38
can be very unique, or most
9:41
unique, or even rather unique.
9:43
A thing can be either banal or banal.
9:46
But it's either unique or it's not
9:49
anyway. The death of the word referee is not even
9:51
all that unusual. They're a member of
9:53
the species of refs that the world now has
9:55
no use for. The culture refs,
9:58
the people who referee are most basic
10:00
interactions how we should talk, who
10:02
we should trust, or whom we should trust.
10:06
No one particularly mourns their death until
10:09
they really need one.
10:15
Our bags. We are
10:17
in a suburb of Dallas,
10:21
at the home of Brian Garner, who
10:23
has set himself up as a referee of the English
10:25
language. When and what should
10:27
you hyphenate? He's the author of M.
10:30
Garner's Modern English Usage. Why
10:32
people shouldn't use flaunt when they mean flout.
10:35
We've been standing out of here for three or four minutes and there's
10:37
no sign of life. We're gonna go knock on his door. All right,
10:39
all right? What's the difference
10:41
between species and spurious?
10:44
Does it really matter if you, at this
10:46
very moment are filled with angst or
10:49
angst? We're
10:53
a weird g. Garner's Usage manual
10:55
is now more than twelve hundred pages long.
10:58
The late novelist David Foster Wallace called
11:00
it a work of genius. This book is so big.
11:04
Did you bring your copy? No? I have xerox
11:06
those pages that I want, just the front. Yeah,
11:09
it wouldn't fit. You don't
11:11
really expect to find guardians of the
11:13
English language in Dallas, Texas. Then
11:16
again, you don't really expect to find them anywhere.
11:18
That's why I've bothered to find him. It's
11:21
like flying to Indonesia to see the
11:23
last of the Sumatran rhinos, and
11:25
so here We are between a giant
11:28
golf course of a lawn and
11:30
a monticello of red bricks
11:33
and doric columns. We're
11:36
prank in the right place, Michael
11:38
Lewis, Ryan Garner, very good to meet you. Thank
11:41
you for letting us in truth. Are
11:43
we welcome? Garner's house does
11:45
have a kitchen and bathrooms, almost
11:47
like a normal house, but it feels
11:49
like an excuse for him to live in what amounts
11:51
to a massive library. Floors
11:54
of books with little ladders so you can climb
11:56
up and reach them. Thousands upon
11:58
thousands of mostly very old books
12:01
about the English language. I've
12:03
had my coffee all round. It
12:08
looks like a Robber Baron's collection
12:10
of books, except they look like they've
12:12
been read. They look like they
12:15
aren't. They aren't book spot by the yard,
12:17
and they also have plastic covers
12:19
on them, which is a little unusual.
12:22
How many Usage
12:25
Experts books do you have in
12:27
this library? I mean, how many different? He
12:31
published his first Usage Guide back in
12:33
nineteen ninety eight, partly as a protest
12:36
against the way people talked on TV, which
12:38
sounds a bit snooty, but Gardner's
12:41
genius was not to set himself up as some
12:43
kind of elite speaking down to the illiterate
12:45
masses. His judgments
12:47
felt like common sense. They were
12:49
relied on data. He classified
12:52
any change in the language into five stages,
12:54
ranging from weird new usage to a
12:57
totally accepted new use of the word. He
13:00
had lots of information on how people were
13:02
actually speaking and writing the English
13:04
language. So this is
13:07
Webster's first dictionary
13:09
six and this just kind of shows the evolution
13:11
over the nineteenth century. But I have
13:14
so upstairs. These
13:16
are books on writing, a
13:18
beginning all the way over here,
13:22
so that this whole, that whole wall
13:25
is linguistics,
13:27
and look on usage and writing. I
13:30
think I just assume that anybody who went this
13:32
far out of his way to tell other people
13:34
how to speak and write must have something wrong
13:36
with him. That if you tracked his interest
13:39
back to its source, you'd finally
13:41
arrive at the desire to feel superior.
13:44
But that's not Garner. His source
13:46
energy isn't snobbery. It's
13:48
outrage at an idea
13:51
cooked up by academic linguistics,
13:53
an idea he had encountered back as a student
13:56
at the University of Texas descriptivism.
13:59
It was called a
14:01
native speaker of English cannot make a mistake,
14:05
and it's so fact though
14:07
if a native speaker says it, it is
14:09
correct. That
14:12
is a very extreme position
14:14
to take, and I think an indefensible one, and one
14:16
that I have pretty much set my face
14:18
against. He set his face
14:20
against descriptivism, and his face
14:23
is set against it. Still. Do you
14:25
consider yourself a referee? Yes?
14:30
Yeah, I'm making
14:34
judgment calls about and
14:37
there is a lot of judgment involved, But I'm
14:39
trying to be a helpful
14:41
guide to writers
14:43
and speakers of English. We're
14:45
now up in a balcony gazing down at an
14:47
amphitheater of books about the
14:49
English language. He's got a whole
14:52
other collection of books out back where
14:54
the poolhouse should be, in a building
14:56
that's an exact replica of the room
14:58
in England in which the Oxford English Dictionary
15:00
was created. I pulled down an
15:02
especially decrepit looking book by
15:05
someone I've never heard of, Lindley
15:07
Murray. Murray, but
15:10
there's kind of a hero of mine. Interesting
15:12
guy. He was a New York lawyer
15:15
in seventeen eighty four. He moved to York,
15:18
England because he didn't like the Revolution, and
15:21
a lot of Americans actually moved
15:24
to England because they didn't appreciate
15:27
what was going on. Lynn Manuel and Miranda
15:29
left that out of Hamilton. I
15:31
guess so, and so these
15:33
two shelves are whole various ambitions
15:36
of Murray's Grammar. Yeah, and Brian
15:38
Garner seems to have all of them. So
15:40
Murray. In seventeen ninety five he stopped
15:43
practicing law and he wrote Murray's
15:45
English Grammar for a Quaker
15:48
girls school in York, and it became
15:50
the best selling book in
15:53
the English language other
15:55
than the Bible for the first
15:58
fifty years of the nineteenth century. He
16:00
sold over thirteen million copies
16:03
of his English Grammar. Every household
16:06
needed an English Grammar
16:08
and a Bible thirteen
16:11
million copies. The joint
16:13
population of Great Britain in the United
16:15
States in eighteen hundred was only fifteen
16:18
million, But back then people
16:20
threw money at language refs. Noah
16:22
Webster got rich from his dictionary, so
16:25
did Fowler and Follet and Partridge and
16:27
scores of others from their grammars
16:29
and usage guides. Strunk and White
16:32
have sold ten million copies of this style
16:34
manual. There was a time
16:37
not long ago when a writer could get paid to
16:39
write about how to write, and the
16:41
American Heritage Dictionary used to brag
16:43
about his usage panel. But
16:46
Brian Garner is in the wrong century. How
16:49
many copies of Garner's Modern English
16:51
usage is sold, I don't know exactly,
16:54
but it's fewer that hauled and paltry.
16:56
Brian Garner has a really nice house, but
16:59
his usage manual doesn't pay his mortgage.
17:01
He gives writing seminars for lawyers. The
17:04
rest of his market has mostly vanished. I
17:07
mentioned Barnes and Noble, but I haven't
17:10
singled anybody out in particular, although
17:12
I kind of did when the first
17:15
two editions of my Usage
17:17
book came out. Usage Book has passed
17:19
a we're not going to stalk it. I
17:21
mean that has a major effect,
17:24
and they said, no, we've made the decision that
17:26
really this category is defunct. The
17:29
usage book is a defunct category. I
17:31
grab another one of his old books and flip through
17:33
it. Some nineteenth century guide
17:35
to pronunciation. The
17:37
idea that anyone would write, much less
17:40
pay money for a pronunciation guide,
17:42
well, it's preposterous and preposterous.
17:46
It is an interesting fact, and one not
17:48
sufficiently realized that a person who has
17:50
a pronunciation of his own for a word
17:52
is very apt to take it for granted that he
17:55
hears all others has pronounced it in the same manner,
17:57
when in fact his own method is
17:59
entirely peculiar to himself.
18:03
It doesn't
18:06
make true at Also, talking about making people incredibly
18:08
uncomfortable, fearful of what was coming
18:10
out of their mouths, that's what he's doing. People
18:12
used to feel uneasy about how they use the language.
18:15
They didn't want to sound stupid or uneducated.
18:18
Now they feel uneasy about anyone who would presume
18:20
to judge how they're using the language,
18:23
and old anxiety has been replaced by something else,
18:26
a suspicion of the individual ref People
18:29
still judge other people by what they say and how
18:31
they say it, but they do it differently, without
18:34
reference to a higher authority, but to
18:36
the crowd. My own bank here
18:38
in Dallas, every
18:40
time there would be in any activity
18:42
on one of my accounts, I'd get an email message
18:45
dear dear mister Garner semicohen
18:49
And I called
18:52
my banker and I said, by the way, you know,
18:54
you got hundreds of these things, presumably thousands
18:57
going out by the day, dear customer,
18:59
semicolon, And I
19:03
said, you know, it's got to be either comma or a
19:05
colon. He said, could you put that in writing? And
19:08
I said, or I'll even give you some authorities, and
19:11
I cited Garner's Modern English usage
19:13
and a
19:16
couple of other authorities
19:18
on this point of punctuation. It's a pretty
19:20
elementary point. Yep, he
19:22
did that. I mean, who else is there
19:24
to site? But the incorrectly
19:26
punctuated letters just kept coming. Still.
19:29
I was getting dozens every week of dear
19:31
mister Garner semicolon and and it was
19:33
I was about to change banks over this, because
19:35
it's it's it's a little upsetting to
19:37
think I'm doing business with
19:41
people who are doing
19:44
something so egregiously bad.
19:47
And they didn't change it for about
19:49
a month, and so I called him and I said, what's going on? He
19:51
said, well, you know, I showed
19:53
it to some of the people here
19:55
at the bank, but we have a dispute about whether it should
19:57
be a semicolon or a colon,
20:00
and so we just left it. But
20:02
that that is a demotic view. Well, your your
20:04
opinion is as good as mine. Anybody's
20:06
opinion is as good as somebody else. Demotic.
20:10
Now, there is a word derived
20:13
from an ancient Greek word meaning popular.
20:17
That's how the language is generally refereed
20:20
by popular opinion. Inside
20:22
Garner's bank, by popular opinion,
20:25
it was okay to send out letters teeming
20:27
with semicolons that didn't belong. It's
20:30
obviously not that big a deal. I
20:32
mean, you can still understand where
20:34
the bank was trying to say. Plus,
20:37
it's sort of freeing to rid ourselves
20:39
of this expert language ref
20:42
this annoying little schoolmarmie
20:44
voice in your head. On
20:46
the other hand, what happens when that
20:49
little voice ceases to exist? And
20:51
not just that little voice, but the other
20:53
little voices like it. I'm
21:00
Margaret Sullivan and I was the
21:03
public editor of the New York Times. And
21:05
what's a public editor? I just asked
21:07
that to loosen her up. I knew the answer. The
21:09
public editor is the ombudsman,
21:12
the neutral party inside the news organization
21:15
whose job is to make judgments
21:17
about the news in the same
21:19
possibly irritating way that Brian
21:22
Garner makes judgments about the language,
21:25
to call out the paper when it screws up. Sullivan
21:28
did that at the New York Times from two thousand and
21:30
twelve until the spring of two thousand
21:32
and sixteen. When she left
21:35
a year later, the Times just got rid of its public
21:37
editor altogether. So I would love
21:39
for you to explain to me the
21:42
importance of ombudsman why
21:44
they exist in the first place. So,
21:47
for example, and this is not the only role,
21:49
but let's just say someone thinks
21:51
a correction should be made in a news
21:53
story and the
21:55
people who are in charge of that say, well,
21:58
nope, we're not going to do that because we're
22:00
convinced it's right. So then they could
22:02
come to the ombudsman and say,
22:06
what do you think here? The
22:08
thing about the is that it has
22:10
to be independent. I
22:13
had no editor. I mean I had a copy
22:15
editor, and I end the My copy editor great
22:17
person would say to me, are you sure you want
22:19
to say it that way? Or don't you think
22:22
going a little too far there? But he
22:24
couldn't tell me not to do it. Sullivan
22:27
was not just a good ombudsman. She
22:29
was a famously good one. She made
22:31
a big deal about reporters who let sources
22:33
approve their quotes. She called
22:35
out The Times for its policies allowing anonymous
22:38
sources, especially in stories
22:40
about national politics. Everyone
22:42
in the news room read and feared her, and
22:45
that probably prevented a lot of distorted
22:47
or unfair stuff from ever
22:49
getting into print. But the
22:51
role she played is dying. The Washington
22:53
Post got rid of their ombudsman in two and
22:56
thirteen, and the New York Times
22:58
in twenty seventeen. Even
23:00
ESPN had one and got rid
23:02
of it. And why so,
23:05
why has it been in decline? If
23:07
you ask the media organizations, the news
23:09
organizations who have discontinued
23:12
their ombudsperson
23:14
rolls, they would say, almost
23:17
to a person, they would say, it's
23:19
not necessary anymore because there's
23:22
so much criticism in the digital
23:24
world on Twitter
23:26
and elsewhere. There's so many voices,
23:29
there's so many ways to get a complaint or
23:31
a point of view out there that we don't
23:33
need to have someone that we pay to
23:36
criticize us. Internally, you
23:38
don't need a news reff anymore because
23:41
in the new media market, the crowd
23:43
can do the reffing. The Times only
23:45
created the ombudsman roll back in two thousand
23:47
and three. The reasoning then
23:50
was the modern media market, the Internet,
23:52
cable TV, the speeding
23:54
up in the news cycle that was all creating
23:57
pressures that led to some really sensational
23:59
screw ups by the New York Times. They
24:01
printed a bunch of stories on the front page by
24:04
a reporter named Jason Blair. He
24:06
later confessed that he just made up quotes
24:08
an entire scenes. They printed
24:10
stories saying that Sodom Hussein possessed weapons
24:13
of mass destruction when he didn't.
24:15
Did you while you were there? Was there? Did
24:18
you have a sense that there was a decline in the
24:21
need for you to do
24:23
this job? Was there? Where? Were there like less
24:25
things coming in? Oh? No more if
24:27
anything? But there was this belief
24:29
in the air that the crowd could
24:32
do the job. And why
24:34
pay a genuinely independent news referee
24:37
when you could get the crowd to do the job
24:39
for free? Do you ever read
24:42
it? Does any anything ever cause
24:44
a story to smell for you? You go, there's
24:46
something wrong. It's the kind of thing that if
24:49
I were there in my job, I'd be getting emails
24:51
about Oh, yes, absolutely, you
24:53
can see those coming a mile
24:55
away. Now
25:00
I'm going to finish the story of Alex Cogan,
25:03
the young psychologist born in the Soviet
25:05
Union who started out in physics and
25:07
ended up in love along
25:11
with a bunch of other researchers and app builders. He'd
25:13
signed an agreement with Facebook to
25:15
study its users. It
25:18
wasn't cheap to do. Alex
25:20
paid the subjects of his studies through some
25:22
survey company. He asked permission
25:24
to let him study overall patterns
25:27
of what they liked and how they used
25:29
emojis. He hoped that the
25:31
data might yield all kinds of insights or
25:34
help address the odd questions that Alex had
25:36
a talent for raising, like what
25:39
is the difference between loving and being
25:41
loved? Fast
25:44
forward to i'd say winter
25:47
of twenty fourteen, and one
25:49
of the PhD students in my department at
25:51
Cambridge says, Hey, I've been consulting for
25:53
this company. They'd really love
25:55
to meet you and get like a little consulting
25:58
help from you. Would you be interesting? I'm like, sure,
26:00
meet Alex Cogan, student
26:03
of Love. The big
26:05
cary for me here was that they were going to
26:07
pay for a really big data collection. So
26:10
they're going to pay something like eight
26:12
hundred thousand dollars so we could get all
26:14
this data and I could keep it to do my research.
26:17
And that was really exciting to me because hey,
26:19
this was a really fast way
26:21
to get a really nice grant. So
26:24
I set up a meeting with this company called cl
26:26
which would eventually become Cambridge Analytica.
26:29
Yes, that Cambridge Analytica.
26:32
It has nothing to do with Cambridge University.
26:35
It was just a little known political consulting
26:37
firm trying to horn in on the lucrative business
26:40
of advising presidential campaigns. Yeah,
26:43
so we're really looking at page legs. And
26:45
the reason we focused it on page likes was
26:47
there's a few papers published at that point
26:50
that showed that, hey, you could take people's
26:52
page legs and use them to
26:54
predict their personalities with some level
26:56
of accuracy. The company
26:58
asked Alex if he could classify people
27:00
by five personality traits extra
27:03
version, agreeableness, openness,
27:06
and so on use their Facebook
27:08
data to herman which little personality
27:10
buckets they fell into kind
27:12
of routine stuff for him. Would
27:15
caught Alex's interest was the chance to
27:17
make other studies of the same people. Why
27:19
do you need that much money to collect the data paying
27:21
participants. So the way we usually recruit
27:24
participants, as would say like, hey, please
27:26
answer twenty minutes of questionnaires for us, and
27:28
we'll give you a few dollars for your time. And
27:31
in this case we got something like two hundred
27:34
thousand people to go and give
27:36
us twenty minutes of their time, and we paid them around
27:38
four bucks each. He
27:42
didn't even need to go find these people. They
27:44
found him through websites where
27:46
people offered to be lab rats for researchers
27:49
in exchange for cash or prizes. Alex
27:52
gave them cash. They gave
27:54
Alex access to their Facebook data, which
27:56
I guess tells you that a lot of people are happy to put
27:58
a price on their privacy. Anyway,
28:01
Cambridge Analytica's idea wasn't even all
28:03
that original. The Obama
28:05
campaign claimed to have done the same thing with Facebook
28:08
data back into twelve, though
28:10
on a smaller scale. But
28:12
Alex figured out pretty quickly just how
28:15
hard it was to do what his client wanted.
28:18
You couldn't really predict much about people
28:20
using their Facebook data, or
28:22
at least he couldn't. We
28:25
started asking the question of like, well, how often
28:27
are we right? And so there's
28:29
five personality dimensions, and we
28:31
said, like, okay, for one percentage
28:33
of people, do we get all five
28:35
personality categories correct?
28:38
We found it was like one percent. How
28:40
did you even check that? Though? How do you find out
28:42
whether someone is an extrovert?
28:45
The two hundred thousands that provided
28:47
us to the personality scores, because
28:49
those terms of thousand people to authorize that app
28:51
filled out the personality quiz, and that would
28:54
be like, okay, let's go
28:56
and see how these people actually answered, and
28:58
let's see what we predicted and we could compare it. So,
29:00
assuming they know their personality and
29:03
that was right, you got it right one percent of
29:05
the time. One percent of time. I'm
29:08
going to break that down for you. Cambridge
29:10
Analytica had Alex Cogan collecting
29:13
and compiling Facebook data
29:16
in a way that was incredibly useless.
29:19
I think we got halfway through the project and realize,
29:21
you know, this probably doesn't work that well.
29:24
But at that point, you know, we're contractorally
29:26
obligated to give them the data and they
29:28
were still interested. But here was the crazy
29:31
thing. The consulting firm didn't
29:33
care whether it worked or it didn't. They're
29:35
getting paid pots of money by Ted Cruz's
29:38
presidential campaign, who were trying
29:40
to reach voters on social media.
29:42
The Cruise campaign didn't seem to know that this stuff
29:44
didn't work. With a heavy
29:47
heart, but
29:49
with boundless optimism. Then
29:52
Ted Cruz lost the Republican
29:54
primary to Donald Trump, we
29:58
are suspending our campaign. Cambridge
30:00
Analytica had used Alex's useless
30:02
predictions to help the loser to lose.
30:05
Now amazingly, they sold
30:07
their services to the winner. Alex
30:10
never learned whether the Trump campaign actually
30:12
ever used his data, but in the
30:14
end that didn't matter. And
30:16
when Donald Trump became president, a
30:19
lot of folks thought incredible had happened.
30:22
So they started looking for incredible explanations.
30:25
Could the same data have been
30:27
possibly used when this selection?
30:29
Because like, how else could this possibly have happened?
30:31
So folks are looking for, like, where's the
30:33
evil genius that could have possibly
30:36
caused all this? That
30:39
was the moment Alex called his old teacher,
30:41
Daker Keltner, who gave him which
30:43
sounded like good advice. I told him like
30:45
key below profile and just
30:48
try to stay out of the conversation. And
30:50
that advice mostly worked right
30:52
up until early twenty and eighteen. First,
30:55
our chief business correspondent Rebecca Jarvis
30:57
has the latest. He's the scientist
31:00
at the heart of the Facebook privacy scandal,
31:02
and then the drama unfolded a researcher
31:05
at the University of Cambridge. They finally
31:07
realized that I was worn in the
31:09
Soviet Union to collect the
31:11
data of millions of ali About
31:19
a week before the stories break, the New
31:22
York Times and The Guardian email me with
31:24
a bunch of questions about like the project
31:27
and also whether I might be a Russian spy. Now,
31:30
I didn't want to ask them, like, guys, if
31:33
I am actually a Russian spy, do you think? Like
31:35
a direct question was going to trip me up, And I'm
31:37
gonna say, you got me, Yes, I'm a Russian spy.
31:40
It's now April twenty and eighteen.
31:43
Alex Cogan's thinking, surely
31:46
someone will step in and sort this out, some
31:48
neutral third party, some grown
31:50
up inside the New York Times. Maybe
31:53
someone would just stop and think about it.
31:56
He was an academic using
31:58
some political consulting money to
32:00
make useless predictions about people's personalities,
32:03
while also funding his own studies on the side.
32:07
He signed this agreement with Facebook,
32:09
the one that's spelled out how he could interact
32:11
with its users, and the company was
32:13
okay with everything he'd been doing. Facebook
32:16
had explicitly agreed to let him use
32:18
Facebook data not just for academic
32:21
research, but for commerce if he could
32:23
find some business use for it. When
32:26
reporters called him, he'd say, look
32:28
at the agreement. Call Facebook, they'll tell
32:30
you the truth. But
32:32
it's clear now that we didn't do enough to
32:35
prevent these tools from being used for harm
32:37
as well, and that goes for fake
32:39
news, for foreign interference and elections,
32:41
and hate speech, as well as
32:43
developers and data privacy. That's
32:46
Mark Zuckerberg on TV, not
32:48
looking like he wants to tell anybody the truth. Facebook
32:51
goes on the defensive. They
32:54
do a press release basically say like
32:56
we've banned Kim John Letaca, they we've
32:58
banned Cogan. They basically also
33:01
say that you know, Cogan here told
33:03
us it was for academocra research and that's why
33:05
we let him do it, which wasn't true at all. We
33:07
need to make sure that people aren't you using it to harm
33:09
other people. Facebook wanted people to believe it
33:11
was a victim of this data thief, when
33:14
in fact, it had given Alex's permission to do exactly
33:16
what he did. But then Facebook
33:19
was created to be an unrefereed space.
33:22
It allowed its users to do and say pretty
33:24
much whatever they pleased and took
33:26
no responsibility for the consequences.
33:29
Now, the world was furious with Facebook
33:32
for not refing itself, and
33:34
so it panicked and look for
33:36
someone else to blame. Alex
33:38
Cogan had set out in life to study our positive
33:40
emotions. He now got his lesson in
33:42
the other kind, anger mistrust.
33:46
All these reporters were now calling him to ask
33:49
these very weird, hostile questions,
33:51
like why it changed his last name after
33:54
he'd gotten married. We
33:56
wanted to find something that
33:59
symbolize both our religious sides or a scientific
34:01
sites, because we're both scientists and religious,
34:04
and we landed this idea of light
34:07
and they're like, oh, spectrum like
34:09
and then we heard the last name Specter, and I'm like, oh, that's
34:11
really cool, let's do that. So we change your
34:13
last name to Specter. Bad
34:16
luck hab it. Specter is also
34:18
the evil organization from James Bond.
34:21
I got a lot of questions from
34:24
a lot of journalists saying like, Hey,
34:26
this whole Specter thing is mighty suspicious.
34:31
I just say this that if you're planning
34:33
to do something sinister, if you're
34:35
even vaguely considering the possibility,
34:38
the last thing you should do is change
34:41
your last name to Specter. It's
34:44
like naming a restaurant sam
34:46
and Ella. Maybe that's just me. All
34:49
the little details of Alex Cogan's
34:51
life had now become evidence
34:53
for the prosecution. No
34:56
one even had to come out and say that Alex
34:58
Cogan was a spy. The Guardian
35:01
ran graphics and little arrows pointing
35:04
from a picture of red Square to a
35:06
picture of Alex Cogan. What
35:09
the Russia connection? I woke up that
35:12
day too, like two hundred emails
35:14
from pretty much every outlet in the world. CNNs
35:18
starts trying to track me down, Like
35:20
I started giving phone calls from like my old
35:22
house in San Francisco that CNN is like poking
35:24
around trying to find me, and then
35:26
they show up at my door. The
35:29
story of Alex Cogan and Cambridge
35:31
Analytica went viral before
35:34
it ever really got checked for whether it
35:36
made any sense. It was refed
35:38
by the crowd. The crowd
35:41
just decided that it liked the story and
35:43
ran with it. The
35:45
US government started knocking my door. We got,
35:48
you know, questions from the US
35:50
Senate, the House, and
35:52
etc. Etc. The British
35:55
Parliament reached out and I learned you can't
35:57
really talk to the government as a private citizen.
36:00
So like financially like completely wiped
36:02
me out and like massive debt. Now
36:04
in terms of the legal bills, as
36:07
far as the academic career, pretty
36:09
much over. A promising academic
36:11
career went poof, just like
36:14
that. All he's got left
36:16
is the possibility of writing a memoir of the experience
36:19
and a lawsuit against Facebook, accusing
36:22
the company of defamation, which he filed
36:24
a few months after we spoke. I
36:28
met with a guy who is
36:31
doing a documentary about all of this, and he's like,
36:33
you know, it's crazy. I was warned,
36:35
and I'm not gonna tell you by who, but it's somebody prominence. But
36:38
I was warned when I'm talking to
36:40
you to be really careful because you're a trained
36:42
covert agent from Russia and you would
36:45
out my phone. I
36:49
think of Alex Cogan as a curious kind
36:52
of victim, even if he refuses
36:54
to sound anything but cheery about his situation.
36:57
He's what happens when the refs are banished from
36:59
the news, when people are encouraged
37:02
to believe whatever it is they want to believe. It's
37:05
not that the news was once perfectly refereed
37:07
and now it's not, or that there
37:09
weren't ever fake stories, or that people
37:11
haven't always believed all kinds of bullshit.
37:15
But there's an obvious antidote, the
37:17
neutral third party, the independent
37:20
authority, the referee
37:22
who makes it more difficult, if
37:25
only just a little bit, for an easy
37:27
lie to replace a complicated truth.
37:31
Yet the job doesn't exist. The
37:34
market doesn't want some neutral third party
37:36
interfering with our ability to create our own
37:38
truths, to render our own
37:40
meanings, to construct our own
37:43
realities as
37:45
we decline, stage
37:47
by stage against
38:00
the Rules. Is brought to you by Pushkin
38:02
Industries. The show's produced by
38:04
Audrey Dilling and Catherine Girardote,
38:07
with research assistance from Zoe Oliver Gray
38:09
and Beth Johnson. Our
38:12
editor is Julia Barton. Mia
38:15
Lobell is our executive producer. Our
38:18
theme was composed by Nick Burttell, with additional
38:20
scoring by Seth Samuel, mastering
38:23
by Jason Gambrel. Our
38:25
show was recorded by Tofa Ruth at
38:28
Northgate Studios at UC Berkeley.
38:31
Special thanks to our founders, Jacob Weisberg
38:34
and Malcolm Gladwell.
38:55
Do you mean an example of the state something that's at
38:57
stage one now? Using
39:02
climatic in the sense
39:04
climactic, this was the
39:06
climatic point of the play.
39:09
Well climate yeah,
39:13
they're both words. I absolutely
39:17
u and if you
39:19
if you take the phrase so anti climactic
39:22
is the word is an anti climax.
39:25
But if you search anti
39:28
climatic versus anti
39:30
climactic, the ratio
39:33
and that's the you have to contextualize these searches.
39:35
There's no reason to use anti climatic
39:38
at all. But it's twenty eight
39:40
to one in print sources anti
39:43
climactic in favor of anti
39:45
climactic. But the fact that the other
39:47
one appears once every twenty
39:49
eight times, that yeah,
39:51
it is. So this is like linguistic
39:54
epidemiology. It
39:56
begins to spread. A lot of us
39:58
have snakes in the
40:00
grass. We call them garter
40:02
snakes, and garter snakes
40:05
have little stripes on them that look
40:07
like garters. But a lot of people people
40:09
misheard that and started saying garden snake.
40:11
They thought it was it's a garden it's a register, regular,
40:14
harmless garden snake. Well
40:16
it's a garter snake. That
40:19
is uh wow,
40:22
Well that's a problem. That's eight to one because if that
40:24
snake in the garden is a rattlesnake, that's
40:27
right, there could be a real I've
40:29
got I've got a garden snake out there. Oh
40:31
good, I don't have to wear any protective
40:33
clothing. I'll go catch it. Well, you know that
40:36
these are problems people
40:38
would say you and I just made that up. Give
40:40
me an example of the stage stage four
40:43
um misspelling minuscule as
40:45
if it were miniskirt minuscules m
40:47
I n us culi. But
40:49
that's two to one in print. Now or
40:53
anti vinin. Now here's one anti
40:56
vinen. If you get bitten by not
40:58
a garter snake, but by a rattlesnake,
41:01
you need anti vinin v E
41:03
N I N. But the
41:06
noun for what the snake puts
41:08
into you is them, And so a
41:10
lot of people you
41:12
know this is is it really worth
41:15
preserving? I don't know. It's
41:17
traditional English anti venin,
41:19
and it comes from a Latin form.
41:23
But people have started saying anti venom,
41:26
and that one is one
41:28
point two to one in favor of anti
41:30
venom. But that's one where I continue
41:33
to recommend the traditional form
41:35
anti vinin. So you go into
41:37
the garden and you pick up the snake because you think
41:39
it's a garden snake, and you're a bit by the rattlesnake,
41:42
and you go you're bitten. You're bitten
41:44
by the bit, thank you very much, bitten by the rattlesnake,
41:46
and you're taking to the hospital and by the time they figure
41:48
out what you're trying to ask for, because you're asking for
41:50
anti venom and they don't have any, you're
41:53
dead. Yeah, because you mispronounced. Sorry, we're not
41:55
giving you any. All we have is anti vinin.
41:58
We don't have any anti venom. And
42:00
by the way, I don't normally correct people, but
42:02
forgive me for that that bitten thing,
42:04
Thank you very much. Sure,
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More