Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:02
Hello, everybody. Welcome to the
0:04
Radical Caner Podcast. I'm Kim
0:06
Scott. And I'm Jason
0:08
Rozov. And today we're going to
0:10
be discussing a philosophy that's become
0:12
almost mythical in tech circles and that
0:14
we're watching play out in real
0:16
time in the US government. This
0:19
philosophy is often referred
0:21
to as the move fast
0:23
and break things approach to
0:25
work. And when this approach is effective
0:27
and when it might lead us astray. So
0:30
with that, let's get going. I have
0:32
to say, Jason, that one of
0:34
my favorite photos, and we will put
0:36
it in the show notes, comes
0:38
from a friend of mine who used
0:41
to work at Facebook, which is
0:43
the company that coined this, Move Fast
0:45
and Break Things. And the
0:47
new sign was, slow down and fix
0:49
your shit. So
0:53
Move Fast and Break Things
0:55
is, in my book, sort
0:57
of obnoxiously aggressive. However, I
0:59
will say there is at Google,
1:01
we called it launch and
1:03
iterate. Yeah. And
1:05
that seems to me to
1:07
be a better way
1:10
to say what the good
1:12
part of this, which is that if
1:14
you're so afraid of never making a
1:16
mistake, you can't innovate, you
1:19
can't fix things. It's sort of
1:21
the ethos behind whoopsie daisy,
1:23
which is publicly saying, oh,
1:25
I messed that one
1:27
up. And I'm going to
1:29
do better next time. I mean,
1:31
another way to say the good
1:33
part of this is something that
1:35
is on the bottom of a
1:37
friend of mine's every email says
1:39
make new mistakes. So
1:42
I think that
1:44
it needs to be
1:46
OK to make a mistake.
1:48
You can't innovate if it's not OK to
1:50
make a mistake. And it even
1:52
needs to be okay to
1:54
admit mistakes in situations where it's
1:56
really not okay to make
1:59
mistakes like hospitals. This is
2:01
kind of what's
2:03
behind Amy Edmondson and
2:05
her book, The
2:07
Fearless Organization. A lot
2:09
of the research she did behind
2:11
psychological safety is if you can't
2:13
admit a mistake, if you can't
2:15
share a mistake, then you're going
2:17
to make your dooms to make
2:20
it over and over and over
2:22
again. That's right, yeah. And paradoxically,
2:24
the organizations that were the hospitals
2:26
where the most mistakes were reported
2:28
were also the safest hospitals, which
2:30
was not what you expected. So
2:32
that's my blink response. When
2:34
I hear move fast and
2:36
break things, especially in the
2:38
context of firing lots of
2:40
people, it feels evil to
2:43
me. Yeah. There
2:45
was a very concrete example of
2:47
this, which was there was
2:49
an excerpt which we'll try to
2:51
find for the show notes
2:53
of one of the cabinet meetings
2:55
where Elon Musk is talking
2:57
about making mistakes. And first of
2:59
all, what's he doing in
3:01
a cabinet meeting? He's an unelected
3:03
bureaucrat. The
3:06
fact that he was in the
3:08
room is an example of moving
3:10
fast and breaking things when they
3:12
should be slowing down and fixing
3:14
their shit. Yes. But he
3:16
said, He
3:19
said jokingly that they had
3:21
He said we're gonna make mistakes
3:23
like for example we cut
3:25
all the funding to Ebola research
3:27
and then we realize whoops
3:29
We probably should have kept the
3:31
funding for Ebola research so
3:33
we turn that funding back on
3:35
and It was like it
3:37
in the moment that clip makes
3:39
it seem sort of reasonable,
3:42
but it doesn't capture the whole
3:44
picture because what part
3:46
of what's happening is that
3:48
The uncertainty of grant money
3:50
is is unwinding programs So
3:52
it's not just like the
3:54
absence Like it's not that
3:56
a short -term absence of
3:58
funding Has no effect. It's
4:00
in fact the case that
4:03
in some cases they are
4:05
so dependent on the next
4:07
distribution of money that they
4:09
literally had to stop an
4:11
experiment midstream. And so, if there
4:13
were live samples of something,
4:15
for example, they couldn't afford to
4:17
keep those frozen, so those
4:19
samples went bad. There's
4:21
a real effect. There's
4:24
a real cost to that mistake. Correct.
4:27
And so, on
4:29
the one hand, credit
4:31
where credit's due, realizing that was
4:33
a terribly... stupid decision and it needed
4:35
to be undone. And on the
4:38
other hand, the one
4:40
of the things that I would like
4:42
to add to your list of what, you
4:44
know, things to look
4:46
out for when you're considering
4:48
a sort of ship and iterate
4:50
is what is the cost?
4:52
What are the side effects of
4:55
negative outcomes? Yes. Because if
4:57
the side effects are significant or
4:59
potentially very costly, then
5:02
the it's
5:04
much more prudent to slow
5:06
down and make sure you're making
5:08
the right decision. Yeah, exactly.
5:10
I mean, launch and iterate was
5:12
about search results. It
5:14
did not apply to
5:16
a nuclear power plant,
5:18
for example. You
5:21
would not want to launch
5:23
and iterate at a nuclear power
5:25
plant. You want to make
5:27
damn sure that you're not going
5:29
to You know blow up
5:32
a major metropolitan area or any
5:34
area if you if you
5:36
launch you you got a test
5:38
and test and test and
5:40
even when the even when the
5:42
consequences of failure are not
5:44
that dire but like Apple was
5:47
not a launch and iterate
5:49
kind of culture because they were
5:51
making Hardware it wasn't so
5:53
you couldn't just push a fix
5:55
out and and once you
5:57
have sold this phone to someone
5:59
you You can't really fix
6:02
it very easily, very cheaply anyway.
6:04
It's not just a matter
6:06
of pushing a software patch out.
6:09
And so they tested it and
6:12
tested it. Apple was much
6:14
more of a measure 100 times
6:16
cut once kind of culture, which
6:19
is not to say that
6:21
it wasn't innovative. Apple's obviously incredibly
6:23
innovative. So you don't even
6:25
have to launch and iterate. to
6:28
be innovative and you certainly
6:30
don't have to move fast and
6:32
break things. Go
6:36
ahead, you were going to say something. No,
6:39
I think the
6:41
cost of failure
6:43
calculation, I think
6:45
you're right. There
6:48
are some times
6:50
in human history
6:52
where, in
6:54
fact, We
6:59
took great risks with people's lives
7:01
because the cost of failure was even
7:03
higher. I think it's important to
7:05
put everything on a spectrum. When
7:08
we were testing aircraft, for example,
7:10
there are all these people who
7:12
put their lives at risk to
7:14
test this aircraft. Even though they
7:17
were testing to try to get
7:19
it right, the cost
7:21
of failure was still high,
7:23
like test pilots died in
7:25
testing these aircraft. But
7:28
the work that they did wind
7:30
up putting us in a position
7:32
to be able to turn the
7:34
tide of World War II. So
7:36
the potential for success was great.
7:38
But the thing that they did
7:40
really well was they were very
7:42
clear headed about the cost of
7:44
failure. Meaning they understood that they
7:46
were putting people's lives at risk.
7:49
And they said, we're measuring that
7:51
against the good that we think
7:53
this can do. And it's not
7:55
that when people do that, they
7:57
always get it right. But I
7:59
think that's a pretty big difference
8:01
than the sort of like, har
8:03
har we, you know, we undid
8:05
Ebola funding. You know what I'm
8:07
saying? Like the joke. Right.
8:09
Exactly. Because he's not going
8:11
to bear that cost. Musk
8:14
is not going to. And there's
8:16
and part of the problem
8:18
with what's happening is that, you
8:20
know, part of the American
8:22
experiment was checks and balances, right?
8:24
Yeah. And so we were
8:26
putting checks and balances in place.
8:29
Our governmental system put checks and
8:31
balances in place so that
8:33
the president didn't have too much
8:36
power. The Congress didn't have
8:38
too much power. The judiciary didn't
8:40
have too much power. They
8:42
could check each other's power. And
8:45
unfortunately, those checks and balances
8:47
are being undone before our
8:49
very eyes. And then
8:51
it becomes impossible to
8:53
hold people accountable for part
8:55
of the cost of
8:58
failure means that if you
9:00
fail, there should be
9:02
some accountability for failure. And
9:06
in this case, there's
9:08
not. That's the problem of
9:10
having an unelected bureaucrat
9:12
in these cabinet meetings. One
9:15
of many.
9:18
I think the other thing that's
9:20
on my mind is just
9:22
to bring back a topic we've
9:24
talked about many times, which
9:26
is one of the issues with
9:29
move fast and break things,
9:31
even if you're aware of the
9:33
potential cost, the
9:35
potential negative impact of
9:37
failure, is that
9:39
it's often quite hard
9:42
to measure the full cost
9:44
of failure. And so
9:46
if your attitude is like,
9:49
there isn't really, there doesn't need to be
9:51
accountability for failure. You know, if you
9:53
have this like sort of careless attitude. Yes,
9:57
like careless people. Yes,
9:59
you wind up, you wind up being
10:01
much more susceptible to the measurement
10:03
problem, which is that it is very
10:05
hard to measure the things that
10:07
really matter when you're doing these calculations.
10:10
And so it actually takes it
10:12
does take slowing down to really
10:14
consider what the externalities of what
10:16
we're doing actually are for you
10:18
to realize, oh, it
10:21
may not be so simple as
10:23
just turning the funding back on. We
10:26
may be setting ourselves back if
10:28
we do this in this way. I
10:30
think there's also size matters. I
10:32
think a lot of the launch and
10:34
iterate and even the move fast
10:36
and break things kind of culture, Came
10:39
from a world in which
10:41
these companies were small right
10:43
right when they started saying
10:45
launch and iterate like it
10:47
didn't really matter if Facebook
10:49
made a mistake in the
10:51
early days because Facebook didn't
10:53
really matter in the early
10:55
days and now as we
10:58
have seen it's having a
11:00
huge impact on the psychological
11:02
well -being of all of
11:04
us polarization You know and
11:07
Myanmar genocide
11:09
was planned on
11:11
Facebook. And
11:14
it really does matter.
11:16
Mistakes really have real
11:18
world, terrible human consequences.
11:21
And I've been
11:24
thinking, I've been
11:26
doing a fair amount of
11:28
retrospection about my career
11:30
in tech and what has
11:32
gone wrong. in
11:34
recent years. And I think,
11:37
in fact, I may even have to
11:39
go back to the first job
11:41
I had out of business goals at
11:43
the FCC in 1996. And
11:45
this was when the
11:47
Telecom Act came out and
11:49
Section 230 is a
11:52
section of the Telecom Act
11:54
of 1996. Section
11:56
230 explicitly lets social
11:58
media platforms, lets tech
12:00
platforms off the hook
12:02
for For any content
12:05
that is for any accountability for
12:07
content moderation and in 1996 I
12:09
mean even then it was sort
12:11
of questionable writing this blank check
12:13
to this new industry But I
12:15
could you know, I I understand
12:18
what we were thinking at the
12:20
time We were thinking it doesn't
12:22
make sense to regulate a tech
12:24
a company companies that don't even
12:26
exist yet. I mean in 1996
12:28
Google hadn't even been founded
12:31
and I think Zuckerberg was in like
12:33
third grade or something. And
12:35
so these platforms
12:38
didn't exist at
12:40
the time. And
12:42
so there was some sense
12:45
to it. But I wish
12:47
that we hadn't, in retrospect,
12:50
written this blank check for
12:52
all time because now this
12:54
check is coming due and
12:56
we can't afford to pay
12:58
it. And so at this
13:00
point, it is time to
13:02
hold these companies accountable for
13:04
the harm that the content
13:06
on these platforms does. And
13:08
I think also you raised one of
13:10
my favorite topics of all time, the
13:12
measurement problem. Part
13:15
of the issue is
13:17
that if you
13:19
just are measuring engagement,
13:22
which is what? Sort of
13:24
which is all they
13:26
really measure. I shouldn't say
13:28
that's all they measure
13:30
But that's a that's very
13:32
important to Facebook's business
13:34
is engagement because the more
13:36
engaged users are the
13:38
more likely people are to
13:40
contribute content to Facebook
13:42
and to Instagram and to
13:44
all these other and
13:46
and and and and people
13:48
tend to engage more
13:50
with negative content with negative
13:52
emotions, which is why
13:54
these platforms tend to sort
13:56
of spew more BOMO,
13:58
fear of missing out, more
14:01
deeply enraging content that
14:03
polarizes us all, more content
14:05
about eating disorders, all
14:07
of these kinds of things,
14:09
because we pay attention
14:11
to these things. And so
14:13
they're just kind of
14:15
tracking what they can measure.
14:17
very hard to measure
14:19
the value of a well
14:21
-functioning society. Believe
14:24
me, it doesn't factor in
14:26
clearly. Even
14:28
though in the end, if
14:31
society dissolves, it'll kill the
14:33
goose that lay the golden
14:35
egg. It seems like you
14:37
would factor it in. I
14:40
spent many of the formative
14:42
years of my career in Silicon
14:44
Valley working at a nonprofit.
14:46
On the other side, at Khan
14:48
Academy. At Khan Academy, yeah.
14:51
Yeah. And I will say, we
14:53
thought a lot about
14:56
these externalities. What does
14:58
it mean to give this away? Can
15:00
I interrupt? Can you explain to people who
15:02
don't know what an externality is? Oh,
15:04
sure. Yeah. But just a side
15:06
effect is maybe a simpler way
15:08
to say it. The side effects of
15:10
what we were doing. So
15:12
Khan Academy was providing
15:14
freeze or world -class
15:17
educational content. And
15:20
we were mostly fine with that
15:22
because most of the other people who
15:24
were providing content were educational publishers. And
15:27
we didn't think the educational publishers were
15:29
doing a great job of providing great
15:31
content for teachers and students to use.
15:34
But we were conscious of
15:36
the fact that there
15:38
may be some side effects.
15:41
So for
15:43
example, As much
15:45
as we might not like to
15:47
think about it this way, a
15:49
textbook you can use for many
15:52
years and many students can use
15:54
a textbook and you don't need
15:56
a computer and you don't need
15:58
internet access to use a textbook.
16:02
We were conscious of the fact
16:04
that if we were successful
16:06
and we undermined this other way
16:08
of getting access to stuff,
16:10
it had the potential of creating
16:13
of exacerbating inequities in the
16:15
system. Even though
16:17
it would be done for all the best
16:19
reasons, and there wasn't
16:21
a profit motive, so we were giving
16:23
away for free, but there were these
16:25
potential side effects of what we were doing, and
16:28
we thought about that a lot.
16:30
And what that caused us to do
16:32
was to build partnerships with people
16:34
who were bringing free or very inexpensive
16:36
internet access, free or very inexpensive
16:38
computing. Two schools in the US
16:40
and around the world like we
16:42
thought about like how do we make
16:44
sure that if we. If
16:47
we're successful that we don't wind
16:49
up just sort of you know
16:51
giving really great access to stuff
16:53
to rich kids who already had
16:55
all the things that they already
16:57
needed. And I
16:59
and I think that. How
17:01
much time do you think Facebook
17:04
spends worrying about that? I
17:06
mean I'm sure there are people I'm
17:08
sure there are but let me pause
17:10
I'm sure there are people at Facebook
17:12
who care deeply deeply about this like
17:14
I do not mean to dismiss everyone
17:16
who works at Facebook I have friends
17:18
who work at Facebook or meta So
17:22
it certainly don't mean to
17:24
cast dispersion on all these people.
17:26
But if you're making that
17:28
argument, and meanwhile these other metrics
17:30
that are making all the
17:32
money, the argument that
17:35
this thing that you care passionately about
17:37
is likely to get lost in
17:39
the noise. Yes. And I
17:41
was going to say the
17:43
exact same thing. I think,
17:45
in part, there's a
17:47
benefit of... You know,
17:49
the Khan Academy relied on
17:51
donations, so that limited
17:53
our scope and our ability
17:55
to grow. And
17:58
that meant it was easier to keep
18:00
people aligned around the mission and it
18:02
was easier to sort of bubble up
18:04
or center these conversations around making sure
18:06
that we were actually achieving the mission
18:08
and not just sort of reinforcing inequities
18:10
that were already there. And
18:13
so I do imagine that there
18:15
are some people at Facebook who are
18:17
every day having a discussion about
18:19
how can they help to fix some
18:21
of the problems that social media
18:23
as a medium has created and all
18:25
this other stuff. As
18:27
there are at Google, this great
18:29
program Grow with Google where
18:31
Google is trying to offer content
18:33
to people to learn skills
18:35
that will help them get jobs.
18:38
Right. It exists
18:40
everywhere. It's sort
18:42
of not at the core of
18:45
the machinery. It exists on
18:47
the periphery as these
18:49
organizations grow. I
18:52
think that back to the
18:54
measurement problem, what's measured is
18:56
managed. If short -term
18:58
profits are measured, if
19:01
engagement is measured, those
19:03
are the things that actually wind
19:05
up being managed against. Yeah,
19:07
I think you're totally right that the
19:09
long -term play is a bad one, which
19:11
is like if it actually undermines society and
19:14
like the health of people become so. Disgusted
19:17
with it that they, you know, they
19:19
leave these platforms, then they have no users
19:21
left so they can't make any more
19:23
money. Yeah, but. Again,
19:26
I think a consideration of
19:28
the periphery as opposed to like at the
19:30
core. Yeah, well, I mean, I
19:32
think part of the part of the
19:34
problem is. to be fair
19:36
is not only the
19:38
metrics that drive Facebook's business,
19:40
but also the market.
19:42
If the market rewards quarterly
19:44
earnings, it's really hard to
19:47
worry about the downstream impact
19:49
of your product. Although
19:51
it's not impossible. Again,
19:53
I don't mean... for
19:56
this to be a
19:58
Facebook bashing Google promoting podcast.
20:01
But like in the S1 letter, when I
20:03
took the job at Google in 2004,
20:05
and you can feel free to push back
20:07
and tell me I'm being, you
20:09
know, I had drunk the Kool -Aidems. But, you
20:11
know, Larry and Tergay said, we are
20:14
not a normal company. We don't intend to
20:16
become one. And we're going to invest
20:18
a lot of money in things that are
20:20
not going to make short term return
20:22
for shareholders. And if you're not comfortable
20:24
with that, you know, don't don't buy our
20:26
stock. You know, and we're
20:28
going to continue to reward and treat our
20:30
people well. And if you're not, if
20:32
you're not happy with that, don't, you know,
20:34
don't buy the stock. And that to me
20:36
was really important. And it
20:38
had, so let's talk about
20:40
sort of the importance of
20:42
debate. Like another problem of move
20:44
fast and break things is
20:46
that there's no time for discussion.
20:48
And in order to in
20:51
order to really innovate you need to
20:53
create time and space for discussion. Yeah, everything
20:55
we're just talking about is this is
20:57
exactly was on my mind like all there
20:59
are people who want to have these
21:01
debates who have really good arguments for why,
21:04
you know, should or shouldn't
21:06
do something and and
21:08
to your point like.
21:11
I don't think it's just the attitude like
21:13
to some degree. It's
21:15
also size, it makes a
21:17
difference here. Like the larger
21:19
the organization gets, the harder
21:21
it is to have debates
21:23
with the people who really
21:26
matter, like whose arguments are
21:28
going to move the needle. Maybe it's a better
21:30
way to say it. It's not about the
21:32
people who matter, but it's about the arguments that
21:34
they're able to make. And so as a
21:36
result, you wind up with, and
21:38
I think this was some of what
21:40
we, What was in careless
21:42
people is like you wind up with
21:44
these sort of like echo echo
21:46
chambers like tiny echo chambers inside the
21:48
company where there's like reinforcement of
21:50
bad behavior In part because they're not
21:52
listening to you know what I'm
21:55
saying there like they're not inviting the
21:57
Disagreement they're not inviting the debate
21:59
and for folks who are not familiar
22:01
with careless people This is a
22:03
book you want to talk about careless
22:05
people for a second Oh, no,
22:07
you go ahead. You got it. It's
22:09
a book. It's
22:11
sort of a memoir
22:13
written by a
22:15
former Facebook employee. And
22:18
she really describes in
22:21
great detail some of
22:23
the problems with the
22:25
way the systems worked
22:27
and the Again,
22:30
the negative externalities, the negative
22:33
impact on all of us, the
22:35
negative side effects that all
22:37
of us are bearing the cost
22:39
of these negative externalities that
22:41
are created by the way that
22:43
Facebook system works. And
22:46
Meta sort of prevented, you
22:48
know, sued the author and
22:50
prevented her from talking about
22:52
her book. So we're trying
22:54
to talk about her book
22:57
for her. And
22:59
by the way, it also prompted
23:01
me to read Francis Hogan's book,
23:03
The Power of One, which
23:05
I hadn't read before is
23:08
also really a great explanation
23:10
of how these systems work
23:12
and what we could do
23:14
to make them not create
23:16
these terrible negative side effects
23:18
for society. And begs a
23:20
lot of questions like why Facebook
23:22
isn't already doing these things. And I
23:24
think part of the answer is
23:26
that there's no public debate. about
23:28
how their algorithm should
23:30
work. And there's no sense
23:32
that there should be
23:34
a public debate about that.
23:37
And I think even to some
23:39
extent, there's very limited internal
23:42
debate. Debate, yes. And
23:44
so if
23:46
you're a
23:48
free market
23:50
capitalist believer,
23:53
in theory, the idea is like
23:55
is that people can have
23:57
an opinion about this and they
23:59
can vote with their money.
24:01
They can basically say, I'm not
24:03
going to give my money.
24:05
But that, to some extent, is
24:07
undone by the business models
24:09
that they've created, which is essentially
24:12
monetizing. They don't
24:14
charge us to access
24:16
the content. They
24:18
charge advertisers, essentially, to
24:20
support the platforms. And
24:23
so it's very hard. Right
24:25
before we got on the podcast,
24:27
we were talking about, you know, should
24:29
we continue to post on Facebook?
24:31
Should we continue to contribute content to
24:33
Facebook and Instagram? And
24:36
I think part of
24:38
what's interesting about that
24:40
debate is like, it's
24:42
hard to affect a
24:45
real protest when the
24:47
consequences are fairly delayed
24:49
for a company like
24:51
Facebook. One
24:53
massive, you know, movements
24:56
of people away from
24:58
the platform to start
25:00
to really have a
25:02
noticeable negative impact on
25:04
their revenue. And that's
25:06
different, right? Like, you
25:08
know, when Tylenol had to
25:10
recall, this is like in the
25:12
70s or something, this thing,
25:15
people stopped buying Tylenol, like
25:17
the money dried up,
25:19
like it went away fairly
25:21
immediately. So there was
25:23
actually like a market response.
25:25
There was like a consequence. Yeah,
25:28
although if everybody, let's
25:30
say if even 10 % of
25:33
the people who contribute, I
25:35
mean, most Facebook users are readers,
25:37
not writers. That's right. If
25:39
only 10 % of the people,
25:41
and we're writers on the thing,
25:43
we're giving away our content
25:45
to this platform. If
25:47
only 10 % of the people
25:49
who actually post to Facebook
25:51
quit posting, that would
25:54
have a huge and
25:56
very quick problem. In fact,
25:58
this is the thing.
26:00
This was at two AHAs
26:02
when I read these
26:04
two books, Careless People and
26:06
the Power of One.
26:08
One AHA was that part
26:10
of the reason why
26:13
Facebook started advocating, Francis Hogan
26:15
explains this, started sort
26:17
of pushing more polarizing content
26:19
in their platform is that
26:21
it got more engagement. People
26:24
who are contributing content were
26:26
slowing down. They weren't contributing
26:28
as much content because they
26:30
weren't getting so many reactions
26:32
from people, lights and whatnot.
26:36
They found that when
26:38
they promoted in their
26:40
algorithm, more extreme content it
26:42
got more engagement and then people started
26:44
and it was really a vicious
26:46
cycle because then even if you didn't
26:48
believe these extreme things or if
26:50
you were writing headlines for example you
26:52
started writing these clickbait headlines and
26:54
so that's a real that that was
26:56
one real problem the other thing
26:59
and maybe this is just my own
27:01
stupidity that I didn't know I've
27:03
always sort of wondered like why do
27:05
they allow all these political ads
27:07
like it's not that it's not that
27:09
much money. Why don't they just
27:11
disallow them? And what
27:13
I realized was they allow
27:16
them because now they are
27:18
king makers. Now they
27:20
can help you get elected
27:22
or unelected. And so no
27:24
official dares regulate Facebook because
27:26
Facebook can prevent them from
27:29
getting elected. So this is
27:31
a huge dampening on democratic
27:33
debate about the ways that
27:35
we should regulate um the
27:38
content on on on facebook
27:40
flash instagram slash meta Yeah,
27:42
and that was like i
27:44
don't know why i didn't
27:46
realize that before but i'm
27:48
like oh of course you
27:51
know uh so that i
27:53
think is is important to
27:55
think about Yeah,
28:04
so like there's multiple layers of which
28:06
we needed debate and and we didn't have
28:08
it like internally I'm sure there were
28:11
people who were like we should not be
28:13
doing this like you know The engagement
28:15
thing is is good in the in the
28:17
sense that we're getting more people to
28:19
write But it's bad in the sense that
28:21
with like content is worse and people
28:23
feel worse about it Like i'm sure there
28:25
are people making that argument inside of
28:28
this book as they were deciding to yes
28:30
of course to do this and then
28:32
there was like to your point the public
28:34
debate where like like the public having
28:36
an opinion about whether or not this is
28:38
uh or the more broader market having
28:40
an opinion about whether or not what they're
28:43
doing is good or should continue that
28:45
doesn't really happen or doesn't feel like it
28:47
it should happen um and then there's there's
28:50
also like the government level
28:52
debate. Yes, the public, the
28:54
democracy level debate. It
28:56
seems increasingly, it feels
28:58
to me anyway, like there's
29:00
a move to just
29:02
do away with, I don't
29:04
want to have the
29:06
debate at all. So let's
29:08
in, how about we
29:10
just arm democracy? And that
29:12
obviously is very worrisome. But
29:15
I want to go back to,
29:17
there was a key moment in
29:19
my career where somebody raised the
29:22
issue of the importance of having
29:24
sort of public debate about decisions
29:26
that this is when I was
29:28
working at Google. I
29:31
went, so I was
29:33
managing AdSense, the AdSense
29:35
online team. And
29:38
we, in
29:41
addition to, sort
29:44
of trying to grow the
29:46
business. I was also in
29:48
charge of policy enforcement and
29:50
creating policy. And
29:52
that meant that at the
29:55
same time that I
29:57
was trying to grow revenue,
29:59
I was also in
30:01
charge of terminating AdSense publishers
30:03
violated Google's policies. Got
30:05
it. And this goes back
30:07
to the measurement problem. I
30:10
don't think it would have mattered. In fact,
30:12
I can tell you for sure, it did not
30:14
matter how fast it grew, how
30:17
much money AdSense
30:19
made, did not specifically
30:21
impact my compensation. And
30:25
that may seem sort
30:27
of nutty to people
30:29
who was in charge
30:31
of sales and operations,
30:34
but there was an understanding at
30:36
Google that we couldn't If we
30:38
measured things that narrowly, we were
30:40
going to get the wrong kinds
30:43
of behaviors. And so
30:45
I was equally as
30:47
passionate about taking down
30:49
the bad sites, or
30:52
the sites that were violating policy, I
30:54
should call them bad sites, as
30:56
I was about growing revenue. And
30:58
I was really excited about
31:00
growing revenue, believe me. But
31:02
I really believed that
31:04
you weed your garden. And
31:06
if you allow your
31:08
garden to get overrun by
31:10
weeds, that's not good
31:13
for your garden long term.
31:15
And I don't understand why
31:17
that doesn't happen more at
31:20
Facebook, because it's possible to
31:22
create a system where people
31:24
are caring about both of
31:26
these things. So I
31:28
thought I was doing a great job. That's
31:31
the TLDR there. But
31:35
then I was invited to
31:37
speak at this class called
31:39
Liberation Technology. It was
31:41
a class taught at Stanford, and it was a
31:43
class taught by Josh Cohen, who is an old
31:45
friend of mine, a person I like a lot.
31:48
And I was describing to him
31:50
content moderation challenges and this
31:52
big debate I had had. And
31:55
we had had the debate
31:57
at Google. This was actually around
31:59
Blogger, which I also managed for
32:01
a while, the policy enforcement. And
32:04
there was somebody had
32:07
written something calling for
32:09
genocide, basically, kill all
32:11
the ex people. And
32:14
I shouldn't even say ex,
32:16
kill all the ABC people.
32:18
And I wanted to take
32:20
it down. I believed that
32:22
that kind of content had
32:24
no part in the AdSense
32:26
network. And I was just
32:28
going to pull it down.
32:31
But I didn't have unilateral decision -making
32:33
authority. This was something that had
32:35
to be discussed more broadly. So
32:37
I was in this big meeting
32:39
at one point, you know, Eric
32:41
Schmidt sort of agreed with me.
32:43
He said, you know, if we
32:46
had a dance hall and not
32:48
blogger, you know, we would not
32:50
rent it out to the KKK.
32:52
Like why would we allow this
32:54
kind of content? And
32:56
yet both Eric and
32:58
I got overruled by kind
33:01
of the free speech crowd
33:03
and in retrospect I think I
33:05
was right and I wish
33:07
I had fought harder but these
33:09
are hard questions because their
33:11
point of view was that you're
33:13
better off knowing who believes
33:15
these things and who's saying it
33:17
than not knowing and you
33:19
know forcing this kind of stuff
33:21
to go underground. And
33:23
so in the end, I
33:25
think, if memory serves, which it
33:28
often doesn't, the older I
33:30
get. But I think what
33:32
happened was we put a content
33:34
warning. We left the site
33:36
up, but we put a
33:38
content warning saying something like,
33:40
this is bad. And although
33:42
I'm sure that's not what it said. And
33:45
so I was talking about this
33:47
in the class and thinking that we
33:49
had a pretty good debate process. And
33:51
I'll never forget Josh looked at
33:53
me and he said, you are making
33:55
those decisions. And at first, I
33:58
was kind of insulted, you know,
34:00
I'm like, but what's wrong? What? Of
34:02
course I like, why am I
34:04
not? And he was like, you are
34:07
totally unqualified to make this. And,
34:09
and, you know, once I got
34:11
over feeling kind of offended, because
34:13
this was the most interesting part
34:15
of my job hands down, you
34:17
know, I really did care about
34:19
it. But he's. Josh said, and
34:21
now I think he's right, like,
34:23
these decisions, there needs to
34:26
be some democratic oversight for these
34:28
kinds of decisions because they have
34:30
such a huge impact on our
34:32
whole society. So this
34:34
is a big, Josh was right,
34:36
I was wrong. But at the time, I
34:39
was like, oh, Josh, you don't
34:41
understand, the government could never be
34:43
involved in these kinds of decisions.
34:45
You know, like, I have people,
34:47
we had a policy about no
34:49
porn sites. And some clever person
34:51
took a picture of himself in
34:53
front of a toaster. But
34:55
of course, the toaster was very shiny
34:57
and he was naked. And
35:00
it's complicated to manage
35:02
all the content. And
35:04
there was another moment
35:06
where there were these
35:08
ads for bestiality showing
35:11
up on a, this
35:13
is all a long
35:15
segue, but they're funny
35:17
stories. But there was
35:19
an ad for bestiality
35:21
that kept showing up
35:23
on this parenting magazine
35:25
that was an AdSense
35:27
publisher. And obviously they
35:29
were very upset about
35:31
this bestiality ad. And
35:34
I called up
35:36
the ad content moderation
35:38
team, which was
35:40
in another country, but
35:42
anyway, they They
35:44
were reviewing all the ads and I was
35:46
saying, why are these ads even showing up
35:48
anywhere? Like we have a policy against porn
35:50
ads. And this person
35:53
claimed that bestiality didn't count as
35:55
porn. I was like, gosh,
35:57
that was not on my bingo
35:59
card today. The argument about
36:01
whether. So anyway, I told these
36:03
stories and I. Yeah, they
36:05
all got a big laugh and
36:07
nobody really stopped to think
36:09
about should we have democratic oversight
36:11
over some of these decisions.
36:14
And I now believe we should.
36:16
Josh Cohen was right. I'm not
36:18
arguing against it, but I'll
36:20
just say like I think
36:23
that it's so much harder
36:25
in practice to achieve the
36:27
goal of oversight in a
36:29
way. Here
36:31
just to give your team credit for
36:33
a second like I think your team
36:35
probably thought about these things very seriously
36:37
and maybe even put like days or
36:39
hours of time and it went all
36:41
the way I mean, you know the
36:43
CEO of the company was willing to
36:45
spend his time on this like Google
36:47
took this very seriously right and I
36:49
guess like my experience with Like I
36:52
don't know exactly how you encourage that
36:54
kind of debate and at what level
36:56
and who gets involved outside of these
36:58
companies I'm like I'm open to the
37:00
idea, but I'm just recognizing a practical
37:02
challenge of the fact that everybody else
37:04
who would be contributing to that conversation
37:06
would either need to be paid by
37:08
Google or would be doing it on
37:10
a volunteer bait. You know what I'm
37:12
saying? They'd be volunteering their time to
37:14
do this thing. And that, I
37:16
think, is part of the reason why the
37:18
public debate hasn't been vigorous about this
37:20
stuff is because it's like, how do you
37:23
make the time to really understand what's
37:25
going on? And instead of public debate, what
37:27
we get is hot takes. You
37:29
know I'm saying? What we
37:31
get is like one person saying,
37:33
you know, something which is missing a
37:35
whole bunch of context, but it's
37:37
sort of punchy. And so as
37:39
a result, a bunch of people are like, yeah,
37:41
I agree with that person. And then the
37:43
other person sort of fires back and their thing
37:45
also misses a bunch of context. And as
37:47
a result, we're not really having a debate. We're
37:49
just sort of like throwing bars at each
37:51
other. The debate certainly should not happen on social
37:53
media. Yeah, for
37:55
sure it needs to happen in a different
37:57
in a different way And I think
37:59
I guess what I would say is like
38:01
what you're it seems to me what
38:04
you're describing is sort of the third pillar
38:06
of Avoiding the pitfalls of move fast
38:08
and break things which is like they're having
38:10
the right kind of culture now. Did
38:12
you get the systems were imperfect? There should
38:14
probably should have been some external involvement,
38:16
but I do think this idea that we're
38:20
going to have a public debate. We're
38:22
not going to give unilateral decision -making
38:24
power to you, to the person who's
38:27
technically in charge of this. We're going
38:29
to force a public debate on this.
38:31
And there's going to be a record
38:33
of that debate. We're actually going to
38:35
record this thinking for posterity. Those
38:37
are important cultural rituals that
38:39
I think don't exist in
38:42
a lot of organizations. And
38:45
going back to your original point, it's one
38:47
of the reasons why Without
38:49
those types of rituals without
38:51
like ritualizing debate for
38:53
example and saying this is
38:55
important part of how
38:57
we make decisions as an
38:59
organization Without removing unilateral
39:01
decision -making power for example
39:03
and setting that as a
39:05
cultural touchstone. I think
39:07
it's very easy for You
39:09
know the necessity of
39:11
the moment to overtake good
39:13
like good thinking It
39:17
feel like things feel urgent. And as
39:19
a result, people don't slow down to have
39:21
the debate. And I think more than
39:23
that, what you were saying at the very
39:25
top of the podcast was like, it
39:27
also without creating those
39:30
cultural norms, it also
39:32
discourages people from talking
39:34
about mistakes that they
39:36
make. And
39:38
when you combine those things together,
39:40
when you have no culture of
39:42
debate and no discussion of mistakes, It
39:45
becomes very easy to see how you
39:47
could go very deep down a rabbit
39:49
hole of bad things happening. And people
39:51
sort of looking around and like, whose
39:53
job is it to put the brakes
39:55
on this? What
39:57
is the way that
39:59
we respond collectively to
40:01
these bad decisions or
40:04
bad behavior that we're
40:06
observing? And I think
40:08
a lot of cultures
40:10
would benefit from... know
40:12
the removal of unilateral decision
40:14
-making power and a push
40:16
toward public You know as
40:18
public as you can make
40:20
it debate on important decisions.
40:22
Yeah, there's there is a
40:24
conference room at Metta right
40:26
next to Zuckerberg's office that
40:28
says Good news only or
40:30
only good news like that's
40:32
a disaster. That's an example
40:34
of The wrong kind of
40:36
culture like don't tell me
40:38
what's wrong, you know, like
40:40
You've got to have that
40:42
culture where leaders are soliciting
40:45
feedback and are eager to
40:47
hear the bad news and
40:49
that contrary point of view, not
40:51
the good news only
40:53
kind of culture. And
40:56
I think it's really important
40:58
that companies be willing, and
41:00
this is something that
41:02
tech companies have not traditionally
41:04
been willing to do,
41:06
to be held accountable by
41:08
the public, by the
41:10
government. There is a reason
41:13
why we have all
41:15
these NDAs and agreements that
41:17
don't allow you to
41:19
sue, that forced arbitration agreements.
41:21
That is an example
41:23
of a culture that is
41:25
trying to avoid being
41:27
held accountable by our government,
41:29
by the systems that
41:31
we have in place to
41:33
hold wealthy big companies
41:35
accountable. All right.
41:38
Well, let's let's let's try
41:40
to summarize our guidance here
41:42
for people who want to
41:44
be able to move fast
41:46
But don't want to break
41:49
things in an irreparably bad
41:51
way Whoopsie Daisy is one
41:53
thing like destroying democracy is
41:55
another All right, so tip
41:58
number one consider the cost
42:00
of failure don't skip the
42:02
debate phase and build a
42:04
team culture that supports
42:07
both speed and
42:09
learning. Yeah. Tip number
42:11
two, moving fast and breaking
42:13
things isn't inherently good or
42:15
bad. It's about applying a
42:17
thoughtful approach in the right
42:19
context with the right process.
42:21
So making sure that you
42:23
slow down enough to debate
42:25
something to make sure that
42:27
you're still on the right
42:29
track. And tip
42:31
number three. always focus on
42:33
learning. The whole point of
42:35
moving fast and potentially breaking
42:38
things is to learn fast. It's
42:40
not to create
42:43
a land grab
42:45
and establish a
42:47
monopoly that then destroys
42:49
democracy. And
42:51
with that... We invite you to head
42:54
over to radicalcander.com slash podcast to see the
42:56
show notes for this episode. Remember, praise
42:58
in public and criticize in private. If you
43:00
like what you hear, please rate review
43:02
and follow us on whatever platform you like
43:04
to listen to your podcasts. If
43:06
you have feedback, please
43:08
email it to us
43:11
at podcastradicalcander.com. Until next
43:13
time. And by the way, if
43:15
you have feedback on whether you think
43:17
we're going to invite some public
43:19
debate, do you think that we should
43:21
stop posting on Instagram and Facebook?
43:23
Let us know your thoughts. We are
43:26
eager for them. Thank you.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More