How NDAs Protect Power Instead of People 7 | 15

How NDAs Protect Power Instead of People 7 | 15

Released Wednesday, 9th April 2025
Good episode? Give it some love!
How NDAs Protect Power Instead of People 7 | 15

How NDAs Protect Power Instead of People 7 | 15

How NDAs Protect Power Instead of People 7 | 15

How NDAs Protect Power Instead of People 7 | 15

Wednesday, 9th April 2025
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Hello, it is Ryan, and I

0:02

was on a flight the other

0:04

day playing one of my favorite

0:06

social spin slot games on Chumba

0:08

casino.com. I looked over the person

0:11

sitting next to me, and you

0:13

know what they were doing? They

0:15

were also playing Chumba Casino. Coincidence?

0:17

I think not. Everybody's

0:19

loving, having fun with

0:22

it. Chumba Casino. Coincidence?

0:24

I think not. Everybody's

0:26

loving, having fun with

0:28

it. Chumba casino.com. That's

0:31

Chumba casino.com and live

0:33

the Chumba Life. No

0:35

purchase necessary. D.W. report

0:37

prohibited by loss. I'm Jason

0:39

Rosoff. I'm Amy Sandler and

0:41

today we're going to be

0:44

tackling a critical issue that

0:46

Kim, you write about in

0:48

radical respect. Question is, what

0:51

happens when organizations silence employees

0:53

instead of removing the obstacles

0:55

that hinder their success? What's

0:58

bringing this up to

1:00

mind is meta's recent arbitration

1:02

case against former employees, Sarah

1:05

Wynne Williams, whose new book,

1:07

Careless People, exposes misconduct

1:09

within the company and

1:12

has prompted us

1:14

to really explore the

1:17

broader implications of

1:19

non-disclosure agreements,

1:21

NDAs, and forced arbitration

1:24

in the work. So let's get

1:26

into it. Yes. This is

1:28

such an important issue, I think,

1:30

when, I mean,

1:32

non-disclosure agreements are supposed

1:35

to prevent people from

1:37

from sharing technical secrets,

1:39

you know, trade secrets.

1:42

They were not supposed to be used

1:44

as a way to hide wrongdoing.

1:47

And yet, that is how

1:49

they're being used, as

1:52

are non-disparagement agreements,

1:54

which is a whole

1:56

other thing. And so I think

1:58

that this book is real... important.

2:00

And I think it's not

2:02

just about this book or

2:04

meta. We have a systemic

2:07

problem in the workplace

2:09

and workplace culture that

2:11

leads leaders to try to

2:13

dodge the checks and balances

2:16

that our society has

2:18

put in place to prevent

2:20

leaders from harming individual

2:22

employees or even all of

2:24

us. And in the case

2:27

of this book, I feel like We're

2:29

all, the author was harmed, but we're

2:31

all being harmed. I'm excited. I read

2:33

the book, the whole book. I know you

2:35

all didn't necessarily read the whole book yet,

2:37

but I'm excited to chat with you all

2:39

about it. Great. I will, I will put

2:41

the book up. Yeah, I have not yet read

2:44

it. I have purchased it. And Kim, do

2:46

you want to just mention for folks

2:48

why you felt so strongly about encouraging

2:50

people to take a look at the

2:52

book? Yeah, not just take a

2:55

look, but buy the book. Yeah,

2:57

because as an author, I'm very

2:59

sympathetic for what's happening to this

3:01

author, because when you publish

3:04

a book, your publisher will tell

3:06

you the author sells the book.

3:08

And so by silencing the author

3:10

of this book, from speaking

3:12

about her book, she's making

3:14

it really difficult. for her to

3:17

sell the book. I mean, of course,

3:19

it seems the strategy seems to have backfired

3:21

and now everybody's buying the book. It's the

3:23

Strice and effect. Yes, the Strice and

3:25

effect and action. But, which is good, but

3:28

that, let's like get it to the top.

3:30

Let's get it to number one on the

3:32

New York Times bestseller list. And this is

3:34

specifically because of a legal arbitration case

3:37

that's forcing her to not be able

3:39

to just wanted to make sure for folks

3:41

who weren't aware where we were in

3:43

this moment where we were in this moment.

3:45

Yes, you're probably better at explaining. Oh, I don't

3:47

know. I'm going to throw that one to Jason He's

3:50

our he's our he's our he's our legal analyst even

3:52

though I don't know if he has a JD, but

3:54

I feel like he's played the role of someone who

3:56

has I just got into law school That's as far

3:58

as I go. I didn't I did so

4:00

badly on the LSAT I gave

4:03

up. I think you basically covered

4:05

it, essentially as part of the

4:07

author's employment or as part of

4:09

the separation agreement, at some point

4:12

she signed something that basically said

4:14

that she wasn't allowed to disclose

4:16

some set of things about her

4:18

experience as an employee. And in

4:21

addition to that, there was a

4:23

clause that said that disputes must

4:25

be decided by arbitration as opposed

4:27

to being able to go to

4:30

court. And what I read about

4:32

it basically said that an arbitrator,

4:34

and arbitration is binding just like

4:36

going to court. This is one

4:39

thing that people may not understand

4:41

is that it's another legal process,

4:43

but the judgment of the arbitrator

4:45

was that. there would be irreparable

4:48

harm caused by allowing Williams to

4:50

actually promote the book. And so

4:52

she was enjoined from doing that.

4:54

She was forbidden from doing it.

4:57

And the thing about arbitration is

4:59

that it really tends to favor

5:01

employees employers over employees, but because

5:03

the companies wind up paying these

5:06

arbitration. They're not totally. neutral, I

5:08

think. I mean, they try to

5:10

be. They try to be neutral,

5:12

and they're, but they're not in

5:15

mind. Yeah, it's one of those

5:17

things that in theory, like arbitration

5:19

in theory is better for employees

5:21

because you don't absolutely need an

5:24

attorney in order to represent you,

5:26

like it could be less costly,

5:28

it could take a lot less

5:30

time in order to reach a

5:33

resolution, but came to your point.

5:35

In many cases, the deck is

5:37

stacked against. It's similar to the

5:39

legal, we've recreated the legal system

5:42

in arbitration by stacking the deck

5:44

against the person with less money.

5:46

And we've, we've dodged the checks

5:48

and balances that, that the government

5:50

should be. putting on the power

5:53

of anyone company or anyone wealthy

5:55

individual with the legal system. So

5:57

anyway, it's a big problem. Now

5:59

I want to talk about irreparable

6:02

harm because it seems to me

6:04

reading this book that there's an

6:06

awful lot of evidence that Facebook

6:08

has done irreparable harm to a

6:11

bunch of us. So we can't

6:13

read the whole book, but I'd

6:15

love to just share a couple

6:17

of passages from a a chapter

6:20

towards the end of the book

6:22

called Emotional Pargaining. And we can

6:24

talk about those and then we

6:26

can talk about NDAs and forced

6:29

arbitration and non-disparagement agreements and why

6:31

it's a problem, why these things

6:33

are problems. How does that sound?

6:35

Sounds good. All right, I'm going

6:38

to jump in. In April 2017,

6:40

a confidential document is leaked that

6:42

reveals Facebook is offering advertisers the

6:44

opportunity to target 13-17-year-olds across its

6:47

platform, including Instagram, during moments of

6:49

psychological vulnerability when they feel quote-unquote

6:51

worthless, insecure, stressed, defeated, anxious, stupid,

6:53

useless, and like a failure. Or

6:56

to target them when they're worried

6:58

about their bodies and thinking of

7:00

losing weight. Basically, when a teen

7:02

is in a fragile emotional state.

7:05

As a parent, I have to

7:07

say, this fills me with rage,

7:09

that they would offer this, or

7:11

that advertisers would agree to do

7:14

it. So what are you all

7:16

think? Yeah, I am not a

7:18

parent, but I am someone who's...

7:20

you know, been a teenager, I

7:23

think we've all been teenagers. And

7:25

I can just speak to my

7:27

own, you know, sort of journey

7:29

around like food and body and

7:32

psychology and, you know, certainly I

7:34

was obsessed. with television and consuming

7:36

mass media, but it was not

7:38

targeted individually to me based on

7:41

my state. And so it feels

7:43

like we're already dealing in a

7:45

consumer society, but then to have

7:47

it targeted at your most vulnerable

7:50

is infuriating and enraging. So that's

7:52

my emotional, like personal reaction to

7:54

it. And then on behalf of

7:56

our most vulnerable, like at the

7:59

most vulnerable moment, I find it

8:01

enraging. Yes. Jason, do you have,

8:03

can you bring us down? From

8:05

our rage? I don't, I don't

8:08

think so. I think, I think

8:10

that it is, it is, it's

8:12

the right, I mean, like, I

8:14

think rage is the right reaction

8:17

to something like this. One of

8:19

the most difficult parts of being

8:21

alive today for people who are

8:23

connected in some way, shape or

8:26

for to the internet. is dealing

8:28

with the information that we can

8:30

gather about coming at us. Like

8:32

we just aren't made for it.

8:35

We haven't developed the skills to

8:37

deal with it. And then to

8:39

throw into that mix, like not

8:41

only is it just completely overwhelming

8:44

and hard to parse, but we're

8:46

using the information that we can

8:48

gather about someone to inject into

8:50

that stream of... difficult to parse

8:53

information, essentially like poison, you know

8:55

what I'm saying? Like, like, it's

8:57

bad. So it was like, it

8:59

was already bad. Yeah. And then

9:02

it was like, we haven't, now

9:04

we're just gonna make it worse.

9:06

It's more precisely bad. Exactly. Yeah,

9:08

the amplification of it. Yeah. I

9:10

mean, this is commonly referred to

9:13

like as dark patterns, right? That

9:15

there are these dark patterns and

9:17

how people design software that. hook

9:19

into our brains at a very

9:22

sort of close to the metal.

9:24

like close to the hardware, yeah,

9:26

and take advantage of various vulnerabilities

9:28

that human beings have to when

9:31

you feel sad, upset, worthless, or

9:33

whatever, we're more susceptible to messaging

9:35

about that same thing. Yeah. And

9:37

by the way, it's not only

9:40

teenagers who have these feelings, like

9:42

I have all, but we all

9:44

do. I mean, we want to

9:46

protect our children, but we should

9:49

also protect all of our, you

9:51

know, protect all of us. And

9:53

so the thing one of the

9:55

things about this book that read

9:58

very credible because I can imagine

10:00

like having worked at a big

10:02

tech company like you can't believe

10:04

this is happening and then you

10:07

look into it and they fought

10:09

at first they think maybe this

10:11

is just a one-off bad actor

10:13

and then as they look into

10:16

it they find this is systemic

10:18

so I'll read another passage at

10:20

first we think the leaked document

10:22

is one Facebook made made to

10:25

pitch a gum manufacturer targeting teens

10:27

during vulnerable emotional states. Then eventually

10:29

the team realizes no. The one

10:31

that got leaked was from a

10:34

bank. So now they're finding a

10:36

bunch of these. There are obviously

10:38

many bets like this. So like

10:40

this is not this is systemic.

10:43

This is not just kind of

10:45

a one-off thing. Facebook does work

10:47

for a beauty product company tracking

10:49

when 13 to 17 year old

10:52

girls delete selfies. So it can

10:54

serve a beauty add to them

10:56

at that moment. We don't know

10:58

what happens to young teen girls

11:01

when they're targeted with beauty advertisements

11:03

after deleting a selfie. Nothing good.

11:05

There's a reason why you erase

11:07

something from existence. Why a teen

11:10

girl feels that it can't be

11:12

shared. And surely Facebook shouldn't be

11:14

using that moment to bombard them

11:16

with extreme weight loss ads or

11:19

beauty industry ads or whatever else

11:21

they push on teens feeling vulnerable.

11:23

The weird thing is that the

11:25

rest of our Facebook co-workers seem

11:28

unbathered by this, reactions. Like a

11:30

moment of empathy for other Facebook

11:32

workers. Like I suspect, as is

11:34

the case where there are a

11:37

lot of questionable slash bad things

11:39

happening around you, it becomes difficult

11:41

to like tune your reaction to

11:43

these events appropriately. And I bet

11:46

a lot of them were horrified.

11:48

Correct. And then there's also this

11:50

question of like, what's it okay

11:52

to express? Right? Like, do you

11:55

feel like it's safe to express

11:57

your horror at this particular thing?

11:59

Or you feel like if you

12:01

say you're horrified that you're taking

12:04

some other kind of risk with

12:06

your job or career, whatever else?

12:08

Yeah. And at the same time,

12:10

like, I think, it's just, it's

12:13

just hard, I think, to be

12:15

in a situation where you feel

12:17

like you have. uncovered something that

12:19

is so obviously horrifying or obviously

12:22

awful and then to have the

12:24

reaction like a sort of non-plus

12:26

sort of reaction from the people

12:28

around here of like well you

12:31

know it's just that's Tuesday for

12:33

you. Yeah so here's another moment

12:35

of empathy for Facebook then Facebook

12:37

now meta employees. What does Facebook

12:39

do about this? A junior researcher

12:42

in Australia is fired for making

12:44

that deck, even though lots of

12:46

people are obviously making these decks.

12:48

I'm now reading, even though that

12:51

poor researcher was most likely just

12:53

doing what her bosses wanted. She's

12:55

just another nameless young woman who

12:57

was treated as cannon fodder by

13:00

the company. And that rings so...

13:02

true to me, like we're going

13:04

to find one person who did

13:06

this, even though we know this

13:09

is a systemic thing and blame

13:11

that one person. What was coming

13:13

up, Kim, as you were asking

13:15

that question and reading about it,

13:18

was even just like the Milgram

13:20

experiment of just like when people

13:22

are following rules and feeling like,

13:24

A, my boss told me to

13:27

do this so it's it must

13:29

be okay because this is my

13:31

boss right or this is the

13:33

rule and then to what Jason

13:36

was saying do I feel safe

13:38

speaking up against it so even

13:40

you know people might have had

13:42

moral concerns but was there the

13:45

safety and all of those questions

13:47

about being a whistleblower so I'm

13:49

just thinking as as people are

13:51

listening to this the places that

13:54

we might be in whether we're

13:56

the junior research or feeling like

13:58

do I have the power to

14:00

to speak up and the person

14:03

who's forcing the junior researcher to

14:05

do that to do that work.

14:07

Yeah, we're not forcing them but

14:09

like this is like like this

14:12

kind of stuff gets normalized and

14:14

when when you're young and early

14:16

in your career and this is

14:18

what everyone around you is doing

14:21

like maybe you haven't asked the

14:23

questions and it's not you've been

14:25

put in a difficult situation and

14:27

and it is the job of

14:30

your leadership to to lead not

14:32

to blame you for something that

14:34

is a systemic problem that they

14:36

have created. But firing, it turns

14:39

out that firing that one junior

14:41

assistant doesn't solve all the problems.

14:43

Facebook has to come out with

14:45

more sort of PR to do

14:48

damage control. And so they offer

14:50

another statement that that this author

14:52

says is a flat-out lie. Facebook

14:54

does not offer tools to target

14:57

people based on their emotional state,

14:59

even though obviously they do. And

15:01

then after that statement comes out,

15:03

the author goes on to say,

15:06

one of the top, and now

15:08

I'm reading again, one of the

15:10

top ad executives for Australia, calls

15:12

me late one night to complain.

15:15

Why are we putting out statements

15:17

like this? He wants to know.

15:19

This is the business, Sarah. We're

15:21

proud of this. We shout this

15:24

from the rooftops. This is what

15:26

puts money in all our pockets.

15:28

And these statements make it look

15:30

like it's something nefarious. So now

15:33

all of a sudden somebody is

15:35

getting mad, somebody else at Facebook

15:37

is getting mad at her for

15:39

distancing the company from what he's

15:42

selling. It's outrageous. Yeah, I think

15:44

that I see, it's a form

15:46

of rationalization that I feel like

15:48

happens a lot in tech in

15:51

particular, which is we. There's a

15:53

tendency to look at the potential

15:55

of something to like be very

15:57

good and weigh that much more

16:00

heavily than the actual harm that

16:02

the thing is doing right now.

16:04

Yes. Like that is that is

16:06

a thing that gets repeated over

16:08

and over and over again in

16:11

sort of the history of all

16:13

technological advancements is we like over

16:15

oversell the good and and I

16:17

think part of what we're saying

16:20

is that. Why does it have

16:22

to be one or the other?

16:24

It's a false dichotomy. Like why

16:26

can't we address the harm that

16:29

the thing is doing and realize

16:31

the upside potential? It doesn't have

16:33

to be one or the other.

16:35

And when you create an environment

16:38

that makes it seem like it's

16:40

one or the other, then you

16:42

wind up with a harm perpetuating

16:44

machine. Yes. Because there's nothing, there's

16:47

no breaks on the thing, you

16:49

know what I'm saying? There's nothing

16:51

to actually slow it down and

16:53

stop it from doing the harm

16:56

until, and that's why we wind

16:58

up with a book like this,

17:00

which is many years later, a

17:02

set of well-documented allegations of like

17:05

all of this wrongdo. Over and

17:07

over and over again, this repeated

17:09

pattern of wrongdoing. And I feel

17:11

like you're what you say in

17:14

radical respect, which is like. The

17:16

reason why this seems so expensive

17:18

to deal with and why I

17:20

mean who knows what matters reasons

17:23

are for trying to enforce this

17:25

particular non-disclosure agreement are what their

17:27

precise reasons are. But like a

17:29

problem is the system is set

17:32

up wrong. It was impossible to

17:34

talk about these things and to

17:36

deal with them and to say

17:38

hey wait. second actually this is

17:41

really harmful like can we why

17:43

don't we have a conversation like

17:45

that wasn't possible yeah there was

17:47

there was a culture of institutional

17:50

betrayal not institutional courage and also

17:52

him what's coming up is you

17:54

know when we've had some conversations

17:56

about this when we were in

17:59

business school and everything that mattered

18:01

the only thing that mattered was

18:03

was was was the shareholder and

18:05

maximizing you know wealth for the

18:08

shareholder so it goes back to

18:10

your measurement problem because The passage

18:12

that you are reading from to

18:14

continue this person this top ad

18:17

executive says you know he's out

18:19

there every day promoting the precision

18:21

of these tools that Hoover up

18:23

so much data and insight on

18:26

and off Facebook so it can

18:28

deliver the right at the right

18:30

time to the right user and

18:32

this is what headquarters is saying

18:35

to the public quote how do

18:37

I explain this he asks and

18:39

13 to 17 year olds quote

18:41

that's a very important audience advertisers

18:44

really want to reach them and

18:46

we have them. we're pretending we

18:48

don't do this. And so for

18:50

me when I read that, that's

18:53

basically about your being, you're optimizing

18:55

just for those metrics, but you're

18:57

not measuring harm, which I think

18:59

is what Jason is, like there's

19:02

no metric of harm caused, and

19:04

that's why the incentives I think

19:06

are so misaligned. Yeah, totally agree

19:08

with that. It's such an important

19:11

like to that. I did write

19:13

this novel called The Measurement Problem,

19:15

which is all about like, you

19:17

know, capitalism is really good at

19:20

rewarding what it can measure and

19:22

very bad at rewarding what it

19:24

values what it values like our

19:26

teenagers, our children, you know, like

19:28

how do we bake that into

19:31

the metrics? Also, Jason going back

19:33

to what you're saying about optimism.

19:35

So careless people. is a quote

19:37

from the Great Gatsby, and it

19:40

just so happens my son just

19:42

read the Great Gatsby and is

19:44

writing a paper about it. I

19:46

was reading his paper yesterday, and

19:49

I'm like, oh my God. And

19:51

the paper he's writing is about

19:53

how we have this sort of

19:55

rosy... view of the past as

19:58

Americans. And that then sort of

20:00

pushes us to have this glorious

20:02

view of the future and to

20:04

ignore the present reality that we're

20:07

in. I'm like that. That is

20:09

what is happening here. That is

20:11

exactly what is happening here is

20:13

a refusal, you know, to admit,

20:16

oh, Daisy loves Tom and married

20:18

him. Therefore I cannot be married

20:20

to her. Now I'm referring back

20:22

to great cats. I think there's

20:25

just like so many problems, it

20:27

feels like to me, so many

20:29

problems in human, in a group

20:31

of humans is the unwillingness to

20:34

talk precisely about what is really

20:36

happening. Yes. To like give, to

20:38

put real clear unambiguous words to

20:40

like what is actually happening. So

20:43

many problems are rooted in either

20:45

an unwillingness or a fear of

20:47

doing exactly that. And I think

20:49

this is a great example of,

20:52

it was literally not okay to

20:54

say the word, you know what

20:56

I'm saying? Like, there was a

20:58

complaint that was being raised by

21:01

the executive that we were just

21:03

quoting from the book, was basically

21:05

like, like, no, the... You're saying

21:07

this is bad and I'm saying

21:10

it's not even okay to say

21:12

that this is bad. This is

21:14

so good that you can't disparage

21:16

it You cannot say that there's

21:19

something wrong with this because this

21:21

is what puts all the money

21:23

in our pockets. It's it's literally

21:25

I don't know like that that's

21:28

sort of like the what I

21:30

mean, it was I'm sure that

21:32

the executive who said that to

21:34

her was in a hard place.

21:37

He's like I'm out here selling

21:39

this and you're telling the press

21:41

I'm not telling this like what

21:43

are my customers going to say.

21:46

And I guess what I'm saying

21:48

is like the temptation to basically

21:50

squash disagreements, like any sort of

21:52

disagreement, and I know we're talking

21:55

about the public statement. versus what

21:57

was privately actually happening. But like

21:59

that effort to squash disagreement in

22:01

order to preserve harmony, you know

22:04

what I'm saying? Yeah, only good

22:06

news for me, only good news

22:08

for the customers, like we only

22:10

want to talk about the good

22:13

that this thing can do, is

22:15

like the source of a lot

22:17

of not intentional evil in the

22:19

world, right? Because you're not thinking

22:22

about the consequences of the actions,

22:24

making it possible. Yeah. Hello,

22:30

it is Ryan, and we could

22:32

all use an extra bright spot

22:34

in our day, couldn't we? Just

22:36

to make up for things like

22:38

sitting in traffic, doing the dishes,

22:40

counting your steps, you know, all

22:42

the mundane stuff. That is why

22:44

I'm such a big fan of

22:46

chumba casino. Chumba Casino has all

22:48

your favorite social casino style games

22:50

that you can play for free

22:52

anytime anywhere with daily bonuses. That

22:54

should brighten your day law. Actually,

22:56

actually a lot. So sign up

22:58

now at chumba casino. of

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features