Kara Swisher’s Big Tech Burn Book

Kara Swisher’s Big Tech Burn Book

Released Thursday, 14th March 2024
 2 people rated this episode
Kara Swisher’s Big Tech Burn Book

Kara Swisher’s Big Tech Burn Book

Kara Swisher’s Big Tech Burn Book

Kara Swisher’s Big Tech Burn Book

Thursday, 14th March 2024
 2 people rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

When providing

0:08

you with high quality sleep every night.

0:10

Sleep next level. J.D. Power ranks sleep

0:12

number number one in customer satisfaction with

0:15

mattresses purchased in store. And now the

0:17

Queen Sleep Number C4 Smartbed is only

0:19

15.99. Save $300 only

0:21

for a limited time. For

0:24

J.D. Power 2023 award information, visit

0:26

jdpower.com/awards, only at a Sleep Number

0:29

store or sleepnumber.com. Having

0:31

seen my name in a burn book as a

0:33

kid, I can tell you it's not a good

0:35

thing. Technically, it's a way of

0:37

naming and shaming people in your social circle. But

0:40

most people don't run in the same circles as

0:42

Kara Swisher. The former

0:44

tech reporter for The Wall Street Journal is on

0:47

a massive book tour. Her memoir

0:49

is titled Burn Book, a tech love

0:51

story. Her onstage moderators include

0:53

people she's covered as a journalist, Laureen

0:55

Powell Jobs, the widow of

0:57

Apple co-founder Steve Jobs, Bob

0:59

Iger of Disney, Sam

1:02

Altman, the current CEO of

1:04

OpenAI. And

1:06

while those interviewers wanted to talk to her about

1:08

her broken relationship with Elon Musk

1:10

or her views on artificial general

1:12

intelligence, otherwise known as AGI, I

1:14

wanted to know

1:17

how she got to be her.

1:20

The definitive beat reporter of the consumer

1:23

internet age with a Google executive

1:25

for an ex-wife, a journalist

1:27

turned entrepreneur who co-launched major high

1:29

tech and media conferences, all things

1:32

D and code, a

1:34

podcast star with high wattage guests,

1:36

a punchy interviewer who never backs

1:39

down, and now an

1:41

important voice in the public debate

1:43

over whether and how to regulate

1:45

tech companies. I'm Adi

1:48

Kornish, and this is The Assignment.

1:56

Kara Swisher and I spoke before a live

1:58

audience of students and professionals. at

2:00

the Sine Institute of Policy and Politics

2:03

at American University. That's where Kara is also

2:05

a 2024 fellow. Hi,

2:07

everybody. How are you? Good morning. Good

2:10

morning. Come

2:13

on. You don't want to bring this energy to

2:15

Kara Swisher, okay, because that's not what you're going

2:17

to get back. Good

2:19

morning. Good morning. Nice. And

2:22

that's where we met for this conversation, which

2:24

I began with an admission. We're

2:27

friends. And I have to say, it

2:29

made me not want to do this interview with

2:31

you today. Okay. All right. Good. Because I feel

2:34

like I'm too close to you

2:36

to do the interview properly. Okay.

2:38

But I felt like you were then the

2:40

right person to talk to because your whole

2:43

career is talking to people that you might

2:45

be a little too close to. Some

2:47

of them. Some of them. Some of

2:50

them, yeah. Yes, that's a good point.

2:52

So let's go. The other thing

2:54

about Kara, which I've told her, is you

2:56

talk to men the way men

2:58

talk to women. Some

3:00

men. Some men. Some men.

3:03

Do any of you guys know what I'm talking about? When

3:05

I say that, does that mean something to you? She said

3:07

that. She was a

3:09

guest host on Pivot. We're like, that's

3:12

exactly what's happening here. And

3:14

as I thought about it, it was really interesting because

3:17

I was talking to my wife, Amanda, is

3:19

here also, and she was talking to a man. And

3:21

I said, this is how you do it. You compliment

3:23

him. Men aren't used to physical compliments.

3:25

And you say, that's a nice shirt. So

3:27

I do kind of do that a lot. But

3:30

then if I really want to really get

3:32

stuff out of them, I actually neg a lot

3:34

of these men I covered. And I'm

3:36

like, did you go this way? Because you were fat last

3:38

time I saw you. Stuff like that.

3:40

And I realized I was just in this.

3:43

I came in from San Francisco last night

3:45

where I did a series of sessions with

3:47

Reed Hoffman, Sam Altman, Gavin Newsom. And

3:49

I did it. I was negging them without even

3:51

trying. And it was really interesting. So

3:54

I'm fascinated by this. And I was

3:56

reading the book looking for your villain origin

3:58

story, like trying to figure out. out how

4:01

you got to be what, you

4:03

know, as a student, she identified,

4:05

yeah, strident, confident in the book.

4:07

Sometimes you use the word arrogant,

4:09

talking about yourself even as a young

4:12

person. And I think that there's something

4:14

to the fact you're running

4:16

against a lot of cultural

4:18

programming as kind of

4:20

a femme, you know, woman identifying person,

4:23

you're going to hear a lot about how

4:25

you should be. Yes. And you

4:27

aren't any of those things. No, I would say

4:29

I'm the exact opposite of Katie Britt as

4:32

her performance last night. Who

4:35

was the Senator from Alabama? Who

4:37

gave the free reunion response? I

4:39

was literally like, she, she to

4:41

me in that thing was chat

4:44

GPT, show me a separate

4:46

wife in a kitchen and didn't win the

4:48

lead role in our town in high school.

4:50

And so Does chat GPT, you do have

4:53

to be distinct with the

4:55

prompts. Yes, very distinct actually. So you

4:57

want to know how I got the

4:59

way I got? Well, yeah. Okay. The

5:01

word, what are your theories? I

5:03

think I was born this way. Because when I

5:05

was a kid, again, the names were put on

5:07

you as a as a girl, if you were

5:09

like this headstrong is the word they

5:11

might use. And then later when

5:13

I was in school, it was bossy or,

5:16

and it's almost always

5:18

from men, not always, but almost

5:20

always. But some an interviewer

5:22

who I competed with quite a bit back in

5:24

the day and used to be on a regular

5:26

basis, said I had uncommon

5:29

confidence, which I found incredibly

5:31

irritating. And I said to him at the right

5:33

when he did it, I was like uncommon

5:35

why? And he was like, well,

5:37

not everyone's confident. I'm like, why is that

5:39

unusual? What am I supposed to be, you

5:42

know, rich, I am retiring? Am I supposed

5:44

to be submissive? Like, what is what is

5:46

going on here? One, I was

5:48

like this to made

5:50

have been about being gay. I knew I was

5:52

gay at an early age. And so I didn't

5:55

feel like I needed to have the approval of

5:57

men. I think girls get into that

5:59

quite a bit. I like men, so I also

6:01

like men. And

6:04

so I got along with, I always got

6:06

along with men at every stage of my

6:08

career and teachers and everything else. And

6:11

then probably my dad dying, I think,

6:14

made me tougher because you sort of had to

6:16

be, my dad died when I was five. And

6:18

so if I think back on

6:21

it, really, I really did get steeled

6:23

during that period. My

6:25

mom married someone who wasn't very nice. I'm

6:27

not Cinderella here, but you know. No, you

6:29

do the same thing in the book though,

6:31

as you start to speed past that point of the story.

6:33

I do, I do. And I want to know,

6:35

I mean, you were only five. We both have

6:38

kids around that age now. And it

6:40

does bring you back to that

6:42

place. It does. I had,

6:44

and before that, I have older kids

6:46

too. And I remember when my, there's

6:48

two points in my life that were

6:50

really important. When I passed the age,

6:52

my dad died at. That

6:54

was interesting because I didn't think I'd live past that. So

6:56

I kind of lived my life in a rush because

6:59

I thought I would die at that age.

7:01

I think that's very common among people whose

7:03

parents die early. My brothers are the

7:05

same way. We're like in a rush. We

7:07

have no time for this. And then when

7:09

my son turned, my oldest son, he's 21

7:12

now, turned five, I realized the

7:14

devastation. I really did. I was like, God, he knows

7:16

me so well. And you know, you don't have memories.

7:18

I don't have a lot of memories in my dad.

7:20

I always thought I should get hypnotized or something, which

7:23

I haven't done. But I have little

7:25

pieces of memory. Yeah. But that's the

7:27

age where those are to be shaped. Right. How

7:29

well do you decide? Four year old, my

7:32

two year old, like would they remember me?

7:34

Maybe not. But we have clear relation

7:37

of every single day and they're deep.

7:40

And so the devastation was very clear

7:42

to me. And I remember being struck

7:44

when he turned the age I was because

7:47

it was clear that I was

7:49

much more hurt than I realized.

7:52

But you've covered up pretty quickly. A lot

7:54

of people whose parents died a young age,

7:57

become something that Irving Yalem called

7:59

highly fine. functional. He was a

8:01

psychiatrist. He wrote a really good

8:03

book called The Lost That Is Forever about people

8:05

whose parents died at a young age. And

8:08

I do remember being, that was

8:10

when I realized how good I was

8:12

at moving forward. Moving forward,

8:14

I don't think. Years ago,

8:16

my friend of mine was in one of those

8:18

therapies where you talk to someone every day. And we

8:21

were talking about it. And I was like, what in

8:23

the world do you find to talk about? Like, I don't

8:26

know, I'm not that interesting. Like, what parts of your life

8:28

could you find? And they turned to me

8:30

and said, they were going to become a therapist. I

8:32

said, that's why they do it. And I

8:35

said, I just don't have that much to talk about. I

8:37

just don't feel like I would have something to talk about

8:39

every day for hours about myself.

8:41

And they said, you're

8:44

blocking. And I said, it's working. I'm

8:48

pretty happy. I was like, where's the depth?

8:50

And my wife talks about it a lot.

8:52

And I said, when I met her, I

8:54

said, I wasn't neurotic.

8:58

I said, I don't have any neurosis. That's

9:00

not your vibe. No, it's not. But it

9:02

really isn't either. It's there. And I've hit

9:04

it so beautifully. Like I'm a mafia Don

9:06

and I, the cement is deep. Yeah,

9:08

the sunglasses help. And Jimmy Hopp, I'm

9:11

trying to avoid intimacy with you. I

9:13

actually, I forgot my other glasses. But

9:15

it does make me think about this

9:17

stepdad. Because you have this person who

9:19

comes into your life, but you describe

9:21

him in extremely specific ways in the

9:23

book. His phrase is like casually cruel,

9:25

manipulative. You don't describe him as abusive.

9:28

So I'm not I don't want to

9:30

give that you do feel that now.

9:32

I do. I do. I tried what it

9:34

was because you were so young. So by the

9:36

time you're 10, 11, what, what are what is

9:38

it? Physically abusive. It's you

9:40

know, everyone always, you know,

9:43

you tend not to minimize mental

9:45

abuse, right? That I vaguely

9:47

recall, but I remember this dog was a basset

9:50

hound, He

10:01

made up stories of what he did with it. He's like,

10:03

oh, I dropped it in the river and all this stuff.

10:06

It was so cruel. And are these happening when

10:08

the kid's with him? Is

10:11

mom around? Mom's around. And she let

10:13

it happen. Yes, she absolutely let it

10:15

happen. And to be kind to her

10:17

is she's this young person with three

10:19

kids and their husband died and she

10:21

got married pretty quickly. I get it.

10:23

I get that kind of thing. But

10:26

she let him do it. She let him do it. He

10:28

was very, you know, one thing is he

10:31

was a game player. He loved strategy games, whatever

10:33

it was. And so I got very good at

10:35

like backgammon. But I don't play it. I don't

10:37

like it. And I'm good at

10:39

it. It's really weird because it was so

10:42

aggressive. He taught me in such an

10:44

aggressive way. And I was also interestingly

10:46

really good at it. Like I actually did

10:48

have those qualities and sort of killer

10:50

like sense of game playing. By

10:52

the way, we just heard the word killer in the context

10:54

of backgammon. Yeah. I have an idea

10:56

of what these games are like. You can do it.

10:58

But it is. It is. I

11:01

must win and dominate. So he did teach me those. He

11:03

had those qualities, which is a good thing. He had a

11:05

very strategic mind. And I would say you can

11:07

learn good things in a bad way.

11:10

Right. And he would do all

11:12

kinds of things. He bugged our house. He was paranoid. And

11:15

it was kind of we used to find humor in

11:17

it because my brother and I would always make fake

11:19

drug deals over the phone. Like,

11:21

did you get the pound of hash? You know, because

11:24

you knew he could hear. Yes. In

11:27

that like 80s and 90s way of your parents

11:29

are listening on the other line or he had

11:31

bugged your home. He put bugs. Yeah. Like

11:34

Nixon. He's like Nixon. Yeah. Whoa.

11:38

So I'm not going to armchair my guy for this. But

11:40

it does to me uniquely prepare you

11:43

for Silicon Valley. It did. It

11:45

did. But I say I'm an optimistic pessimist.

11:47

I really do think the worst is going

11:49

to happen. And then when it doesn't, I'm

11:51

like, yay. It did. I

11:54

know. You know, you describe a

11:56

number. I want to talk about

11:58

some of the people you encountered in the early. part

12:00

of your career, you're a young person, you're

12:02

at school, you decide for a variety of

12:04

reasons as you are kind of cut off

12:06

from doing what you want to do, right,

12:09

joining the armed services specifically, the CIA. I

12:11

did. That was really a disappointment for me.

12:13

I wanted to be in the military very much. My

12:16

dad was in the military. He had just gotten out when he died. He was

12:18

in the Navy. I wanted to be in the Navy. I

12:21

have a lot of family members on his

12:23

side in the Navy, actually. And I really

12:25

wanted to. I have a great patriotic streak.

12:28

I have a real belief in America. But

12:30

you not doing it is somehow at odds

12:32

with how I think of you. Really? Like,

12:35

why didn't you just do it? Because I had to lie. This

12:37

is even before Don't Ask, Don't Tell. It

12:40

was you got kicked out and then dishonored

12:42

the discharge. And that old Don't

12:44

Ask, Don't Tell was another opportunity. And I

12:46

was like, I ask and

12:48

I tell. So that's a problem. I

12:51

thought you know you wouldn't be able to do it. It was

12:53

so offensive. Even more than just kicking people out.

12:55

That was Clinton. That was a Clinton era thing. That

12:58

was so offensive to me, Don't Ask, Don't Tell. So

13:01

I couldn't do that. And then years later when

13:03

they finally got rid of everything, I wanted to

13:05

join the reserves, but I was actually too old.

13:08

So I never got a chance to do it. I

13:11

thought about joining the CIA. I went to the

13:13

Foreign Service School with a feeder to the CIA

13:15

and the State Department. Both things. At

13:18

the time, there was this Clayton loan

13:20

tree scandal. You should look it up.

13:22

There was all this Russians manipulating

13:25

Americans that lived in Russia. I didn't even speak

13:27

Russian, so I don't know why they were so

13:29

concerned. But when I went for some of the interviews,

13:31

they were like, you could be

13:33

blackmailed. I'm like, I'm out. And

13:36

they were like, but you could be blackmailed. And I was

13:38

like, but I'm out. It was like these discussions. But

13:41

they were very obsessed with gay people

13:43

at that time. And finding

13:45

out. And I was like, there's nothing to find

13:47

out. At one point, they were like,

13:50

well, what if you went to Saudi Arabia? They

13:52

killed gay people here. And I'm like, I

13:54

don't speak Arabic. Why would you send me

13:56

there? How did they shape how you think about? Woods.

14:00

Discrimination. How you talking about

14:03

The reminds? Of thing serious because it was

14:05

also the era. Of the Aids

14:07

the Eight zero and it was during the

14:09

regular Mister Says in college and. And.

14:12

I was serious about how they treat people

14:14

with Aids and I think a lot of

14:16

people became activated. Then they are. People were

14:18

like. Also. When you face discrimination,

14:20

I don't know about you, but it makes

14:22

you feel that identity more acutely. Yes, even if

14:24

you were started around your life. Yeah, they're telling

14:27

you that was what. You're not out of

14:29

the spot. Right inning with differently going to

14:31

hide right with raids and edge and

14:33

jewish you don't you can't hide is

14:35

easily and sell the to problems and

14:37

I'm not going to stack rank anyone.

14:39

Discrimination that. One. Is you

14:41

can hide and but you have to be

14:43

furtive which really was terrible. I can't even

14:45

tell ya terrible that was. and I wasn't

14:48

good at foreigners and I was very outspoken.

14:50

And the other thing was your family

14:52

can dislike like your family this year

14:55

so it creates this weird thing in

14:57

your friends and you know a lot

14:59

It was such a different time it

15:01

really was I can eat. He is.

15:04

It was shot. It's shocking the changes. And

15:07

wanted have kids was not open

15:09

to you. That with us and

15:11

South Island Avenue at a good

15:13

thing I bought a baby and

15:15

one thing for my child. I

15:18

said I'm having a cat and a cabinet and he

15:20

was all the kids or. And. Them

15:22

you know which is is really hard to

15:24

be. Not who you are and I

15:26

really was who I was And I came out pretty

15:28

quickly and the Aids thing was it. I was like

15:30

that's. Enough. That's enough

15:32

with you people. People are dying.

15:35

Reagan administration was pretending people that

15:37

have a to Iraq had such

15:39

such nonsense. And. You

15:41

know when things like Angels in America came out

15:44

I was like exactly yeah a but I think

15:46

it's. Maybe hard for this generation

15:48

to understand because they've grown

15:50

up with activism. at their

15:52

fingertips yeah i like very

15:54

young people are running major

15:56

nonprofits yeah you know and

15:59

i was like coming up

16:01

that was its own set of challenges. But

16:04

for sure it was hard.

16:06

It was really hard and it was and

16:08

you didn't have the internet. Like my ex white

16:10

brand planted out which was one of the

16:12

first, AOL was one of the first gay communities.

16:15

Where people could talk to each other and

16:17

find each other. I remember Megan who

16:20

went on to join Google and then was the CTO

16:22

of America. She had

16:24

like seven members of Planet Out that

16:26

were from Vatican City which was really

16:28

interesting. But they were

16:30

everywhere. That was the thing. People were everywhere and

16:33

dying to meet each other. Which

16:36

is why the online space held such promise for

16:38

me. Especially around gay people. It was

16:40

people that couldn't meet. Everyone

16:42

from gay people to one of the first group

16:44

I encountered was a group of quilters at AOL.

16:47

Who had never met. They

16:49

were just like quilting and they were sort

16:51

of alone in their little communities. And then

16:53

they met online and they made a giant

16:56

AOL quilt with each other. And then Steve

16:58

Case brought them all in to meet each

17:00

other. They never met. And

17:02

I remember thinking this is a magical medium.

17:07

You're listening to Cara Swisher. I spoke

17:09

with her last week at American University. We'll

17:11

have more after the break. We

17:20

all do things our own way. And since the way that each of us sleeps is unique,

17:23

you need a bed that fits you

17:26

just the right way. Sleep Number Smart

17:28

Beds make your sleep experience as individual

17:30

as you are. Using cutting edge technology

17:32

to give you effortless, high quality sleep

17:34

every night. J.D. Power ranks Sleep Number

17:37

number one in customer satisfaction with mattresses

17:39

purchased in store. And now during Sleep

17:41

Number's President's Day Sale, save 50% on

17:43

the Sleep Number Limited Edition Smart Bed

17:45

plus Special Financing for a limited time.

17:48

For J.D. Power 2023 award information, visit

17:50

jdpower.com slash awards. Only at

17:52

Sleep Number stores or sleepnumber.com. See store

17:54

for details. When you work,

17:56

you work next level. When you play, you play

17:58

next level. And when it's time to play, you play next level. The.

18:05

Night sleep Next level Jd Power

18:07

ranked sleep number number one in

18:10

customer satisfaction with mattresses purchased. In

18:12

store and now during sleep number It's Presidents'

18:14

Day sale see fifty percent on the sleep

18:16

number Limited edition smart bed plus special financing

18:19

for a limited time. Power

18:22

2023 award information, visit jdpower.com/awards

18:24

only at Sleep Number Stores

18:26

or sleepnumber.com. See store for

18:28

details. Welcome back. I'm

18:32

speaking with Kara Swisher about her memoir Burn

18:35

Book, a tech love story. It's

18:37

about her career covering the rise

18:39

of Silicon Valley's biggest leaders from

18:41

Tesla founder Elon Musk, whom she

18:43

calls her biggest disappointment, to Salesforce

18:45

CEO Mark Benioff, whom she lists

18:48

in a chapter called The Menches.

18:50

Why do you think they liked

18:52

you? Did they see you as one of them?

18:55

No, and this was interesting. I don't think

18:57

that because I had been, you know, I was

18:59

at the Wall Street Journal or The Washington Post

19:01

first and they needed me. Right. I

19:04

was an entrepreneur until 10 years later. I didn't start.

19:06

I didn't leave the wall. The safety

19:08

of the big newspaper until 2002, which

19:11

was a long time. But the fact that you could call

19:13

up someone and say, this is what I think you should do.

19:15

This is what I think you should buy. This

19:18

is what I said. I'm going to write this. I

19:20

mean, I would tell them what I was going to

19:22

do. They never, they, they often didn't take my advice.

19:24

I was writing the story of how I thought Newville

19:26

was going to take over Yahoo. And I said

19:29

to Jerry, no, you're these guys are going

19:31

to eat you. And I said,

19:33

you should take them off. And I didn't think

19:35

he was going to do it. They never did. They never

19:37

did what I said. But I

19:40

think they appreciate it. Again, we talked about how

19:42

you talk to men. Was there something about the

19:44

way you had those conversations? Because

19:47

now I see an industry that's quite closed.

19:49

Yeah, very hard to talk to any of

19:51

them for any reason. And

19:53

I think that you, your stature and

19:55

you have the history. Yes. Now

19:58

we do it in public. Like I had a public argument.

20:00

with Mark Benioff, I was like,

20:02

Elon's doing this anti-Semitic, this

20:04

anti-gay, this anti-trans. These

20:07

are the values you said you held dear. And

20:09

he goes, well, he can land a rocket on

20:11

a surfboard. And I was like, I

20:14

don't understand. So he's racist, too,

20:16

Kanan. What a rocket on a surfboard. What

20:18

I was trying to get out there was like, you

20:20

should say something. And I sent it to him

20:22

publicly in front of a crowd of a thousand

20:25

people. I said, why aren't you saying something publicly?

20:27

I think you should. If you say you support

20:29

gay people, this guy just said Paul

20:31

Pelosi's in a gay love triangle. And

20:33

that's how he got beaten up. Why

20:35

won't you say anything? And his excuse

20:37

was, well, he's really innovative. Like

20:40

we must excuse his sins for that.

20:43

And so that's what it turned into. When I was at

20:45

the journal, I was a beat reporter, right?

20:47

You know, at some point, you have to say Google

20:49

bought this, this happened. And the reason

20:52

I left it was because I got so

20:54

frustrated at doing an

20:56

enormous amount of reporting. One particular time

20:58

was web ban. And I was

21:00

like, this ain't going to work. And I wanted

21:02

to say it in the Wall Street Journal. I wanted

21:05

to just let that be the way I communicated it.

21:07

And the editor said, get someone else to

21:09

say what you think. And I, you know

21:11

that, right? Very common, which is really common.

21:13

And then they always had a thing in

21:16

the journal, they use this term to be

21:18

sure some people say, you know, that kind

21:20

of thing, which I hate, which is, and

21:22

I was like, to be sure some people are

21:24

idiots. Not true. And so that's why I started

21:26

all things D because the minute we did all

21:29

things D, if you go back, we

21:31

were like, this is a

21:33

go rodeo. This is what here's the recording,

21:35

here's the scoop. And then let us tell

21:37

you why it's a go rodeo. Because we

21:39

did the reporting for you. What I

21:41

was more valuable is the memory uses

21:43

analogy of love, you know, and love

21:45

clouds your judgment. It

21:48

does. It can, it can, but I love

21:50

the tech, not the people. I love

21:52

the possibilities of technology.

21:55

I really did believe, and I

21:57

probably was from being gay now when I think about it. we

22:00

can come out and meet each other now. Were

22:02

there red flags that you missed industry-wise

22:04

in terms of how that culture was

22:06

changing? No, because I think we were on

22:09

that a lot earlier when Google, I wrote

22:11

a particular tough piece on Google about their

22:14

need to be the Borg and take over

22:16

everything. We wrote that in the antithesis of

22:18

Don't Be Evil. No, yes, right, exactly. So

22:20

we wrote a lot of those in 2001,

22:22

2002, saying

22:24

this company is starting to turn into Microsoft

22:26

and they got mad at me for doing

22:29

that. And we were saying, if they

22:31

do that, they will have unlimited power because

22:33

this is where advertising is going and everything

22:35

else. We were super

22:37

tough on Yahoo's mistakes, Twitter.

22:40

Uber, we got, I think

22:42

we helped get Travis Callan expired over that.

22:44

That was later. That was in the

22:47

studio. There's a way that given when you

22:49

started covering it and where you

22:51

are now, that your experience kind

22:53

of mirrors our own cultural experience

22:55

of the industry. Yes, which I was

22:57

trying to get at. Good, I did the homework. But

23:00

just the it goes from granules, utopia. I

23:02

was not a sand boy. I definitely was

23:04

not those people. But we all were, right?

23:06

Like we were, I do think

23:09

that we were sold, not a bill of

23:11

goods, but definitely there was a

23:13

reverence for young tech

23:15

leadership. They were bringing up something. We

23:18

were not talking about them like they

23:20

were robber barons or railroad guys.

23:22

So it was very different. Whereas

23:24

now, people do see

23:27

them as insular, reckless. That's

23:29

correct. Self aggrandizing billionaires.

23:31

Yes. And I do feel

23:33

like that is a shift. But I don't think

23:35

people heard it. We did an interview with Mark

23:38

Zuckerberg where we ran over him so much that

23:40

he sweat. He almost went to death, right? And

23:42

the swab sweat. Yeah, and the social network did

23:44

that. Yes, but we had been pressing him about

23:46

privacy and that Deacon, we were particularly tough about

23:48

Deacon. And we're like, what are you doing? This

23:50

is worse. Even Steve Case

23:52

at one point when he was collecting all

23:54

this information, we were talking about the hacking,

23:56

the possibilities of hacking it. And

23:59

he was at one. investor meeting and

24:01

he was bragging about how he made $63

24:03

each person, right? 63,

24:06

I mean that's what the data that they got that

24:08

they know that you caught is where the

24:11

product that's correct and I raised

24:13

my hand and I said where's my $30

24:16

like why are you taking my things and

24:18

I think one of the problems that it led

24:20

to is this sort of thing which Scott Galloway

24:22

calls the idolatry of innovators and I think that's

24:25

true. We're in this

24:27

strange moment now with generative artificial

24:30

intelligence AI and you were recently

24:32

talking to Sam Altman who you

24:34

know yesterday last night 12 hours

24:36

ago but you also said there

24:39

needs to be a certain kind of

24:41

personality in this next generation there does

24:44

there also needs to be an aggressive

24:46

person too like you said contemplative also

24:48

he is much more and this is

24:50

not to pick on him in particular but I feel

24:52

like AI is being greeted with far

24:54

more skepticism yeah in some

24:56

corners panic panic it's a very different science fiction

24:58

but go ahead yeah yeah sure but it's a

25:00

very different like this is gonna be so cool

25:03

we're gonna get to do but it's much more

25:05

like whoa yeah are the robots

25:07

gonna kill us all and what

25:10

are those lessons that you take from

25:12

the last couple years of reporting

25:15

into this next phase you know it's interesting because I'm

25:17

I'm in the middle I'm like dr. Fei-Fei Li I'm

25:19

in the middle I'm like and I'm people are surprised

25:21

or like don't you want to stop this I'm like

25:24

we could solve cancer no I

25:26

don't like again I don't think I'm too hopeful

25:28

because I'm very aware of the

25:31

dangers I think what's good about this

25:33

year is the actual makers of these

25:35

things are contemplating the dangers of them

25:37

and honestly the first person who talked about it with

25:39

me 10 15 years ago was

25:42

Elon Musk was very worried about

25:44

that he was sort of deep in the sci-fi

25:46

part of it and it shifted a little bit

25:48

over time you at first it was Terminator it

25:51

was the Terminator this is it and

25:53

then it became well treat

25:55

us like house cats and then it was

25:57

like oh it's really like you're

25:59

a villain and you

26:01

go over an anthill and to kill it, you

26:03

don't think about it. Like it became that, which

26:05

I think is probably more accurate. It'll

26:08

only do what we tell

26:10

it to do. Right. But how

26:12

should reporters approach this now based

26:15

on what you went through those early years? I

26:18

am making a heavy pressure

26:20

on regulators right now. I spent a lot

26:23

of time texting regulators saying, what

26:25

the are you doing? Do you think people

26:27

didn't do that enough in the past, not

26:29

you broader tech press? Yes, I think it

26:31

was we just let it is an astonishing thing

26:33

to think about the most powerful companies in the

26:35

world and the richest people in the

26:37

world in the history of the world. Not

26:40

having any regulation applied to them and the

26:42

regulation that applies them is advantageous, which is

26:44

Section 230. It's really unusual for

26:46

the richest people in the world not to be

26:48

able to be sued at all. Like really

26:51

pretty much not everybody. Yeah. But

26:53

again, we've talked about this in Congress.

26:55

It's not like banking regulations where there's

26:58

a constituency of lawmakers who want to

27:00

do more regulations and the banks push

27:02

back. But there's a push pull. But

27:05

there are regulations. With their heck, it's

27:07

often like we don't want to squash

27:09

innovation. Right. That's what they do. Silence.

27:12

Silence. Right. We won't squash innovation. And

27:14

so that's been their excuse is we

27:16

won't want to. And honestly, I think

27:18

with squash innovation is our reliance

27:21

on them to self-regulate and therefore disaster

27:23

after disaster. And so are we headed

27:25

in the same direction a couple of

27:27

years ago in the New York Times?

27:29

I compared Mark Zuckerberg is like, he's

27:31

got a butcher shop. And he's

27:33

like, some of the meat is probably going to

27:35

kill you. But I don't know what which one.

27:37

And but it's not my responsibility to worry about

27:40

the rancid meat. You're just going to have to.

27:42

Good luck. That kind of thing. And he got mad

27:44

at that. But I felt like that was

27:47

a good thing. Is that, but to me, ultimately now

27:49

this guy just wants to make money. And that's why the

27:51

first line of the book is so was capitalism after all.

27:54

And I think that our regulators are now on

27:56

the hook because these people are here for the

27:58

shareholders. But they look out. matched, frankly. There

28:00

are a lot of them. And I just

28:02

think, I don't know a

28:05

single lawmaker that is going to

28:07

get to toe to toe with this person, let's

28:09

be toe to toe with that. Well, that's the

28:11

thing. It's astonishing to think the federal government

28:13

does not have enough power, and they don't.

28:15

They really don't. I mean, these people have,

28:17

I jokingly said, it was around some

28:19

Facebook people. I was like, what are there? Like 80 PR

28:22

people just on me? And they're like, no,

28:25

40. And I was like, oh, yeah.

28:27

You know what I mean? But Joe Biden in a

28:29

state of the union, the president talked about trying

28:31

to do regulations, I think specifically

28:34

against voice imitation.

28:36

Yeah. Well, that's because

28:38

he got got he got got they did all these

28:40

things with him. But is that a moment for

28:43

someone to? They're all been

28:45

moments. You know, when there was a

28:47

moment, the insurrection, like, look, I, you

28:49

know, sorry, it was, it was

28:51

one of the best pieces I wrote is in 2019.

28:53

I wrote a column in the New York Times,

28:56

my first column in

28:58

the New York Times in 2017. I called

29:01

them digital arms dealers. That's why I did

29:03

that column. So I wanted to alert public

29:05

officials to what was happening. And so I said,

29:07

these people are really hurting us. I

29:09

wrote a column before COVID saying they're going to

29:11

have more power than ever after COVID, because we're

29:13

gonna have to rely on them so much. And

29:15

then there'll be richer and there'll be trillion dollar

29:17

companies and then we can't stop them. But

29:20

during 2019, I

29:22

had argued with Mark about Alex

29:24

Jones, he moved into Holocaust. Now, as I

29:27

was worried about anti-Semitism, that was a very

29:29

famous interview where he said Holocaust

29:31

deniers don't mean to lie and he wasn't going to

29:33

take them off the platform. And I said, you

29:36

are going to reap the rewards of what

29:38

you're doing here in a very bad way. But

29:40

the column I wrote was saying, when

29:43

he loses in 2020,

29:45

he's going to say it was stolen. And

29:47

then he's going to make sure he says

29:49

it over and over again, he's going to

29:51

radicalize his people, he's going to, it's going

29:54

to go up and down the right wing echo chamber. It's

29:56

all over the place, the right wing echo chamber. story

30:00

about people who rise

30:03

to power virtually unchecked. Yeah.

30:06

And once they get there, don't want any kind of accountability. They

30:08

don't. You and I have talked about how

30:10

hard it is for lawmakers to get it together. And

30:12

now we're staring down this dark tunnel of AI.

30:16

This is the most important computing change

30:18

in the generation. And you've

30:21

moved to Washington. Yes. To

30:23

talk more about regulation. Yeah. Are

30:26

there willing ears? There is.

30:28

Is there a constituency in all of us? Do

30:30

we care enough or is it more convenient for

30:32

us to order food and helicopters

30:34

or whatever in the absence to

30:36

actually deal with this? Because I

30:38

don't want to be in a love story with flames

30:41

on it 10 years from now. You know what I

30:43

mean? I would like to get this. Haven't

30:45

we all? I would like to get it right. Well,

30:48

here's the problem because it's already

30:50

inbuilt. The lobbying is

30:52

astonishing. There is a

30:54

feeling by, you know, you saw those

30:56

parents face Mark Zuckerberg the other day. Right.

30:59

They're like, what are you doing? But

31:01

even that wasn't enough. He didn't apologize. He

31:03

said, I'm sorry for what happened to you.

31:06

That is not an apology as far as

31:08

I can understand it. And I

31:10

saw his eyes and he was quite upset. You

31:12

know, he's a person. I know he doesn't

31:14

seem like one, but he is. And

31:16

I think one of the problems is it just it's

31:18

always pushback. And by the way, it's not just Republicans

31:21

who just spend their time just

31:23

cosplaying like TikTok. Let's

31:26

get them. Like, that's not our biggest problem. It is a problem.

31:28

We are exactly going through this fight on

31:30

Capitol Hill right now about TikTok. But guess

31:32

what? It is a problem. I wrote about

31:34

it five years ago. But

31:37

what we really need is basic things, artificial

31:39

general intelligence, safety regulation. A privacy

31:41

bill would be nice this 10,

31:44

20 years into it, a national

31:47

privacy bill, a data transparency bill,

31:49

an algorithmic transparency bill. Like

31:51

there's 10 of them. I wrote a story a long time ago

31:53

called the Internet Bill of Rights and named the 10. You

31:55

can say the EU, the European Union has acted,

31:58

started to act and made a difference. No,

32:00

they have actions. Why do you think they are able to

32:02

do it in a way that the US can't? Because

32:04

they don't get the benefits from these companies

32:06

that our country does, right? Meaning? The US

32:08

companies, the financial benefits, they are getting the

32:10

benefits, they're getting a lot of the negatives.

32:12

Their governments aren't bought and paid for by

32:14

these tech companies. And so

32:16

they are not under as much pressure. So Marguerite

32:18

Vestager, I'm actually talking to her next week, is

32:21

able to put these fines on, and

32:23

she's not getting pushback. It's a much

32:25

more privacy-oriented world over there in Europe,

32:27

it just is. Are you

32:29

in your activist era? No.

32:33

You know, I think I have been, if

32:35

you go back, I was, I think if

32:37

you... But this is different. Moving to Washington,

32:39

you're burning the relationships. Yeah. Not

32:41

that hard, because they're like all on the

32:43

book tour, but you're speaking more publicly, and

32:46

you're speaking more publicly. No, only the people

32:48

I like are on the book tour. Elon and

32:50

I are not going to be doing appearance together.

32:52

That's fair, but you trust all of them more than I

32:54

do right now, let's say that way. Well, it's a

32:56

low bar. It's a low bar. If

32:59

I can impact him, sure, I

33:01

want to impact him. I did say to him last

33:03

night. That's why I'm asking, do you see

33:05

that also now as part of your work?

33:08

Yes. I said to him, I

33:10

said, I have great hopes that you'll try to at least

33:12

do... I know it's a big money-making thing for all of

33:14

you, and you're already a billionaire, so

33:16

I suppose you have enough money. I

33:19

said, if you start down the ugly road of Elon

33:21

Musk, I am going to turf you. I think I

33:23

said that to him, I'm going to turf you. I'm

33:26

going to try to turf you, at least, kind of

33:28

thing. So yeah, I think I can have this turf

33:30

you make. But put you down. Put you down. Like,

33:33

don't get up. Don't get up,

33:35

that kind of thing. But my idea is

33:37

I'm watching you, and I'm hoping for the

33:39

best from these people. And look, there's a

33:41

lot of entrepreneurs that are doing climate change.

33:43

You're hoping for the best, but you're ready to take

33:45

them down for the worst. Yes, and I think

33:48

by nature, because the money involved is

33:50

so big, it's so massive, they're going

33:53

to want to advantage shareholders. That's

33:55

what they do. That's what they do.

33:58

But in doing this, I'm hoping for the best. that

34:00

some of them and I would say you

34:03

can find people who do

34:05

understand the gravity of the situation like

34:07

a Sachin Adela at Microsoft. It's

34:10

not activist. It's more, you know, speaking

34:13

of CNN, I think Christiane Amanpour,

34:15

I really like that saying, truthful,

34:17

not neutral. I'm not neutral about

34:19

this. And I think this new

34:21

era of Trump wins, he

34:23

understands the uses of online

34:26

to manipulate and

34:28

quash people. And

34:31

so I see like this could go real

34:33

bad, real fast. And that's the whole point

34:35

is that I'm trying to make, which

34:38

I did, I think with the Star Trek Star Wars

34:40

metaphor, which is we could go Star Trek,

34:42

which is good, or we could go

34:44

Star Wars, which is bad. But it's

34:47

not the tech that's the problem. It's the

34:49

people manipulating the tech. So I guess

34:51

you could say I'm an activist. I'm

34:53

an activist for unaccountable

34:55

power, not being unaccountable,

34:58

and that we should not

35:01

have private companies running space. Government

35:03

people should be involved in social citizens.

35:05

We should not have giant companies

35:07

control AGI. And that's what

35:09

they're doing right now. So if I keep saying, hey,

35:12

big companies are controlling AGI

35:14

government, what are you going to do

35:17

about it enough times that someone will listen

35:19

and they will also listen because they think

35:21

the people that can do something about it

35:23

are listening to me, which is often the

35:25

case, right? That happens a lot. See

35:29

Kara Swisher. Thank you so much. Amazing.

35:32

Wonderful. Kara Swisher

35:34

is the host of the podcast

35:36

On with Kara Swisher and Pivot

35:38

with her co-host Scott Galloway. Her

35:41

new memoir is called Burn Books, a tech

35:43

love story. We spoke at

35:45

American University in Washington, D.C. Still

35:49

thanks to director Amy Dacey and

35:51

everyone at the Sine Institute who made

35:54

us feel so welcome. That's

35:56

all for today. We'll be back with a

35:58

new Politics episode on Monday. The

36:02

assignment is a production of CNN

36:04

Audio. This episode was produced

36:06

by Dan Bloom. Our senior

36:08

producer is Matt Martinez. Our

36:10

engineer is Michael Hammond. Dan

36:13

Dzula is our technical director. And Steve

36:15

Lichtai is the executive producer

36:17

of CNN Audio. We got

36:20

support from Haley Thomas, Alex

36:22

Manissary, Robert Mathers, John Deonora,

36:24

Lenny Steinhardt, James Andrus, Nicole

36:27

Pessaroo, and Lisa Namaril. Special

36:30

thanks to Katie Hinman. I'm

36:32

Audie Cornish, and thank you for listening.

36:38

We all do things our own way. And since the way

36:40

that each of us sleeps is unique, you need a bed

36:42

that fits you just the right way. Sleep

36:44

Number Smart Beds make your sleep experience

36:46

as individual as you are, using cutting-edge

36:49

technology to give you effortless, high-quality sleep

36:51

every night. J.D. Power ranks Sleep Number

36:53

number one in customer satisfaction with mattresses

36:55

purchased in store. And now, the Queen

36:57

Sleep Number C4 Smart Bed is only

36:59

$15.99. Save

37:01

$300 only for a limited time.

37:04

For J.D. Power 2023 award information,

37:06

visit jdpower.com/awards, only at a sleep

37:08

number store or sleepnumber.com.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features