A device that records your entire life? Yeah, sure great idea...

A device that records your entire life? Yeah, sure great idea...

Released Friday, 11th April 2025
Good episode? Give it some love!
A device that records your entire life? Yeah, sure great idea...

A device that records your entire life? Yeah, sure great idea...

A device that records your entire life? Yeah, sure great idea...

A device that records your entire life? Yeah, sure great idea...

Friday, 11th April 2025
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Hey everybody, just a

0:02

quick note before we get to

0:04

the rest of the show, you

0:07

will notice that we do not

0:09

really talk about the big story,

0:11

tariffs. that is of course rocking

0:13

the markets globally, but that dear

0:15

listener is by design. We want

0:17

to give a little bit of

0:19

time for this story, which is

0:21

just changing so much by the

0:23

day and come with a more

0:25

fulsome discussion next week. We have

0:27

a great guest lined up to

0:29

talk about it, and hopefully there'll

0:31

be also a tad bit more clarity.

0:34

But now, let's get back to the

0:36

rest of this week's episode. Hello

0:39

and welcome back to the Times

0:41

tech podcast with me, Katie Prescott,

0:43

and you, Danny Fulton, Katie here

0:45

in the city and Danny in

0:47

the valley. Hello Katie. Hello, good to see

0:49

you. So what do you have for us

0:51

today? I have a question. Go on.

0:54

Have you ever watched Black Mirror

0:56

and thought, this terrifying bit of

0:58

technology that is ruining the fabric

1:00

of society? We should do that. This

1:02

is a really embarrassing admission

1:04

for a tech journalist, but I've actually

1:06

never seen Black Mirror. Oh my gosh.

1:09

I feel like I'm going to get

1:11

fired for admitting it. Yeah, keep on

1:13

doing it. Sorry. Well, so this obviously

1:15

will answer my next question, which is

1:18

going to be, have you ever watched

1:20

the episode called The Entire History of

1:22

You? Sounds very good, but no. So

1:24

I've asked Chad GBT to summarize this

1:27

episode for you. Big spoil alert

1:29

for any listeners. For you, Katie,

1:31

I will have to spoil it

1:33

because it's important for what we're

1:35

about to discuss today. Okay, spoil

1:37

away, go for it.

1:39

Okay, so I've asked

1:41

Chad GPT to make

1:44

this summary dramatic, insert

1:46

tense music, producer

1:48

Callum, in a world where

1:51

every moment is

1:53

recorded through a brain

1:55

implant called the grain.

1:58

I'm serious here. it

2:01

together. One man rewinds

2:03

his memories to uncover

2:05

the truth about his

2:07

marriage. But the more

2:09

he sees, the less he

2:11

can forget. And the truth

2:13

tears everything apart. In the

2:16

end, to forget is the

2:18

only freedom. Did I say like

2:20

a dude who's like, you know,

2:22

on the movie trailers? Yeah, you

2:24

could be in Netflix.

2:26

There's another job waiting

2:28

for you. So would you watch

2:31

that? Would you watch what I

2:33

just described? To uncover the truth.

2:35

Yeah, it sounds gripping. Good

2:37

story. Yeah. Question two,

2:39

the grain, the device that allows

2:42

Liam and Blackmear to replay his

2:44

entire life. What do you reckon?

2:46

Do you think this is a

2:48

good idea? Think you would buy?

2:51

Rewinding your memories to uncover

2:53

the truth about your marriage.

2:55

I mean, sounds like a

2:57

disaster waiting to happen. I

2:59

don't know that it's that

3:01

different to what we're doing

3:03

now, carrying mobile phones around,

3:05

but maybe I'm wrong. Fair, fair. Well,

3:07

today's guess I've been speaking

3:10

to is Dan Sorocher. He's

3:12

the founder of a company

3:14

called Limitless AI, and Limitless

3:16

have created something they call

3:19

the world's most wearable AI.

3:21

It's a magnetic brooch called

3:23

the Pendant. And according to

3:25

their website, remembers what you

3:28

say throughout the day from

3:30

in-person meetings, impromptu conversations and

3:32

personal insights. Personal insights.

3:35

Wow. That's amazing. Yeah, so many

3:37

questions. What's not to like? Now

3:39

in fairness to Dan, Pendant, unlike

3:41

the grain of black mirror, doesn't

3:44

film what you see, it only

3:46

records what you hear, and he

3:48

takes issue with the black mirror

3:50

comparison, but we'll get into that.

3:52

Okay, well I'm desperate to hear what

3:54

he has to say. Should we go

3:56

straight into it? Yeah, so let's do it.

3:59

Here he is Dan. broker of limitless AI?

4:01

My 20s, I started to go deaf.

4:03

I have a genetic condition called odosclerosis. And

4:05

when I tried a hearing aid, it was

4:07

magical. And it was magical for a few

4:09

reasons. One, I could hear again, which is

4:11

great. But the second, which is this big

4:13

insight I realized, is my hearing had gone

4:15

so bad, so gradually, that I didn't even

4:17

know what I was missing. For example, the

4:19

first time I turned a faucet on after

4:21

getting a hearing aid, I was like, in

4:23

shock, like, that makes sound, this high-pitched sound.

4:25

What's that racket? Yeah, if you didn't know

4:27

that faucets make sound, get your hearing checked.

4:29

But yeah, you turn a faucet on and

4:31

it makes this high-pitched sound. And like in

4:33

that moment, I had this epiphany. We are

4:35

living our lives with biological limitations, and we

4:37

don't even realize we're there. Now 12 years

4:39

later, for ways that technology can augment human

4:41

capabilities and give us superpowers. And that's what

4:43

it feels like to overcome your biological limitations.

4:45

That led me to memory. 90% of memories

4:47

are forgotten after a week, gets worse as

4:49

you get older, typically people's memory peak when

4:51

they're 20 years old, and then every year

4:53

they're after it's worse. So that's what we

4:55

mean by overcoming your biological limitations, your mind's

4:57

limitations, is to do the same thing a

4:59

hearing aid did for my hearing, but for

5:01

your mind, for your memory instead. So what

5:03

is the thing? The thing that you are,

5:05

for people who are not, who won't see

5:07

the video, you're wearing this little kind of,

5:09

it's about the size of a quarter-ish-ish? Yep.

5:11

That's kind of clipped, it looks like, magnetically

5:13

to your collar. That's right, yeah, it's clipped

5:15

magnetically to my collar. We also have a

5:17

necklace that goes with it. You can put

5:19

it on a little necklace, you know, jawbone

5:21

design here. And the idea behind the pendant

5:23

is it's a wearable AI device. You put

5:25

it on in the morning, you said and

5:27

forget it, I've worn it now every day,

5:29

every moment of every day for months, and

5:31

just like my hearing aid, I put it

5:34

on in the morning, I can't imagine living

5:36

a day without it. The same is probably

5:38

true for people with glasses. Could you imagine

5:40

going around your life without glasses? Maybe there's

5:42

some moments you would, but, and so the

5:44

same thing happens with the pendant. What the

5:46

pendant happens with the pendant happens with the

5:48

pendant happens with the pendant happens with it.

5:50

It could be as simple as a cute

5:52

thing your kid said. I have three young

5:54

kids and I capture cute things they say

5:56

all the time to maybe an action item

5:58

in a meeting or maybe you're listening to

6:00

a podcast like this one and a few

6:02

hours later think, what was actually that point

6:04

Dan was saying, what was actually that point

6:06

Dan was saying, what was the analogy he

6:08

was making? And now you can ask those

6:10

questions, anything that's on the tip of your,

6:12

if you've ever had this feeling of racking

6:14

your brain, like where was that, what was

6:16

that, what was that, what did I hear,

6:18

what did I hear, or you made me

6:20

connecting the dots, or you made me connecting

6:22

the dots, or you made me connecting the

6:24

dots, or you made me connecting the dots,

6:26

connecting the dots, or you made me connecting

6:28

the dots in your thoughts in your thoughts

6:30

in your thoughts in your past, or you

6:32

made me connecting the dots in your past,

6:34

or maybe connecting the dots in your past,

6:36

or maybe connecting the dots in your past,

6:38

or you made me connecting the So in

6:40

other words, it's effectively a listing device that

6:42

transforms your daily life into a searchable document.

6:44

Yeah, it's searchable, not just as searchable from

6:46

like a full-tech search, but really powered by

6:48

AI. You can ask questions of your past

6:50

in a way you can never do it

6:52

before. For example. Yeah, for example, you could

6:54

ask it, you know, I ask every morning,

6:56

actually, I've set it up to automatically give

6:58

me a push notification where I have this

7:00

prompt that says, better to be a better

7:02

co-worker, be a better parent, be a better,

7:04

you know, husband. Give me examples throughout my

7:06

day of ways where I could do that.

7:08

Give me, you know, one key takeaway. And

7:10

it'll give you these magical moments that you

7:12

don't even realize. You, you know, we all

7:14

can. Can you give me an example of

7:16

something that that the AI told you recently

7:18

that kind of judged you and said you

7:20

need to do this better? Sure, yeah. So

7:22

I actually put this out when I was

7:24

demoing this feature when we launched these daily

7:26

notifications, I ran it live on the recording

7:28

for the first time, so you could actually

7:30

even see my reaction when I see it

7:32

for the first time. And the advice you

7:34

gave me, the one concrete thing was, you

7:36

know, you could give more praise to your

7:38

team. And it gave three examples from just

7:40

that day, where in stand-up in the morning,

7:42

an engineer is talking about, boy, this is

7:44

really challenging, we overcame this, and I responded

7:46

with, okay, cool, and I moved on, where

7:48

I gave it specific language, say, hey, look,

7:50

if you had just responded with, that's how

7:52

I'm really challenging, you know, really appreciate you

7:54

putting the extra effort in, that would like

7:56

them as a leader, as a manager, those

7:59

moments of praise are things that I know,

8:01

I have a huge positive impact, thing as

8:03

a manager is a leader I could have

8:05

done better and I didn't know in the

8:07

moment I'm just moving on the next thing

8:09

I'm not paying attention where you know if

8:11

you're if you're watching a sporting event and

8:13

you know in American football somebody throws an

8:15

interception that quarterback is within minutes on the

8:17

sideline on the quarterback is within minutes on

8:19

the sideline looking at a printout of what

8:21

they missed what did they miss? Where was

8:23

that safety? They're just watching the game tape

8:25

on the iPad for your life and Many

8:27

times, the ones that hit home for me

8:29

are ways I could be more, pay more

8:31

attention as a dad, opportunities where my kid

8:33

says something and maybe missed it and, you

8:35

know, where I could go deeper. And so

8:37

yeah, it's become a key part of it.

8:39

It's a life coach for me, it's a

8:41

work coach. And until you have those concrete

8:43

examples, you don't even realize, I bet you

8:45

most of people listen to this, I'm doing

8:47

a pretty good job. There's nothing I could

8:49

do to improve. You'd be shocked if you

8:51

just look at one day of your life

8:53

and you could ask AI what you could

8:55

do better, you'd be shocked at what it

8:57

would come up with. And yeah, that's not

8:59

for everyone. Not everyone wants to be better

9:01

in life. But if you do, this is

9:03

magic. It feels like there's so bad. judgment

9:05

implicit and bad, well, you know, not everybody

9:07

wants to be better. Some people want to

9:09

be, you know, not very good. I would

9:11

say there's a spectrum of self-awareness and self-improvement.

9:13

And I'll give you a concrete example. When

9:15

we launched this feature, it pissed people off.

9:17

People got to push navigation in the morning

9:19

saying, how dare you? How dare you send

9:21

me a message? And you know, if that's

9:23

how they feel when they go back to

9:25

the conversations they've had in the conversations they've

9:27

had in the past, and it. Okay, maybe

9:29

we're not for them, but we would rather

9:31

lean into the people who are, you know,

9:33

wearing an aura ring. They're trying to be

9:35

better at what they do, and if they're

9:37

open to that, you know, this is, this

9:39

is not, most tech doesn't tell you ways

9:41

you can be better. Most tech ties to

9:43

play it safe. This is an example. We're

9:45

kind of leaning into play it safe. This

9:47

is an example. We're kind of leaning into

9:49

the edginess and the ability for us to

9:51

help people. How just it just because I

9:53

think the mechanics of this you know so

9:55

much of this is about design and how

9:57

it kind of lands on people as you

9:59

say when you launched a lot of it

10:01

made a lot of people angry so I

10:03

Let's say yesterday. I wear dependent all day,

10:05

the AI is surveilling, and then it comes

10:07

back, man, when Danny was hanging out with

10:09

his boys, he should have done this or

10:11

that, or he could have done this better,

10:13

whatever. I wake up, what does that push

10:15

notification look like? What is it called? Is

10:17

it like your daily self-improvement notice, and is

10:19

it, and how is that delivered? Yeah, so

10:21

you said by listening to the vice earlier

10:24

and surveil just now, and so you have

10:26

a connotation here that I just want to

10:28

address Which is, and I'll answer your question,

10:30

but first I just want to say, your

10:32

privacy and the privacy people you meet with

10:34

are very, very important. It is critical, critical

10:36

for our product to do well for you

10:38

and society at large. And I just want

10:40

to head that off, that this is not

10:42

a device that's not a device that's surveilling

10:44

you. There's no AI sitting there, like, you

10:46

know, twiddling as sons, oh, I wonder what

10:48

Dan's up to. This is a machine that

10:50

you're asking questions, there's no nefarious, nefarious, activity,

10:52

the things, the conversations you have with others

10:54

are shareable, you know, very often when I

10:56

meet somebody for the first time and I

10:58

say, hey, can I, can I wear this

11:00

thing? I'll capture the conversation, I don't have

11:02

to take notes, I'm happy to share it

11:04

with you afterwards. 100% of the time, people

11:06

are fine with it. I've had, I even

11:08

had a single person say to me, oh,

11:10

that's weird, I don't feel is because the

11:12

perception of a device like this is so

11:14

different than the reality, especially when I can

11:16

share the artifact. I can share the summary,

11:18

I can share the action names, I can

11:20

share the conversation, and that neutralizes it. Most

11:22

people get afraid of recording devices when it's

11:24

about control, when it's about cancelling me. Well,

11:26

what are you going to do with this

11:28

voice, you know, this conversation afterwards? And when

11:30

you tell them, look, is it just because

11:32

I care what you say, dude? I want

11:34

to know what you're going to say. I

11:36

want to remember it. I've got a kind

11:38

of a bad memory. It'd be really great,

11:40

you know, to be able to go back

11:42

to it afterwards if it's useful. And, you

11:44

know, maybe you say something seredipitously now that

11:46

I might not remember later. And I just,

11:48

I care about you and I want to

11:50

remember later, and I want to remember it,

11:52

and I want to remember it, and I

11:54

want to remember it, and I want to

11:56

remember it, and I want to remember it,

11:58

and I want to remember it, and I

12:00

want to remember it, and I want to

12:02

remember it, and I'll share about it, and

12:04

I want to remember it, and I want

12:06

to remember it, and I'll share about it,

12:08

and I want to remember it, and I

12:10

want to remember it, and I want to

12:12

remember it, and I'll share about it, and

12:14

I want to remember talking shit, and they're

12:16

like, one guy says, A, are you a

12:18

narc? But that's about it, literally, every single

12:20

moment, every single other day, for months, this

12:22

has not been an issue. But so since

12:24

we're here, because I was looking on your

12:26

website, and there's a lot about privacy and

12:28

how important it is, and how the lengths

12:30

to which you are going to kind of

12:32

ensure that this is kind of encrypted, and

12:34

nobody can hack it, etc. But. How does

12:36

that work? Because if you're wearing that, is

12:38

it on you as a wearer to be

12:40

like every time you have a conversation, every

12:42

time you walk into the supermarket, every time

12:44

you walk through the office to tell everybody,

12:46

hey, I've got this listening device, and everybody

12:49

cool with this? In other words, what is

12:51

the system or is there one? Because it

12:53

feels like right now it's relying on people

12:55

to be very... proactive and most people decide

12:57

that. Yeah, so we have a bunch of

12:59

something in our help center all about consent,

13:01

if you search for consent, we've got articles

13:03

and the thing in the app, we teach

13:05

users how to do this. And so I'll

13:07

say a couple things. One, there's a legal

13:09

bar for consent and then there's a much

13:11

higher moral, ethical, social, behavioral, social acceptance bar,

13:13

which is the one we want people to

13:15

adhere to. And the two magical things that

13:17

we have learned about that higher bar is

13:19

what I did earlier. I tell you why

13:21

I'm wearing it. It's because I want to

13:23

remember what you say. It's not about me

13:25

trying to cancel you. And I'm willing to

13:27

share it with you. Those two things I

13:29

think are the magic ingredients to social acceptance.

13:31

People are like, OK, cool. You're just an

13:33

honest person trying to be better, trying to

13:35

remember, and you're not going to lord this

13:37

over me. It's something we have a shared

13:39

artifact for. We have found that that actually,

13:41

we had all these technology solutions to this

13:43

problem. We thought about consent mode and all

13:45

the technology, and we have built a lot

13:47

of that, but we realize it's more of

13:49

a human thing. It's more of a the

13:51

way you introduce it thing, and that is

13:53

what unlocks the social acceptance. So when you

13:55

talk about those, there's a moral bar of

13:57

like just being straight with people. Yeah. And

13:59

then there's the legal bar. What's the legal

14:01

bar? So the legal bar is in every

14:03

country and state is different. But part of

14:05

the legal part is there's notion of presumption

14:07

of privacy. So if you're at a restaurant

14:09

or if you're at a sporting event, or

14:11

if you're at a sporting event, and you're

14:13

at a sporting event, and you're at a

14:15

sporting event, and you're at a sporting event,

14:17

and you're at a sporting event, take a

14:19

picture of me. So all the legal stuff

14:21

happens when there is a presumption of privacy.

14:23

In America, it's state by state. So in

14:25

America, you know, in one state, there's some

14:27

states that are called one party consent states,

14:29

where the person who's wearing it is the

14:31

party, they have to, they have to consent

14:33

states, where the person who's wearing it is

14:35

the party, they have to consent, and they

14:37

have to, and they have to consent, and

14:39

that's where the person who's wearing it is

14:41

the party, they have to, they have to,

14:43

they have to, they have a higher bar.

14:45

you can tell them, hey, you have to

14:47

tell them legally that you're going to record

14:49

them and get their consent. And so that

14:51

is, that's sort of the legal bar, but

14:53

again, that's much lower than the higher bar.

14:55

And the higher bar again is just. being

14:57

a good person. It's being a good person

14:59

and explaining the why behind the device and

15:01

you know most people you tend to think

15:03

I mean maybe you as you know you

15:05

interview a lot of new people most people's

15:07

worlds are pretty small they interact with their

15:09

family their coworkers their friends it's not like

15:11

a thousand new people you meet their coworkers

15:14

their friends it's not like a thousand new

15:16

people you meet every day so you have

15:18

a thousand new people you meet every day

15:20

so you have a thousand new people you

15:22

meet every day. This is why for most

15:24

people listening, we're like, oh, this is weird.

15:26

Why would I do that? Imagine a world

15:28

before Uber. You got in. And I told

15:30

you, look, I've got an idea for a

15:32

company. You're going to get into a random

15:34

person's car that you were matched with on

15:36

the internet. Random person. You never met this

15:38

person. You never met this person. You're going

15:40

to get in this person. You never met

15:42

this person. You're going to get in their

15:44

person. You're going to get this. You manually

15:46

ratche on the internet. within minutes, like that

15:48

would have been shocking. And the first time

15:50

you write an Uber ride, you're like, oh,

15:52

this is kind of neat. So you're all

15:54

your perception, your concerns, your fears go away

15:56

as soon as you try it. And the

15:58

same thing is the true of the pendant.

16:00

As soon as you feel that magical moment

16:02

of being able to go back to a

16:04

moment in your past and see the ways

16:06

you could have. been better or done better

16:08

or even just remember something that you would

16:10

have otherwise forgotten or that cute thing. The

16:12

first time your kid says, I love you,

16:14

that was one of the first things we

16:16

got from a customer is like, hey, I

16:18

was wearing this pendant with my one-year-old on

16:20

a swing. They turned around and they said,

16:22

I love you for the first time. They

16:24

were able to capture that moment. They were

16:26

able to capture that moment with a pendant.

16:28

They were able to capture that moment with

16:30

a pendant. They were able to that I

16:32

love you for the first time. They were

16:34

able to capture and they said, they said,

16:36

they said, I love, I love you, I

16:38

love you, I love you, I love you,

16:40

I love you, I love you, I love

16:42

you, I love you, I love you, I

16:44

love you, I, I love you, I love

16:46

you, I love you, I, I, I love

16:48

you, I, I, I love you, I love

16:50

you, I love you, I, No, the whole

16:52

world. Yeah. Whole world. And so are there

16:54

any, and I don't know if you know

16:56

the south top of the head because the

16:58

world's a big place, but like what, what,

17:00

if any, differences there are in terms of

17:02

what you can and can't do or have

17:04

to and have to not do in the

17:06

UK in terms of privacy consent, etc. Yeah,

17:08

I'm not an expert on the UK. I

17:10

have seen, though, that America tends to be

17:12

actually more restrictive and more legalistic here. And

17:14

even socially, it's like, even the word consent

17:16

is actually a very loaded word. There's a

17:18

lot of other connotations for the consent. There's

17:20

a lot of other connotations for the consent.

17:22

There's nothing to do with recording. So that

17:24

brings in sort of this baggage, this conception

17:26

that I do think creates a very high

17:28

perceived concern, but reality when you tried the

17:30

phone call. And that's totally fine. You just

17:32

have, oh, I won't record this call. It

17:34

doesn't do the beeping thing. It just records

17:36

the call. People realize, okay, cool. I'm going

17:38

to funk all. So it is, you know,

17:41

it is kind of this social, contractual artifact.

17:43

It's actually, the roots of it are in

17:45

government surveillance. It's nothing to do with me

17:47

capturing. It's the government. It's protecting you from

17:49

your own government. You know, that's why you

17:51

need one party consenting. So the government can't.

17:53

So the government can't surveiluse. I

17:56

want to talk about the business a bit.

17:58

So you, when did you launch this company

18:00

or what did you? start this whole thing?

18:02

So this company actually is five years old.

18:04

We started with a very different idea and

18:06

we've pivoted now a few times. We just

18:08

announced the pendant October 23. We're now just

18:11

delivering on these pendants. So it's been a

18:13

while since we announced it. built it, we've

18:15

mass produced it, we have a whole factory

18:17

set up in China, mass produced these things,

18:19

shipping them out now. So we've got thousands

18:21

of these going out, you know, every week.

18:24

Well, I was going to say, so how

18:26

many are like out in the wild and

18:28

like how far along in that kind of

18:30

trajectory are you and actually getting this out

18:32

in the world and kind of, because that's.

18:34

to prove the puddings in the eating, right?

18:36

Yeah, I would say we unequivocally have product

18:39

market fit and having now searched for that

18:41

in five for five years, it feels really

18:43

good to be able to say that. Oh

18:45

my gosh, were we out in the desert

18:47

looking for ways to get there? And this

18:49

is not my first rodeo, so my first

18:52

company. Optimizely, it actually got product market fit

18:54

very quickly. So I have not been used

18:56

to spending years and years of my life

18:58

trying to build something that people want and

19:00

failing. So finally, I think we've achieved that.

19:02

And you know, obviously there's degrees of product

19:04

market fit. This is not, you know, chat

19:07

TV that's got hundreds of millions of users

19:09

in a few months. But it's tapping into

19:11

something really human and we've been thrilled with

19:13

the results. You know, we've got thousands of

19:15

these out in the results. that this isn't

19:17

just a tape recorder. This isn't just a

19:20

tape recorder in the cloud. There's a lot

19:22

of products out there that you can use

19:24

voice memos. We sort of made this conscious

19:26

bed. If we build a product with this

19:28

premise that people are going to set and

19:30

forget it, they want to carry it and

19:32

wear it without having to annotate their day,

19:35

without having to turn it on and off,

19:37

it just sort of organizes your life for

19:39

you. That's something I didn't mention. It sort

19:41

of organizes everything you've said and heard into

19:43

this life. It's sort of organizes everything you've

19:45

said and heard into this. It's like, it's

19:48

everything you've said and heard into this. It's

19:50

like, it's everything you've said, it's everything you've

19:52

said, it's everything you've said, it's everything you've

19:54

said, it's everything you've said, it's everything you've

19:56

said, it's everything you've said, it's everything you've

19:58

said, you've said, you've said, and heard, and

20:00

heard, and heard, and heard, and heard, it's

20:03

everything you've said, and heard, it's everything you've

20:05

said, and heard, and heard, and heard, and

20:07

heard, and heard, it's But passively, it just

20:09

captures my day. And that premise has really,

20:11

really paid. I think people really value that.

20:13

They're busy. They want something that just can

20:16

help them be a little bit better and

20:18

we can do that. What's the gender breakdown

20:20

of people buying this thing? Is it mostly

20:22

dudes? It is mostly dudes. That is a

20:24

good question. I'm curious why you ask. It

20:26

does skew mail. And that might be a

20:28

byproduct of just social media. And like, you

20:31

know, I'm not on Instagram or other, I'm

20:33

mostly on Twitter slash X and that's now

20:35

skewed very mail. So I do think that's

20:37

more a byproduct of where we've been. We

20:39

had a couple of early adopters use us

20:41

and put videos on and put videos on

20:44

viral viral on Instagram. So maybe that's we're

20:46

getting more more of a more of a

20:48

gender-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a-a personal question. Are you married or have

20:50

a partner? I am married. Yeah, with three

20:52

kids. What does your wife think of this?

20:54

Great question. My very first thought when I

20:56

had this idea for the pennant many, many

20:59

years ago was fear. I was terrified. My

21:01

wife would hate this idea. My wife would

21:03

hate this idea. And so she's very honest,

21:05

very like she's my both my biggest fan

21:07

and I think most honest critic. It captured

21:09

conversations. And my fear was she would hate

21:12

it because now I would finally win arguments

21:14

and she would lose a few. And it

21:16

turned out it was the opposite. She, I

21:18

think, before I did, realized the power of

21:20

clarity, that conflict and miscommunication comes from misremembering.

21:22

Oh, no, you didn't say you're going to

21:25

do that. Oh, I did say that I

21:27

was going to do that. When you have

21:29

a record, we have a conversation, you can

21:31

go back to it. You have this magical

21:33

ability to remove ambiguity, to remove miscommunication. And

21:35

so she saw it long before I did,

21:37

and she's the most private, if she watches

21:40

it, she's gonna be, so she's the most

21:42

private person I know, she doesn't want me

21:44

to talk about her, but like that was

21:46

actually the first people who I talk about

21:48

this idea and kind of lit me up,

21:50

actually maybe this is better for relationships in

21:53

society. Well, I was gonna say because to

21:55

your point, often conflicts are, well, you said

21:57

this, like, You know, being right doesn't mean

21:59

you're always, quote unquote, have won the argument.

22:01

And so I'm just wondering like how this

22:03

fits, like, have you used this to solve

22:05

conflicts in your own life? Because there is

22:08

that kind of, let's call it a misalignment

22:10

of like the record around what was actually

22:12

said or not said. I will give you

22:14

a very concrete example. I tweeted this out

22:16

six days ago. Me. You said you would

22:18

do your homework right after dinner. My six-year-old.

22:21

No, I didn't. me. Do you want me

22:23

to play back what you said from the

22:25

pendant? Six-year-old. Okay, fine. I'll do my homework.

22:27

So, one example, you know, of getting your

22:29

kid to follow through and commitment when there's

22:31

no ambiguity in what was said. You know,

22:33

that was a small example, but I do

22:36

think, you know, he should know that when

22:38

he says something, his word should be impeccable.

22:40

It's, he can't just make it up and

22:42

change it after the fact. The pendant can

22:44

help hold him. And how much money have

22:46

you guys raised? With 33 million from great

22:49

investors, first investor with Sam Altman, before he

22:51

became Sam Altman, although he's always in the

22:53

same Altman. Then first round capital did our

22:55

seed round, Arguess Pre-Seed, then Andreessen Horowitz, at

22:57

our seed, and NEA led our series A.

22:59

Do you think you're ever going to get

23:01

past the black mirror effect? Because I mean,

23:04

I'm sure you've been asked a version of

23:06

this question a billion times, but I'm and

23:08

I'm sure you've seen the famous episode I

23:10

think it's in the eye that's in it's

23:12

a recording device in that someone's eye and

23:14

they their whole life is basically one long

23:17

video recording the guy goes back sees there

23:19

is a little something between his wife and

23:21

a friend at a dinner party and that

23:23

leaves a bunch of questions and ultimately comes

23:25

out. She's cheated and he ends up like.

23:27

alone depressed. It's all very dystopian because he's

23:29

like, I can see the evidence. Anyway, it

23:32

gets to that idea that I think, you

23:34

know, a lot of companies out here, there's

23:36

a question of like, well, can. we build

23:38

this thing? The answer is often yes. The

23:40

next question is, should we build this thing?

23:42

Do people want it? Or is this going

23:45

to be too creepy? Whatever it may be.

23:47

So how do you guys think about that

23:49

kind of social piece? Is this a thing

23:51

that people are actually going to want? Beyond

23:53

like a very small core of enthusiasts. I

23:55

will tell you with 100% certainty this is

23:57

the future I have felt the power 100%

24:00

certainty there will be millions if not billions

24:02

of people on this planet wearing this or

24:04

one of our competitors in years to come.

24:06

Why are you so convicted? Because of the

24:08

fundamental inside I have, have now worn it,

24:10

is that we all live our lives with

24:13

blinders on, we don't even realize it. Once

24:15

you feel what it's like to go back

24:17

to a moment, you've had from your past,

24:19

that you didn't realize you were living life

24:21

to its fullest in the moment, you can't

24:23

go back to its fullest in the moment.

24:25

I cannot go back. I cannot realize you

24:28

were living life to its fullest in the

24:30

moment. I cannot go back. I cannot imagine

24:32

a single life to its fullest in the

24:34

moment. You can't go back. You realize you

24:36

are living life to its fullest in the

24:38

fullest in the moment. You are living life

24:41

to its fullest in the moment. You can't.

24:43

You can't. You are living life to its

24:45

fullest in the moment. You can't. You can't.

24:47

You are living life to its fullest in

24:49

the moment. You can't. You're its fullest in

24:51

the moment. You can't. You are living life

24:53

to its fullest in the moment. You can't.

24:56

You can't. You can't. No, you could argue,

24:58

OK, is that good for my brain? Is

25:00

that to be outsourced? But the same thing

25:02

is I don't know people's phone numbers anymore.

25:04

I just type in their name. That's fine.

25:06

Whatever. So I'm convinced with 100% certainty that

25:09

millions, if not billions of people will wear

25:11

this. It's early. And every new technology goes

25:13

through these phases of these phases of people

25:15

will wear this. It's early. And every new

25:17

technology goes through these phases. This idea that

25:19

the wife captured these these cheating affair with

25:21

you know without this and then lied about

25:24

it where it's like if you know in

25:26

a world that there's a track record just

25:28

like you know with my wife now I

25:30

don't I know that hey if something was

25:32

said I have a record I can go

25:34

back to it. I don't lie I never

25:37

lie I know that there's an impeccable truth

25:39

behind it it changes your behavior in a

25:41

positive way not you know you don't gas

25:43

light people if you've been gas lit you

25:45

can go back and say hey hold back

25:47

and say hold on what actually So I

25:49

think the benefits to the person wearing it

25:52

are clear. The benefits to society at large

25:54

is clear. But it's weird. I know it's

25:56

weird. I'm not I'm not immune to that.

25:58

It just like it sounds like what you're

26:00

saying is like on this idea that millions

26:02

and eventually billions of people are going to

26:05

wear this thing. What you're talking about is

26:07

because it's going to in that part of

26:09

that bargain is changing kind of the fundamental

26:11

nature of humans and how they interact. Yeah,

26:13

fundamentally. It helps you in a way. This

26:15

is like glasses and hearing. It's like, you

26:17

know, you could live your life without glasses,

26:20

but why would you? I could live my

26:22

life without perfect memory, but why would I?

26:24

And I think the thing that was a

26:26

surprise to me was how clearly the disconnect

26:28

between perception and realities once you wear it,

26:30

and the other major surprise, I didn't know

26:33

how having wearing it, the whole team wears

26:35

it. I was expecting maybe half my conversations

26:37

to be awkward, and it's every time I

26:39

have to like, I have to like, I

26:41

have to like, I have to like, I

26:43

have to like, like, like, like, like, like,

26:45

like, like, like, like, like, like, like, like,

26:48

like, like, like, like, like, like, It completely

26:50

neutralizes that. So once I had seen that,

26:52

I really, okay, the bar to this being

26:54

accepted isn't that one-to-one kind of awkwardness. It's

26:56

people getting over their own. perceive concerns just

26:58

like the Uber. You try Uber one time,

27:01

you get over the fact that it's a

27:03

random guy from the internet. And that same

27:05

thing is true of this pendant. So, you

27:07

know, I hope it's us, I don't know

27:09

if it's us, I'm sure at some point

27:11

Apple and Google and they're all going to

27:13

make these things. Right now, the beautiful and

27:16

they're all going to make these things. Right

27:18

now, the beautiful and they're all going to

27:20

make these things. Right. Right. Right. And they're

27:22

all going to make these things. They're all

27:24

going to make these things. They're all going

27:26

to make these things. They're all going to

27:29

make these things. They're all going to make

27:31

these things. They're all going to make these

27:33

things. They're all going to make these things.

27:35

They're all going to make these things. They're

27:37

all going to make these things. They're all

27:39

going to make these things. They're all going

27:42

to make these things. They're all going to

27:44

make these things. These you know, we've built

27:46

a new category of technology that people realize,

27:48

hey, there's some utility here, and I can't

27:50

imagine life without it. Feel free, if this

27:52

age is like milk, I would be very

27:54

surprised. Whoever's listening, if the year is 2035

27:57

and there are millions of people wearing devices

27:59

like this, maybe not from us, maybe from

28:01

our competitors, feel free to send me in,

28:03

you know, hey Dan, this age like milk,

28:05

your prediction was wrong, your 100% confidence was

28:07

ill- I just think the, you know, to

28:10

your point, there is, you know, who would

28:12

have thought that, you know, 25 years ago

28:14

when the internet first arrived that all of

28:16

a sudden we'd be walking around with super

28:18

computers that we're just obsessed with and staring

28:20

at all day in our pockets. That then

28:22

would have seemed like an impossibility, you know,

28:25

because it was like a big deal when

28:27

Bill Gates said, you know, we're gonna have

28:29

a desktop on every and everybody in every

28:31

home. People like, no, that's crazy. So I'm

28:33

willing to accept a premise that, you know,

28:35

we don't know how things are going to

28:38

go. But I think the fundamental issue around,

28:40

oh, just like, you know, it's a bit

28:42

like CCTV in the UK. So maybe this

28:44

is an apropos example, you know, CCTV is

28:46

literally everywhere in the UK. You know, there's

28:48

like the famous saying, like, if you want

28:50

to go through central London, there's never, there's

28:53

almost nowhere where you can go where you're

28:55

not on film. But this

28:57

fundamental issue around consent and just relying

28:59

on people to be good people about

29:02

it and to be really assiduous about

29:04

it in every time they go and

29:06

meet a new person, every time they're

29:08

going out on a date, whatever, they'll

29:11

be like, hey, do you mind if

29:13

we record this conversation? But I'm only

29:15

doing this because I think, you know,

29:18

I care about you. I just don't

29:20

think that's realistic. But maybe I just

29:22

don't have enough faith in humanity. Yeah,

29:24

maybe. I don't think it's a faith

29:27

in humanity thing, because I would even

29:29

argue that, what's the worst that could

29:31

happen? Like, honestly, people have, there's a

29:34

psychological trait, people have, that the whole

29:36

world rolls around them. Like, what you

29:38

say really doesn't matter. You know, honestly,

29:40

what if somebody posted on the internet,

29:43

like, I'm not if I'm not cynical

29:45

about it, but I do think they

29:47

like to think they're more important than

29:49

they really are, and they like what

29:52

they say is super important. and your

29:54

words are going to be canceled, and

29:56

AI could have made it. AI automatic

29:59

generating voice clones are just as good

30:01

as humans. So honestly, even if we

30:03

don't... We're going to live in a

30:05

world where like what you say doesn't

30:08

really matter like it almost like you're

30:10

yelling into the into a wind that

30:12

nobody really hears you. So I just

30:15

think there's so much it's a little

30:17

bit of I think a Luddite mindset

30:19

that like oh like you know. And

30:21

it's also not the government. It's not

30:24

sort of government surveilling you. And that

30:26

to me is much more scary than

30:28

your friend who just wants to sincerely

30:30

remember your conversation with you. I much

30:33

more trust my friends than the government

30:35

with CCTV. And the fact that CCTV

30:37

even, that kind of, that would surprise

30:40

me more that societies would be acceptance

30:42

of that, than individuals who are just

30:44

trying to be better and remember what

30:46

you say. So maybe I'm wrong. Who

30:49

knows? I'll just say just say. This

30:51

is the worst this technology will ever

30:53

be. Every day it's getting better. Soon,

30:56

by being able to capture what you

30:58

say, there's a fundamental thing that I

31:00

didn't mention that I think is important.

31:02

LLLMs are getting smarter and smarter, but

31:05

still know nothing about you. You know,

31:07

maybe they remember your chat history, but

31:09

like they don't know the conversations you

31:11

have. Soon, independent will enable you to

31:14

basically draft emails exactly like I would.

31:16

So instead of me having to write

31:18

from a blank piece of paper, just

31:21

drafts the email just like I would.

31:23

So instead of me having to write

31:25

from a blank piece of paper, just

31:27

drafts the email. Maybe after this conversation

31:30

will draft me and email, say, hey,

31:32

thanks for having me, we talk about

31:34

this, this, and the link you should

31:37

look, here, you should look, knowing, knowing

31:39

the context of what I've actually, Eventually,

31:41

it'll do the boring half of your

31:43

job for you. I'm a big believer

31:46

in LOM's augmenting human intelligence, not replacing

31:48

human intelligence, and eventually you should just

31:50

do agentic behavior identically to me. And

31:53

if we get to those use cases,

31:55

oh my gosh, then how can you

31:57

imagine living life without it? I don't

31:59

know what to do with that. There's

32:02

a lot to talk about. Can we

32:04

start with his enthusiasm? I don't think

32:06

I've ever heard anyone who's so... Completely

32:08

optimistic and I don't know what the

32:11

word is, you know, sort of self-driven

32:13

in the, in really, really bloody believing

32:15

in this thing. And he didn't seem

32:18

to be able to, even when you

32:20

were talking about the consent issue, see

32:22

the issue, see the problem. Well, the

32:24

thing that struck to me is he's

32:27

both totally optimistic. about, yeah, hey, just

32:29

ask nicely and it's all gonna work,

32:31

but also totally nihilistic as well. I'm

32:34

just being like, well, in the future,

32:36

there's gonna be so much AI and

32:38

nobody's going to know who's really saying

32:40

anything. So you're not that important. What

32:43

you say is a matter. Yeah, you're

32:45

just shouting into the wind. And actually

32:47

that's total nonsense. A lot of what

32:49

people say, a lot of the time

32:52

matters a lot. It might not if

32:54

you're just wondering around in your day-to-day

32:56

personal life, sure, but in the workplace,

32:59

it's hugely important a lot of the

33:01

time. People are dealing with really sensitive

33:03

conversations. Yeah. And the idea that it

33:05

just completely depends on the individual wearing

33:08

it to put their hand up and

33:10

say, would you mind? Well, this is

33:12

what I found so interesting. Yeah, because

33:15

he was like, look. We tried all

33:17

these different technological kind of fixes we'll

33:19

call them to kind of handle the

33:21

consent issue and Lo and behold we

33:24

couldn't figure it out or nothing quite

33:26

worked and so the solution we ended

33:28

up on was Hey person who's spent

33:31

$200 on this device anytime you meet

33:33

somebody just say how much you care

33:35

about them and you just are really

33:37

trying to be a better person and

33:40

that's why you want to record this

33:42

conversation Like that's just so wildly unrealistic,

33:44

it feels like to me. And the

33:46

fact that like, oh, we couldn't actually

33:49

crack this very, very hard problem, so

33:51

we're gonna shift the responsibility back on

33:53

the user in a way that's completely

33:56

informal and not trackable and not enforceable.

33:58

It just, I was just like, oh.

34:00

And very discreet. I mean, you could

34:02

argue, well, and he might argue, well,

34:05

if you were going to record somebody

34:07

secretly, you could do it any way

34:09

you wanted. You could. And of course

34:12

you could, but the fact is this

34:14

is actually deliberately a device that's made

34:16

to do that. It's made to capture

34:18

the whole of your life. And if

34:21

you think about Meta's glasses, they have

34:23

a little flashing light, because they record

34:25

everything as well, can do, but they

34:27

have a light. to show that they

34:30

are recording. You'd think that they would

34:32

have designed something on the little badge,

34:34

like a red light that went bing.

34:37

You would think, you would think. And

34:39

I also think it's just so interesting

34:41

that all, you know, this company is

34:43

not alone. There's a bunch of companies

34:46

trying to figure out what is there

34:48

gonna be the quote unquote, you know,

34:50

the AIA wearable, the thing that's going

34:53

to like supplant or at least become

34:55

a companion to our smartphones phones or

34:57

whatever. I mean his prediction that. In

34:59

2035, there would be billions of people

35:02

wearing an AI microphone. What do you

35:04

think about that? I can't, I can't

35:06

say it happening. Can you? But I

35:09

do find it fascinating that he got

35:11

money from Sam Altman before he was

35:13

Sam Altman. Yeah, yeah, yeah. Because actually

35:15

that when I heard him talking, that

35:18

was my question. I thought, wow, Danny's

35:20

got an interviewed Tigger. I don't know,

35:22

but he's got serious people behind him.

35:24

Yeah, he's got real money. Real money

35:27

and real people. Yeah, totally. And when

35:29

I reported out a story on this,

35:31

so I talked to a few people

35:34

and I found another company who he

35:36

referred to that didn't make into the

35:38

cut. They're called Friend.com. And it's the

35:40

same thing. It's like a little circular

35:43

listening device, but you can wear it

35:45

as a necklace. And the founder is

35:47

22 years old. And they put out

35:50

this like 90 second ad. And it's

35:52

all young people and it like opens

35:54

with this like woman, she's hiking, she

35:56

reaches a peak of something and then

35:59

she just starts talking to like, she's

36:01

by herself, she's talking to nobody or

36:03

whatever. And then she says something and

36:05

then she gets a text. And the

36:08

text is from her listening device, her

36:10

friend. who has commented on what she

36:12

said and then she responds. And then

36:15

the next vignette is like somebody playing

36:17

video games and like the little friend

36:19

says, oh you're getting your butt kicked

36:21

or whatever. Ha ha ha ha. The

36:24

friend necklace. Yes. And then the third

36:26

one is this woman is on a

36:28

date and she was about to start

36:31

talking to her friend because I guess

36:33

you can kind of touch it which

36:35

activates it or says something or whatever.

36:37

And then she stops because she's having

36:40

such a wonderful time on this date.

36:42

And it's like this supposed to be

36:44

this profound moment. But the whole point

36:47

is, is like an AI wearable friend

36:49

that has a personality, it has a

36:51

name, and it listens to your life

36:53

because you're wearing it as a necklace,

36:56

and then sends you text messages kind

36:58

of throughout the day or whatever to

37:00

kind of have a conversation as if

37:02

your friend is with you. Do you

37:05

think Kim, the art generation maybe has

37:07

different views on privacy? I mean is

37:09

that you think one of the things

37:12

here, because the idea for us of

37:14

walking around and recording our entire lives

37:16

is terrible, but maybe if you're more

37:18

used to growing up in a social

37:21

media area where you document everything and

37:23

you carry your phone everywhere, it's less

37:25

of a deal. Your whole life is

37:28

mediated by a screen anyway, and so

37:30

the idea of like an AI friend,

37:32

for example, that's constantly talking to you.

37:34

And the interesting thing about the friend,

37:37

and then you're back to a little

37:39

bit less, but friend, but friend is

37:41

if you lose the device. You lose

37:43

everything. There is no backup. I.e. it's

37:46

kind of supposed to be like a

37:48

person. Oh, so it literally dies and

37:50

that's it. Basically, yeah. Well, that's one

37:53

of the things I like about technology.

37:55

That it doesn't do that. Oh, well.

37:57

But back to limitless, I just think

37:59

the, uh, I'm trying to figure out

38:02

how they're trying to sell it because

38:04

he's like... it's a life coach it

38:06

makes it's making me a better person

38:09

it's making me a better like worker

38:11

or whatever you're like that bit was

38:13

wild by the way what was that

38:15

about he goes through i mean i

38:18

don't think i think he uses it

38:20

because he's taken an optimistic in a

38:22

different way to the way most people

38:25

would use it Well, that's what I'm

38:27

saying. Like, is this a solution looking

38:29

for a problem? I mean, billions of

38:31

people, I just don't buy it. Like,

38:34

you know, millions of people, I don't

38:36

buy it. It's very easy to just

38:38

like it off. I can see why

38:40

a hundred percent recording all your life

38:43

could be really, really useful. You know,

38:45

like all of the work conversations you

38:47

have. I know sometimes. I speak to

38:50

sources and I would love to be

38:52

able to recall what they say, but

38:54

I don't because they don't want me

38:56

to and it's not ethical, but I

38:59

can see how, you know, or just

39:01

generally when you're engaging with your kids,

39:03

like I can see why it could

39:06

be useful. I just, it makes me

39:08

feel really squeamish. And yeah, just the

39:10

idea that like when he gave the

39:12

example of talking to his kid, and

39:15

it's like, do you want me to

39:17

call up God? and say, hey God,

39:19

what did it, what did little Timmy

39:21

say? But also the idea in an

39:24

argument with your partner that, you know,

39:26

when you go, well, you said this,

39:28

you said that, oh, well, let's get

39:31

the dependent out and check what you

39:33

really did. Say, ah, I mean, most

39:35

people know in that situation. Well, that's

39:37

what I was saying, like, even if

39:40

you win, you're gonna lose that argument,

39:42

I think you know, be right. But

39:44

you're going to be sleeping on the

39:47

couch, my friend. Life is a bit

39:49

more subtle. I found him quite endearing.

39:51

It's like boundless enthusiasm for this idea.

39:53

Well, the other thing I would say

39:56

is that he has a previous company

39:58

optimisely. Like, I think that when you

40:00

talk about, you know, getting backing from

40:03

Sam Altman Andrews and Horowitz and other

40:05

companies, like, nothing will get you funded

40:07

quicker than a previous success. So. His

40:09

other company did well. This is his

40:12

next go around and so again it

40:14

kind of just it's a little insight

40:16

into how this place works Silicon Valley

40:18

is that even if your idea is

40:21

totally loony tune if you shown that

40:23

you can do this if you've, you're

40:25

very likely to get money. People get

40:28

their wallets out. Yeah, exactly. Maybe it

40:30

is something that if everybody wore one

40:32

and if there is that universality to

40:34

it that he thinks there's going to

40:37

be, I'm looking forward to you going

40:39

back to him about the milk. I

40:41

think, honestly, I think it is going

40:44

to age like milk. But if you

40:46

knew everybody was wearing one, then it

40:48

would make you act in a very

40:50

different way, wouldn't it? Well, that's what

40:53

he's saying. He's like, you know, basically

40:55

everybody would be different. And you're kind

40:57

of like, okay, this is, we're doing,

40:59

we're trying to do too much here.

41:02

We're not changing humans. No. Because all

41:04

of a sudden, oh, no one's gonna

41:06

gas light, anybody anymore. And no one's

41:09

gonna lie. And you're like. Yeah, so

41:11

I but but again, it's a little

41:13

insight into the kind of that uber

41:15

optimism That often you find in this

41:18

place He's an awesome. I think he

41:20

used to be based out here, but

41:22

um either way So I thought it

41:25

was a kind of a fun conversation

41:27

Obviously, it was a great conversation Thank

41:29

you for sharing it. So you're not

41:31

gonna go get you're not gonna rush

41:34

out and get one. That's what you're

41:36

saying not quite yet. Maybe when everyone

41:38

else has one in ten years time.

41:41

How about you Did you give you

41:43

one? No. No. Well, he did. He

41:45

encouraged me to like, you know, have

41:47

a go. But I'm a real late

41:50

adopter of everything, basically. Are you? Yeah.

41:52

When did you get an iPhone? Oh,

41:54

late. I didn't get my first cell

41:56

phone, mobile phone, until I was 25?

41:59

At all. Not even like a dumb

42:01

phone. No. My first Nokia brick was

42:03

in my mid 20s. Wow. Yeah. My

42:06

best friend told me the other day

42:08

she's still got her AOL account. Yeah,

42:10

see that's the thing. If you have

42:12

an AOL account, you... There's a lot.

42:15

You need to be conscious of what

42:17

that says about you. I know. It's

42:19

like having a hot mail account. Or

42:22

driving a really old car. That's fine.

42:24

Before we go, a very important bit

42:26

of news to cover. What's happened with

42:28

the tadpoles? Oh, the tadpoles. They're almost

42:31

growing legs. Weren't they almost growing legs?

42:33

Two weeks ago? I thought they were

42:35

in it so now that was poo.

42:37

Put that on a t-shirt. So I

42:40

now put little bits of bacon into

42:42

the tank and they go for it

42:44

like devils. Oh wow. It's horrible. What

42:47

are you going to do with all

42:49

these frogs once they're frogs? Well, while

42:51

you are starting your coding company, I'm

42:53

going to start my frog selling business.

42:56

Oh, right, right, right, right. Massive to

42:58

large. Yeah, yeah. Millions or billions of

43:00

people will have frogs in the future.

43:03

Everyone in Southeast London.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features