Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
While you're listening to this podcast, you're probably
0:02
doing something else too. It's cool. We get
0:04
it. When you're having conversations with your customers,
0:06
the same is probably true for them. They're
0:09
messaging their teams, they're mentally planning date
0:11
night, so growing conversations beyond the moment
0:13
can be challenging. So HubSpot
0:15
helps you go beyond the moment connecting you and
0:17
your teams giving you access to the same
0:19
exact data and helping you see the
0:21
full customer picture. With powerful
0:23
tools that connect marketing, sales, ops, and
0:26
service, HubSpot's powerful CRM
0:28
platform powers you and your team to transform your
0:30
customers' moments into extraordinary customer
0:32
experiences. Learn how HubSpot can
0:34
help your business grow better at hubspot dot
0:37
com.
0:39
Bullprint to me is the best idea
0:41
I've ever come come up with and the most practical
0:43
idea I've ever come up with to address that.
0:45
And so from the outside perspective, it
0:47
appears to be health and wellness
0:49
and anti aging and whatnot. That's
0:51
all true.
0:52
but really it's a philosophical endeavor
0:55
in the future of intelligence.
0:58
I feel like I could rule the world. I don't
1:00
know where I could be, what I want to.
1:03
I put my own in it like a days
1:05
old son of my oldest travel never
1:07
looking back. We're officially
1:09
live. So Brian, I've been messaging
1:11
you for, like, six or
1:13
eight months now. But basically, I'll
1:15
give, like, a very brief background and
1:17
and you can kinda, like, tells a little bit more
1:19
because I and, neverly, I'll I'll miss something, but you started
1:22
a bunch of stuff. The most that way, the
1:24
the biggest thing is brain tree. think
1:26
you bootstrapped that. Right? I did.
1:28
So you bootstrap that, sold it for like something like
1:30
eight hundred million dollars to PayPal. You
1:32
guys also bought Venmo, which I think
1:35
is like the greatest acquisition. One of
1:37
the best acquisitions of all time because you bought
1:39
it for kinda nothing compared to what it
1:41
is now. And then you're you've done a
1:43
bunch of other things you've done on kernel, which is interesting,
1:45
but you and you have this fun that's kinda interesting.
1:47
But the thing that I started reading
1:49
is your new thing called BLUEPRINT, which the
1:52
the kind of the I'm kind of an idiot. And so
1:54
the stupid way of describing it is like,
1:57
you have your biological age, and then
1:59
you have your chronological age. Your chronological
2:01
age is just how many years old you and
2:03
then you have your biological, which is it measures
2:06
bunch of different things like your organs,
2:08
your blood, and you're basically trying to
2:10
reverse your biological age
2:12
faster than the chronological
2:15
age goes up which inevitably means you
2:17
live forever. I mean, is that is that basically and
2:19
you're blogging and, like, sharing everything along the
2:21
way. Is that right? Great job. Bryan,
2:24
what is your chronological age and what
2:26
is your current biological age? I
2:28
left my mother's womb forty five
2:30
years ago. And
2:33
biologically,
2:34
I'm a few hundred
2:36
different ages. And so you for example,
2:38
if you're looking at the age of your heart, you
2:41
can characterize the age
2:43
of the heart through a few dozen markers. You
2:45
can do the same thing with other parts of the body.
2:47
And so you're actually a collection
2:49
of some very large number of
2:51
markers because different parts of the body
2:53
age at different speeds, and then your
2:56
life choices and environment also
2:57
affects that. So I wanna
3:00
ask you all about this blueprint thing because I think
3:02
it's amazing. But can I ask you a few questions
3:04
about Bryan first? please?
3:07
So Braintree I mean, like, you know, you
3:09
guys are owned by PayPal now. Another competitor
3:13
of yours, I think, is is Stripe, which is
3:15
these are like high-tech companies, you know, pretty
3:17
complicated things. How on earth
3:19
do you bootstrap a business like that? I mean, I
3:21
think by like year three or year four,
3:23
you're doing like eight or nine million in revenue. I
3:25
mean, you kinda you guys kinda took off.
3:27
So I understand, like, how you're able to bootstrap it.
3:29
bootstrap it once you got to maybe ten million in revenue.
3:31
But how on Earth do you make something like that from scratch?
3:34
I
3:34
was I guess it started when I was twenty one.
3:36
I decided that I I
3:38
wanted to try to do something meaningful for
3:41
humanity. I grew up reading a whole bunch of biography
3:43
about people who had done things that kind of place,
3:45
and I admired
3:47
people who tried to identify the thing on
3:49
the horizon that was barely reachable
3:51
during their lifetime and they went after it.
3:53
And
3:53
at the age of twenty one, I didn't know what that
3:56
was, and I didn't know how I could do
3:58
it. And
3:58
so I thought, you know, given
4:00
my options said I might as well, I'll be coming career,
4:02
I'll make whole bunch of money by the eight thirty. And
4:04
then at that point, I'll try to go after something.
4:07
And so it was a naive contemplation of how
4:09
to go about doing things. I'd grown up in a
4:11
small town basically, you know, with
4:13
my grandpa on a farm, I didn't meet
4:15
the engineer until I was twenty
4:18
one or twenty two years old. It was
4:20
very much a a Farm Boy like,
4:22
Ray, you know, deep, really just community.
4:25
And so I did a bunch of startups
4:27
and I just accidentally fell into payments
4:30
because I I was
4:31
building a startup up, I struggling to pay my
4:33
bills, I had a child at the
4:35
time,
4:36
And I would do anything for money.
4:38
I applied for sixty jobs. Nobody even contemplate
4:41
hiring me. And
4:41
so I found this job to sell Credit Card
4:44
Processing Services door to door. And so I agreed
4:46
to do it.
4:46
And it was a hundred percent commission, and I became
4:49
the company's number
4:49
one salesperson in a in a matter of months
4:52
doing it part time while building my startup. And
4:54
so I just accidentally stumbled
4:56
into payments and learned there was this big opportunity.
4:58
PayPal had
4:59
grown up through the Internet, but they had stopped really
5:01
innovating for a couple years. And so
5:03
developers didn't have the tools they'd like. And
5:05
so I started Braintree. And
5:07
we landed a big deal early on with OpenTable.
5:10
They We're accepting credit cards
5:12
to increase the likelihood of a person would
5:14
show up or the reservation. But they
5:15
didn't wanna store the credit card data because they had
5:17
compliance issues. And
5:18
so we built out a custom solution for
5:21
them that allowed us to store critical
5:23
data on our side instead of them so they didn't
5:25
have the compliance, but still accept credit cards.
5:27
And so from scratch, we built this payment
5:29
system first for them that we expanded out to more
5:31
general merchants. And we
5:32
got a few customers like Airbnb, GitHub,
5:34
Uber, We helped Uber
5:36
do their no. The payment experience where
5:38
you get in the car, you
5:40
arrive your destination, you leave the car, no
5:42
exchange of payment information, no signing of
5:44
receipts, no printing of receipts. started
5:45
doing a few things like that, and we
5:48
really made our headway into high-tech
5:49
companies going very quickly that they prefer
5:52
to use our software. you just said a
5:54
bunch of things that were all super interesting.
5:56
First of all, you're kinda like Elon
5:58
Musk without the fame. You know, like,
5:59
you had your kinda payments he he
6:02
had x dot com and PayPal. You you had
6:04
your payments thing. Now you're doing like a
6:06
brain interface and stuff like that. You you you
6:08
do these moonshot project trying to live forever, that
6:10
sort of stuff. So I think you're you're a
6:12
fascinating dude. You said something about door to
6:14
door sales. And on the pod, we've talked about
6:16
this before, which is that, you know, our producer
6:18
who's not here today, because having a baby.
6:20
You know, he's Mormon and he did his mission
6:22
and we talked about, you know, what that's like.
6:24
We've talked about, you know, cutco and some of these,
6:26
like, door to door textbook companies where
6:28
it really breeds this like amazing
6:31
entrepreneur because you have to learn sales, you have to
6:33
be able to work hard, face rejection all
6:35
the time, you know, that sort of thing. and it's
6:37
like this write a passage. I think if you come out the
6:39
other side of that, you were successful at donor sales.
6:41
I would bet on you with any role in my
6:43
company if you're successful door to door
6:45
sales, but I've never done it. So I'm just
6:47
talking out my ass here. Is that accurate
6:49
in your view? And how you know, I guess,
6:51
how do you think about to door sales? And how did
6:53
you become the number one when
6:55
you don't seem like the most, you
6:57
know, charismatic, you know,
6:59
sales sales? Exactly. Yeah.
7:03
I mean,
7:05
I don't know if you had the kind of the same haircut
7:07
back then. But, like, I don't know. I don't know. What what what
7:09
did it for you? You're
7:11
not saying it.
7:12
That's funny. I mean, it's I
7:15
guess the, like, the
7:16
one thought on this, you know, my my
7:18
kids are
7:18
nineteen seventeen and thirteen.
7:21
And they're
7:22
currently going through these
7:24
important life decisions on what they study in school
7:26
and what they try to do. I'm doing
7:28
everything I can to help them focus
7:30
on
7:31
CS, math, and physics. Like,
7:33
these are the tools that you these are the
7:35
language that you wanna be fluent in. to
7:37
be architects in the future.
7:39
And in
7:40
many ways, my choice of
7:42
doing the door to door sales was
7:44
just a it
7:45
was my hacker attempt at
7:48
paying
7:48
the bills with the child
7:50
while I buy time before I start something
7:52
new. and
7:53
it was Adam desperation. It was like
7:55
I was seeking it out. And so
7:57
it was just a and as also a case, just
7:59
dealing with the reality
7:59
of my skills that had grown up and this
8:02
farm like community PayPal that just
8:04
didn't have any engineering background.
8:06
And so the the thing that
8:08
I enjoyed the most about the sales
8:10
was it's not doing a high
8:12
pressure sales tactics and it's not trying
8:14
to manipulate somebody.
8:16
It's not trying to perfect the skill.
8:19
It's about getting in and figuring out
8:21
the system.
8:22
Like, what is really going on? And if
8:24
you If
8:25
you jump into the role of payments in the year two
8:27
thousand seven when I started this, it
8:30
was defined by deep distrust.
8:33
that it was
8:34
a game where credit
8:35
card payments is a really expensive. And when
8:37
a when a business owner gets their credit
8:40
card monthly
8:40
invoice,
8:41
it's so complicated. They have
8:43
no idea what's going on.
8:44
And the providers make it even
8:46
more complicated in how they report things.
8:48
And
8:48
so it creates this opportunity for people to
8:50
be extremely deceptive and create high
8:52
commissions. And so and
8:53
look at that system, opportunity number
8:56
one, be
8:56
honest. be
8:58
transparent, be honest, and be trustworthy.
9:00
And then number
9:01
two is because there was so much
9:03
skepticism on this, businesses
9:05
didn't know how to differentiate
9:07
why should
9:07
I do this work with this company versus that
9:09
company? When in reality,
9:10
most companies were mostly the same.
9:12
It was it's very hard to differentiate payments.
9:15
So
9:15
two is making that known. So again, the
9:17
the customer has a very clear understanding.
9:20
And then
9:20
three, it's just being reliable and and
9:22
competent. Like, that, you know, that when
9:24
the customer tracks with you and your team, they
9:27
say, what an amazing
9:29
experience? And
9:29
so it was once you figure
9:32
out how the system worked, it was
9:34
very easy to solve. And so I would just walk
9:36
in and, like, the moment you walk
9:38
in the store, they can tell
9:40
you're
9:40
not a customer. by way dressed and maybe
9:42
the way you're walking or whatever, and they immediately
9:45
hate you.
9:46
And so you have to
9:48
overcome this animosity from the get
9:50
go And
9:51
so I would take out a hundred dollar bill and say,
9:53
I will give you this or one minute at
9:55
your time. And if you say
9:56
no to me, you can keep it. And
9:59
I they'd be like,
9:59
alright. Whatever. This sounds fun. What do
10:02
you want? And I would just
10:03
walk them through these basic principles, like,
10:05
here's what's going on. Here's what they're
10:07
doing. I'm really
10:08
no different than anyone else. You're just
10:10
gonna find something clean and transparent and
10:12
reliable with me. And
10:13
most people be like, okay. I just want
10:15
it to be done. Like, I don't want to deal with
10:17
any more deception. I don't want to have to change
10:20
again. I don't want these machine leases.
10:22
And so
10:22
it was it was really just again. It's system
10:24
deconstruction and reconfiguration.
10:26
And it
10:27
was the skill set that I tried
10:29
to build again and again through
10:31
every business I built, walking
10:33
into a new world trying to figure
10:35
out what is really going on Johnson how to be constructed
10:37
and then maneuver within it. Was
10:39
the early product just like
10:41
an agency where you were getting your friends to
10:43
help install these credit card processors?
10:45
Or, you know, what what was that early
10:47
v one of BrainTree? Because you said you're not
10:49
an engineer. What what did that look like?
10:51
Because this is pretty complicated stuff, it seems.
10:53
The first product was for OpenTable. It was just allowing
10:56
someone to make a reservation, put in their credit card
10:58
number, and have it stored. So
10:59
to the user, it appeared as if I was
11:01
entering my credit card information, in
11:03
the
11:03
OpenTable system when in
11:05
fact they're Internet into our
11:07
system behind the scenes. And so OpenTable and
11:09
who built it? I had a team of engineers
11:12
software engineers do it. And how'd you
11:14
fund that? I had made enough money
11:16
from selling this stuff door to door that
11:18
I could bootstrap it and hire them.
11:20
how how
11:20
much did you make roughly did you do
11:22
it for, like, a year or something like that to
11:24
cover the bills and all that, say, create
11:26
a stash Yeah. I did about eleven
11:28
months, and I remember at eleven month
11:30
mark, my my
11:33
portfolio of customers were generating
11:36
I
11:36
think it was, like, fifty nine thousand a
11:38
month
11:39
of revenue. And I
11:41
thought that's that's interesting. Right? Like,
11:43
I mean, I'm coming from this world
11:45
where my
11:46
family, we would decide
11:49
whether to spend our five dollar
11:51
family date budget on going through a car
11:53
wash. or,
11:53
you know, going to going and get
11:56
something in the restaurant. Like, we grew
11:58
up in so in such a frugal
11:59
environment. and
12:01
then seeing that it was like fifty nine
12:03
thousand dollars a month. Now I'd always been wanting to
12:05
build I was I
12:07
was not willing to trade my time
12:10
for money. you know, if
12:10
someone wanted to say I'll pay you fifteen dollars an
12:12
hour or two blank, I didn't
12:14
wanna make that exchange. I wanted
12:16
to say I am willing to take zero
12:18
for
12:18
an indefinite period of time
12:21
and
12:21
exchanged the opportunity to make a whole
12:23
lot more. And that
12:24
was true. Like, I didn't make any money till I was really thirty
12:26
four years old. I was the entire time
12:28
was working for basically zero. But
12:29
that's when I started seeing that what kind of money you
12:31
could make in payments on this residual revenue
12:33
basis? The
12:34
fifty nine thousand, that was what you were
12:36
getting as residuals. So that's what the company was getting after the
12:38
company was. And they were giving me a cut. Yeah. And so
12:40
you're getting a cut of that. And so you
12:42
you're saying these things where you're like,
12:44
I knew I didn't wanna trade time for money or,
12:47
like, I wanted to, like, do
12:49
the biggest, like, technological breakthrough, and I didn't
12:51
know what that was. So I first decided to make some
12:53
money. And by thirty, I'll have that figured You're
12:55
saying these things as a twenty one year old
12:57
most people don't know
12:59
or have the perspective or wisdom wisdom
13:01
to to think that way. That's pretty
13:03
profound. And you're also saying you grew up kind of like on
13:05
a small farm in a deeply religious community. So it's
13:07
not like you were surrounded by these other, like,
13:09
you know, by other technologists, or
13:12
business sort of like mentors. So
13:15
where is this coming from? Did you and even
13:17
this, like, hundred dollar bill trick, like, you
13:19
know, did you read, like, how do they either think and
13:21
grow rich or, like, what did you did you read any
13:23
biographies or books that, like,
13:25
change your way? Or how the heck did you do this as a
13:27
small farm boy to, like,
13:29
get this type of thinking in your Bryan?
13:31
I can
13:31
probably make up an answer. It
13:34
seems
13:34
to me right now. I have no idea. What
13:36
book see what biographies? You said you read
13:38
a lot of biographies. What were you reading that
13:41
changed your life? No.
13:42
Probably read over a hundred.
13:45
maybe even two hundred biographies at this
13:47
point. Like, for
13:47
example, I would go on deep dives
13:50
of trying to understand certain world
13:52
history events like World War two. And
13:54
so one issue buyer favorite is
13:56
a a gentleman that Dietrich Bonhoeffer.
13:59
He was trying to assassinate
14:01
Hitler and
14:02
he was deeply religious. You understand
14:04
world war two and Germany
14:07
and
14:07
nazism through the frame of this
14:10
individual and his plans to go about and
14:12
his observations about what other people are doing the
14:14
community. And so I found, like,
14:16
these these biographies provided
14:18
this back door on how to
14:20
understand events as they
14:22
were told to me in school. In school, you
14:24
have this highly compressed version of history of,
14:26
like, alright, everybody just got the same page of
14:28
bringing stand these big things that happened,
14:30
but you really miss out on the nuance. And we
14:32
all know how flawed
14:34
historical
14:34
accounts are because of
14:36
just the
14:36
nature of of humans and the
14:39
way people write
14:40
history. And so these
14:41
biographies help me start
14:43
to piece
14:44
together an understanding of
14:46
reality that was much more nuanced
14:49
sometimes contrary to
14:51
to primary narratives. And so it it
14:53
invites me to
14:54
always reject
14:56
the first narrative that's offered
14:58
and understand
14:59
it not for a factual statement
15:02
but for a
15:03
wishful attempt to be
15:06
understood, to be accepted.
15:08
You said
15:08
earlier, you said I wanted to make a certain
15:10
amount of money by age thirty. What was
15:12
your number? What was your target? Well,
15:15
on
15:15
the on the lower end, it was
15:17
seven million dollars. I had built
15:19
out my spreadsheet model and assumed a certain
15:21
rate of interest and basically said, if I
15:23
make a certain amount of money, this is an annuity that
15:25
will be good enough for my entire life assuming
15:27
I don't need capital to do anything.
15:29
Like, just if it's time, like,
15:31
I'm writing
15:32
or something. Then if I do
15:34
something in the world, I had mapped out
15:36
something like a hundred and fifty, three hundred
15:38
million
15:38
as a a
15:40
basis that would get me started
15:42
on that path.
15:44
Alright. And when today's episode is brought
15:46
to you by imperfect action hosted
15:48
by Steph Taylor. It's a podcast on
15:50
HubSpot's podcast network, the
15:52
audio destination for business
15:54
professionals. In perfect action is a bite
15:56
sized online marketing podcast for
15:58
business owners, so join Saf Taylor, she answers
16:00
all your business marketing questions that deep dives
16:02
into the nitty gritty of online marketing,
16:04
content marketing, social media marketing,
16:06
and marketing for strategy for
16:08
business owners. A few recent episodes include some
16:10
of the biggest mistakes you can make with your
16:12
launch. Another one is why growing your audience
16:14
feels so hard in two thousand twenty two, and
16:16
another one is five ways to make content
16:18
creation less consuming. So check it
16:21
out. It's called in perfect action. You
16:23
can look it up wherever you get your podcast.
16:25
When you sold when you were a
16:27
thirty four. Right? Yes.
16:29
And what
16:30
did you what were you able to walk away with? Were you able to
16:32
hit your your your the north end of your
16:34
target? Yeah. I get three hundred
16:36
million. What does that you know, you're
16:38
you're a religious
16:40
farm kid who doesn't know much
16:43
and and then in a matter of eleven or
16:45
twelve years, you're able to walk away with, you
16:47
know, north of three hundred billion dollars. What
16:49
does that feel like? And what do you do with that
16:51
money once it hits your account? You
16:52
don't seem like a victory dance kind of guy.
16:55
Wait a second. What's
16:57
true? That's true.
17:00
That's true. That's true. That's true. that
17:02
is
17:02
very true. It's
17:04
sobering because you
17:05
know it's bigger
17:07
than what you realize.
17:10
but
17:10
you don't know in what ways.
17:12
And so it
17:12
wasn't the case that I had a
17:15
long list of things I wanted to
17:17
buy and we're just waiting for this cash to come in. I don't think
17:19
I've spent any
17:19
money for a long
17:21
time. And I think
17:23
you're now looking back now, even my
17:27
most
17:27
aggressive expectations on how life would
17:30
change weren't even
17:30
close to how
17:32
significantly my reality would change over the years
17:34
with that event. Well,
17:36
changed? I mean,
17:39
your
17:39
your relationship with the
17:41
world fundamentally changes.
17:44
I mean, in
17:47
any relationship, there's power
17:49
dynamics of
17:50
wealth and
17:51
power and status and
17:54
age,
17:54
there's all sorts of things that that
17:56
shape human interactions. And
17:59
the it
18:01
creates a different entry point for everything because
18:04
now we all know this
18:06
from our
18:06
experiences. When you engage with people of different
18:08
powers of different levels,
18:10
it
18:11
changes the dynamics of your relationship with them.
18:14
Expectations, interests, rationale,
18:17
justification, it just
18:19
alters everything about I mean, I
18:21
remember
18:21
one of the one of the first stories that I
18:23
heard somebody shared with me is is Larry Bird.
18:25
I don't know if it's true. The second or
18:27
third hand, but Larry had a group of good
18:29
friends that he went back and saw. And
18:31
he
18:31
had just made money, I think, signing it for
18:33
the NBA, sat down to have dinner with
18:36
his friends. And Larry was like, you
18:36
know, I got it. Don't worry about
18:38
it.
18:39
And it's been so quick.
18:40
Great, Larry. It's amazing. Thank you so much.
18:42
Second time happens.
18:44
and
18:45
everyone's quiet. Assuming Larry's gonna pick
18:48
up the bill,
18:49
he does. Third time, there's just like a court,
18:51
Larry's gonna pick this up and probably take it somewhere
18:53
else. And so, Larry, from his perspective, just like,
18:56
if, you know, the fund was
18:58
removed because now instead of me going
19:00
through something, be generous with
19:02
someone I'm
19:03
now in this expectation, and so it deterred him
19:06
from wanting to interact
19:07
with people because there was this expectation on
19:09
him that anything he did
19:11
there
19:11
will be this expectation. And
19:13
so it just I think anyone who's
19:16
experienced fame or anything in the store, there
19:18
are these underlying dynamics of human attractions, which
19:20
are just a reality for everybody. And, yeah,
19:22
it
19:22
would have been helpful, I think, thinking back if
19:24
I could have spoken to somebody and if they
19:26
said, hey, like, let me just share five really important
19:28
things with you. on what it
19:30
means to have money and how you can best navigate this
19:33
because it it's taking me some time to
19:35
learn. Well, there's probably gonna
19:36
be like a hundred fifty thousand people that listen
19:39
to this you could be that mentor for them. So what
19:41
would you say? What are those five things? But
19:43
give us three at least. One
19:45
is
19:45
transparency of intent.
19:48
you
19:48
know, when when you're with somebody, it's very
19:50
important that you establish why you're doing,
19:52
what you're doing, and the roles you're gonna
19:54
play with each other. If
19:56
it's ambiguous, then it creates complications in
19:58
your relationship. And so
20:00
it's
20:00
unpleasant for anyone to be surprised in a relationship
20:03
of what somebody really wants in
20:05
the relationship. So
20:06
just transparency, we're doing this on these conditions.
20:08
Two, I'd say that money
20:10
is not a a resource that
20:12
is viable for necessarily for
20:14
the things that lie require
20:17
is
20:17
most valuable for the time
20:19
it creates that you can
20:21
solve problems with
20:24
money. So
20:24
utilize it wisely, not on
20:27
acquiring frivolous
20:29
things, but on solving fundamental problems
20:31
of time. And then
20:32
three is there's a weird
20:34
psychological relationship with it
20:36
where you are not that,
20:39
and
20:39
it is not you. and
20:41
to
20:41
have an identity independent
20:43
of that because it can
20:45
get very confusing if you don't maintain
20:47
those clear boundaries. What did
20:49
you so I know that I read that you invested
20:51
a hundred million dollars into your fund, which
20:53
you guys have invested in all types of cool stuff, and it
20:55
seems like you've had some really good outcomes. And then
20:57
you started a company called Colonel that we could talk about a little a
20:59
little bit. What did you do with the rest? Did you this is
21:01
a good question that Sean always asked. He goes, what do what do you do
21:03
with your money? You know, like, if you had a pie chart, like,
21:06
where would to be, like, just boring
21:08
index funds and bonds. It was Sean for a
21:10
long time. He was, like, having the crypto and
21:12
and having cash. What about you? It
21:14
depends on
21:14
what your objectives are. I mean, I think
21:16
good
21:16
advice for me at that point would have
21:19
been you are an entrepreneur. You're always
21:21
going to be an entrepreneur.
21:22
Cash is king. Like,
21:24
don't put your money anything that's gonna be a
21:26
liquid So
21:26
there's been times in the past couple of years
21:29
where
21:29
I desperately needed cash and it was not I
21:31
didn't have liquid levels that I wanted. And
21:34
so liquidity for entrepreneurs
21:36
is really important. Number two
21:38
is that the movement that our West
21:40
fund was like, okay. Wait. I did range of Venmo,
21:42
primarily a software engineering
21:44
objective within an established entry of
21:46
payments. I was moving
21:47
into science.
21:49
And the question that I was trying to solve
21:51
for example,
21:52
could we build a
21:55
global biological immune system? So we
21:56
all know that if a problem
21:58
in the world arises, and that problem
22:00
can be addressed by
22:02
software
22:02
engineers coding to their
22:05
computers to solve
22:06
the problem. we're
22:07
pretty good at that as a species. If
22:10
a problem arises in the
22:12
world that
22:12
requires the engineering of
22:14
biology, of atoms, and molecules, and
22:17
organisms, We're
22:17
not there yet. We don't have the ability
22:19
to deploy millions of people who
22:21
can just engineer biology at
22:23
a moment's
22:23
notice and solve problems. Like, is the
22:26
coral reef dying because the water's too
22:28
acidic. We need carbon
22:30
capture, like, we need, you know, whatever the the
22:32
problem is, as I wanted
22:33
to invest in companies that
22:36
would basically serve as the foundation
22:38
of building blocks for humanity of
22:40
building this infrastructure so
22:42
we could actually engineer
22:44
with the reliability atoms, molecules, and organisms. So
22:47
if I wrote this
22:47
in this blog post,
22:48
like, if, for example, a pandemic happened,
22:50
it would be amazing
22:51
if we have these capabilities
22:54
to
22:54
build up the biological infrastructure
22:56
detection,
22:57
vaccine creation, you know, remediation
22:59
and whatnot. And that actually had that actually was
23:01
true with Gingko Bioworks, one of my
23:04
first investments. ended up
23:04
working on the mRNA vaccine. But
23:06
it was just this
23:08
idea of like if we've
23:10
done
23:10
very well
23:11
mastering programming
23:14
bits.
23:14
We are
23:15
emerging now powerfully in the
23:17
engineering of biology. And so some of the companies like
23:19
we are doing synthetic biology engineering.
23:22
One company is doing their storing
23:25
information using DNA. So instead of a hard
23:27
drive made of the material
23:28
we're accustomed to, they stored on
23:30
DNA because that's Nature's Hard Drive. Another
23:32
company is doing Nanotech, building
23:35
these
23:35
structures atom by atom, like literally assembling
23:37
them like Lego's. And so the
23:39
companies and they've been successful. Like, some of
23:41
these are breakeven, some of these are positive, are
23:44
profitable. And so I wanted
23:45
to do this because branch
23:47
of Endo was was really nice to teach about
23:49
software engineering. I wanted
23:51
to understand science and
23:53
engineering of science. And so it
23:56
educational experience for me of getting deep in the
23:58
trenches with a bunch of PhD
23:59
entrepreneurs across the
24:01
range of all these different scientific
24:03
disciplines. How much do
24:04
you think I wanna talk about the health stuff in a second,
24:06
but how much do you think Venmo alone is
24:08
worth right now? You guys had bought that
24:10
for twenty eight million, I think. like,
24:12
only five only, like, five or six or
24:14
seven years into Braintree, like, pretty early
24:17
into Braintree's existing. Did you pay mostly
24:19
cash for that? I mean, how did you finance
24:21
that deal? what do you
24:23
think that that's worth now? because it keeps saying Braintree Venmo as if,
24:25
like, Venmo is as powerful or as
24:27
valuable as Braintree. I don't know the current
24:29
values. I mean, I I stole the company
24:32
several years ago. I know it's very viable, and
24:34
so it's brain tree. And so this was
24:36
a decision I made when when
24:38
I did sell range
24:40
of FM0 for eight hundred million. It was
24:42
a
24:42
decision of, okay, so I'm now thirty four years
24:44
old. I want to move on to this
24:47
next life. Accumulation of money
24:49
was not my objective. And I
24:51
could have
24:51
stuck with the company and we could have
24:53
made more even
24:54
more viable But
24:56
it was like, okay. So if I do this, it's three hundred million, that's a
24:58
good enough starting base to go off into
25:00
these other things. And because they were going to be
25:02
in the areas of you
25:04
know,
25:04
deep tech. I knew they
25:06
would take a decade or so to start. It
25:08
was just gonna be a long
25:10
startup process. And so and looking at
25:12
that, you know, prime of my ears. I thought, I'll
25:15
take this.
25:15
I'll have a go at this and
25:18
try to
25:18
do something meaningful there. But it was just a calculation of
25:20
time and and reconfiguring
25:23
my life towards that. And so you've
25:25
now, let's say, that's ten years ago,
25:27
ten eleven years ago. We're we're
25:30
basically eleven years and a hundred million
25:32
dollars in. And you probably at the beginning
25:34
had a bunch of things that were bets that you
25:36
thought were interesting might pan out predictions
25:39
maybe of what what the world might look like in twenty
25:41
twenty two. Can you as best as you can
25:43
give us the summary of, like, you know,
25:45
What were you right about? What were you wrong about?
25:47
And where do you think the the
25:50
big sort of like promise is
25:52
now? The
25:52
venture fund has been remarkably successful
25:55
especially a as a newbie into
25:57
this to this world, the number
25:59
of good
25:59
investments relative to bad investments is extremely
26:02
high. So I'd say we've done a new
26:04
remarkably well on the deep tech side
26:06
with synthetic biology, genomics,
26:08
Nanotech. On kernel, I think
26:10
we nailed the
26:12
technology selection.
26:12
Initially, there was The idea was
26:15
could we make brain
26:17
measurement
26:17
ubiquitous in
26:18
society? And we can pretty
26:20
we can measure almost everything about
26:23
ourselves in a fairly routine
26:25
way
26:25
except for our brains. We don't
26:27
have And it's going on It's been a kernel list.
26:29
People Listen is probably going to a kernel list. So, like, get in
26:31
I mean, I've just seen as, like, a helmet that you put
26:34
on like, you think something and
26:36
you could change, like, something on a computer
26:38
because of your brainwaves. So
26:39
wearables are a familiar concept.
26:41
We put this thing on our fingers or our
26:44
wrists. And it gives us data,
26:46
light, sleep stats,
26:48
respiration rate, heart
26:50
rate, cardiovascular, expenditure,
26:52
and exercise. Like, we get these this
26:55
set of data, and it's pretty easy to
26:57
acquire. And then
26:58
we can use this information to
27:00
help us understand our health and wellness. We
27:02
currently can't do that for our brain. So if
27:04
I have a question, am I
27:06
in the early stages of cognitive decline. Do I have
27:09
anxiety, if so, what kind of anxiety? Do
27:11
I have depression? Or what kind of depression?
27:13
Is my lifestyle conducive to states
27:15
of focus or not? what is
27:16
my emotional reaction to things? And
27:19
just really basic questions. And
27:21
most people think that their
27:24
self awareness is basically the sensor
27:26
system that captures their brain.
27:27
Like because I'm conscious and
27:30
because I can feel when I have a headache,
27:32
it
27:32
basically is a robust enough sensor
27:34
system. to do
27:35
it. And that's not correct. So much
27:37
happens in our brain that we are unaware
27:39
of, and there's so much data in our brain that
27:41
is informative what we wanna do.
27:43
And so this what we at Carl, what
27:45
we've done is we've built a neural imaging
27:48
helmet like you said, Sam. You just put
27:49
it on your head. It takes one minute to
27:52
set up It
27:53
uses light to
27:54
measure the brain activity. And
27:56
these activity patterns are
27:58
extremely
27:59
informative. So for
28:00
example, I did, I was a pilot
28:03
participant for a ketamine study. The
28:05
ketamine has been used for the treatment
28:07
of depression. We use it as an off label study with
28:09
healthy people. But the
28:10
question is, what
28:11
does ketamine do to your
28:14
brain? And, you
28:15
know, you, of course, someone can do ketamine and they can,
28:17
like, hey, Sam, how was ketamine? Like, I
28:19
don't know. Like in a different dimension, and I think I feel better,
28:21
but I'm not sure. But it it's kinda like, you
28:23
know, how was your sleep for the past week?
28:26
It's an extremely imprecise answering you going on the
28:28
subjective self assessment, your memory space, you
28:30
really know, and it's like it's a
28:32
disaster. And so this This
28:34
measurement system is basically meant to
28:37
standardize the
28:38
measurement of the brain.
28:40
And so This
28:42
part of challenge was building a device of
28:44
of identifying is there a technology and
28:47
existence that can be
28:49
built that makes brain
28:51
measurement mainstream. So everyone does it for
28:53
everything. And then
28:54
second is, can we find applications
28:56
for early markets? And so we build the
28:58
tech. We have
28:59
a few papers coming out. and now we're
29:01
in the product market fit,
29:03
finding the first application for the technology.
29:05
When you
29:06
so for the listeners, if
29:08
you just Google Brian Johnson blueprint, you'll see it. I don't what's
29:10
the URL? Is it just for the blueprint? What
29:12
is it?
29:13
Bryan Johnson dot co. What sorry,
29:16
John. One one thought of comment on this just
29:18
or to make this maybe an an attempt to make
29:20
an intuitive. When people begin experiencing
29:23
cognitive decline, this may be true what
29:25
happens with how with and talk case with
29:27
alcohol too. We we just did an
29:29
alcohol study. When
29:30
your brain is impaired,
29:32
your
29:32
brain compensates
29:34
for the deficiency. And
29:35
so you you can't pick up the
29:38
impairment. But you
29:38
can you can record it and identify
29:40
it. But
29:41
at a certain level of of the talk occasion,
29:43
your brain can no longer make up for that impairment. And
29:45
so that it reveals itself in impairment
29:47
and behavior. And so that is
29:49
true with somebody who may be experiencing
29:52
cough decline you may be
29:54
along the path of cognitive decline
29:56
and you
29:56
may say, I feel great, I seem great,
29:58
I'm moving great,
29:59
everything's great, but you
30:01
just can't pick it up. And so it's the
30:03
value of like, oh, wouldn't it be amazing
30:05
if I have
30:06
the ability to measure my brain on a
30:08
routine basis that
30:09
informs me of these things
30:11
that I myself cannot identify. And
30:13
wouldn't it be neat if everyone did it
30:15
and it was just incorporated into standard
30:17
of care across all things and how we
30:19
dealt with our mental health and wellness, you know,
30:21
all the above. Is
30:22
there anything people can do if they're in cognitive
30:24
decline? Or is it just sort of like, well, I've
30:27
measured this. Sad
30:29
news. Alright. you know, well, I
30:31
don't don't have a a way to sell Sean,
30:33
have you guys ever seen on twenty three and
30:35
me or something like that? They they used to have, like, an
30:37
all So I guess Alzheimer's is genetic and,
30:39
like, there's like a particular type of gene that you
30:41
could have that increases the likelihood that you're gonna
30:43
have it. And twenty three in me used to
30:45
do this thing, where they said, alright, you
30:48
have it. But before we could even tell you if you have it,
30:50
you have to sign this paperwork saying you're not gonna assume this
30:52
is a diagnosis and you can't flip
30:54
out. Well, I have it. And then, like, I did flip out.
30:56
And then they eventually removed that. I believe I
30:58
don't think they have it anymore because they said people were just
31:00
flipping out too much and one of the
31:02
reasons they removed it is they tell you that they you
31:04
have this and they're like, good luck.
31:06
Like, maybe put, like, frames. Like,
31:09
they're like, there's all these, like, I don't know if they said this, but, like, people were,
31:11
like, if you, like, have frames of
31:13
pictures upside down and so you have to, like, work
31:15
harder to, like, figure out who's in the picture, that's
31:17
gonna, like, help you, like, get a stronger
31:19
Bryan. But in general, when I was researching,
31:21
I was like, oh shit, I have this gene. We'll have to do it.
31:23
People were like, good luck. just like
31:25
I guess hopefully you'll be alright. But
31:27
there wasn't like that many things. So, yeah,
31:29
what what can you actually fix
31:32
any of these things? So we
31:34
are accustomed to idea of society having
31:36
engineering standards. So
31:37
we know that when we buy an appliance,
31:39
it's going to fit through our front
31:42
door. we don't have
31:42
to go by the right front door. Look at the dimensions of
31:44
the website. Like, it's just gonna fit or not because
31:46
we know that the
31:47
door size a standard, the appliance size a
31:49
standard, it can be moved into my house.
31:52
That's true for everything we you know, so many
31:54
things we do in life. We just know these
31:56
standards. When
31:56
we agree,
31:57
we build societies, we have
31:59
no
31:59
millions of invisible
32:02
standards. We have very
32:04
few standards about our brains
32:06
because we can't measure it. we
32:07
know that the timing that's appropriate for green
32:09
light, red yellow light and red light on lights
32:11
because we know the reaction time of humans to
32:14
lights. We know breaking power. We
32:16
know people stop times. So when
32:18
we
32:18
do have data, we can actually
32:20
determine that. We
32:21
do not have engineering standards
32:23
around
32:25
the brain. depression,
32:26
anxiety, often decline because we have
32:28
no measurement. And so
32:30
fundamentally, the way to
32:33
how could
32:33
we actually
32:36
create a
32:36
step change function change in the world and how we
32:38
deal with our minds? You
32:40
begin with measurement. And once
32:42
you have numbers, science begins with numbers and
32:45
counting. And then ecosystems form around
32:47
that. So genetics, I think, is
32:49
is kind of like that. It's not as numerical as
32:51
what the brain measurement could be, but that was
32:53
a fundamental thing is if you give everyone
32:56
the numbers, you get an opportunity to
32:58
build solutions around
33:00
those problems.
33:01
So this blueprint, your
33:04
your your blog, I don't know what you're calling
33:06
your experiment, It's pretty wild because, you know,
33:08
I saw what you used to look like.
33:10
You weren't bad looking, but you definitely were
33:13
thicker than you are now. Like, your
33:15
jawline is like crazy cut right now. And,
33:17
like, you just look way different than you used to
33:19
do. I mean, it's pretty fascinating. And, you know, Bryan Braintree
33:21
is amazing. You've built something amazing.
33:23
this blueprint thing is, like, way crazier and
33:25
and unique and odd, and it's awesome.
33:27
And, like, I've read it and you're
33:29
basically, if I remember correctly.
33:32
Like, I've been following it for a bunch of months
33:34
and you you do regular updates. And at
33:36
first, you were I think you got down to, like,
33:38
six percent body fat and you're, like, Okay.
33:40
I think six percent is a little bit
33:42
too low. Let's go to seven and a half percent.
33:44
And then you're also like eating this like
33:46
and nutty pudding, I think you called it. So you have
33:48
a vegan diet and you're like just eating, like,
33:51
nuts and, like, tons of vitamins and,
33:53
like, what appears to be not
33:55
tasty, boring food. Like, you've gone
33:57
all in and the the premise behind
33:59
this is I think you had this blog post that said, like,
34:01
late night Brian no longer gets to make decisions
34:03
or something like you know, you're you're you're at
34:05
home at night and you're hungry and you just would go and
34:07
snack and eat bad food and you're like, I don't know I'm no longer
34:09
letting that guy make any decisions. He
34:11
he no longer has say. We're letting
34:13
experimental brine make the decisions from now on. This thing's crazy,
34:15
man. What? Why are you doing
34:17
this? And what have
34:19
you found to be actually meaningful versus
34:22
not meaningful? And by the way though, you're
34:24
doing you're doing the knees over
34:26
toes guy. thing. I noticed. You're, like, walking backwards. You're doing the
34:28
tipialis raises. This guy was onto
34:30
something. You're doing it. It looked
34:32
like, I
34:34
am. Yeah. I mean, so so blueprint for me, this goes back
34:36
to the age of twenty one.
34:38
And to me, this is the
34:40
best answer I've ever come
34:42
up with my life. If
34:45
you
34:45
bait if you pose a question,
34:47
how can we
34:47
imagine the human race
34:49
and
34:49
intelligence generally
34:52
surviving
34:52
itself and thriving. Like, what is
34:54
our plan
34:55
as a species
34:58
to thrive?
34:58
blueprint
34:59
to me is the best idea
35:01
I've ever come up come up with and the most practical
35:03
idea I've ever come up with to address that. And
35:05
so
35:05
from the outside perspective, it appears
35:08
to be health and wellness and anti aging and
35:10
whatnot. That's all true. But really,
35:12
it's a
35:13
philosophical endeavor and the future
35:15
of intelligence. And so
35:18
the way this I this began in from
35:21
a basically, I had a problem
35:23
of overeating every day. Every night
35:25
at seven PM,
35:27
I would
35:27
overeat. I'd rather have a
35:29
second serving for dinner or a third
35:31
serving or have desserts
35:33
or do something that I would consider to be
35:35
self harm. Like eating too much food, the wrong food, and it was just
35:38
causing
35:38
bad things to me. I couldn't sleep well.
35:40
It was I was overweight, like, all the
35:42
above. And
35:44
so I tried everything to fix it and I couldn't. And
35:46
so I playfully said I'm going to
35:49
fire evening Brian because
35:50
Bryan brine
35:52
who wakes up in the morning, he exercises. He does really well
35:54
eating, staying with lunch brine. But
35:56
this five PM, ten PM brine,
35:58
like, Bryan, he
35:59
at PayPal? he's
36:01
always making the
36:03
wrong choice. Like, I
36:05
could absolutely rely upon him to make the
36:07
wrong choice. And he always had an
36:09
infinite number of reasons on why
36:12
today was okay to
36:14
do
36:14
the thing. And so I was like, you know
36:15
what? I'm having it. Like, he's done.
36:17
He's absolutely out And so I played through
36:19
with this podcast of, like, all Brian's got together. We had
36:20
a discussion when I Brian, even Brian, you're making this awful. And
36:23
so I just I revoked an
36:25
authority from five PM to
36:27
ten PM to eight. And so
36:29
what started off is, like, this playful thing now turned into
36:31
what I'm what I
36:32
basically have done to my my
36:35
entire system
36:36
is
36:38
now
36:38
only ate
36:40
what
36:41
my body
36:44
asks for according
36:45
to data and science. Have you had any splurges since you've started this? And
36:47
do you ever intend to do that?
36:49
Have had
36:51
any
36:51
fractions? Yes. although this is I
36:53
think this is I think
36:55
this is the most interesting part
36:58
of the entire thing. So
36:59
just to be clear, the
37:01
starting point. This is a big deviation from
37:03
how society is structured right now. Right
37:06
now, our
37:07
minds
37:09
have unquestioned authority.
37:12
in deciding what we eat. So if you think about your
37:14
daily life, you go to the store and you
37:16
walk down aisles, you're like, yeah, maybe this, maybe
37:18
that. You decide how much you
37:20
put on your plate, you decide if you go to
37:22
a restaurant, you're presenter at the menu, you decide
37:24
if you're going to
37:25
have a pizza party, you decide if you're
37:27
going to Doritos, Like,
37:28
you're making these decisions all the time. And it's a combination of how
37:31
you feel what you want. Like, you know, that you're
37:33
trying to be whatever, but you're basically
37:35
giving your mind
37:37
unquestioned authority to do it.
37:40
Blueprint
37:40
flips that and it says
37:42
my mind has zero
37:44
you what already authority,
37:45
my body has a hundred
37:48
measurement of my heart
37:48
and liver and lungs and
37:51
DNA methylation patterns. It directly
37:53
asks for
37:53
what it
37:56
wants. data, and
37:57
I can never override it. And
37:58
so in this idea,
38:00
so the
38:00
the thought experiment is, if
38:03
you could achieve
38:04
you could achieve health
38:06
and maintain perfect health. But
38:08
it but it required you to
38:10
accept basically
38:11
what an algorithm is doing to deliver
38:13
what you eat and when
38:15
you eat. do it? And
38:16
then in that
38:17
hot experiment, Sam,
38:18
and I've had this conversation
38:20
hundreds of times now, the
38:22
reaction people have is almost
38:25
this response. And their conscious
38:27
mind panics. And it
38:30
it's, like,
38:30
I I see it like a computer screen
38:33
scrolling that by infant number of questions. But, like,
38:35
what if Cheetos, pizza, party, seven to eight, like,
38:38
whatever. And then it like,
38:39
the mind is
38:42
panicking or
38:42
the contemplation of loss of control, that it
38:45
can't do the things, that
38:46
it thinks, that the
38:49
deadline says, the only
38:50
way I can be happy in existence is if I still
38:53
get to choose what I do
38:54
when I do it and the mind cannot
38:56
get over
38:56
it. They cannot just say,
38:59
whole tight. Like,
39:00
is it poly let me just
39:02
contemplate.
39:02
Is it possible that
39:04
I am a
39:05
self harm machine?
39:08
I cannot stop myself from committing self harm. I probably will never
39:10
be able to do it.
39:11
And if I keep on doing this, it's probably
39:13
gonna lead to a
39:16
predictable outcome. And so
39:17
it's such interesting interaction of rolling through this
39:19
thing of and I think it's really on
39:21
par if you say,
39:22
like, one of the major
39:25
things that have impacted humanity of is the earth the center
39:27
of the universe? Or, you know,
39:30
is
39:30
there an evolutionary force
39:32
creating all things on earth I
39:34
think this one could be on of
39:36
a
39:37
societal understanding that our
39:39
unquestioned granting of authority to our
39:42
conscious minds.
39:44
is at the
39:44
root of all of our problems
39:46
in society. And so the contemplation
39:48
here
39:48
is, if if I imagine this, could
39:52
I stop self harm from
39:54
happening inside of Brian?
39:55
Because
39:56
just like there's wars going on there's
39:58
all these tribal factions in
39:59
the world, same thing has gone inside
40:02
of me with my own body and my cravings and whatever else, I've
40:04
achieved goal alignment within
40:05
myself on
40:08
this program. And
40:09
so that's that's really what this whole thing is
40:11
about is is trying to
40:13
think through how could
40:14
I achieve goal alignment. And we hear
40:17
a lot about AI goal alignment humans and back in
40:19
an essential threat that I think the more interesting
40:21
starting point is not for me to look on the
40:23
other side of my eyeballs and say, let me find everyone
40:25
else who's got a problem in
40:27
the world. let me look at myself and say, what
40:29
is my own internal chaos and
40:31
war? And can I even try to resolve
40:33
conflict within myself? there's
40:35
a there's a part of me that's like, wow,
40:37
this is this is incredible. I'm
40:40
going to the blueprint site
40:42
looking at the the routine. I'm looking at the photo. I mean, you're completely
40:44
shredded. This is amazing. So there's part of this, like,
40:46
wow. This is incredible. And then there's a part of me
40:48
that's, like, you're doing the
40:50
thing a little bit where it's like, is this
40:52
mayo? No. He's calling it aioli or it's
40:54
like, oh, wow. This is not only just great
40:56
for your health. This
40:58
is transcendence. for the race in the society and I'm like, okay. Maybe
41:00
I could see how that's true, but it it I mean, it does
41:02
seem like you're not actually it's not that your
41:04
body's deciding your brain has decided
41:06
I'm gonna use data about what my body
41:08
wants instead of impulse, you know,
41:10
whatever my impulse driven, you know, brain was
41:12
trying to do before. But like, I guess regardless,
41:15
I look at I'm like, this is amazing, but man, you know,
41:18
this looks like a full
41:20
time PHA full
41:22
time effort plus Phd
41:24
level intelligence, plus a bunch of money to
41:26
be able to do this. What have you just said
41:28
his cost, by the way, his cost I think, say, like, three
41:31
grand a month. Like, it's not, like Well, I think it's three grand a month, but it's also, like,
41:33
the mental energy that you would have to put towards doing this
41:35
is, like, you know, the the real cost.
41:37
So but but
41:39
you did it in service of of other people too. So, like,
41:41
you know, what is the eighty twenty? What have you
41:44
discovered have been the the
41:46
highest leverage you know, changes. I know
41:48
it's in this, but say it out loud because not
41:50
everyone's gonna read the whole thing. So, like, what are
41:52
the the highest leverage changes that you
41:54
were able to make during this experiment? The
41:56
first shot
41:56
on this, maybe just a reflection. Why
41:58
is it
41:59
in
41:59
society?
42:01
Do we
42:03
the accept? this
42:04
ferocious system to invite everyone to
42:06
commit
42:06
self harm? Like, when you
42:08
walk into the
42:09
grocery store, I mean,
42:11
it is violence.
42:12
issue from violence
42:14
I'll outright violence through the representation
42:16
of advertising and ingredients and
42:18
sugar and you're in there and
42:20
you're supposed to be on on
42:23
equal footing with that. Like, no way we're outmatched. And the
42:26
same thing when we're sized up against
42:28
algorithms, it's a totally unfair match
42:30
in society we just
42:32
gleefully allow the self
42:34
harm. And so the individual is pitted
42:36
against algorithms and
42:38
capitalism
42:38
like good luck individual.
42:41
on trying
42:41
to keep your shit together. And so it's just
42:44
it's
42:44
an unfair thing, and I think it's a
42:46
it's just bad for everyone to
42:48
be in this game. in of, like,
42:50
the basics for people, it's really
42:52
understanding that trying to
42:54
win this game with willpower
42:57
is a
42:58
losing game. you yourself
43:00
in a situation where you
43:02
have option a and option b,
43:04
you're probably
43:05
going to lose fifty
43:07
percent of time or more. And that's
43:09
the whole thing I've been trying to a
43:11
blueprint is yes. It's expensive right
43:13
now. Yes.
43:14
Like, it's difficult. this always
43:15
happens with innovation. It's always
43:18
expensive and accessible and then time it gets
43:20
better. And that's why I openly blog
43:22
about all the things I'm doing is I'm
43:24
trying to get
43:24
this out so others upon think the most
43:27
important thing
43:28
that someone
43:30
can
43:30
do to win here
43:32
would
43:32
be to accept basic principle that it's
43:35
a system
43:35
that drives what you eat.
43:38
It's not
43:39
your decision making. But but what
43:41
about the the the, like, the specific true are
43:44
there any, like, truths or any, like,
43:46
hypotheses that you believe to be
43:48
true, like, for, you know, you're, like,
43:50
well, I even though you said you did a great analogy with the door frame, how
43:52
it's standard, and how bodies aren't necessarily
43:54
like that. But, like, has going
43:56
vegan, like, made a huge difference to you? Is there
43:58
anything about,
44:00
like, for example, a lot of us sit at chairs, you know,
44:02
eighty hours a week, staring at screens, and
44:04
we work really hard. Is there any, like, really
44:06
It's actually thirty hours a week.
44:10
doing this whole interview. Yeah.
44:12
Are you like, well, you know, like, thirty
44:14
hours is I I think in the what's the
44:17
the Israeli guy, he wrote the book, homo sapiens. In
44:19
one of his books, he was like, you know, like,
44:21
hunter and gatherers worked only thirty hours a week, and
44:23
that kinda seems like an ideal number.
44:25
Is there any Sapiens? No homo. Okay?
44:28
My bad. My bad. I'm
44:30
inclusive. So yeah. Is there
44:33
any, like, is there any that that you've discovered for
44:35
you that, you know, it maybe is not right
44:37
for everyone else. Yeah.
44:39
yeah I started
44:40
I got my pilot license several years
44:43
ago. And
44:44
in doing so, I was
44:45
assessing the risk of
44:47
death.
44:48
And one of
44:49
the stats that that stood out
44:51
to me was that over seventy
44:54
percent of
44:54
incidents in aviation
44:56
were
44:57
attributable to
45:00
amateur pilots. And so
45:00
while I went through the certification of every plane I flew, I was
45:02
typewriter in every plane and, you know, whatnot,
45:05
I refused to fly alone. because
45:07
I knew the risk of
45:10
error was just the math was there.
45:12
The stats were there. The same is
45:14
true with health and
45:16
wellness. I tried to do this on my own, you know, almost like
45:18
going around
45:18
this little bag, listening to
45:21
podcasts, reading books, reading
45:24
literature, and try to put little gems of insight into my bag and
45:26
try to piece together my
45:27
own protocol. It's
45:29
the same as trying
45:31
to airplane
45:32
by myself. Even
45:33
though I study it, I get typed in
45:36
it, my error rate just gonna be
45:37
very high.
45:39
And so And
45:41
then three is that the the value here in this conversation would
45:43
not be somebody feeling motivated
45:45
in this moment of doing
45:47
something good because tomorrow they're
45:50
going to fail. And it's also the value here
45:52
if not debating is a
45:54
vegan diet better than a carnivore diet.
45:57
To me, the the real
45:58
essence of this conversation, the only
45:59
way this conversation can be a value besides
46:02
of all the chatter went around the world of every you know,
46:04
everyone else is talking about this
46:06
is, one, Is
46:06
there a engineered
46:08
solution that actually solves
46:10
this from a certain perspective?
46:14
two, can
46:14
you do so with data? It
46:16
doesn't do anyone any
46:17
good to debate is carnivore better
46:19
than bacon. It's a
46:22
meaningless conversation.
46:23
data is the
46:24
only thing that matters. And so that's
46:26
why I publish all my data is,
46:28
like, I I mean, I am
46:30
vegan for ethical and moral reasons.
46:32
but I'm
46:32
not vegan because I think it's it's you know, I
46:35
didn't yeah. At Labor abstract, we looked at
46:37
the data. We're agnostic to these
46:40
inputs. And so that's really what I'm
46:41
trying to say. This is not a health and wellness gig. This is
46:43
not a diet trend. It is really trying to
46:45
get the structural formation of what it means
46:47
to be human our
46:50
relationship with food, our relationship with
46:51
happiness and and, you know, like, how we structure
46:54
our lives. And it's also trying to say,
46:55
let's get past
46:58
these silly Bryan debates
47:00
at these layers
47:01
of obstruction which are meaningless and they
47:03
just confuse everyone because then people immediately say,
47:05
first, they say eggs are bad for
47:07
you, then they're good for you. No one knows, and
47:09
they just stop at that point. When
47:11
a
47:11
reality, there are more right answers than wrong
47:14
in terms of doing this in a
47:16
methodical way. You know, before
47:17
we started recording, Sean was just like
47:19
gushing about how, like, Jack and ripped you are and
47:21
how he thought you
47:24
looked great. Is it mostly are you getting any more
47:26
attention from women? Or is it all just guys
47:28
like Sean or just like
47:30
high quality? because whenever
47:32
I whenever I get in
47:34
shape, I'm like, oh yeah, my wife should've loved
47:36
this. And
47:38
she's like, I guess you look alright.
47:40
It's always dude. It's holy men. I I
47:42
think Sam,
47:43
I would say the the
47:46
person
47:46
most happiest is
47:48
me.
47:48
You know, like,
47:50
I I
47:51
had terribly
47:54
complicated emotions looking
47:55
at myself in
47:56
my worst years. It
47:58
just was so
47:59
I felt so much shame and
48:02
guilt and lack
48:04
of respect. because I just felt out of control and
48:06
powerless.
48:06
And now
48:08
when I look at myself, I have
48:10
such positive emotions.
48:13
that I'm
48:13
stable, that things
48:15
are reliable, that I trust
48:17
myself. Yeah. I I trust
48:19
the systems that are built.
48:21
It is trans form my relationship with myself. It transforms what I
48:23
think about, what I can become as a person, the
48:26
relationships I have. So it just it's hard
48:28
to articulate how
48:30
significant the a psychological shift has
48:32
been for me and my own
48:34
understanding, my own
48:35
identity. Wow. You just
48:37
gave
48:37
a really great answer to a really
48:39
dope question. That was amazing. I'm glad you did that
48:41
because we set you up with sort of a
48:43
goofy question and and I think you said something really
48:45
profound there. I wanted
48:48
to ask you you said you said something like it's a bit of a
48:50
cliffhanger. You were like, you know, willpower is not the
48:52
answer. You said, you know, the the the good
48:54
thing that could come off this is not oh, you listen
48:56
to spot are and you're
48:58
motivated to go, you know,
49:00
go eat two pounds of veggies tomorrow
49:02
because, you know, you'll revert and you're
49:04
up against this, you know, the
49:06
grocery store is this sugar casino,
49:08
basically. And then, you know, your
49:10
all your social media is just this, like, algorithm designed
49:12
to to hook you and, you know, you are
49:14
sort of Yeah. David, first of all,
49:16
I guess, bad analogy. David, one. But, like, you
49:18
know, you are powerless compared to the
49:20
the onslaught that's coming at you trying trying to
49:22
get you to make a certain decision. And,
49:25
you know, do you but you didn't quite say you're like, you know, there's a system. But
49:27
if okay. If I take that, I'm like,
49:29
tomorrow, I wake up. Brian's
49:31
gone. I'm like, alright. Today is about a system.
49:34
And I'm gonna be like, what the fuck was he
49:36
talking about? What's the system? What am I
49:38
supposed to do? that's where I would
49:40
stop. So I wanna make sure we don't leave it at
49:42
that. What would you say is is
49:44
the approach that you advocate
49:46
for? Did you say it's the data?
49:48
Is it oh, I should begin tracking certain markers and that's the
49:50
step to take and then let my own
49:52
intuition ride. Is it is it I
49:54
gotta fire evening evening, Sean? because I
49:56
had the same I have the same problem
49:58
now that you you described you had then. What
49:59
do you what would you say is the
50:02
actionable way forward that that sort
50:04
of is the sustainable
50:06
successful path? Yep. I I say
50:08
three things. One, fire
50:10
the
50:10
worst version of yourself. So
50:12
whatever you do, whatever it
50:15
is, like, whatever time However, where
50:17
the circumstances are, identify that
50:19
person and
50:21
fire
50:21
him. They'll be successful without one act.
50:23
Step number
50:26
two, is
50:26
make a firm commitment on one step towards
50:28
the system. So for example, for me,
50:31
that's calories.
50:31
I eat one thousand
50:33
nine hundred and seventy
50:35
seven calories per day, that's
50:37
it.
50:38
Not anymore. And that
50:39
is my absolute budget, and I
50:41
can't like over it. And so set
50:43
a firm boundary that that's what you're going to do and you're gonna stick with
50:45
it. And then three is you can
50:47
start refining the details of what those calories are
50:49
and when you
50:52
eat it, you wanna pack
50:54
more nutrition in over time, but
50:55
just acknowledge
50:58
knowledge that you yourself
51:00
are powered us to
51:02
win in
51:02
a in a moment by a moment
51:05
will power a game. and structurally
51:07
set up the path to win. Find the worst version of yourself
51:09
to get that muscle
51:11
to build, set very clear
51:12
boundaries and then you
51:13
can you
51:16
can refine. don't worry about taking on the whole thing all at once.
51:18
They're just this baby step. And then you'll build this
51:19
muscle. Like, my seventeen year
51:22
old
51:23
is he he
51:25
does the identical things I
51:27
do. And,
51:28
you know, Sam, you came back here, like, how many
51:30
times have you made interactions? And, like, you know,
51:32
so he went through this whole
51:34
thing of how many times did he have to make
51:35
the error of eating too much or making
51:38
an
51:38
exception or breaking the protocol or
51:42
whatever And
51:42
it was something like, I don't know. I I forget what we talked about. It was, like, something in
51:44
the thirties range. You know, where I in
51:47
the morning, he was, like, dad, I
51:50
felt like, fine. Fine. Right? Like but he finally, to the
51:52
same point, where I did where you
51:54
can get to that moment. and
51:57
you're
51:57
tempted to do something and you can model
51:59
out exactly what how it's going to feel
52:02
to do that thing and you can model out
52:04
exactly what it's gonna feel like after you've done
52:06
that thing, you can model out your
52:08
sleep and you can model out your next morning and how you feel about yourself. And pretty soon,
52:10
the simulation becomes so clear
52:13
in your mind. You're like,
52:15
yeah, no. Like, there's nothing
52:17
in that series of events where
52:19
I win. Like,
52:20
literally nowhere, why am I
52:21
going to do it? And
52:23
then it just becomes a point where
52:25
it's
52:25
you know, people ask me, like, do you have
52:27
cheat days and, like, no. Like, a
52:29
cheat day awful to me. It sounds like the worst feeling in
52:31
the world to be full and to be
52:34
regretful. Like, no. That's the last thing in the world I
52:36
wanna do need baby steps
52:38
and you build up your your muscles and then
52:40
they soon just become the way of
52:42
being. You're
52:43
clearly incredibly unique you're
52:45
very insightful, you're very wise, you're a
52:48
person who, like, you talk to me and I'm like, oh, man,
52:50
this guy has a lot of things
52:52
figured out. you're like very precise and you're like, I filed the data.
52:54
Were you a good manager? You think did
52:56
people were did people like working with
52:58
you and for you? Or did they find you to be
53:00
just like
53:02
challenging because you're so you you you're you're just so you're
53:05
very unique -- Yeah. -- like, hyper
53:07
rational. Yeah. I I joke my
53:09
team at Braintree but I
53:11
didn't care what they thought. I
53:13
only cared
53:14
to learn what their significant others
53:17
thought
53:17
because it's very
53:18
hard to get someone's real opinion
53:20
of you. because it's
53:21
it's difficult for someone to be
53:24
honest. But when we all know when we go home from
53:26
work and we're talking
53:27
to our syndicate and others, like, that's
53:29
the truth serum in action. You know, like, where
53:31
we really say something. And so I do
53:33
have some data where people said
53:35
on the best boss
53:37
they've ever had in their life. I'm sure other
53:40
people dislike the way I
53:41
I do certain things, but I
53:43
certainly
53:44
I care
53:46
deeply about
53:47
being a
53:49
high
53:50
value person
53:52
in these people's lives.
53:54
And
53:54
so not only, you know, a good steward of the business, but
53:57
creating an environment where they
53:59
become their
53:59
best selves. Was
54:02
that
54:02
a learned behavior? But was that like a learned behavior? Like, were you
54:04
like, well, if I treat people right, I'm gonna get my
54:06
outcome and or was it just like you was
54:09
this like is this how you
54:11
are raised like, what what motivated that? Yeah. It just feels like the right
54:13
way to do things. I mean, like, if if you think
54:15
about, like, more structurally in
54:18
society,
54:18
we accept this exchange.
54:20
of
54:21
spending our life
54:23
points for
54:24
a system of
54:26
rewards that includes
54:26
status and wealth and something
54:29
else, And I suppose
54:29
why I bring this up is I think we all know
54:32
how much money is in our bank account. We know how
54:34
much we weigh. We know how many followers,
54:36
social media followers
54:38
we have. we don't know for example our speed of
54:39
aging. Like, how fast are you aging in this
54:42
moment? And if you had
54:43
aging eighteen points
54:46
points, like
54:46
your bank account, would you spend them a certain way?
54:48
And so what I'm saying is
54:50
we have a fundamentally accepted
54:53
that we are on this decline in life. We're going to
54:55
spend our life points. We're going through the
54:57
grave. And the things that may live on
54:59
may include our reputation or, like, our
55:01
our contributions or whatever. And
55:03
I suggest now that may be the opportunity for us to
55:05
flip it. So if you say in the year twenty
55:08
twenty two, if you're looking at the
55:10
horizon of
55:10
possibilities for humanity, like, what is the
55:12
thing you can barely see, which
55:15
is barely imaginable. You
55:17
would say, basically, don't spend
55:19
your life points recklessly.
55:21
if you
55:21
can live long enough, there may be a
55:24
new wave here. It's like blueprint, I'm trying to do
55:26
two things. I'm trying to
55:27
maximally slow my rate of aging.
55:29
because entropy
55:29
is very strong, you're not gonna beat it. So I'm rate
55:32
I'm
55:32
currently aging at point seven six. So
55:34
I for every three hundred and sixty five days a
55:36
year, I
55:37
aged two hundred and seventy
55:39
seven. So I
55:40
basically get, like, October, November, December for
55:43
free. And then for the for the months I do
55:45
age, then the progress is to
55:48
reverse that aging that has
55:50
happened.
55:51
though that So that I
55:52
can be the same biological age. And so if we have
55:54
aging points, we have an age of egg count,
55:56
then society could
55:57
shift instead of us saying, we're gonna
55:59
be we're going to be a martyr
56:02
a martyr or
56:02
wealth or status
56:03
or whatever? Would it
56:04
change in that balance? And would it more be
56:06
about humans? So we become an obsessed
56:09
about what we can become the
56:11
species, not what
56:12
our technology can become. Right. And let
56:14
me
56:14
ask you, you I like a lot of
56:16
people who listen to this are entrepreneurs and I
56:19
think one of the cool things when we have somebody like
56:21
you come on versus, like, you
56:24
know, our buddy who's
56:26
doing like vending machine
56:28
arbitrage and, like, you
56:30
know, he tells a different story which is like, oh, this is
56:32
great. I got this income stream coming
56:34
in I was able to quit my job inspiring
56:36
story. This is a different one, which is
56:38
like, you know, spend your creative
56:41
and entrepreneurial energies. on things that
56:44
really matter both to you and your
56:46
lifespan as well as like, you know, it's just the
56:48
human civilization. What do you wish
56:50
people were working on? What opportunities do
56:52
you see? where you're like, man, we need more talent and brains going and
56:54
trying to solve x or this breakthrough
56:56
just happened and really nobody's
56:58
doing
56:59
y or Do you have any ideas like that
57:01
that are, like, you know, I think somebody could go do x? I do.
57:03
A lot of them, but I would say I'm
57:05
really obsessed with one. that's
57:07
goal alignment or
57:10
cooperation. At at
57:11
the basis of everything that
57:13
exists on planet earth,
57:15
there's
57:15
a singular question to
57:18
play.
57:18
Can't it cooperate? And
57:19
so this blueprint thing, this
57:21
question is, can I bring world peace to
57:24
Brian in my body? The answer is
57:26
yes,
57:26
I did. Now my mind
57:28
is is
57:28
a whole another thing. Right? The negative
57:32
self talk Like, all the stuff that goes on in my brain, it's an entirely
57:34
different project. But Blueprint is
57:36
applicable to climate change. It's the
57:38
same
57:38
thing if we
57:39
were to measure the world with millions of
57:42
data points and let the
57:44
Earth speak, and then
57:45
we work within
57:48
those constraints. that's the solution
57:49
for how we can coexist with a healthy planet.
57:51
Versus right now, our minds
57:54
overrun the
57:56
planet we do what we want, when we want, and how we want, and it
57:58
comes at the expense of our planet. So just like we're
58:00
committing self harm to ourselves, we're
58:02
committing harm to the earth. likely
58:06
it's the same problem.
58:08
As to be if you
58:09
could think about and go alignment
58:10
within ourselves between each
58:12
other with planet Earth and with AI,
58:15
It's
58:16
a gigantic computational
58:18
goal alignment problem. And
58:20
so the idea,
58:21
like, this AI alignment problem we have we've
58:23
talked about now like, between AI
58:25
and humans, like, what? Like, don't have go alignment within ourselves,
58:28
let alone between humans. And then humans in
58:30
AI, it's it's kind of a
58:32
crazy notion.
58:34
then, you know, like, if you look
58:36
at the number of disconnects on that entire stream, and the
58:38
starting point
58:39
in most people's assumptions
58:42
is, Let me start with what I can see
58:44
and how I can change people's behavior person
58:46
looking in. And so
58:47
I'd say, yes, I have ideas that
58:50
none of them matter. Because
58:51
unless we can solve
58:54
cooperation, what is it that we have
58:55
to look forward to as a
58:57
species? So what's an
58:58
example of that you you talked
59:00
about basically, like, goal alignment first with your body.
59:03
Right? And you were able to solve PayPal, and
59:05
blueprint is a good example. what's
59:07
another example of how you take this idea of cooperation or goal alignment? And
59:09
what would be, like, a
59:11
more specific, like,
59:14
point of attack or or, like, product or service or whatever
59:16
that would be created
59:18
along those lines? I mean,
59:21
the brain. Like, so I would love to tackle my
59:23
brain next, and that's what kernel is. If I have this
59:25
device and I can measure my brain, if you
59:27
think about it in, like,
59:29
what we eat today is dietary input to our bodies.
59:31
And you think about what
59:32
is my diet for my brain? Like, the
59:34
news sources,
59:35
the social media sources, my
59:37
friends, my environment, we
59:39
have no idea what's happening. We don't
59:41
we haven't we haven't we haven't measured.
59:43
And so
59:44
is it possible
59:44
that we basically eat ninety percent junk
59:47
food in a given day to
59:49
our brains because that's
59:50
just waste society is structured. And so
59:52
I'd love to address that of getting
59:54
the baseline of where my brain And
59:56
then more broadly, I've been talking to several people who PayPal
59:59
math and
59:59
computational methodologies to try
1:00:02
to
1:00:03
figure out are there mathematical
1:00:06
approaches? In the same way that John Nash came up with,
1:00:08
you know, game theory, are there
1:00:10
mathematical
1:00:10
approaches to think about
1:00:13
how
1:00:13
we might solve this between humans between
1:00:15
humans and AI, between the human AI and
1:00:17
the planet. And so, you
1:00:18
know, you know, I can do this with my body. It's
1:00:20
a it's a pretty straightforward thing to
1:00:23
do now, but it's
1:00:24
just gonna get more complicated than more
1:00:26
agents you have in the game. Man, you so
1:00:28
I've, like, worked with Tim Ferriss in the
1:00:30
past and, like, I remember I would ask him. Like, we would just an example is he we'd
1:00:33
be walking our dogs together and I'd be like, oh, Tim,
1:00:35
that's a cool dog leash. And he'd be like, oh, this dog leash.
1:00:37
You know, the reason it's interesting is made
1:00:39
of horse's hair, which is good for the dog, for this reason and
1:00:42
that reason. I found it in Japan. Then we
1:00:44
have, like, biology on the podcast, you
1:00:46
know, and he was, like, well, he would go through
1:00:48
these, like, we were just asking, like,
1:00:50
when he had for breakfast and he would just give us this, like, complex answer and it was actually pretty
1:00:52
profound. Palmer Lucky was kinda like that
1:00:54
too. And
1:00:56
you are hundred percent in that same ballpark where
1:00:58
I make a stupid joke and I ask a dumb
1:01:00
question or we ask something that seems
1:01:02
straightforward and simple and you get like a
1:01:04
pretty profound answer.
1:01:06
And what's interesting is you actually do this thing that Elad does where
1:01:08
you ask them a question and most people
1:01:11
don't do this. They pause. they
1:01:14
don't they'll they'll actually not talk for, like, ten seconds and they just think. And
1:01:16
I actually I try not to interrupt because most
1:01:18
people are uncomfortable with that. One I
1:01:20
interrupted you or I or you
1:01:23
got silent. By the way, the keto good interviews,
1:01:25
you don't interrupt that. So I kind of screw that up.
1:01:27
But you, like, are really thoughtful and,
1:01:29
like, in one way, it's exhausting. because, like,
1:01:31
everything you say is, like, profound and I'm, like, thinking about
1:01:33
it. And I'm, like, well, I wanna ask you all these questions about that
1:01:35
too now and this and this. On the other
1:01:38
hand, it's refreshing. I mean, it's just enlightening.
1:01:40
You're just you're you're
1:01:42
just an interesting human and you
1:01:44
your your intensity is I
1:01:46
can see it being off putting for some people. But
1:01:48
for me, like, I've I'm I'm I'm into it. You're you're
1:01:50
just a very unique person. I I think it's like
1:01:52
really interesting to hear your perspective. Even I
1:01:54
don't think you've said anything that
1:01:57
I agree or or that I disagree with, but maybe you have.
1:01:59
And it's like, well, that's okay, but you're just like
1:02:01
an original thinker. I dig it. Thank you, Sam.
1:02:03
Appreciate that. What's a give us before we
1:02:05
go, what's one of these contraptions behind you? Give us what what is one of these interesting
1:02:07
things? Is that just that just occurring, but she If that
1:02:09
one looks like a soda stream
1:02:12
or something. a genetic
1:02:14
sequencing machine.
1:02:16
That'd be funny. Like, got me. Like, you know,
1:02:18
I've got Coke and Pepsi and Yeah.
1:02:20
I'm kinda drinking a rental
1:02:23
wine too. what's going on with me? Yeah.
1:02:26
So we I last week, like weeks
1:02:28
ago, we bought a medical
1:02:30
grade, hospital grade ultrasound machine.
1:02:33
And
1:02:33
so part of this has
1:02:35
just been buying the kind of equipment that allows us
1:02:37
to do the stuff we're trying to do. And so
1:02:39
for example, ultrasound if you look at
1:02:41
the the pyramid of measurement,
1:02:43
wearables are like part of the first
1:02:46
category and then you get to biofluids
1:02:48
like blood draws and and, you know, urine
1:02:50
and whatever else imaging is an entirely different
1:02:52
quality class. And so if you wanna get
1:02:54
really good data on on yourself, it's
1:02:56
an ultrasound
1:02:58
and MRI and, you know, life has
1:03:00
suffered internal. And so we
1:03:01
really had to build up this infrastructure. So one of
1:03:03
the reasons why this is so expensive is we've
1:03:05
basically built out, like, a
1:03:07
mini clinic hospital here with all the stuff we
1:03:09
have. It's such a lot of different outcomes. But yeah.
1:03:12
And are there, like, four dudes
1:03:13
just kind of, like,
1:03:16
that are your physicians that are monitoring you at all times? Or what's gonna
1:03:18
what's your team? What's your stack for for people
1:03:20
with with that PayPal- Yeah. -- body
1:03:24
For
1:03:24
the ultrasound, we have five sonographers.
1:03:27
So one sonographer specializes
1:03:30
in the heart.
1:03:32
One does lungs, pancreas, liver,
1:03:33
a kidney. Another one does
1:03:36
musculoskeletal. Another one does droppler
1:03:38
for the
1:03:40
brain. And so for example, I we
1:03:42
just did one of the ways to quantify working out is we're
1:03:44
using ultrasound
1:03:44
to measure tendons ligament
1:03:48
like all the component parts of my joints, ankles, knees, hips, elbows,
1:03:51
shoulders. And then we implement
1:03:53
these
1:03:53
exercise regimes and we look
1:03:55
at the changes They're like, are
1:03:57
they working to show what degree?
1:04:00
So everything
1:04:00
we do is quantified. And so
1:04:02
it's not like, oh, this thing makes me
1:04:05
feel better. feelings rarely
1:04:06
matter with anything
1:04:07
we do. And so, yeah, this
1:04:09
the hardware and the the specific team. And
1:04:11
so just like we have five sonographers to run
1:04:13
the ultrasound machine, We have
1:04:15
specialists in lung, specialists in the heart,
1:04:18
specialists in other areas. So the team is like
1:04:20
twenty five or so, maybe, you know,
1:04:22
altogether that have
1:04:23
different specialties. Man, this
1:04:24
is crazy. Sean's answer to that question is, like, I just
1:04:26
got, like, a poster of dogs playing poker,
1:04:29
one shoe, straight airpods, get a
1:04:31
couple of empty die a
1:04:33
coke, it's like We ask you.
1:04:36
It's like, oh, you mean that all comes out.
1:04:38
Actually, I don't need all this measurement. Like, if you
1:04:40
just x port my DoorDash history.
1:04:42
It'll tell you how I'm aging out. Yeah. I'm aging out. Yeah.
1:04:44
Gray out right now. fil
1:04:48
A, like, you older or younger. What
1:04:50
are your thoughts on the branch?
1:04:56
Dude, Brian, this has been awesome. Thanks for thanks
1:04:58
for coming on. We we talked about you way
1:05:00
back as like this awesome guy and it's amazing
1:05:02
to to get to meet you and hear some of the stories and
1:05:04
ask ask her questions first hand. So appreciate
1:05:06
you
1:05:06
coming on. Yeah. Hey, Kevin.
1:05:08
I feel like
1:05:09
I could ruin where to
1:05:11
know what could be what I
1:05:13
want to. I put my dog in it
1:05:15
like a day's all one of my oldest
1:05:17
travel never
1:05:20
looking
1:05:20
back.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More