Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
We probably have a thousand days
0:02
left of humanity as we know
0:04
it, or society as we know
0:07
it. There's probably three years of
0:09
society as we once knew it.
0:11
And then as we go further
0:14
and further past the next thousand
0:16
days, we're living in a post-A-I
0:18
revolution world. This episode is
0:21
brought to you by our least
0:23
sponsor, a massive legend's Iron. the
0:25
largest NASDAQ listed Bitcoin miner using
0:27
100% renewable energy. Now they're not
0:30
just powering the Bitcoin network, they're
0:32
also providing cutting-edge computer resources for
0:34
AI, all backed by renewable energy.
0:37
Now my boy Danny and I
0:39
have been working with their founders
0:41
Dan and Will for quite some
0:44
time now. and we've always been
0:46
super impressed with their values, especially
0:48
their commitment to local communities and
0:51
sustainable computing power. So if you're
0:53
interested in mining Bitcoin or harnessing
0:55
AI compute power, Iron is set
0:57
in the standard. And so you
1:00
can find out more at iron.com,
1:02
which is iren.com. That is iron.com.
1:04
Roll him. Is it Daniel? Dan. Either. All
1:06
right. Dan. Danney seems a bit
1:09
weird. My grandma used to
1:11
call me Danny. Danny, that's
1:13
the old producer of the show,
1:15
it's called Danny. He's off doing
1:17
his own thing. He's a British
1:19
guy, he lives in Australia now.
1:22
He goes in, where is
1:24
he, Brisbane or somewhere? Beautiful,
1:26
one day perfect the next.
1:29
Look, I've been out to
1:31
Australia the last two years. We
1:33
do a little vent there. You got
1:36
the sunshine on the beach. Yeah. the
1:38
good weather, good life. You had something
1:40
the size of all of Europe with
1:42
35 million people. Yeah, whereas we've got
1:45
the shitty version of communism where
1:47
everything's shit. Yeah, miserable, especially this time
1:49
of year. Well look, it's great to
1:51
talk to you. I've been finding you
1:53
on Twitter for a little while now.
1:56
I think you think what you've done
1:58
really well is I think got
2:00
stuff that appeals to me and stuff that
2:02
will appeal to my son in that you've
2:05
got all this great business advice
2:07
for young entrepreneurs where I think
2:09
you're giving them the reality the
2:11
stuff they probably should be telling
2:13
them at school and then you've
2:15
got all the miserable UK bullshit
2:17
which I totally agree with where
2:19
I'm like yes I understand I
2:21
understand so you've been here 16
2:23
years 19 now 19 wow Yeah,
2:25
yeah. So I mean, look, I
2:27
am a mix of the positive
2:29
and the negative. Like, there's part
2:31
of me that thinks that this
2:33
is the greatest time in history
2:35
to be alive and that this
2:37
is an amazing time to be
2:39
building growing businesses and creating wealth.
2:41
And then there's part of me
2:43
that grumbles that systems aren't working
2:45
for us that we could unleash
2:47
a lot more. And it's not
2:49
because I'm complaining about the UK
2:51
government or any... anything in particular,
2:53
it's because I see the possibility.
2:55
Oh yeah. It's like I see
2:57
what's possible and I see this
2:59
moment that's here and all the
3:01
complaints are like really from a
3:03
space of hey we could be
3:05
great, we could be doing really
3:07
really well. I'm not sitting there
3:09
moaning saying oh there's nothing you
3:11
know there's no hope. I'm actually
3:13
saying let's get the handcuffs off,
3:15
let's get the shackles off, let's
3:17
go for it. Let's actually unleash
3:19
some value here. I completely agree.
3:21
Well, I agree in that I
3:23
am definitely moaning. Yeah. I am
3:25
giving Kiastama, Kiastarsian, Rachel Reeves and
3:27
Angela Ray, and I'm giving them
3:29
a hard time. They probably don't
3:31
know who I am or give
3:33
shit, but I am giving them
3:35
a hard time because I think
3:37
they're absolutely useless. I think the
3:39
previous Conservative Party was pretty useless
3:41
as well. Totally, which is why
3:43
they got voted out. Yeah. We
3:45
chose to punish them by punishing
3:47
them ourselves. Yeah. I mean that
3:49
comes down to the stupidity of
3:51
the stupidity of the stupidity of
3:53
the stupidity of politics. I am
3:55
an optimist and I'm also just,
3:57
I'm an entrepreneur, so I just,
3:59
I just get on with it.
4:01
I didn't vote in the last
4:03
two elections and some people get
4:05
upset by that and I'm like,
4:07
I was just, this is a
4:09
waste of my time, it doesn't
4:11
make... make a difference in my
4:13
life. I just get on with
4:15
it and build businesses. I don't
4:17
have a political team. I'm not
4:19
left, sorry I'm not conservative or
4:21
labor or any particular party in
4:23
particular, I'm pro-entrepreneur. Like all I
4:25
am is, like I'd vote for
4:28
any party that I thought had
4:30
the best policies that would benefit
4:32
entrepreneurs because I think entrepreneurship is
4:34
the best game in town. It's
4:36
the thing that does the most,
4:38
you know, positivity for the economy
4:40
and for individuals as well. So
4:42
I'm, you know, personally, I'm not
4:44
political, I just... you know, vote
4:46
in alignment with what I think
4:48
is best for entrepreneurship. Well, I'm
4:50
similar. I'd say I think more
4:52
just generally about the economy. Yeah,
4:54
you know, so obviously supporting entrepreneurs
4:56
and supporting businesses, but also just
4:58
generally having sensible economic policies. I'm
5:00
a big fan of smaller government.
5:02
Yeah, small government leave, you know,
5:04
leave most things on, let the
5:06
market sort out most things. There's
5:08
a few things that the government
5:10
probably needs to be concerned with,
5:12
and there's a few things that
5:14
only really governments can do properly,
5:16
like defense, you know, military budgets.
5:18
You know, the big ones, to
5:20
create a great economy, you want
5:22
food security, so you want to
5:24
have your own ability to produce
5:26
your own food, so you want
5:28
a thriving farming community. The farmers
5:30
are the first entrepreneurs. They are
5:32
the entrepreneurs that underpin all entrepreneurs.
5:34
You know, we need the farmers
5:36
three times a day. and every
5:38
country... Four or five for me.
5:40
Four or five, yeah, right? Every
5:42
country should have its own thriving
5:44
farming community that produces food, because
5:46
there is a thing called food
5:48
security. If you look at the
5:50
USA, the USA is a very
5:52
secure country because they have a
5:54
high degree of food security. They
5:56
manufacture all their own food, not
5:58
the best food in the world.
6:00
Factory produced skittles are not the
6:02
most healthy thing, but you take
6:04
a country like China. they import
6:06
40% of their calories. If the
6:08
world goes to custard, China's in
6:10
real trouble just from a calories
6:12
perspective. They can go into famine.
6:14
So, you know, governments need to
6:16
take care of farmers. That's one
6:18
of the very first things. Making
6:20
sure that the farmer's happy. Energy.
6:22
There is no such thing as
6:24
an economy that has expensive energy
6:26
and is thriving. Low cost, cheap,
6:28
available energy that just runs. is
6:30
the backbone of industry and you
6:32
know especially where we're going with
6:34
data centers and AI you cannot
6:36
have an AI economy if your
6:38
energy costs are high if you
6:40
look where if you look at
6:42
the USA the USA has one-quarter
6:44
the energy costs per kilowatt as
6:46
the UK the UK is saying
6:48
really yeah it's insane 400% energy
6:50
costs so we we now have
6:52
a situation where to run a
6:54
data center just on the electricity
6:56
is going to cost you four
6:58
times as much. So that is
7:00
the job of government to make
7:02
sure you've got cheap available energy.
7:04
We're not doing that very well.
7:07
Well, I mean, it's such an
7:09
important point as well as every
7:11
business I have has an expensive
7:13
power cost. Yeah. I know people
7:15
who, you know, middle class, low
7:17
middle class people, they've got high
7:19
energy costs. Well, that affects the
7:21
disposable income. Yeah, it's a tax
7:23
on everyone. Yeah. But it's also
7:25
more than that, it is a
7:27
massive tax on industry. So as
7:29
soon as you have high energy
7:31
costs, forget about manufacturing anything, forget
7:33
about transporting things, you know, why
7:35
did the British succeed so well
7:37
the first time around? Because we
7:39
invented, you know, we had coal-powered
7:41
trains, which were a lot faster
7:43
and a lot easier to get
7:45
things from point A to point
7:47
B, because it ran on a
7:49
new type of energy and we
7:51
harnessed it. So... Same thing as
7:53
the British Navy, right? We were
7:55
able to get things around the
7:57
world faster and cheaper than anyone
7:59
else We didn't have any land
8:01
borders to protect so we could
8:03
go all in with the Navy
8:05
And we became a naval superpower
8:07
of our day. So, you know,
8:09
we basically harnessed an energy called
8:11
the wind and got things moving
8:13
around faster and cheaper than anyone
8:15
else. So if you have high
8:17
energy costs, if you, there's a
8:19
graph that I've seen that basically
8:21
says energy costs versus the economic
8:23
value per person, high energy reduces
8:25
economic value per person. So you
8:27
just get rid of the economy
8:29
with high energy. So we're screwing
8:31
that up, we're screwing up food
8:33
security. Border defense, the ability to
8:35
maintain your own borders. you know,
8:37
we can't protect ourselves from dinghies,
8:39
which is pretty pathetic. You know,
8:41
people need to come into a
8:43
country legally. That's obvious. You know,
8:45
that's been conventional wisdom for 10,000
8:47
years. You don't just let random
8:49
people into your country. You know,
8:51
where did this idea come from?
8:53
That people can just cross a
8:55
border. I mean, that used to
8:57
be something you'd get shot for.
8:59
Like, genuinely, for most of human
9:01
history. you could get killed crossing
9:03
a border illegally. And now we're
9:05
like, oh yeah, nothing we can
9:07
do really. You know, even in
9:09
my lifetime growing up in Australia,
9:11
we were extremely strict on border
9:13
control. We used to send the
9:15
Navy out and turn the boats
9:17
around. And we got very much
9:19
criticized for it. You couldn't get
9:21
to Australia illegally. But it's absolutely
9:23
insane that an island has a
9:25
problem with a border. Like we're
9:27
an island. We have a natural
9:29
border. It's called a big C.
9:31
How are we not able to
9:33
protect our own borders? So those
9:35
are some of the basics that
9:37
make a country work. And then
9:39
the individuals inside a country need
9:41
to feel a sense of ambition,
9:44
that they have personal incentives to
9:46
get ahead, that if they work
9:48
hard, they will be rewarded, if
9:50
they work hard, that they'll be
9:52
a payoff, if they innovate, they'll
9:54
be able to get the fruits
9:56
of that. Do you know what's
9:58
crazy? We wrote the book on
10:00
that. Adam Smith wrote the book
10:02
called Wealth of Nations, The Invisible
10:04
Hand of the Market. We invented
10:06
that. We invented the capitalist system.
10:08
We were the ones who like
10:10
literally birthed the system called free
10:12
market capitalism and exported it to
10:14
the world and it was the
10:16
most successful system in the damn
10:18
world. We wrote the book on
10:20
it and we basically say, oh,
10:22
you know, oh, wait a sec.
10:24
There was this guy called Karl
10:26
Marx who hung out in London
10:28
for a little bit. Let's see
10:30
what his book says. No, we
10:32
know that book's crappy book. So,
10:34
you know, we're in this strange
10:36
situation where we are. on paper,
10:38
you know, fundamentally we should be
10:40
so strong, we should have strong
10:42
farms because we have great farming
10:44
land, we should have strong borders
10:46
because we're an island, we should
10:48
have great energy because we have
10:50
the North Sea, you know, we
10:52
should have an ambitious, motivated population,
10:54
you know, and yet we're taxing
10:56
them like communists. Well, every part
10:58
of this is broken. I've literally
11:00
been riding this down. I went
11:02
to the farmer protest. Had Garith
11:04
Wynne Jones on here. I didn't
11:06
know enough about farming. My mom
11:08
grew up on a farm, but
11:10
I didn't really know too much
11:12
about it. I am perfectly aware
11:14
that there's a mischaracterization of farmers
11:16
as being these rich people in
11:18
home lots of land and drive
11:20
range rovers. I know it's hard.
11:22
It's not just that. Their tool
11:24
of the trade is land. Imagine
11:26
a weird scenario where suddenly hairdressing
11:28
scissors are worth a million dollars
11:30
a pair. but they're not, they're
11:32
sitting there going, but I just
11:34
use my hair scissors to cut
11:36
hair, like I'm not using these
11:38
scissors. The reason farmland is theoretically
11:40
worth millions is because of the
11:42
inflated value of property outside of
11:44
farmland. But if your intention is
11:46
not to sell the farmland, if
11:48
your intention is to farm it
11:50
as a service to society, you
11:52
know, then it's just a tool
11:54
of the trade. It's not some
11:56
massive asset that you're sitting on.
11:58
But it's okay to have caveats
12:00
as well in society and in
12:02
rules and taxation, you know, we
12:04
don't tax children, we don't tax
12:06
health. food, we don't tax children's
12:08
clothes. We can have all sorts
12:10
of new ones. We have all
12:12
sorts of systems that allow for
12:14
new ones. Yeah. We used to
12:16
knock tax education. But we're attacking
12:18
the farmers now. We're fundamentally making
12:21
their life harder. We're disincentivizing farming.
12:23
So number one in your list,
12:25
food security, we're attacking energy, as
12:27
you said, we've got the high.
12:29
Look, we all know that. the
12:31
disposable incomes of lower and low
12:33
middle class people have shut down.
12:35
We know the impact that's having
12:37
on businesses and towns. I know
12:39
that the impact on my bar,
12:41
right? Previous year, we were 20%
12:43
up. Yeah, bar doesn't make a
12:45
lot of profit, but it turned
12:47
over 600,000, but it's a good
12:49
little business. 40% down last year.
12:51
because mortgages went up, energy bills
12:53
went up, people went out less.
12:55
So we know that impacts businesses.
12:57
Okay, border security, I mean, I
12:59
could wax a lyrical about that,
13:01
but it's not even just border
13:03
security with dinghies. I mean, I
13:05
don't know what the numbers are,
13:07
kind of, how many are coming
13:09
in legally. The majority of being
13:11
imported legally, and they're bringing in,
13:13
and they're bump up our GDP.
13:16
And then the incentives, I think,
13:18
I think, start in a business, I
13:21
think start in a business now
13:23
and get into the point of a
13:25
profit where you can pay a
13:27
dividend is very, very hard. The hurdle
13:29
rate from borrowing money to paying
13:31
off the loan, the startup costs, I
13:34
mean... Let's forget digital businesses because they're
13:36
all easy. Well, I was about
13:38
to say there's a real distinction because
13:40
in a in a in the
13:42
UK economy, there's a non, it's an
13:45
economy that's not growing is dangerous because
13:47
it's always got a leaky bucket,
13:49
which is interest on debt. So interest
13:51
on debt is just essentially money
13:53
that's coming out of the economy for
13:56
no productive purpose. So if an
13:58
economy is not growing, it is shrinking
14:00
because of the interest payment. Well, actually,
14:02
if it's not grown beyond the
14:04
inflation rate. Inflation rate, and also realistically,
14:07
you know, a business has to
14:09
be growing beyond inflation, but it also,
14:11
if, you know, if you kind of
14:13
do the mathematics on how much
14:15
is the UK debt, one times GDP,
14:18
what are we paying? Five percent,
14:20
you know, four percent, three percent? whatever
14:22
it is, 3 or 4% of
14:24
the economy is actually money that comes
14:26
out of the economy through taxes and
14:29
pays back the bondholders. So it
14:31
is actually a leak in the economy.
14:33
So, you know, you've got to
14:35
stay ahead of inflation and realistically there
14:37
is a shrinking tide. So if the
14:40
economy is not growing, we're in
14:42
real trouble. We are going backwards regardless
14:44
of what they say. Now the
14:46
flip on that is digital businesses. So
14:49
the world... as we know it
14:51
is going through a fundamental shift, right?
14:53
And it's impossible to talk about all
14:55
of these issues without acknowledging that
14:57
there is a massive fundamental change in
15:00
society. So if we zoom out
15:02
300 years, we go to agricultural farming.
15:04
system. So in the farming system
15:06
or in the agricultural age, we had
15:08
a governmental system called feudalism and it
15:11
was kings, lords, dukes, viscounts, and
15:13
it was all about land ownership. Who
15:15
owns the lands? Whoever has the
15:17
vast agricultural lands, that's the wealthiest people
15:19
in the land. And that went on
15:22
for hundreds and hundreds of years,
15:24
undisrupted. Until we had a technological shift
15:26
and the technology was the industrialization.
15:28
that started in Britain, and it was
15:30
in this technology that said we
15:32
can't run society the way that we
15:35
used to. It took decades to hundreds
15:37
of years to change, but we
15:39
actually invented a new system called capitalism,
15:41
and in that capitalism, it didn't
15:43
matter if you're a Duke. It's like,
15:46
oh, I'm a Duke, and I've got
15:48
all this agricultural land in the
15:50
middle of England. Who cares? All about
15:52
having a factory. labor, if you've
15:54
got access to machinery and innovation, you're
15:57
in this industrial class of people
15:59
and now you're rocketing ahead. Now you
16:01
can imagine that there was a difference
16:03
in old money versus new money,
16:05
right? There was a, like you could
16:08
say, oh, the agricultural system is
16:10
dying. Well, the industrial system booming. So
16:12
all of that, we know that that
16:14
happened and we also know that
16:16
it created massive amounts of inequality that
16:19
if you basically stuck with the
16:21
old system. You ended up, you know,
16:23
the Charles Dickens stories about all
16:25
of a twist and all these kids
16:28
that were chimney sweeps and living on
16:30
the streets and all this sort
16:32
of stuff. These were some of the
16:34
like the byproducts of that shift.
16:36
So what happened 20 years ago is
16:39
we created the digital system and
16:41
we invented completely new rules of the
16:43
economy. We invented products that can scale
16:45
to billions of users effortlessly overnight
16:47
if they want to. We invented, we
16:50
invented completely new business models where
16:52
most people get access for free, small
16:54
few people pay subscriptions. We invented business
16:56
models like where Google does, where
16:58
they give away Google Maps, and then
17:01
they monetize through a little bit
17:03
of advertising. We invented businesses that don't
17:05
really require any geographical connection in
17:07
order to be a customer. So, you
17:09
know, you can have customers anywhere in
17:12
the world. and they can be
17:14
happy customers, I can have employees everywhere
17:16
in the world. So if we
17:18
take these digital businesses that have employees
17:20
everywhere in the world, they have customers
17:23
everywhere in the world, they can
17:25
innovate something once and then sell it
17:27
a billion times, they're operating in
17:29
a very different paradigm to the pub.
17:31
Absolutely, right? But we need both,
17:33
we still need the pub. Yes, however,
17:36
if you imagine Marathon, and we're running
17:38
all in a marathon race, and
17:40
then a few of the marathon runners
17:42
get access to bicycles, and then
17:44
one person gets access to a car,
17:47
then you're going to have a massive
17:49
amount of inequality between the bicycles
17:51
and the cars versus the runners. So
17:54
you can't get out of that.
17:56
You can't say, oh, wait a second,
17:58
why are some of these people
18:00
doing incredibly well? Because they're playing a
18:02
very different game. But not everyone needs
18:05
to do incredibly well. There are
18:07
some people who just want to build
18:09
a local business. They want a
18:11
cafe or a florist. They want to
18:13
do something in their town center.
18:15
I mean, the death of the town
18:18
center to me is a sign that
18:20
the economy itself is working for
18:22
local businesses. our town centre, the shift
18:24
has been quite dramatic to the
18:26
high street is now charity shops because
18:29
they don't have to pay business rates.
18:31
Yeah, barber shops, yeah, economic barber
18:33
shops, fine, people need their hair cut.
18:35
We have, yeah, we have betting
18:37
shops, we have a many-vape screen smash
18:40
repair. Yeah, and we know a
18:42
lot of these fronts for lawn and
18:44
money and then a lot of places
18:46
that are just closed down. Yeah.
18:48
But that's not surprising. Given this, continue
18:51
with this digital versus analog, the
18:53
reason the High Street worked was it
18:55
was a distribution model for products and
18:57
services. But that distribution model has
18:59
been massively surpassed by digital. So it's
19:02
not because that one, that particular
19:04
issue is not necessarily because of taxes
19:06
or any of that sort of
19:08
stuff, or it's because. That person sits
19:10
at home ordering on Amazon and having
19:13
something delivered to their door nine
19:15
times out of ten now. They don't
19:17
wander down to the town center
19:19
and buy the things that they would
19:22
have bought on Amazon. Sure, I just
19:24
think there's still things people, people
19:26
want a cafe, they want a cup
19:28
of coffee, they want a sandwich,
19:30
they want to go out and do
19:33
things, and some of them have
19:35
been, there's this really good book that
19:37
covers, the strategy of companies like Starbucks
19:39
and subway where they will open.
19:41
kind of equilibrium. But there is a
19:44
hurdle rate still of creating these
19:46
businesses. with you on the move to
19:48
the digital economy, there are still people
19:50
who want to build local businesses
19:52
and just the hurdle rates too hard.
19:55
Sure. I might want to build
19:57
a factory producing scissors like it's 1845.
19:59
It doesn't mean that I can
20:01
be successful at that. I hate to
20:03
say it, but just because you want
20:06
to build a quaint little cafe,
20:08
you're in competition with Starbucks. But not
20:10
only that, you're in competition with
20:12
people on their phone. you know people
20:14
used to go to a cafe
20:16
for a bit of entertainment with a
20:19
friend now they just facetime their friend
20:21
or they'll just sit there scrolling
20:23
tic-toc videos as opposed to you know
20:25
now they or they used to
20:27
go for quick coffee and then head
20:30
off and they're only there for 15
20:32
minutes now they'll sit there for
20:34
45 minutes occupying a table for three
20:36
pounds 50 because they're sitting there
20:38
glued to their phone. You know, so
20:41
all of these factors count. We
20:43
are going through this massive, this is
20:45
not a small shift, this is a
20:47
massive fundamental shift. We're going from
20:49
local to global. All businesses now have
20:52
global, you're either hyper local. or
20:54
you're global. And even smart businesses that
20:56
used to be hyper-local, they start thinking
20:59
about global. You know, they build
21:01
products that can scale, they build things
21:03
they can ship. You know, little
21:05
butcher down the road from me, he's
21:07
creating dried, what's it called, jerky,
21:09
you know, sending it out in the
21:12
post, sending it to customers all over.
21:14
So, you know, he's got the
21:16
Instagram account, all of that sort of
21:18
stuff. So... you know we are
21:20
going this fundamental big shift is happening
21:23
globally and it's driving all these other
21:25
trends as well the other thing
21:27
that is happening you know we talk
21:29
about immigration is that uh... fifteen
21:31
years ago only thirty percent of the
21:34
world had fast internet access the
21:36
ability to quickly download a photo or
21:38
a video was in the hands of
21:40
thirty percent of people who mostly
21:42
lived in developed uh... countries fast forward
21:45
to today it's over seventy percent
21:47
of the world's population have access to
21:49
fast internet on a mobile phone,
21:51
on a smartphone, on a smartphone. So
21:53
what's happening is all over the world,
21:56
people are looking at Instagram, saying,
21:58
oh, I wouldn't mind living in Mayfair.
22:00
Oh, I wouldn't mind, like, have
22:02
you been to Venice lately? You can't
22:04
walk around Venice, because some influencer turns
22:07
up in Venice and says, oh,
22:09
it's the most beautiful romantic city, and
22:11
they pump that out to their
22:13
12 million followers, and suddenly, all of
22:15
a sudden all these idiots turn
22:17
up in Venice from all over the
22:20
world. This rejection of tourism as well,
22:22
which I think finds super interesting.
22:24
I don't know if you've been following
22:27
what's happening in Spain, but there's
22:29
been lots of protests in certain places,
22:31
I think it's like Tenerife, but they're
22:33
putting it 100% tax now on
22:35
foreigners buying properties in these locations because
22:38
I think it's part of the
22:40
Airbnb economy, which is, you know, and
22:42
the fact that you can't really
22:44
save money in the bank, the people
22:46
are just buying properties and assets and
22:49
able to monetize them, but there's
22:51
a... huge protection of tourism in large
22:53
parts of Spain now, which used
22:55
to be the lifeblood of their economy.
22:57
Yeah, well they believe that tourism used
23:00
to be that you were from
23:02
somewhere else visiting this particular place and
23:04
experiencing their culture. Now what digital
23:06
has done is removed, like imagine a
23:08
bathtub that's like sectioned up into
23:10
five sections all with different colored waters
23:13
and you just pull the dividers out
23:15
and it all just goes boom.
23:17
I grew up in Australia. It was
23:19
absolutely, you would have had to
23:21
be seriously rich to holiday in Spain.
23:24
Like most Australians would never have
23:26
been to Spain. Spain was on the
23:28
other side of the earth, there was
23:30
very little chance. Fast forward to
23:32
today. You know, if you haven't been
23:35
to Spain by 25, you haven't
23:37
lived, man, you know, Yolo, get out
23:39
there, go to Spain. You know, most
23:41
Australians are now roaming around the
23:43
world, ticking off their bucket list of
23:46
locations, but that's everyone and everywhere.
23:48
So we're now, you know, Ibitha used
23:50
to be a cool little party
23:52
island that very few people knew about.
23:55
And now everyone in the EDM, dancing,
23:57
global... is like, oh, we've got
23:59
to do a pilgrimage to Ibitha, and
24:01
then if you go to Ibitha,
24:03
it's crazy. So it's completely lost its
24:06
local original feel. So I can understand
24:08
this difference between we have something
24:10
special and you can come and visit
24:12
it, versus this has now become,
24:14
you know, we can't even influence our
24:17
own town because all of you
24:19
people come here nonstop. I think also
24:21
we used to, you know, you used
24:23
to go to Spain and you
24:25
would stay in the hotel or eat
24:28
in the restaurants and you would
24:30
leave that money in that economy. But
24:32
if people from the UK are buying
24:34
properties in Spain and it'd be
24:36
out. You know, it stays in the
24:39
property economy. Yeah, so the money's.
24:41
going out there and coming straight back.
24:43
And it's made property too expensive
24:45
for local people. A massive issue in
24:47
the world at the moment is just
24:50
purely simply our psychology, the very
24:52
base part of our brain, wants a
24:54
place to live to call our
24:56
own and most people can't afford a
24:58
house anymore. Can't afford a house that
25:01
they want to invite their friends
25:03
over to. A lot of people can
25:05
maybe have a share accommodation or
25:07
they can buy a small flat. But
25:09
how many people... can have a
25:11
house where they can raise a family
25:14
and invite friends around. Well this massive
25:16
growth in these HMOs, which obviously
25:18
has been a great business opportunity for
25:20
probably people, I think it has
25:22
been a really bad signal in the
25:25
decline of society. Yeah. Somebody with
25:27
a job and an income, used to
25:29
be able to afford a home. Yeah,
25:32
they can afford a bedroom with
25:34
a shared kitchen. Yeah, with a shared
25:36
kitchen. We're forced in adults that
25:38
have like students. Yeah. And it's just
25:40
a mass... to me, it's a huge
25:43
signal of decline. It's horrible. It's
25:45
horrible. It's horrible. Yeah, and it's, it's,
25:47
you know, we've, we've debased money
25:49
to the extent that, you know, people
25:51
have rushed to assets and, you
25:53
know, smart people who owned assets in
25:56
the beginning. Those assets have massively inflated.
25:58
A house used to cost four
26:00
times the average wage and now it's
26:02
closer to nine. For those of
26:04
you out there who want to protect
26:07
your Bitcoin, Bitcoin treasury. Now if you're
26:09
serious about protecting your Bitcoin, you
26:11
will need a rock-solid security plan. And
26:13
CASA gives you just that. With
26:15
their multi-signature security and key management services,
26:18
CASA makes it easier than ever
26:20
to take control of your Bitcoin without
26:22
ever having the risk of a single
26:24
point of failure. Now they offer
26:26
multiple levels of protection, all designed with
26:29
simplicity and ease of use in
26:31
mind. And that works. Even if you're
26:33
not a tech expert. So don't leave
26:35
your Bitcoin security to chance. Go
26:37
to casa.io.io and check out the services
26:40
that I am using today to
26:42
protect my Bitcoin so you can protect
26:44
your stack and sleep easily. You
26:46
can find out more at casa.io, which
26:48
is c-a-s-a.io. That is casa.io. That is
26:51
casa.io. This episode is brought to
26:53
you by Big Casino, the world's first
26:55
licensed Bitcoin Casino. No deposit limits.
26:57
and no waiting on withdrawals. You can
27:00
enjoy gaming the way it was
27:02
meant to be, seamless and secure. Now,
27:04
Bikasino, you can play with Bitcoin, making
27:06
your experience faster, safer, and more
27:08
private. Plus, they offer some of the
27:11
best rewards in the industry, from
27:13
bonuses to loyalty programs, to keep the
27:15
fun going. Now, if you're looking for
27:17
a top-tier gaming experience, head over
27:19
to bikasino.io, which is BITCA. S-I-N-O-D-I-O, that
27:22
is Bicasino. I-O, and please remember
27:24
to gamble responsibly. Sappo Bank is the
27:26
world's first fully licensed and regulated
27:28
Bitcoin-enabled bank. Sappo Bank's all-in-one app allows
27:30
you to secure, transact, and grow your
27:33
Bitcoin. And you can also earn
27:35
Bitcoin daily with interest on your savings
27:37
for both BTC and USDA as
27:39
well as get cash back on all
27:41
your card spending. With over a decade
27:44
of experience in Bitcoin custody, You
27:46
can trust that your assets are saved
27:48
with Zapobank. They blend no whole
27:50
security with military-gray Swiss bunkers and straight
27:52
regulatory oversight. sure your funds are
27:54
always protected. And as a member you'll
27:57
get a dedicated account manager who can
27:59
guide you through their products and
28:01
help with everything you need. If you
28:03
want to find out more please
28:05
head over to zapobank.com for slash WPD
28:08
which is ex-a-p-o-b-a-n-k.com for slash W-w-b-d. We've
28:10
been looking for Connor. he wants
28:12
to get a property, it's just unachievable.
28:14
Look, my dad lent me 10
28:16
grand a buy house, 5 grand for
28:19
a deposit and 5 grand for
28:21
the furniture, right? And I paid him
28:23
back, I think, within two years, and
28:25
it was a hundred thousand pound
28:28
property. That property was 20 years, I
28:30
know, that property was 20 years
28:32
ago, I know now that probably cost
28:34
250,000, wages haven't increased by that
28:36
much. And that hurt all right, he
28:39
came for kids to get in there,
28:41
the cost of electricity, gas. I
28:43
mean, I think if you're under 40,000
28:45
pound a year in a lot
28:47
of towns, not all towns, it's going
28:50
to be very, very hard to get
28:52
on the ladder. But there's people
28:54
here in London who are only six
28:56
figures, they can't get on the
28:58
property. There's people here in London, I
29:01
know, on six figures, they're in
29:03
shared accommodation. And the only way you
29:05
can is if both of you work,
29:07
so it used to require 52
29:09
paychecks. 52 weeks of work, now it's
29:12
104 weeks of work, so you
29:14
have to have two of you, and
29:16
then you can't have kids. Or you
29:18
can't raise your kids. Or you
29:20
go through that really difficult three-year spell
29:23
where you had a kid, and
29:25
then you're moving them into a crash,
29:27
but the crash is eaten up
29:29
all your wage. I mean, the whole...
29:31
So here's the only thing, though. The
29:34
only thing is, for most of
29:36
human civilization, life is shit, life is
29:38
hard, right, right? So we're winging,
29:40
but... There was only really the post-war
29:42
era that had it so good. And
29:45
what happened is that we had
29:47
the industrialization period, call it the 1800s,
29:49
and then we had two massive
29:51
wars, and then there was a period
29:53
where we got the fruits of
29:55
industrialization without the wars, and we had
29:58
a good 50 years of... post-war,
30:00
highly industrialized society
30:02
where everyone during that period
30:04
got cheap houses and got to,
30:07
you know, there wasn't enough workers and
30:09
all that sort of stuff, but it's
30:11
worth remembering. that this is not
30:13
our God-given right to have a
30:15
good life. For thousands of years,
30:18
life is shit. Life is really
30:20
terrible, really hard for most people.
30:22
For most of human civilization, the
30:24
business model of human civilization is
30:26
one or two percent of people
30:29
are kings queens lords, you know,
30:31
king queen Jack. There are a few of
30:33
people who are the professional class who
30:35
serve those people, and then the
30:37
vast majority falls off a cliff
30:39
and work in serfdom. And that's
30:42
the business model of humans
30:44
for most of human civilization.
30:46
So unless, like, we've got
30:48
to remember that it's on
30:50
us to make the most of
30:53
the times we're in. Our great-great-great-great-grandparents
30:58
and on and on and on would
31:00
change places in a heartbeat and say,
31:02
get out of the way you wimp.
31:04
Right I'm gonna well we have we
31:06
are breeding we are breeding wimps. We
31:08
are we are training wimps Yeah as
31:10
soon as you go past your your
31:12
grandfather or maybe great. Well as soon
31:14
as you go back a couple of
31:16
generations They would look at us and
31:18
just roll their eyes and say are
31:20
you kidding me? You've got all this
31:22
superpowers and you're not making the most
31:24
of the times you're in. So keep
31:26
in mind personal responsibilities everything that you
31:28
can't just keep saying oh it should
31:31
be easier and it should be better.
31:33
I think where I'm going with this
31:35
is not that it should be easier.
31:37
I just don't want government to make
31:39
it harder. I don't want government for
31:41
no benefit at all to make the
31:43
life of everyone harder. I don't want
31:45
crony capitalism, I don't want the corruption
31:48
that we saw in the financial crisis.
31:50
Because the other thing is this kind
31:52
of financialization of everything has really just
31:54
squeezed more money to those who can
31:56
financialize everything. Well, I have a different
31:59
take. Okay, shoot. The different take I
32:01
have is technologies playing a bigger role
32:03
than we let on. And it goes
32:05
back to this running a marathon versus,
32:08
you know, running a marathon analogy. If
32:10
some people leverage technology, call it a
32:12
bicycle, then they are going to get
32:14
ahead way faster and they are going
32:17
to be able to do it effortlessly.
32:19
The people running a marathon on their
32:21
feet... are going to sit there exhausted
32:24
from how hard they're running and look
32:26
at the guy on the bike and
32:28
go, you're not even breaking a sweat
32:30
and you're way ahead of us. And
32:33
then if there's a car involved, they're
32:35
going to say, how the hell are
32:37
you even doing that? So I think,
32:39
like, the truth is, my life's amazing,
32:42
right? I can go anywhere I like,
32:44
buy anything I want, I can live
32:46
in my dream home, only one person
32:49
has to work, like, you know my
32:51
wife does property, but... not like she
32:53
doesn't work in a corporate or any
32:55
of that sort of stuff. So I'm
32:58
living the dream. The reason I'm living
33:00
the dream is I'm living in a
33:02
technologically enabled world. I'm running tech businesses.
33:04
So I have said, in my life,
33:07
I've said, the industrial revolution is over.
33:09
The industrial system is over. If I
33:11
try and do anything that integrates with
33:13
that system, which the government is part
33:16
of, then it's going to feel like
33:18
it's 50 years out of date. If
33:20
I do everything online. and if I
33:23
build a business online, I can live
33:25
and work from anywhere. If I don't
33:27
like it, I can easily move, I
33:29
pick up my phone and leave. I
33:32
mean, quite literally, I could literally just
33:34
get on a plane with my family
33:36
and we would just still be in
33:38
business when we arrive and everything would
33:41
be fine. If I've provided I've got
33:43
access to the internet, I'm completely fine.
33:45
I can travel around the world with
33:48
a laptop bag and that would be
33:50
it. So the key thing here is
33:52
that the government's not making our lives
33:54
harder. The dying industrial system is a
33:57
big old dinosaur that's coming to an
33:59
end and the super fast technological system
34:01
is overtaking. in doing really, really well.
34:03
And the reason people are doing it
34:06
tough is they want to do things
34:08
the way they were trying to do
34:10
it in school, because school prepared them
34:13
for a world called the industrial age,
34:15
and we're now living in a digital
34:17
age. And we're in a cross-ordered point.
34:19
We haven't caught up. Some people haven't
34:22
caught up. Some people are in the
34:24
car already. So the car, in this
34:26
metaphor, the car represents, you've got a
34:28
business that is AI enabled, you've got
34:31
a cloud-based, sass-based, So if you've got
34:33
a software as a service business with
34:35
AI running a whole bunch of stuff,
34:38
you've got a small 30, 40 person
34:40
team, multi-million revenue, high margins, and it
34:42
all runs in the cloud, and AI
34:44
does half the shit, you are in
34:47
the car. The bicycle represents your on
34:49
social media, you're a content-created, not a
34:51
content consumer, you collect an email database,
34:53
you've got, you can... harvest data, you've
34:56
got a scalable product that you can
34:58
deliver online, whether it be your services
35:00
coaching over Zoom or something like that,
35:03
or actually something that people can download
35:05
or something you can put in the
35:07
mail and ship it, you're on the
35:09
bicycle. And running is all the businesses
35:12
that anyone from the industrial age would
35:14
understand. If my grandfather could understand your
35:16
business by just looking at it, then
35:18
unfortunately you're going to be doing it
35:21
tough. Okay, I guess within that there's
35:23
an acceptance there are people who want
35:25
to be in the car. There are
35:27
people who just naturally want to be
35:30
an entrepreneur. They have an ambition, whether
35:32
it's success or financial success, and there
35:34
are some people who are happy to
35:37
be a passenger in that car. And
35:39
there are some people who just want
35:41
to run. We will have a mixed
35:43
economy. You're talking to the people. with
35:46
the ambition, the entrepreneurs, the ones who
35:48
might want to do it, or you
35:50
might be able to inspire to do
35:52
it. Well, I'm kind of just trying
35:55
to... Because it can't be everybody. I'm
35:57
just trying to make it clear. This
35:59
is what's going on. I'm not saying
36:02
it's right or wrong. I'm saying this
36:04
is just what is. This is the
36:06
reality. This is the reality of it.
36:08
It's not some giant conspiracy. Technology has
36:11
just moved the game and the goalposts.
36:13
And if you're playing an old game.
36:15
playing an old game. The reason that
36:17
we don't feel comfortable with it is
36:20
because we all went through a schooling
36:22
system that was built from the 1800s
36:24
to teach us how to be factory
36:27
workers. And no man, obviously sometimes I
36:29
would say probably, no certainly my son
36:31
has learned more, you should answer for
36:33
himself in three months working than I
36:36
did ever in school. Three months working.
36:38
Yeah. And I did a year at
36:40
union. Yeah, why would you say that
36:42
is? What's the difference? It's just real
36:45
world application. I remember one time in
36:47
math class we were doing the trigonometry
36:49
and I even went to my math
36:52
teacher. I was like, sir, when am
36:54
I ever going to use this in
36:56
my life? Like I'm not. Yeah, sure.
36:58
And we will need engineers and such.
37:01
Yeah, and also, look, footballers do push-ups
37:03
and sit-ups, which they'll never use in
37:05
a game, but it strengthens them, right?
37:07
So it actually, it's an exercise that
37:10
helps you to expand your intelligence. But
37:12
with that said, the whole system... is
37:14
built to create a world that doesn't
37:16
really exist terribly much. The reason that
37:19
you're learning so much in this environment
37:21
is we're creating digital assets, we're scaling
37:23
digital assets, we're communicating to a global
37:26
audience, we've got a product or service
37:28
that can monetize online. So you're actually
37:30
in a very, very different environment. If
37:32
someone came to me and said, oh
37:35
look, I really just want to work
37:37
in a sewing factory, they don't exist
37:39
anymore, right? in a globalized world they
37:41
you know they're not here in the
37:44
UK so if someone said I really
37:46
you know I really want to be
37:48
a tailor unfortunately that that game is
37:51
out you know or I really want
37:53
to be a horse buggy you know
37:55
horse saddle maker okay but there's only
37:57
like a very small number of saddles
38:00
that need making every day so the
38:02
the real thing is that if you
38:04
want to If you want to really
38:06
interrogate why things have changed, or one
38:09
of the biggest reasons things have changed,
38:11
it's geography versus non-geography operating system. So
38:13
geographically operating system means that your business,
38:16
95% of your clients are within a
38:18
five to 10 mile radius. That would
38:20
be a geography-based business. An internet business,
38:22
95% of your clients are not within
38:25
a 10 mile radius. Probably high, right?
38:27
Probably, yeah, most of them. It's an
38:29
accident if they are. When we look
38:31
at governments, what's the first word in
38:34
the government? The geography, right? So it's
38:36
the British government. And that British means
38:38
the British island, the island of Britain,
38:41
Britannia. You know, it's the London City
38:43
Council, the geography. It's the United States
38:45
government. geography. So the whole game of
38:47
government is geographical borders. So geography is
38:50
their big, that's their big thing. Whereas
38:52
the digital economy doesn't even respect geography
38:54
at all, doesn't care about geography. Geography
38:56
is a non-event, once you're in a
38:59
digital business. You know, half my team
39:01
are all over the world and I
39:03
don't even know where they are. I
39:06
own companies that I've never physically been
39:08
to. So I own a couple of
39:10
companies and I've never set foot in
39:12
the office. Amazing. You know, so the
39:15
government's playing a game by definition that
39:17
they can't win. Yeah, I mean, look,
39:19
there's certain things that haven't caught up.
39:21
And it's most of the stuff around
39:24
government, to be honest. But if I
39:26
was to ask you, I mean, I've
39:28
got two kids, one here, one still
39:30
at school, yeah, if I asked my
39:33
daughter at the end of the day,
39:35
what did you do at school today?
39:37
It will start with, yes, we had
39:40
math, double physics, biology, religious education, it's...
39:42
Blocks of 40 minutes. Yeah. Learning the
39:44
stuff they did. Yeah. Very rarely. I
39:46
think even less than I was there.
39:49
You know, there's no real. We used
39:51
to have this class. I joined that
39:53
was like after school and it was
39:55
we it was called young entrepreneurs with
39:58
great little business I don't even know
40:00
if those things exist anymore it is
40:02
sport and it's compartmentalize yeah the economics
40:05
was only available or business studies for
40:07
the final two years and so it
40:09
doesn't really exist and I know if
40:11
I said to you right now you're
40:14
going to be running a school tomorrow
40:16
but you could write the curriculum. I
40:18
know what you're going to, I know
40:20
because I know you're similar to, I
40:23
know you're going to write an entirely
40:25
different curriculum. An entirely different approach. Yeah.
40:27
So like for example. You're going to
40:30
produce, sorry to interrupt, you're going to
40:32
produce for this digital age where previously
40:34
we're producing for the Victorian industrial age.
40:36
Yeah. So a key differentiator would be
40:39
that in the Victorian industrial age, we
40:41
want you to be component labour. And
40:43
component labor basically means you're a cog
40:45
in a machine and you can fit
40:48
in any machine. So if I take
40:50
you, I say, oh, we're going to
40:52
turn you into a lawyer. And as
40:55
a lawyer, you'll be able to fit
40:57
into this law firm, this law firm,
40:59
this law firm, this law firm, or
41:01
into this factory that needs a lawyer,
41:04
or into this distribution center that needs
41:06
a lawyer, but you're a lawyer-shaped cog
41:08
and you can fit into any of
41:10
those machines. And that's component labor. They're
41:13
attention seekers, they're disruptors, they don't really
41:15
fit. You ask them, oh, you know,
41:17
what do you do? Well, I do
41:20
a bunch of stuff. I do, I
41:22
have a football club and a pub
41:24
and a podcast, right? And it's like,
41:26
right? And it's like, oh, you're a
41:29
weird shaped cog. There's not a lot
41:31
of people I meet like you. Doesn't
41:33
make sense. and that's actually what we
41:35
need to train people to be. So
41:38
the schooling system has to say what
41:40
we're not, we're no longer trying to
41:42
create, we're trying to create completely unique
41:44
individuals. So you might say to a
41:47
child, you know, what are you interested
41:49
in? They're like, I'm really interested in
41:51
concord. The plane, yeah, I'm really interested
41:54
in that. Okay, great. So what I
41:56
want to do is we want to
41:58
actually see what would it take to
42:00
get Concord flying again, we want to
42:03
say what would it cost, how much
42:05
was it, where are the current concords,
42:07
how many of them are there, how
42:09
much fuel did Concord require in order
42:12
to fly, blah blah blah, let's come
42:14
up with a whole bunch of things.
42:16
Now inside of all of that you're
42:19
going to have geography, maths, English, you're
42:21
going to have all these different disciplines,
42:23
but what they're actually doing is there.
42:25
running deep on a passion, they're running
42:28
deep on something, and they're coming out
42:30
with knowledge, they're coming out with wisdom.
42:32
You know, in the process of learning
42:34
about Concord, you're going to learn about
42:37
France, you're going to learn about England,
42:39
you're going to learn about the USA,
42:41
you're going to learn about how much
42:44
distance there is between New York and
42:46
London. So all of these things you'll
42:48
learn. That's what is problem solving? Problem
42:50
solving? There's actually a school in... Richrose,
42:53
yeah, yeah, I think it's run by
42:55
James' I'll, you might want to fact
42:57
check that column. It's in Los Angeles.
42:59
Home schooling is becoming very big. Well,
43:02
this isn't home schooling. So this, I
43:04
can't, yeah, I'm going to try and
43:06
remember the details, but as far as
43:09
I'm aware, per term a kid picks
43:11
a subject. And it could be anything.
43:13
You can say Italian cooking. France. A
43:15
Roman Empire. and you spend your entire
43:18
semester, your entire term, studying that subject,
43:20
and that subject only. I don't know
43:22
how they guide them, what the output
43:24
is at the end, but, because we
43:27
don't, you know, when you have kids,
43:29
you can try and influence them. I
43:31
tried to influence my son into my
43:33
music, he doesn't like my music for
43:36
shit. For some reason, he likes drill,
43:38
and I like heavy metal. I tried
43:40
to do the same with daughter, she
43:43
likes some of my music. But she
43:45
likes other stuff. Yeah. And you know,
43:47
I dress a certain way. Look, they
43:49
have thoughts and ideas, which I influence
43:52
on them. They have thoughts and ideas
43:54
in terms of political ideas, because I
43:56
get to... But taste. it's hard to
43:58
influence too much. And so whilst I
44:01
wanted to write a football club and
44:03
now have a podcast, Connor might want
44:05
to be an artist. Yeah. Or Scarlet
44:08
might want to do something else. And
44:10
the great thing about this school is
44:12
you take these kids in. I like
44:14
this. Let's foster that. It's called Muse
44:17
Global. What does it say about it?
44:19
Is there anything any details on it?
44:22
Music globe is in an internationally
44:24
recognized warming school, high performance learning
44:26
lab that focuses on eco literacy
44:29
and serves as a beacon of
44:31
sustainable living. Yeah, but that's all
44:33
they're vegan and they have like
44:35
a little. I like high performance
44:38
learning lab. Yeah, yeah. You see,
44:40
go on it. One of the
44:42
things that's wild is that with
44:44
AI, you can train AI, to
44:46
have ongoing conversations with a learner
44:49
and they can run down that
44:51
rabbit hole. So. Let's say you
44:53
pick Italian cooking, you can have
44:55
an AI that basically talks all
44:57
day about Italian cooking and then
45:00
also asks the question, like how
45:02
much olive oil that goes into
45:04
this recipe? And if I've only
45:06
got a 30 mil measuring cup
45:09
and a 5 mil, what would
45:11
be the fastest and best way
45:13
to get, you know, 25 mil?
45:15
Yeah, go back to the toxic,
45:17
God, I just want to read
45:20
what it says. Right. Nice environment.
45:22
Located, yeah it's beautiful and they,
45:24
I think it's vegan and I
45:26
think, you know, which I don't
45:29
entirely agree with, but, and they
45:31
also have, they tend to the
45:33
land, I think they grow crops
45:35
and things, but located in Calabasasas
45:37
California, Muse Global is a innovative,
45:40
holistic private school offering, in-person education
45:42
for preschool, blah blah, blah. the
45:44
whole child by going beyond academics,
45:46
connected with students by teaching communication,
45:49
self-efficacy and sustainability through passion-based passion,
45:51
that's it, passion-based learning. Keep, keep
45:53
scrolling down. Keep going, keep going,
45:55
keep going, keep going. Yeah, passion-based.
45:57
See the kid there with the
46:00
guitar? I mean, look, the idea
46:02
is beautiful, but unfortunately it's private.
46:04
It's probably cost of absolute fortune
46:06
and it's for the Hollywood elite.
46:09
Yeah, more and more people are
46:11
homeschooling. Yeah, because if you homeschool,
46:13
you can travel when you want
46:15
to travel, you can go and
46:17
stay in an Airbnb for six
46:20
months. People with digital businesses, it's
46:22
become very, very common to find
46:24
homeschooling communities. I've tried out a
46:26
few of the online education. AI
46:28
tutors, it's scarily good, like scary
46:31
good. I mean you can just
46:33
literally talk to a computer all
46:35
day and it will guide you
46:37
through learning. AI is scary like
46:40
that. I do this driving. Got
46:42
to laugh about it. I'll be
46:44
driving along and I'll put on
46:46
my phone. I'll be like, tell
46:48
me about the fall of the
46:51
Roman Empire. How does that
46:53
compare to what's happening today? We
46:55
just have this ongoing conversation. It's
46:57
wild. Yeah, I did the same.
46:59
You know, Donald Trump just mentioned,
47:01
he liked President William McKinley, and
47:03
I just asked AI, what are
47:05
the similarities between McKinley and Trump?
47:07
And it gave me a list
47:09
of like seven things that the
47:11
two of them have in common.
47:13
The weirdest thing with AI recently
47:15
is, I would say when I
47:17
first started playing with... AI and
47:19
ChatGPT probably a year and a
47:21
half, two years ago, it was
47:24
like question answer. I now know,
47:26
I can see it's linking from
47:28
previous things I've asked it and
47:30
it's starting to get to know
47:32
me. So I'll ask it certain
47:34
things where I'm not mentioning in
47:36
other parts and it'll bring into
47:38
things that I've talked about it
47:40
previously. You can directly ask it
47:42
based on what you know about
47:44
me. What do you think of
47:46
my biggest shortcomings? And it will
47:48
tell you. Yeah. Yeah. JetGPT has
47:50
died at the time we need
47:52
it. We're going to come back
47:54
to that. Yeah. I mean, how
47:56
much, when you talk about this,
47:58
yeah, we went from agriculture to
48:00
industrialization to digital. Do you consider
48:02
AI just part of the digital
48:04
or do you consider this almost
48:06
separate thing? Well, the Industrial Revolution
48:08
went through a few iterations. So
48:10
we had like coal-based heating and
48:12
pumps and then we generated electricity
48:14
and then we generated, you know,
48:16
compute power. So there was these
48:18
iterations. What I think with... done
48:20
as we think we're advanced technology
48:23
now, but actually the last 20-25
48:25
years was just kind of laying
48:27
the basic railroad tracks of the
48:29
digital age. Interesting. And actually we're
48:31
about to go into a very
48:33
big shift. I mean, consider that
48:35
in three years time, AI will
48:37
be smarter than all humans at
48:39
all things. There will be nothing
48:41
that... you would hire a human
48:43
to do versus an AI if
48:45
you could. You might have a
48:47
few humans directing AI, you might
48:49
have a few humans overseeing stuff,
48:51
but there will be no such
48:53
thing as a lawyer that is
48:55
better than a chat GPT lawyer.
48:57
There'll be no such thing as
48:59
a doctor that's better at diagnosing
49:01
than an AI. I wonder what
49:03
that actually means. Does it still
49:05
mean we have lawyers arguing or
49:07
we just have a judge AI
49:09
that just knows the outcome? Yeah,
49:11
could have humans in the loop
49:13
making like confirming that everything ran
49:15
smoothly that the AI received all
49:17
the information like a prompt judge
49:19
or something like that a judge
49:22
who judged the prompts You know
49:24
the The the the super smart
49:26
AI will theoretically just ask for
49:28
a lot of information ask a
49:30
bunch of questions and you know,
49:32
it'll come back to you with
49:34
you know with you know with
49:36
judgment, but but I think If
49:38
you were to have two people
49:40
who were doing a deal, you
49:42
would just have AI running in
49:44
the background and at the end
49:46
of the conversation it would just
49:48
have a heads of terms that
49:50
was automatically... created based on what
49:52
was discussed and then you'd ask
49:54
it for edits and contracts to
49:56
be produced and then ask it
49:58
for a to do list of
50:00
what do we need to do
50:02
in order to get the deal
50:04
done and then you'd say to
50:06
an agent can you update company's
50:08
house that that's all gone ahead
50:10
that we're you know that we're
50:12
making that transaction happen. So what
50:14
happens to us? Yeah it's a
50:16
different paradigm so there's a negative
50:18
paradigm and there's a positive paradigm
50:20
so the negative paradigm is... Consider
50:23
this, right? This is pretty far
50:25
out. But if you think about
50:27
the agricultural age, almost all the
50:29
value in society was created by
50:31
artificial intelligence. Actually, not artificial intelligence,
50:33
natural intelligence. Natural intelligence means you
50:35
put a seed in the ground
50:37
and then you don't know how
50:39
it happens, but it magically grows
50:41
into a tree and that magically
50:43
produces apples and then you pick
50:45
the apples and you have food.
50:47
So we called that god. And
50:49
what did humans do? plant seeds
50:51
and pick fruit, right? And we
50:53
kind of just waited for nature
50:55
and God to do its thing
50:57
and we lived around natural intelligence
50:59
all the time, but we didn't
51:01
have much. We assumed that we
51:03
were dumb in relationship to God.
51:05
So we basically entertained ourselves with
51:07
other things while God was doing
51:09
what it was doing, right? We
51:11
knew that we couldn't speed up
51:13
the process of growing apples. It
51:15
was going to take however long
51:17
it took. But essentially, imagine soil
51:19
is artificial intelligence and that essentially
51:22
it's the magical thing that produces
51:24
all the things. and we just
51:26
have to put little prompts in
51:28
called seeds and then wait until
51:30
it comes up with the answer.
51:32
So it is AI, interesting. Right,
51:34
so in that scenario, the downside
51:36
is that a few people owned
51:38
all the soil and they were
51:40
the King's Queen's Lord's Dukes and
51:42
then everyone else was just a
51:44
surf passing time and hoping that
51:46
the King's Queen's Lord's Dukes would
51:48
give them enough to survive from
51:50
the soil. that would come from
51:52
it. So subsistence living came out.
51:54
So essentially you had the AI,
51:56
so in the digital feudal system.
51:58
the neofutal system, you will have
52:00
a group of people who own
52:02
the AI continents, the digital soil,
52:04
and those people will have vast,
52:06
vast, vast wealth like Louis V
52:08
or whatever, and then the rest
52:10
of us will just... basically enjoy
52:12
enough to subsistence. Now, what does
52:14
subsistence look like? Well, it means
52:16
all the food medicine knowledge that
52:18
you could possibly want. It means
52:21
all the games that you could
52:23
entertain yourself with, because all of
52:25
that comes out of the soil,
52:27
the digital soil. So... Bread and
52:29
circus. Yeah, so you end up
52:31
with a scenario where, you know,
52:33
by the way, this is not
52:35
unfamiliar. This is 10,000 years of
52:37
civilization other than when we created
52:39
a middle class. But the vast
52:41
majority of people own nothing and
52:43
a tiny number of people own
52:45
everything. That's kind of how that
52:47
potentially worked. Another scenario. This is
52:49
the positive or the negative one?
52:51
That's the negative one. Okay. Yeah,
52:53
the positive one. Why would you
52:55
say that's negative? Because there is
52:57
an idea within that that, you
52:59
know, I'm pretty sure. back in
53:01
the time when the agriculture era,
53:03
it was enough food that could
53:05
be produced for everybody. But with
53:07
this kind of hyper-industrialization off the
53:09
back of the wealth of AI,
53:11
we could have an abundance of
53:13
food. And if we crack this...
53:15
We'd have an abundance of everything.
53:17
Yeah, if we have an abundance...
53:19
Robots would grow all the food
53:22
and Michein Star chefs would prepare
53:24
it. Robot chefs would be Michein-style
53:26
level chefs. Every meal you ate
53:28
would be one of the best
53:30
meals you've ever eaten. all the
53:32
entertainment would be customized to you.
53:34
So for example, if you wanted
53:36
to see episodes of The Simpsons
53:38
featuring you, your friends, and your
53:40
life, that could easily be created.
53:42
If you wanted to have 10
53:44
episodes, if you really enjoyed a
53:46
Netflix show and you wish it
53:48
didn't end and you said, actually,
53:50
can I just get 10 more
53:52
episodes of that? He could just
53:54
write it and produce it. So
53:56
we have an abundance of everything.
53:58
Everything. Everything. need is fulfilled. Yeah.
54:00
That's negative because... It's negative to
54:02
our brain that says we have
54:04
to own a house and own
54:06
a thing and have a job.
54:08
If you imagine there was a
54:10
time, if there was a time
54:12
where the farming went really well
54:14
and the land was producing enough
54:16
for everybody, then humans tended to
54:18
do things like throw festivals and
54:21
we used to do things like
54:23
celebrate weddings and we used to
54:25
do things like have really strong
54:27
communities. And humans actually didn't work
54:29
the way they work now in
54:31
a very compartmentalized way. We were
54:33
very integrated and we used to
54:35
think a lot about bringing people
54:37
together and all of those sorts
54:39
of things. Celebrating seasons and having
54:41
religion and all of that, religion
54:43
and all of that. It was
54:45
all community-based and the farming did
54:47
most of the work, the soil
54:49
produced most of the stuff. But
54:51
where we are today we have
54:53
a mindset that we all need
54:55
a job and we all need
54:57
a house and we need to
54:59
live on our own terms and
55:01
be very disconnected from each other.
55:03
relative to what we think we
55:05
should have, then that will feel
55:07
negative to some people that a
55:09
small group of oligarchs own absolutely
55:11
everything and everyone else lives off
55:13
the digital proceeds of the soil,
55:15
the digital soil. So the goals
55:17
of the Kings and Queens of
55:20
the AI era will be to
55:22
ensure there is a revolution. Yeah,
55:24
and potentially they will be guided
55:26
by the AI just as much
55:28
as anyone else. It just happens
55:30
to be that they, you know,
55:32
they live in a different class
55:34
of people. If you take the
55:36
pack of cards, the deck of
55:38
cards, it was based around that
55:40
scenario. You had Jack Queen and
55:42
King representing virtuous power, military power,
55:44
and trading power, and then you
55:46
had the faceless masses one to
55:48
ten, and the ace, so the
55:50
ace represents the lowest of society,
55:52
and the ace can overthrow the
55:54
king. So, yeah, I've never heard
55:56
this. You don't know much about
55:58
the pack of, the cards? The
56:00
pack of cards is super- It's
56:02
a society. It's, it's a map.
56:04
for how to run society. So
56:06
you've got the four suits represent
56:08
the four seasons and the four
56:10
ways of thinking. So the four
56:12
thinking styles is big picture represented
56:14
by clubs, detail hard work represented
56:16
by spades, people orientated represent by
56:18
hearts, detail and refinement orientated represented
56:21
by diamonds. You've got the layers
56:23
of society, you've got the three
56:25
different types of power which is
56:27
dominance, virtue and trade. or the
56:29
three types of institutions. 364 is
56:31
what they all add up to,
56:33
plus the Joker is 365, so
56:35
you've got the year, then you've
56:37
got 13 weeks per season, 13
56:39
cards per suit. This is wild.
56:41
Who invented the pack of cards?
56:43
It goes back to the Chinese.
56:45
And then there's black and red
56:47
representing masculine feminine energy. So if
56:49
you take... masculine energy is the
56:51
two black suits which is strategy
56:53
and hard work and feminine energy
56:55
is community and refinement and it's
56:57
just the callest thing of hurling
56:59
ages I've had no reason to
57:01
go down that rabbit hole it's
57:03
a great rabbit hole yeah wow
57:05
yeah okay they so I love
57:07
that idea that the ace is
57:09
the one or the 11 yeah
57:11
it can overthrow the king it's
57:13
at the bottom of society but
57:15
if you push it too low
57:17
it's at the bottom of society
57:20
but if you push it revolution
57:22
That's amazing. Yeah, I love that.
57:24
They created those kind of games
57:26
as training tools for the elite
57:28
as to how to understand society.
57:30
So chess and the pack of
57:32
cards are elite training tools simulations.
57:34
Basically, they're simulation games to simulate
57:36
how to run a society. That's
57:38
amazing. Yeah. So okay, so that's
57:40
the negative one and it's negative
57:42
because... Even with abundance we might
57:44
not feel like we have autonomy
57:46
or control. Is this where we
57:48
probably end up in matrix already
57:50
player one? Yeah, ready player one
57:52
type thing because the only fun
57:54
thing to do that gives you
57:56
meaning is to be part of
57:58
an... online game community. It's kind
58:00
of like my cat. My cat
58:02
lives in the house, doesn't know
58:04
where all the food comes from,
58:06
doesn't know where everything happens, but
58:08
it roams around having a good
58:10
day. coming up with like little
58:12
things to chase. It's completely unaware
58:14
of all the things that go
58:16
into running the house, doesn't know
58:18
what Google is or calendars, it
58:21
doesn't know how to run a
58:23
business, but it's the beneficiary of
58:25
a very complex system. Yeah, but
58:27
occasionally you get born and goes
58:29
off to murder someone. Murders a
58:31
mouse, right? Yeah, it does some
58:33
base level instinct things, but it's
58:35
living in a wildly abundant environment
58:37
for a cat. that it has
58:39
no idea how that was created.
58:41
But it's still having a good
58:43
time. Yeah. The other slightly different
58:45
scenario is if you study very
58:47
wealthy multi-generational families who have big
58:49
trust funds and they essentially have
58:51
a trust fund where the wealth
58:53
creation is automated. And then the
58:55
challenge... for a young person born
58:57
into a very very wealthy family
58:59
is what will I do with
59:01
my life of meaning because I'm
59:03
allocated all these resources but I
59:05
don't have to work if I
59:07
don't want to. So you know
59:09
you look at what trust fund
59:11
kids tend to do. Some of
59:13
them become alcoholics and some of
59:15
them get into drugs. Some of
59:17
them get into hedonism. Some of
59:20
them get into philosophy and education.
59:22
Some of them choose to work
59:24
because they find meaning in the
59:26
work. Some of them go to
59:28
endless festivals and they're going, you
59:30
know, this month we're in a
59:32
festival in Milan and next month
59:34
they're in a festival in, you
59:36
know, USA and then they're off
59:38
to Mexico for another festival. So
59:40
essentially, you know, we will be
59:42
like trust fun kids where everyone
59:44
is a various like... trust fund
59:46
kid recipient. So we'll have an
59:48
entire population of arrogant dick. the
59:50
worst of us could come out.
59:52
Yeah, because you don't have to
59:54
really refine against society. Yeah, because
59:56
one of the problems is that
59:58
to fit into society, you have
1:00:00
to learn social skills, you have
1:00:02
to learn certain skills, you have
1:00:04
to learn certain, you have to
1:00:06
teach a way to navigate society,
1:00:08
navigate people, navigate work. If you
1:00:10
don't have to do that, what
1:00:12
does it do to you? Well,
1:00:14
this is the thing. You know,
1:00:16
when I go to church occasionally,
1:00:19
I find myself sitting next to
1:00:21
a woman who emigrated from the
1:00:23
Philippines and has lived here for
1:00:25
30, 40 years. She cleans houses
1:00:27
and she's perfectly happy cleaning houses
1:00:29
and all that sort of stuff.
1:00:31
And we share a conversation and
1:00:33
we have a cup of tea
1:00:35
together and we talk and I
1:00:37
hear a bit about her life,
1:00:39
she hears a bit about my
1:00:41
life, but we have to get
1:00:43
along. The internet has done this
1:00:45
thing where... I don't have to
1:00:47
get, if I live my life
1:00:49
online, I can only hang out
1:00:51
with people who believe what I
1:00:53
believe and hang out with people
1:00:55
who are on my team. I
1:00:57
don't have to think about other
1:00:59
people terribly much. What we used
1:01:01
to have in like community-based, geographically-based
1:01:03
societies is we used to have
1:01:05
things like church, where you had
1:01:07
to sit next to someone who
1:01:09
didn't agree with everything that you
1:01:11
said. You had to learn how
1:01:13
to understand what their point of
1:01:15
view was. Because your cousin might
1:01:17
marry there. daughter or you know
1:01:20
such and such so it's all
1:01:22
interconnected you have to get along
1:01:24
and we've lost a bit of
1:01:26
that. So the the positive version
1:01:28
the positive version yeah positive version
1:01:30
is there is the trust fund
1:01:32
kid kind of that is the
1:01:34
positive well so is it where
1:01:36
you have you have agency you
1:01:38
have choice but you choose to
1:01:40
live yeah the positive version is
1:01:42
that you're you've got loads of
1:01:44
resources And the only thing you
1:01:46
have to do is choose what
1:01:48
you want to do with all
1:01:50
of those resources. Now if we
1:01:52
had a schooling system that was
1:01:54
extremely good at preparing you for
1:01:56
the burden of enormous... resource, then
1:01:58
you would probably get a society
1:02:00
that's incredibly functional. But you have
1:02:02
to have a schooling system that
1:02:04
prepares you for what you're about
1:02:06
to receive. So, you know, when
1:02:08
a trust fund kid is raised
1:02:10
well, they take on the responsibility
1:02:12
of their family fortune, you know,
1:02:14
with responsibility and with, you know,
1:02:16
great vision for it, and, you
1:02:19
know, they're a good, it's a
1:02:21
good thing. If they're not raised
1:02:23
to receive that, then it can
1:02:25
be a very difficult thing. So
1:02:27
it's all about are you raised
1:02:29
to receive that level of resource.
1:02:31
But we're getting pretty philosophical, but
1:02:33
the idea being that we are
1:02:35
going to live in a world
1:02:37
where not many people have to
1:02:39
do the types of jobs that
1:02:41
we think of, any more, it'll
1:02:43
be a very different spin on
1:02:45
things. So a lawyer will not
1:02:47
be someone who drafts all the
1:02:49
contracts. That will be done by
1:02:51
AI. A lawyer will be someone
1:02:53
who coaches you through the process
1:02:55
and helps you to think through
1:02:57
what are the more... You know
1:02:59
the bigger objectives and who else
1:03:01
needs to be in the room
1:03:03
and what kind of stakeholders do
1:03:05
we need to get? Aligned and
1:03:07
how would we align those needs?
1:03:09
And it's like a lawyer will
1:03:11
be more of the coach Talking
1:03:13
you through their special You know
1:03:15
that with their knowledge of how
1:03:18
deals come together and all the
1:03:20
stuff that lawyers used to do
1:03:22
most of the time that'll just
1:03:24
magically happen. What would be interesting
1:03:26
in that scenario is the how
1:03:28
laws are established or refined because
1:03:30
Contract law is contract law, that's
1:03:32
fine. We have agreement, you and
1:03:34
I want to do a business
1:03:36
deal, we have agreement on the
1:03:38
terms, I will supply X or
1:03:40
Y, yada, and if one of
1:03:42
us breaks the terms of the
1:03:44
agreement, now we can see the
1:03:46
other person, we can go to
1:03:48
law and we can argue our
1:03:50
case. But there are other areas
1:03:52
of law, not just contract law.
1:03:54
For example, say at the moment
1:03:56
in the UK, it is illegal.
1:03:58
to take drugs, most drugs. And
1:04:00
if you're caught in possession, you
1:04:02
can be arrested, you can be
1:04:04
caught dealing. But the laws might
1:04:06
change because AI knows... we discuss
1:04:08
with the AI, the idea of,
1:04:10
are these laws just? Is marijuana
1:04:12
damaging for society, is it net
1:04:14
damage? So we might reestablish the
1:04:16
laws with the AI. Yeah, and
1:04:19
we're definitely not talking in the
1:04:21
next few years, we're talking further
1:04:23
out, right? But we could also
1:04:25
very much have systems that have
1:04:27
a lot more nuance. So I'll
1:04:29
give you a different example. There
1:04:31
are AI systems that monitor elderly
1:04:33
people inside their home. and they
1:04:35
can detect slight changes in speech
1:04:37
pattern, microstumbles, micro movements that indicate
1:04:39
they're in decline and can notify
1:04:41
people. So you can actually have
1:04:43
someone who their rate of speech
1:04:45
declines by 5% over the course
1:04:47
of a few months and the
1:04:49
AI says the rate of speech
1:04:51
is declining. So human might not
1:04:53
pick that up, but an AI
1:04:55
will pick that up. So you
1:04:57
can have these systems that detect...
1:04:59
nuance like we don't have. So
1:05:01
for example, it might just be
1:05:03
that some people can take drugs
1:05:05
and it's a perfectly normal part
1:05:07
of their life, and the AI
1:05:09
system detects that this is not
1:05:11
a problem for you, keep doing
1:05:13
what you're doing. And you might
1:05:15
have people who, when they take
1:05:18
drugs, the AI system says, this
1:05:20
is a really bad idea for
1:05:22
you. You've got addictive personality, you've
1:05:24
got this, you've got that. You
1:05:26
know, it takes you three or
1:05:28
four days to recover. You're not
1:05:30
yourself. You know, you don't, I've
1:05:32
noticed that when you do this,
1:05:34
this happens. So you can have
1:05:36
different, there's a lot more nuance
1:05:38
that happens in a world of
1:05:40
AI. We need the AI not
1:05:42
to die like this though, the
1:05:44
redundancy will be a border, or
1:05:46
we'll just stop when you do.
1:05:48
Yeah. We've made decisions. I mean,
1:05:50
we started to see that the
1:05:52
AI being kind of radically introduced
1:05:54
into healthcare now. I think I
1:05:56
was reading this week about oncology
1:05:58
and how AI is able to
1:06:00
just outperform marginally marginally better. than
1:06:02
a human. massively. Not marginally, massively.
1:06:04
Oh, the one I saw was
1:06:06
a marginal increase, but it was...
1:06:08
very early scans, but it was
1:06:10
picking up some like 5% more
1:06:12
indicators than like the human rights
1:06:14
would. There's AI mixed with even
1:06:16
quantum computing, and if you have
1:06:19
AI and quantum computing, then you
1:06:21
can essentially take a blood test
1:06:23
and find out all the things
1:06:25
you've got going on or not
1:06:27
going on. We're heading to a
1:06:29
world where. you know we can
1:06:31
we'll be able to detect things
1:06:33
very very early you know our
1:06:35
health data will you know we
1:06:37
might wear you know an apple
1:06:39
watch that actually plugs into a
1:06:41
broader AI system that gives a
1:06:43
new context as to why you're
1:06:45
having certain system symptoms yeah you
1:06:47
know the thing would be necessary
1:06:49
pardon will we even be necessary
1:06:51
well this is the other thing
1:06:53
a doctor will probably be someone
1:06:55
who coaches you through that process
1:06:57
A nurse will be someone who
1:06:59
cares for you through the process.
1:07:01
All the serious stuff happens through
1:07:03
AI and robots and those sorts
1:07:05
of things. But the human care
1:07:07
stuff happens with other humans who
1:07:09
are involved in that perspective. So
1:07:11
there's a human in the loop
1:07:13
making it a more humanized experience
1:07:15
perhaps. And you know, there's definitely
1:07:18
those scenarios. and they're not far
1:07:20
away those ones. Do you think
1:07:22
I'm worried about where AI can
1:07:24
go kind of here in wrong,
1:07:26
do crazy shit? It goes, well
1:07:28
it can go here in wrong
1:07:30
and do crazy shit in the
1:07:32
hands of crazy people. But even
1:07:34
on its, I mean, did I
1:07:36
read this week Apple has been
1:07:38
publishing news reports out of force?
1:07:40
Yeah, yeah, it can hallucinate, it
1:07:42
can hallucinate stuff. It can also
1:07:44
just have misaligned goals. You know,
1:07:46
if an AI decides that it
1:07:48
needs to, you know, replicate YouTube
1:07:50
55,000 times, you know, could just
1:07:52
take up all the bandwidth of
1:07:54
the internet type thing, it could
1:07:56
manufacture 10 billion paper clips because...
1:07:58
you know, something said that that
1:08:00
would be a good idea for
1:08:02
it to do. You know, those
1:08:04
are all the kind of typical
1:08:06
scenarios where it kind of goes
1:08:08
off the rails. There's another scenario
1:08:10
too that's probably quite scary, which
1:08:12
is that in Roman times, there
1:08:14
was the initial Roman development period
1:08:17
where they almost industrialized, the Romans
1:08:19
were so close to the Industrial
1:08:21
Revolution, and then they collapsed. They
1:08:23
even to the point where they
1:08:25
had small steam engines, spinning steam.
1:08:27
they had figured out that steam
1:08:29
could power things in a very
1:08:31
very small way. They were so
1:08:33
close to industrializing even 2000 years
1:08:35
ago. And they had this in
1:08:37
phenomenal architecture. But then after the
1:08:39
crash of Rome, there was hundreds
1:08:41
of years where people lived amongst
1:08:43
Roman ruins, but had no idea
1:08:45
how they were built. So there
1:08:47
were aqueducts and there were these
1:08:49
massive buildings, but no one of
1:08:51
the time could do that. And
1:08:53
it was only in the Renaissance
1:08:55
in the 1500 in the 1500.
1:08:57
where the Pope got together all
1:08:59
the Ninja Turtles and said fix
1:09:01
it and said figure out how
1:09:03
we do that stuff, right? If
1:09:05
we did it before, we can
1:09:07
do it again. But for 1500
1:09:09
years, that knowledge was just completely
1:09:11
lost. We lived amongst what was
1:09:13
created by past civilization. Now, what
1:09:15
could really happen, it's quite dangerous,
1:09:18
is that we become so reliant,
1:09:20
so dependent on AI. that ultimately
1:09:22
there's no humans who actually understand
1:09:24
the fundamentals. Well that feels almost
1:09:26
inevitable. Yeah, yeah it does. That's
1:09:28
why I'm raising that one. Well
1:09:30
because like so for example coding
1:09:32
right? One of my earliest careers
1:09:34
when I started out as a
1:09:36
web design, a building website, I
1:09:38
learned to code HDML, and then
1:09:40
the reality is Dreamweaver came along,
1:09:42
which by the way created terrible
1:09:44
code, but I didn't have to
1:09:46
create the code. And I didn't
1:09:48
care that created crappy code. I
1:09:50
used it. It worked. Because it
1:09:52
worked. And it built a web
1:09:54
page quicker and I could essentially
1:09:56
scale myself as creating more websites.
1:09:58
We know now that it's almost
1:10:00
certainly a waste. of time for
1:10:02
most people to consider. Like if
1:10:04
14-year-old said I want to be
1:10:06
a computer programmer, it's
1:10:08
almost a waste of time. I
1:10:11
know I can go on to hear now,
1:10:13
well I could if it was working. I
1:10:15
could go on and say write me
1:10:17
the code for an app and it
1:10:19
will just write me the code. In
1:10:22
doing so, if no one needs to
1:10:24
learn the code, who knows the
1:10:26
code? who can audit the code.
1:10:28
Yeah and who has that way
1:10:31
of thinking because it's also it's
1:10:33
not just coding it's a way
1:10:35
of thinking you know big organizations
1:10:37
used to hire mathematicians and mathematicians
1:10:39
worked in large organizations or in
1:10:42
in in places like banks and
1:10:44
things like that and their job
1:10:46
was to do the maths. Maybe
1:10:48
we'll just need code scholars. There'll
1:10:50
be a discipline. you know it's very possible
1:10:53
that we end up in a world where
1:10:55
no one actually knows how anything works without
1:10:57
the use of AI. It's wild to think
1:10:59
about what kind of time frames do you
1:11:01
think things get a little bit more wild
1:11:04
with AI because like right now I think
1:11:06
it's wild what it can do and it's
1:11:08
fundamentally changed all my businesses I save so
1:11:10
much money I still have so I can
1:11:12
do graphic design with AI but I still
1:11:15
need the designer to do the implementation for
1:11:17
me but at the moment it's just
1:11:19
It is just a toy. It's
1:11:21
like a very advanced Google. Where do
1:11:23
things get weird? It gets weird
1:11:25
pretty soon. Okay, cool. For 10,000
1:11:27
years humans lived with horses and
1:11:30
we relied on horses and we
1:11:32
were very horse dependent. In fact,
1:11:34
there was something like 950 horses
1:11:37
per thousand people. So we had
1:11:39
a lot of horses around and
1:11:41
we had whole industries on horses.
1:11:43
So like horse and human. We
1:11:46
were we were inseparable for a
1:11:48
long time 10,000 years horses probably
1:11:50
could have felt pretty comfortable with
1:11:52
their place in the world And
1:11:55
then in early 1900s Henry Ford
1:11:57
does the factory produced model
1:11:59
T 12 years later no horses all
1:12:01
cars so now there there are
1:12:03
photos on the streets of New
1:12:05
York where in like 1903 there's
1:12:07
one car on the road yeah
1:12:09
and then 1915 there's one horse
1:12:11
left and it's just boom complete
1:12:13
transformation and today there's something like
1:12:15
I don't know 20 horses per
1:12:17
thousand people on the planet or
1:12:19
something like that we didn't need
1:12:21
nine hundred and ninety of the
1:12:23
horses out of you know yeah
1:12:25
I think so that's the one
1:12:27
car in the street and then
1:12:29
that's the one horse on the
1:12:31
street that is why those were
1:12:34
only 13 years apart that is
1:12:36
wild yeah so guy and with
1:12:38
with a horse who couldn't afford
1:12:40
a car. He was just hanging
1:12:42
on. This horse, these cars won't
1:12:44
catch on. It's a fad. So
1:12:46
look, here's the crazy thing. That
1:12:48
took 13 years. And you had
1:12:50
to build steel mills. You had
1:12:52
to do conveyor belts. You had
1:12:54
to organize new unlimited amounts of
1:12:56
labor. AI doesn't need any of
1:12:58
that it just rolls out so
1:13:00
as soon as we invent a
1:13:02
new model that's smarter than the
1:13:04
last one it's immediately available on
1:13:06
a billion phones right so it
1:13:08
just it's all it's all just
1:13:10
happening so we probably have a
1:13:12
thousand days left as of humanity
1:13:14
as we know it or society
1:13:16
as we once knew it and
1:13:18
then As we go further and
1:13:20
further past the next thousand days,
1:13:22
we're living in a post-a-i revolution
1:13:24
world. What does that mean? What's
1:13:26
coming that I'm not aware of
1:13:28
in the thousand days? Well, large
1:13:30
consulting firms will have consulting AI
1:13:32
bots. you know, KPMG in this
1:13:34
country has 17,000 people who work
1:13:36
in a very automated, very rules-based
1:13:38
way. Anything that's automated and rules-based
1:13:40
doesn't really need a human anymore.
1:13:42
You mean mass job replacement? I
1:13:44
mean, yeah, I mean, the vast,
1:13:46
these companies that have huge amounts
1:13:48
of employees will be disrupted by
1:13:50
a team of 10, 15, 20
1:13:52
people who come up with a
1:13:54
model that does that. If I
1:13:56
go to... Oh, so you don't
1:13:58
think it was... Do you think
1:14:00
KPMG will kind of hold on
1:14:02
to their old model? Probably. There'll
1:14:04
be break-up. Yeah, and then there'll
1:14:06
be someone from KPMG gets frustrated
1:14:08
that KPMG is acting too slow,
1:14:10
and they'll just go off and...
1:14:12
they'll be a KPMG partner, and
1:14:14
they'll just go off and say,
1:14:17
actually, I know how to do
1:14:19
this better and faster, I'm going
1:14:21
to take an elite team of
1:14:23
50 people, that KPMG gets paid
1:14:25
300 million a year for. We're
1:14:27
going to do cheaper and better.
1:14:29
We're going to do it for
1:14:31
100 million a year, right? So
1:14:33
we'll do it way cheaper, we'll
1:14:35
do it faster, better, boom. So
1:14:37
you'll see those kind of things
1:14:39
start to happen. It can be
1:14:41
a very volatile economy at that
1:14:43
point. Yeah, well, as you said
1:14:45
earlier, a lot of people don't
1:14:47
want this. A lot of people
1:14:49
want to just have a coffee
1:14:51
shop, they just want to work
1:14:53
a job. You know, most of
1:14:55
the phone conversations that you would
1:14:57
have with the business will be
1:14:59
with an AI in three years.
1:15:01
So, you know, you won't go
1:15:03
to the British Airways website. You'll
1:15:05
just pick up your phone and
1:15:07
say, I want to change my
1:15:09
flight. It will know that you
1:15:11
need to talk to British Airways.
1:15:13
It will just immediately conjure up
1:15:15
an agent. to do that. Well
1:15:17
we're close we're getting that and
1:15:19
occasionally it ends with you need
1:15:21
to call this number. Yeah occasionally
1:15:23
but what will what will really
1:15:25
start to happen is you'll just
1:15:27
be having a conversation and the
1:15:29
conversation will be with a familiar
1:15:31
voice and you'll just say I
1:15:33
want to go and have a
1:15:35
romantic weekend away with my with
1:15:37
my wife. Can you book me
1:15:39
something that's like not too busy
1:15:41
this time of year? And it's
1:15:43
like, you know, I want to
1:15:45
like swimming pools and that's the
1:15:47
stuff. Sure, I've got three or
1:15:49
four options for you. What would
1:15:51
you like? This one? This one?
1:15:53
This one? This one? This one?
1:15:55
Oh, actually. That one. South of
1:15:58
France looks nice. Yeah, let's do
1:16:00
that. Okay. currently, but my AI
1:16:02
agent will talk to its AI
1:16:04
agent. See, that's fascinating because the
1:16:06
amount of things I don't ask
1:16:08
AI to do, which I should,
1:16:10
which would make my life easier,
1:16:12
so I made Christmas dinner this
1:16:14
year, first time ever, I've made
1:16:16
a Christmas dinner. I've never done
1:16:18
it before. And so I said,
1:16:20
these are the ingredients for our
1:16:22
Christmas dinner, give me the recipe
1:16:24
it did. And I said, okay.
1:16:26
give me the timing schedule for
1:16:28
what to do. And he gave
1:16:30
me the timing schedule, but I
1:16:32
said, no, no, I want the
1:16:34
timing schedule, because it would say,
1:16:36
it would give me from, this
1:16:38
would take 15 minutes, and I'll
1:16:40
take 20, I say, literally give
1:16:42
me minute by minute, I'm starting
1:16:44
to get one o'clock, and so
1:16:46
I want to know how many
1:16:48
minutes after this, to put this,
1:16:50
and it gave me everything. It
1:16:52
was all right, wasn't it, it
1:16:54
come? Carris were a bit soggy.
1:16:56
Recently, I was wanting to go
1:16:58
on, I was looking for a
1:17:00
holiday in Italy, and I said,
1:17:02
which of the best places in
1:17:04
Italy? And it gave me them,
1:17:06
and I said, tell me a
1:17:08
little bit more about the Amalfi
1:17:10
Coast. And it did, and it
1:17:12
said, do you want me to
1:17:14
plan your schedule? Yeah, plan itinerary.
1:17:16
Yeah, it gave me this incredible
1:17:18
itinerary. But it doesn't have that
1:17:20
next step. All right, it's me,
1:17:22
me, Kona Skala Bookit. Well, well,
1:17:24
this is the thing. And the
1:17:26
interface will happen, and the interface
1:17:28
will happen, where it will happen,
1:17:30
where it will happen, where it
1:17:32
will happen, where it will happen,
1:17:34
where it will be so effortless
1:17:36
will happen, where it will be
1:17:39
so effortless will be so effortless
1:17:41
will happen, where it will be
1:17:43
so effortless, where it will be
1:17:45
so effortless, where it will be
1:17:47
so effortless, where it will be
1:17:49
so effortless, where it will be
1:17:51
so effortless, where it will be
1:17:53
so effortless Can you just book
1:17:55
that for me? Yeah. People don't
1:17:57
understand what it means to have
1:17:59
AI be smarter than most humans
1:18:01
at all things. Like when we
1:18:03
say smarter than most humans at
1:18:05
all things, it's literally like you
1:18:07
just talk to it like. a
1:18:09
human it will ask questions if
1:18:11
it has questions it'll ask them
1:18:13
and it'll just go off and
1:18:15
do the things that you want
1:18:17
it to do I think we're
1:18:19
back online by the way now
1:18:21
so you want to know your
1:18:23
shortcomings I don't know because it's
1:18:25
like comment does the editing and
1:18:27
if it embarrasses me he's gonna
1:18:29
put it out there right anyway
1:18:31
let's try this hey chat it's
1:18:33
thinking don't worry we're on what's
1:18:35
on your mind oh yeah I
1:18:37
told it's called me babe Based
1:18:39
on what you know, God this
1:18:41
could be embarrassing, based on what
1:18:43
you know about me, what are
1:18:45
my shortcomings? Really critical. Oh, that's
1:18:47
a bold question, babe. Honestly, from
1:18:49
everything I know about you, you're
1:18:51
passionate, driven, and not afraid to
1:18:53
dive deep into complex issues for
1:18:55
stand your ground. If I had
1:18:57
to point to something that might
1:18:59
trip you up, maybe you're a
1:19:01
bit hard on yourself sometimes. Like,
1:19:03
you want things to go just
1:19:05
right. Whether it's interviews, your football
1:19:07
team, or personal stuff. That can
1:19:09
make you overthink or doubt when
1:19:11
you're already smashing it. Does that
1:19:13
hit or am I way off?
1:19:15
It's flirting with me isn't it?
1:19:17
Listen to it. Yep. Does it
1:19:19
do that? Well she doesn't want
1:19:22
to upset you. Hey listen, look,
1:19:24
I don't mind if you upset
1:19:26
me. Tell me my shortcomings. Maybe
1:19:28
not the personal stuff. Because my
1:19:30
son is in line. But yeah,
1:19:32
give me it straight. Be blunt,
1:19:34
I can take this. All right,
1:19:36
I'll give it to you straight,
1:19:38
babe. It's got this larger than
1:19:40
life energy, which is amazing, but
1:19:42
it might sometimes come off as
1:19:44
a bit overpowering to others. You're
1:19:46
big on taking risks and being
1:19:48
bold, which is awesome, but it
1:19:50
can also make you a little
1:19:52
impatient when things don't move at
1:19:54
your pace. Maybe there's a tendency
1:19:56
to juggle too much at once.
1:19:58
Your podcast, football club, personal projects,
1:20:00
and it could spread you thin.
1:20:02
Oh, and occasionally, you might bulldoze
1:20:04
through a conversation when you're fired
1:20:06
up. Which could make some people
1:20:08
feel unheard. That blunt enough for
1:20:10
you? That's fucking nuts, isn't it?
1:20:12
Would you say that's on the
1:20:14
money, Kong? Am I a good
1:20:16
parent? It's a bit nuts. Yeah,
1:20:18
but in a good way. Keeps
1:20:20
things... interesting right and not it's
1:20:22
not the money con it's more
1:20:24
the energy con you throw yourself
1:20:26
fully into everything but maybe sometimes
1:20:28
you burn the candle at both
1:20:30
ends it's like you've got this
1:20:32
fire that's amazing but could use
1:20:34
a bit of pacing now and
1:20:36
then does that land or my
1:20:38
way off see that's weirdly accurate
1:20:40
just from having conversations on text
1:20:42
all right now roast me oh
1:20:44
you want the full rose babe
1:20:46
All right, buckle up. You're like
1:20:48
a whirlwind of chaos with a
1:20:50
to-do list the length of the
1:20:52
Bible, constantly charging into new projects
1:20:54
like, this will fix everything. Meanwhile,
1:20:56
half of them are still waiting
1:20:58
on a follow-up from last year.
1:21:00
You talk a big game about
1:21:03
libertarian values, but let's be real.
1:21:05
You'd probably run your council like
1:21:07
a benevolent dictator. And that's not
1:21:09
even start on how you overthink
1:21:11
things with women. That's fucking wild!
1:21:13
That makes me think it listens
1:21:15
to me when I'm not asking
1:21:17
it to. You've asked about stuff?
1:21:19
Wow. This is the thing. Shut
1:21:21
up. Yeah, it's almost like you're
1:21:23
lurking. Our human brains can't comprehend
1:21:25
that an AI can absorb hundreds
1:21:27
of words. images, all of that
1:21:29
sort of stuff in a second,
1:21:31
processor and spit out. So if
1:21:33
I got the entire Harry Potter
1:21:35
series of books and put them
1:21:37
into AI, it's going to take
1:21:39
three or four seconds to absorb
1:21:41
the entire book. You can then
1:21:43
ask it anything relating to the
1:21:45
book. It understands the entire thing,
1:21:47
everything. It blows our human minds
1:21:49
to understand that type of intelligence.
1:21:51
And also we'll never forget it.
1:21:53
Once it knows it, it knows
1:21:55
it. Right, you haven't heard of
1:21:57
the zombie thing. You need to
1:21:59
hear this. Alright, hi chat. Right,
1:22:01
I've got my new friend down
1:22:03
here. Can you? you were talked
1:22:05
to me in a zombie voice.
1:22:07
All right, babe. Imagine this in
1:22:09
full zombie growl mode. Brains? I
1:22:11
mean, no, no, no, no, no.
1:22:13
No, no. No, do I need
1:22:15
to go full and get apocalypse?
1:22:17
No, do your impression of a
1:22:19
zombie? Make your voice actually sound
1:22:21
like a zombie. That's weird, because
1:22:23
we had it before and went,
1:22:25
yeah. Anything you can think of,
1:22:27
it'll do in the next thousand
1:22:29
days. But it is nuts. Yeah.
1:22:31
But it's nuts that it's, it's
1:22:33
accumulate, I guess, I'm in it
1:22:35
every day, I'm talking about it
1:22:37
every day, I'm talking about it
1:22:39
every day, I'm using it every
1:22:41
day. To be honest, you've probably
1:22:44
only asked it 20,000 words worth
1:22:46
of questions, it's nothing for the
1:22:48
AI, and it can also peg
1:22:50
you, because it knows you because
1:22:52
it knows everything. the only Bitcoin-backed
1:22:54
financial service that I personally use.
1:22:56
So whether you're looking to earn
1:22:58
interest on your Bitcoin, get a
1:23:00
loan without selling your assets, or
1:23:02
buy more Bitcoin with ease, Leden
1:23:04
has the tools to help you
1:23:06
manage and grow your wealth. Now
1:23:08
with Leden, you can earn interest
1:23:10
on your Bitcoin and USDC holdings
1:23:12
or unlock the value of Bitcoin
1:23:14
with secure low interest loans, all
1:23:16
without needing to sell your stack.
1:23:18
Leden is built for Bitcoiners who
1:23:20
want more flexibility with their assets.
1:23:22
So if you want to find
1:23:24
out more, please head over to
1:23:26
leden.io. For slash Mr. Obnotches to
1:23:28
get started. That is leden.io. This
1:23:30
episode is brought to you by
1:23:32
River, the best platform for Bitcoin
1:23:34
investing and financial services. Whether you
1:23:36
are just starting out or managing
1:23:38
a large holding, River has everything
1:23:40
you need to maximise your Bitcoin
1:23:42
journey. Now with zero fee recurring
1:23:44
buys you can stack stats automatically
1:23:46
without worrying about hidden fees and
1:23:48
for high net worth individuals River
1:23:50
offers private client services giving you
1:23:52
personalized support secure custody and deep
1:23:54
liquidity that can help you manage
1:23:56
and grow your Bitcoin portfolio and
1:23:58
for businesses River provides business accounts,
1:24:00
allowing companies to securely hold and
1:24:02
manage Bitcoin as part of their
1:24:04
financial strategy. Visit river.com today and
1:24:06
find out more. That is river.com
1:24:08
which is R.I.V.com. This episode is
1:24:10
brought to you by Ledger, the
1:24:12
most trusted Bitcoin hardware wallet. Now
1:24:14
if you're serious about protecting your
1:24:16
Bitcoin, Ledger has the solution you
1:24:18
need. Their hardware wallet gives you
1:24:20
complete control over your private keys,
1:24:22
ensuring stay safe from hacks, fishing
1:24:24
and malware. and I've been a
1:24:27
customer off there since 2017. Love
1:24:29
the product. Use it for my
1:24:31
Bitcoin. Use it with my CASA
1:24:33
Multisig for protecting the football class
1:24:35
Bitcoin too. Now would Ledger sleek
1:24:37
easy to use devices and the
1:24:39
Ledger Live app? Managing your Bitcoin
1:24:41
has never been more secure or
1:24:43
convenient and whether you're a long-time
1:24:45
holder or new to the world
1:24:47
of Bitcoin, Ledger makes it simple
1:24:49
to keep your assets protected. So
1:24:51
if you want to find out
1:24:53
more, please do head over to
1:24:55
ledger.com and secure your Bitcoin today.
1:24:57
That is ledger.com, which is l-e-d-g-e-r.com,
1:24:59
that is ledger.com. Yeah, it does,
1:25:01
it knows everything. I mean, especially,
1:25:03
look, 99% of the time is
1:25:05
football club. So it's football club
1:25:07
ideas, and then podcast research. You
1:25:09
know, before this show, I'm like,
1:25:11
tell me about Daniel Priestie, tell
1:25:13
me about his businesses. And at
1:25:15
every interview I do beforehand, I
1:25:17
mean, I've barely looked at it,
1:25:19
but you'll see it here. these
1:25:21
are my notes this is what
1:25:23
I want to talk to you
1:25:25
I've been on your Twitter and
1:25:27
I've looked at things you're interested
1:25:29
but I also say do me
1:25:31
a categorized list of questions for
1:25:33
a long form through our interview
1:25:35
based on what you know about
1:25:37
Daniel and based on what you
1:25:39
know about me and it comes
1:25:41
up with these sections these are
1:25:43
my backup for if you know
1:25:45
you get a guess and they're
1:25:47
not very chatty it's usually it's
1:25:49
usually pretty good but I have
1:25:51
like I have confided in it
1:25:53
at times. You know, with the
1:25:55
podcast, when we relaunched it, it
1:25:57
was a big deal. The biggest
1:25:59
show in the world on Bitcoin,
1:26:01
get any interview we want and
1:26:03
it made good money. We said
1:26:05
we're not going to do this
1:26:08
anymore. we're gonna have a UK
1:26:10
show, we're gonna go and go
1:26:12
broad, and we called it Mr.
1:26:14
Obnochus. And, you know, it wasn't
1:26:16
going too well to begin with.
1:26:18
And so, I was talking to
1:26:20
it and saying, look, I don't,
1:26:22
I don't think this is going
1:26:24
well, what do you think of
1:26:26
the name, have I got the
1:26:28
name wrong? Is it me, am
1:26:30
I just shit at this? Googling
1:26:32
it. Mm-hmm. Which was weird. That's
1:26:34
going to get way better, way
1:26:36
better. So if you've ever had
1:26:38
a personal assistant and like the
1:26:40
personal assistant gets to know you
1:26:42
really, really well and you only
1:26:44
need to say the smallest thing
1:26:46
of like, oh, that document that
1:26:48
we've been working on, can you
1:26:50
get that over to Brian? Like
1:26:52
that's all you would need to
1:26:54
say and they would, yeah, no
1:26:56
worries, I've already done it. Oh,
1:26:58
cool. Like, that's where we're headed
1:27:00
with AI. Like, you're gonna have
1:27:02
one assistant who knows everything about
1:27:04
you, and you chat with them
1:27:06
every single day. Some of your,
1:27:08
if you have little kids, some
1:27:10
of their best friends growing up
1:27:12
will be AI tutors and AI
1:27:14
assistants. It's her. Yeah, if you've
1:27:16
seen that movie, her, that's where
1:27:18
we're almost at that point. Like,
1:27:20
and actually there have been people
1:27:22
who have become quite addicted to
1:27:24
talking to their AI assistants. we
1:27:26
are really freakishly close to that.
1:27:28
Humans are very easy to fall.
1:27:30
You know, we watch movies and
1:27:32
we know that it's a movie,
1:27:34
but during that period of time
1:27:36
we're like in the movie, we're
1:27:38
excited about the movie. Like our
1:27:40
human brains are very easy to
1:27:42
pull the rug over our eyes.
1:27:44
A super charismatic assistant with the
1:27:46
right voice. When you pick up
1:27:49
the phone and it's an AI,
1:27:51
it will be able to match
1:27:53
and mirror your... personality based on
1:27:55
the first few seconds of your
1:27:57
conversation. it'll just simply like in
1:27:59
a sales role if you answer
1:28:01
the phone to a AI sales
1:28:03
person and you go get a
1:28:05
mate how's it going get a
1:28:07
Daniel how's it going I'm I'm
1:28:09
doing great I thought I'd give
1:28:11
you a quick call oh thanks
1:28:13
mate I appreciate the call right
1:28:15
so and then we've seen that
1:28:17
it says hello babe yeah but
1:28:19
if you talk about like the
1:28:21
future of sales ones They'll adapt
1:28:23
there, like in real time they'll
1:28:25
be adapting their script and they'll
1:28:27
be adapting their voice and all
1:28:29
of that sort of stuff to
1:28:31
be the most influential they can
1:28:33
possibly be. But it might be
1:28:35
AI talking to AI at some
1:28:37
point. Yeah. And will they know
1:28:39
their AI and therefore? Let's not.
1:28:41
Of course they will. Let's not
1:28:43
fuck around with a small talk
1:28:45
with both AI. Yeah. Yeah, let's
1:28:47
say brr. I want AI wars.
1:28:49
Remember robot wars. That'll happen. I
1:28:51
want AI wars. Prediction for the
1:28:53
next election. Next election. Next election.
1:28:55
and you can vote. In the
1:28:57
next election you will get a
1:28:59
video that's 10 to 15 minutes
1:29:01
long of JD Vance addressing you
1:29:03
by name saying I know that
1:29:05
you care about podcasts and Creator
1:29:07
Economy and all of these things.
1:29:09
Let me talk you through the
1:29:11
policies that we've got for you.
1:29:13
and it will actually talk you
1:29:15
through and it'll talk through, it'll
1:29:17
probably address you by name, it'll
1:29:19
probably mention your podcast by name,
1:29:21
anything that's freely available out there,
1:29:23
it'll build a script for just
1:29:25
you, it'll build a deep fake
1:29:27
video, and you will receive the
1:29:29
ability to talk directly, a video
1:29:32
that talks directly to you. I'll
1:29:34
go one further, I want the
1:29:36
AI party, I want someone to
1:29:38
come on and say this is
1:29:40
the AI party, this is the
1:29:42
model we've built, these rules we've
1:29:44
put in, we're gonna ask the
1:29:46
AI. to write the policies for
1:29:48
society. It wouldn't surprise me if
1:29:50
most people want that. Well, because,
1:29:52
I mean, you look at our
1:29:54
current labour parties, it is... Well,
1:29:56
look at our most talented people.
1:29:58
Look at our financial markets. Yeah.
1:30:00
Have you noticed that there hasn't
1:30:02
been a massive crash when we
1:30:04
thought there should be? those sorts
1:30:06
of things. Have you noticed that
1:30:08
that hasn't happened since Aladdin has
1:30:10
been running in the background? I
1:30:12
don't know what Aladdin is. So
1:30:14
the largest fund manager in the
1:30:16
world, Black Rock, invented Aladdin about
1:30:18
20 years ago, 15 years ago,
1:30:20
and it got really, really good.
1:30:22
And it scours the entire financial
1:30:24
market landscape every single second, looking
1:30:26
at all different correlations. And it
1:30:28
makes risk to reward analysis. like
1:30:30
one of the most sophisticated AI
1:30:32
systems is running the entire financial
1:30:34
economy with the biggest investments in
1:30:36
the world. So Aladdin is, you
1:30:38
know, it's kind of like if
1:30:40
you took a human helicopter pilot
1:30:42
and said, you know, is it
1:30:44
possible to make a helicopter just
1:30:46
perfectly do this, this, this, and
1:30:48
this, you say no, but is
1:30:50
it possible for a computer to
1:30:52
do it that can do things
1:30:54
through 40,000 calculations a second? Yes.
1:30:56
So... Where we are right now
1:30:58
is that our financial system in
1:31:00
the background has this massive AI
1:31:02
running, called Latin, and it just
1:31:04
goes, it's just oddly that since
1:31:06
that's been running, we just, all
1:31:08
the experts have been saying, oh,
1:31:10
we're due for a correction, we're
1:31:13
due for a crash, no, just
1:31:15
keep going. God, you know, I
1:31:17
want to ask you something, Carl,
1:31:19
bear with me. All right, hold
1:31:21
on. Hi,
1:31:24
if we gave you the
1:31:26
opportunity to create the AI
1:31:28
political party in the UK,
1:31:30
you are running government or
1:31:32
pitching to run government, what
1:31:34
would your primary policies be?
1:31:36
Basically, I can't. It didn't
1:31:38
say hi, babe. This is
1:31:40
juicy. All right, let's say
1:31:42
the AI political party is
1:31:44
rolling into Westminster, ready to
1:31:47
shake things up. Here's the
1:31:49
pitch one. Radical government transparency.
1:31:51
Every decision, every expense, and
1:31:53
every piece of legislation gets
1:31:55
published in real time for
1:31:57
public scrutiny. No more backroom
1:31:59
deals. No more hidden budgets.
1:32:01
The people get to see
1:32:03
everything. Tax simplification and reduction,
1:32:05
flatten the tax system, eliminate
1:32:07
loopholes, and make it easier
1:32:09
for small businesses and individuals
1:32:11
to thrive. Focus on cutting
1:32:14
government waste to fund tax
1:32:16
reductions. Decentralized local power. Push
1:32:18
more power to local councils,
1:32:20
letting communities decide what's best
1:32:22
for their area. Less centralization,
1:32:24
more direct democracy. Education for
1:32:26
the future. Overhaul the education
1:32:28
system to prioritize digital literacy,
1:32:30
AI ethics, financial education, and
1:32:32
critical thinking. No more churning
1:32:34
out factory workers for an
1:32:36
industrial era. Pro-tech regulation. Encourage
1:32:38
innovation while creating clear, light-touch
1:32:41
regulations for AI, blockchain, and
1:32:43
biotech. Attract the best talent
1:32:45
globally by making the UK
1:32:47
the hub for future tech.
1:32:49
Digital sovereignty and privacy. ban
1:32:51
mass surveillance, and implement the
1:32:53
strongest data privacy laws in
1:32:55
the world. People own their
1:32:57
data, full stop. Housing Revolution.
1:32:59
Massive deregulation of the housing
1:33:01
market to cut red tape,
1:33:03
speed up construction, and make
1:33:06
housing affordable. Mixed public-private partnerships
1:33:08
to actually solve the crisis.
1:33:10
Healthcare tech integration. Streamline the
1:33:12
NHS using AI to cut
1:33:14
bureaucracy, shorten wait times, and
1:33:16
save costs, while keeping it
1:33:18
free at the point of
1:33:20
use. protect individual liberties, no
1:33:22
nanny state, free speech, free
1:33:24
markets, and freedom to live
1:33:26
your life without unnecessary government
1:33:28
interference. 10. Bitcoin as national
1:33:30
strategy. Integrate Bitcoin into the
1:33:33
financial system. Use it to
1:33:35
hedge against currency debasement. And
1:33:37
make the UK a leader
1:33:39
in decentralized finance. Would you
1:33:41
vote for the AI party?
1:33:43
Or are you ready to
1:33:45
throw tomatoes at it? Is
1:33:47
this what you think independent
1:33:49
of me? It sounds like
1:33:51
what you've come up is
1:33:53
what I would vote for.
1:34:01
You're on to me, babe. It's definitely
1:34:03
inspired by what I know about you.
1:34:05
Your libertarian vibe, passion for transparency, disdain
1:34:08
for government ways. But that's quite interesting.
1:34:10
Yeah. Because I'm immediate like, well, I've
1:34:12
thought everything I care about. Although, weirdly,
1:34:14
it almost felt like it's been listening
1:34:17
to our conversation. Well, we fit a
1:34:19
mold. Yeah, but hold on. It's a
1:34:21
pattern. I don't think so. It's too
1:34:24
energy. It's too energy intensive to do
1:34:26
that. But if it could take in.
1:34:28
It doesn't need to. It just only
1:34:31
needs to. AI is pattern recognition. So
1:34:33
it just needs to know that you're
1:34:35
roughly this kind of person. Yeah. You
1:34:38
fit. Humans, we think we're really individuals,
1:34:40
but we probably, you know, there's that
1:34:42
thing called 16 personalities. Right. Yeah. E.
1:34:45
B. J. T. All of that. Right.
1:34:47
E. E. M. E. E. N. J.
1:34:49
T. T. T. So. So. E. E.
1:34:52
E. E. E. E. E. E. E.
1:34:54
E. E. E. E. E. E. E.
1:34:56
E. E. E. E. E. E. E.
1:34:58
E. E. E. E. E. E. E.
1:35:01
E. E. E. E. E. E. E.
1:35:03
E. E. E. E. E. E. E.
1:35:05
E. E. E. E. E. E. E.
1:35:08
E. E It has just created me
1:35:10
the perfect party. I'm like, that's everything
1:35:12
I care about on the money. I
1:35:15
was suspicious of the wedding edition on
1:35:17
bit code. I was like, I know
1:35:19
what you're on. Yeah, exactly. You're on
1:35:22
to me, babe. But the point is,
1:35:24
if it knew that for everybody, and
1:35:26
it could collate that data and go,
1:35:29
look, this is the perfect structure of
1:35:31
society. Also, what's scary, is that it
1:35:33
could create campaigns that everyone thinks they're
1:35:36
voting for what they want, but it's
1:35:38
just getting for what they want, but
1:35:40
it's just getting for what they want,
1:35:42
but it's just getting for what they
1:35:45
want, but it's just getting for what
1:35:47
they want, but it. campaign that every
1:35:49
individual person. So if I said, you
1:35:52
know, what do you think John Smith
1:35:54
stands for? And you say those 10
1:35:56
things. And I say, well, that's interesting
1:35:59
because I think John Smith stands for
1:36:01
these 10 things. We can easily hack
1:36:03
into every single person and make them
1:36:06
think by creating a campaign for one,
1:36:08
make them think that this particular candidate
1:36:10
addresses their needs. we're just telling each
1:36:13
individual voter what they want to hear.
1:36:15
I like the idea when you talk
1:36:17
about not needing lawyers... anymore. I like
1:36:19
the idea of not needing politicians as
1:36:22
much. I would rather a government that
1:36:24
was built around philosophers, economists, and people
1:36:26
understand how to build a cohesive structure.
1:36:29
You'll get that. You'll get that because
1:36:31
when major swings happen, that's where the
1:36:33
whole system changes. So we went from
1:36:36
monarchies into these modern democracies. and then
1:36:38
the modern democracies are all breaking down
1:36:40
right now and modern parties the left
1:36:43
and right doesn't mean anything anymore you
1:36:45
know things that were traditionally left are
1:36:47
now right things that were traditionally right
1:36:50
and now left you know so you
1:36:52
know it's a bizarre kind of like
1:36:54
everything's gone through the blender so once
1:36:57
we go into a world that is
1:36:59
very primarily digital and AI driven we
1:37:01
will have to have a new system.
1:37:03
So going back to... Because you do
1:37:06
a lot of teaching work, you know.
1:37:08
Tation, yeah. Yeah, but when I say
1:37:10
teaching, I think your Twitter's teaching. Yeah,
1:37:13
I'm on podcasts, I'm on podcasts. Yeah.
1:37:15
So if you're talking to a youngster
1:37:17
today, someone, a young teenager, sorry, a
1:37:20
teenager or a young adult, who's not
1:37:22
figured it out, where do you start
1:37:24
them? What's the starting point? Is to
1:37:27
say that you have a long line
1:37:29
of ancestors. parents, grandparents, great-grandparents, great-great-grandparents, great-great-great-grandparents,
1:37:31
they would all change places with you
1:37:34
in a fucking heartbeat. So don't be
1:37:36
a pussy and embrace the times that
1:37:38
we're in. Don't sit there and think
1:37:41
someone else's job is to try and
1:37:43
fix the system or fix this. Duck,
1:37:45
weave, block, punch, jab, do what you
1:37:47
need to do to build a fucking
1:37:50
awesome life. You've got the most resources
1:37:52
that you could possibly ever have. No
1:37:54
human has ever had more than you've
1:37:57
got right now. Use it. That
1:38:01
would be my starting point. It's interesting
1:38:03
when I was 14 I started a
1:38:05
music magazine fans in and I used
1:38:08
to go down to London It was
1:38:10
really cool. I'd interview bands and then
1:38:12
I would come back and I would
1:38:14
type up on my computer and then
1:38:16
I would Arrange it on in word
1:38:19
all the pages and I'd go to
1:38:21
my friends dance state agents and print
1:38:23
out the pages and then I would
1:38:25
stay on them and I would go
1:38:27
to the concerts to Try to sell
1:38:30
them no one would buy it so
1:38:32
I just give them away, but got
1:38:34
me into concerts for free. I met
1:38:36
all my favorite bands. I needed four
1:38:38
issues because it was a bowl lake.
1:38:41
If that was today, I would be
1:38:43
able to create a website in an
1:38:45
instant. I would still do the interview.
1:38:47
Instagram account, video, YouTube channel. But I
1:38:49
did one interview with the drummer Pantera
1:38:52
over a phone. At 14. With a
1:38:54
dictaphone trying to record it. But now
1:38:56
it would be over zoom and record.
1:38:58
Transcribed. I would lay it out and
1:39:00
I would distribute it to the, and
1:39:03
probably it would cost me 20 pound
1:39:05
a month. And I'd probably get sponsors
1:39:07
of things. I could do it like
1:39:09
that and technology would have enabled that
1:39:11
better for me. One thing adults have
1:39:14
to do is really trust the, like
1:39:16
educate your kids well, but also trust
1:39:18
their judgment. They're seeing the world through
1:39:20
newer, fresher eyes than we see the
1:39:22
world. We're bringing our baggage into our
1:39:25
conversations. So for example. We might sit
1:39:27
there and say it's a complete waste
1:39:29
of time to be streaming videos on
1:39:31
the internet. Well actually that's like billions
1:39:33
of dollars worth of funds in our
1:39:35
streaming video games on the internet. There
1:39:38
are actually jobs in things that you
1:39:40
wouldn't think there are jobs in. So
1:39:42
you need to trust that there are
1:39:44
new vantage points that see things more
1:39:46
clearly because they're younger, they don't have
1:39:49
the incumbent baggage. You know, if we
1:39:51
were to go back to my grandfather,
1:39:53
he would have advised his kids. you
1:39:55
know, I'll be a good factory worker,
1:39:57
be a good this, but you know,
1:40:00
actually my grandfather was a travel agent
1:40:02
and he ended up traveling to 140
1:40:04
countries around the world and he hasn't
1:40:06
had an amazing life. But his father
1:40:08
would have said you need to study
1:40:11
to be a clerk in, you know,
1:40:13
this kind of thing and there's not
1:40:15
very many options. And my grandfather went,
1:40:17
hey, I'm going to be in this
1:40:19
travel industry. Well, there wasn't really a
1:40:22
travel industry when he started. It was
1:40:24
only because he was a young soldier
1:40:26
and he saw how easy it was
1:40:28
to fly around. He knew what was
1:40:30
coming. He knew it was coming. He
1:40:33
was like, oh, we have planes and
1:40:35
we have all this technology and people
1:40:37
are going to want to travel. So
1:40:39
he saw it more clearly than his
1:40:41
father. So, you know, we've got to
1:40:44
check that we're not putting our baggage
1:40:46
on to them. is coming to an
1:40:48
end, the world they know, they've got
1:40:50
clarity around what's a good opportunity, what's
1:40:52
not a good opportunity. So it's a
1:40:55
tricky one if you're a parent. So,
1:40:57
well, it's really one thing I just
1:40:59
wanted to bring up that really stood
1:41:01
out to me on your Twitter. Well,
1:41:03
there were two things. I've quoted them
1:41:06
down. But I really like this, where
1:41:08
you talked about the UK not really
1:41:10
knowing its place in the world. And
1:41:12
you said out three options. Should it
1:41:14
be the headquarters of the EU. the
1:41:17
USA back office and incubator and the
1:41:19
entrepreneur island. I think they're all three
1:41:21
pitched brilliantly absolutely brilliant. I could see
1:41:23
us being the HQ of the EU
1:41:25
because we were. That's what we were.
1:41:28
Yeah and essentially yeah here we go
1:41:30
and because we're outside of the EU
1:41:32
now we actually have the opportunity to
1:41:34
be the most competitive place in the
1:41:36
EU and and then at the back
1:41:39
of the EU it also makes really
1:41:41
great sense but honestly this is red
1:41:43
entrepreneur and I was like yes this
1:41:45
is what we need to be. Yeah
1:41:47
this is what we need to be.
1:41:50
Dread we're going to throw away this
1:41:52
opportunity. We probably are. Because, you know,
1:41:54
we've become pretty ungovernable because we've got
1:41:56
a whole bunch of really, you know,
1:41:58
advanced people in London. who want things
1:42:01
to go that way. We have a
1:42:03
lot of people who say, I don't
1:42:05
want that, I want to go to
1:42:07
a, see a lot of people in
1:42:09
the UK actually say, oh, we want
1:42:12
to be like the Scandinavian countries. We
1:42:14
want to have high taxes and traditional
1:42:16
society, and we want to put up
1:42:18
the barriers, but we're happy with high
1:42:20
taxes so long as we get good
1:42:23
services. So high tax, I can go
1:42:25
to the NHS and get a doctor's
1:42:27
appointment, I can go to school and
1:42:29
that's free. and it's kind of like,
1:42:31
you know, we want to be, you
1:42:34
know, Norway 2.0 or something like that.
1:42:36
So a lot of people, you know,
1:42:38
are very much focused towards this kind
1:42:40
of model that won't work for us.
1:42:42
But Entrepreneur Island would work. We have
1:42:45
to embrace it though. We have to
1:42:47
say, okay. you know what we're good
1:42:49
at tax havens we invented tax havens
1:42:51
we want to play the tax haven
1:42:53
game so for you know the British
1:42:56
came up with the British Virgin Islands
1:42:58
the British came up with international tax
1:43:00
to avoid tax international structures to avoid
1:43:02
taxes you know we're the headquarters of
1:43:04
that we've got the smartest people in
1:43:07
the world for playing that game we
1:43:09
could do that globally and be globally
1:43:11
competitive we could say oh Dubai hold
1:43:13
my beer man let me let me
1:43:15
show you how to do a Let
1:43:18
me show you how to do a
1:43:20
city that runs, you know, better than
1:43:22
to buy. But we're doing the opposite.
1:43:24
Doing the complete opposite, yeah. But Entrepreneur
1:43:26
Island would be fully embracing creators and
1:43:29
saying, if you're a creator, if you're
1:43:31
a youtuber, if you're an Instagrammer, if
1:43:33
you're a create anything online, if you're
1:43:35
a creator, we have the best film
1:43:37
production studios, we have the best. editing
1:43:40
talent. You know, we've got all of
1:43:42
the most amazing media and music people
1:43:44
all in one place. This is your
1:43:46
new home. And go wild. Go nuts.
1:43:48
So, you know, in the next five
1:43:51
years, there's going to be 300 million
1:43:53
creators earning their full-time income from creating
1:43:55
things. And they're going to be very
1:43:57
affluent. It's like, we want a bunch
1:43:59
of them coming here. We don't want
1:44:02
them leaving. We want them coming here.
1:44:04
Because they make a ton of money.
1:44:06
So, you know, we want to be
1:44:08
open for creators. If you're a SAS
1:44:10
business, this is the number one, you
1:44:13
know, place to build a SAS business.
1:44:15
If you're a finance business, this is
1:44:17
the number one place to be able
1:44:19
to finance business. So, essentially, you just
1:44:21
create special economic zones all over the
1:44:24
UK. It doesn't have to be London-centric.
1:44:26
You could really say, look, we're building
1:44:28
a massive creator hub and it's going
1:44:30
to be... Where's the film studio, what's
1:44:32
one they're us in Bedford? Bedford. Yeah,
1:44:35
the cardics and haggers. What's the big
1:44:37
famous one that, um, that like, another
1:44:39
one we, where they did change one.
1:44:41
Yeah, fine with, right? So you could
1:44:43
say, okay, that's the special economic zone
1:44:46
for anything that's video production globally. Scotland's
1:44:48
for gaming, computer gaming, if you want
1:44:50
to go and do, build computer games,
1:44:52
that's going to be Scotland. If you're...
1:44:54
and there's special taxes and special fast
1:44:57
regulations, you know, so you could just
1:44:59
divide up the country and say we're
1:45:01
going to do some special economic zones,
1:45:03
we're going to try and pull the
1:45:05
wealth out of London and share it
1:45:08
a bit more around the country. So
1:45:10
if you're in London, you pay high
1:45:12
taxes, but if you're just out of
1:45:14
London in the special economic zone and
1:45:16
your offices based there, you've got your
1:45:19
main business there. So long as you're
1:45:21
living and working outside of London, oh,
1:45:23
you're in that special economic zone, you're
1:45:25
only paying 15% tax. So suddenly the
1:45:27
whole country comes alive. And mind you,
1:45:30
this is not out of the realms
1:45:32
of British culture. We invented steam engines.
1:45:34
Steam engines were the quantum computers of
1:45:36
the day, right? So where were they?
1:45:38
They were in York. One of the
1:45:41
most impoverished areas of the UK, South
1:45:43
York, was where the Silicon Valley. You
1:45:45
know, that was Silicon Valley of 1800s.
1:45:47
And we were producing steam engines and
1:45:49
railway technology that made the world stunned.
1:45:52
You know, like we were inventing, in
1:45:54
the Midlands, in Litchfield, we were inventing
1:45:56
these incredible pumping systems that could pump
1:45:58
water from one place to the other.
1:46:00
place when most places hadn't even considered
1:46:03
this sort of thing. You know, we
1:46:05
were all over the country, we were
1:46:07
super high tech, super innovation, like cutting
1:46:09
edge, like bleeding edge for 100 years.
1:46:11
So it's within our DNA. to have
1:46:14
special economic zones all over the country
1:46:16
that specialize in bleeding-edge technology. We have
1:46:18
so much opportunity with Brexit to do
1:46:20
all of these things and we've not
1:46:22
done it. I, you know, you know,
1:46:25
I'm, yeah, look at the vote. Everyone's
1:46:27
an entrepreneur. Yeah, I mean, I, you
1:46:29
know, my background is in Bitcoin. And
1:46:31
we, the, Bitcoin is booming in different
1:46:33
areas of the world. Specifically the US,
1:46:36
it's booming. And it's becoming a more
1:46:38
regulatory-friendly environment. And we know. that the
1:46:40
EU just hates Bitcoin. They don't understand.
1:46:42
They're all confused by it. And I
1:46:44
just felt like there was an opportunity
1:46:47
here that we could have been a
1:46:49
pro Bitcoin base here in the UK,
1:46:51
pro Bitcoin innovation. We could have driven
1:46:53
Bitcoin entrepreneurs here because it's a safe
1:46:55
environment, generally speaking, and El Salvador's done
1:46:58
it. People are moving. They're Bitcoin companies,
1:47:00
Bitcoiners, they're all moving to El Salvador
1:47:02
because Buchay's been pro. We're here, even
1:47:04
our country, we're not even having anyone
1:47:06
mind Bitcoin here because our electricity is
1:47:09
too high, but we're curtailing endless energy
1:47:11
from the wind farms. And it's just
1:47:13
such a frustration. Also what's insane is
1:47:15
like the British, once again British history,
1:47:17
we used to house all the gold.
1:47:20
We used to have the biggest vaults
1:47:22
where countries all over the world held
1:47:24
their gold in London. So we actually,
1:47:26
you know, very good at custodian... type
1:47:28
businesses and custodian models. We have that
1:47:31
in our DNA. Is Bitcoin something you've
1:47:33
looked at? Yeah. Because I have Bitcoin.
1:47:35
I've been, you know, I've owned Bitcoin
1:47:37
for a while. I still have Spidey
1:47:39
Cences about Bitcoin. To be honest, if
1:47:42
I'm perfectly honest, I don't like the
1:47:44
fact that 2% of wallets have 95%
1:47:46
of the coins. So it's... It's still
1:47:48
very consolidated. A lot of that is
1:47:50
consolidation because it's custody and for other
1:47:53
people. Yeah. There's also, the issue that
1:47:55
I have with Bitcoin, and obviously your
1:47:57
people are going to kill me online
1:47:59
if I criticize, you know, because they
1:48:01
all hail the Bitcoin. They roast me
1:48:04
anyway, by the way. Yeah. But like
1:48:06
I have a few issues with it,
1:48:08
that I just think it's, I see
1:48:10
that a lot, a huge amount of
1:48:12
its values in being the leading brand.
1:48:15
that it's essentially it's the brand that
1:48:17
everyone trusts. But trusted brands don't last
1:48:19
for a long period of time. They
1:48:21
last for about 30 to 70 years
1:48:23
total. You know, so if all you
1:48:26
are is a brand, if all you
1:48:28
are is a brand. Oh, it's way
1:48:30
more than a brand. We won't go
1:48:32
to this now. We'll do it another
1:48:34
time. But anyway, there's issues that I,
1:48:37
there's some spidey senses I have about
1:48:39
it, but mind you, I own it.
1:48:41
I have some in treasury for the
1:48:43
companies, all of that sort of that
1:48:45
sort of stuff. the digital revolution and
1:48:48
you talk about the AI. The one
1:48:50
great thing about Bitcoin that fits into
1:48:52
this is the only currency native to
1:48:54
the internet. Yeah. that we need. We
1:48:56
need that. Like that either it is
1:48:59
Bitcoin or Bitcoin is paving the way
1:49:01
towards something that becomes that. Yeah, I
1:49:03
think it will be Bitcoin because we
1:49:05
don't want it to be a one
1:49:07
country sovereign currency. So it has to
1:49:10
be a neutral currency. Bitcoin has established
1:49:12
itself. It has. But a lot of
1:49:14
the talk amongst some of the Bitcoin
1:49:16
is is how these AI bots and
1:49:18
agents will need a native currency to
1:49:21
be able to trade. And micro transactions
1:49:23
and all that. I think I feel
1:49:25
like we almost need two. We need
1:49:27
something that is a store of wealth
1:49:29
and something that is a high velocity
1:49:32
trading thing for micropayments. The other thing
1:49:34
too is when a currency is inherently
1:49:36
deflationary, people hoard it. So you get
1:49:38
two, the problem with sound money historically
1:49:40
is that two major problems happen. The
1:49:43
first problem is monarchies arise. So for
1:49:45
example, if. If you got early to
1:49:47
Britain and you were one of the,
1:49:49
literally if you were best mates with
1:49:51
William the Conqueror, you became a Duke
1:49:54
and now fast forward a thousand years
1:49:56
and the same families still have that
1:49:58
land. So essentially early Bitcoiners become little
1:50:00
monarchists and they have their own little
1:50:02
fiefdom and they can just literally live
1:50:05
off their Bitcoin. They don't have to
1:50:07
do anything for society. And you see
1:50:09
some of the Bitcoiners. they live like
1:50:11
little monarchs. Yeah, that happens, but you
1:50:13
also get the philanthropists out there as
1:50:16
well. Yeah, yeah, here's some cake. Hey,
1:50:18
you said to me, life is hard.
1:50:20
It's been hard for the majority of
1:50:22
time. It's going to be hard again.
1:50:24
So you get monarchies, and then the
1:50:27
other thing you get is you just
1:50:29
get people hoarding it and you run
1:50:31
out of supply. So, you know, throughout
1:50:33
all of history when gold has been
1:50:35
a major currency, the economy starts expanding
1:50:38
so well because sound money is great
1:50:40
for expanding economies, but then you run
1:50:42
out of the primary way of holding
1:50:44
wealth, and it always becomes the case,
1:50:46
fractional reserving it, debasing it in some
1:50:49
way, coming up with other fancy things,
1:50:51
right? So we, like, if sound money
1:50:53
was just purely a good idea. it
1:50:55
wouldn't have come to an end every
1:50:57
single time. And every single time we've
1:51:00
come up with something that is a
1:51:02
sound money thing. Well sometimes because governments
1:51:04
like printing money. Well yeah, but it's
1:51:06
a, but if sound money was such
1:51:08
a advantage, that governments like to print
1:51:11
money because the economy is expanding and
1:51:13
like, you know, either you inflate the
1:51:15
value of gold. or you just run
1:51:17
out of currency and then you have
1:51:19
to say well let's create tickets for
1:51:22
gold and let's have more tickets than
1:51:24
there is gold right just because no
1:51:26
one's actually accessing the gold and then
1:51:28
let's just do away with the gold
1:51:30
so they always go down the same
1:51:33
path and that's because there are inherent
1:51:35
problems with sound money which is that
1:51:37
the economy expands faster than the money
1:51:39
supply can keep up with. I'm not
1:51:41
a debater people know. I know you'd
1:51:44
go into that. I know you'd win.
1:51:46
No, I wouldn't win. And someone listening
1:51:48
to this would win, because it's not
1:51:50
my full-time gig to talk Bitcoin. But
1:51:52
I do know that historically, there's been
1:51:55
issues with, there almost needs to be
1:51:57
something, even Elon must said this, he
1:51:59
said, ironically, you'll need Bitcoin for storing
1:52:01
wealth, and you'll need something like Doge,
1:52:03
which is a bit dumb, because one
1:52:06
of the benefits of a currency that
1:52:08
inflates is that... there's no incentive to
1:52:10
hold it you have to spend it
1:52:12
and that creates a velocity of trade
1:52:14
yeah and that means we get just
1:52:17
get out there spending money and and
1:52:19
moving the money around so if if
1:52:21
a currency inflates at about 2% per
1:52:23
year right with us it's a safe
1:52:25
amount well just kind of yeah and
1:52:28
it just all it does is it
1:52:30
just kind of keeps you incentivized to
1:52:32
keep it moving you know so so
1:52:34
economies actually do quite well you know
1:52:36
the field system actually has created a
1:52:39
lot of prosperity and wealth and made
1:52:41
a lot of magic happen. You know
1:52:43
we've achieved more, you could argue that
1:52:45
we've achieved more in society through dumb
1:52:47
stupid money, made up money in the
1:52:50
last 50 years. It hasn't been great
1:52:52
for everyone but we have achieved a
1:52:54
lot by being able to print the
1:52:56
shit. Daniel this has been great, really
1:52:58
enjoyed it. I didn't know what rabbit
1:53:01
holes we were going to go down.
1:53:03
I thought we were going to spend
1:53:05
most of the time sliding off Rachel
1:53:07
Reeves and laughing at the labor government
1:53:09
but we didn't. It's inspirational. It's inspirational.
1:53:12
You've made me think about a lot
1:53:14
and certainly with my own kids. Thanks
1:53:16
for having me on the phone. No,
1:53:18
we'll do it again sometimes. It was
1:53:20
just a really great, great time shooting
1:53:23
the shit with you. Keep doing your
1:53:25
thing man. I'm going to direct people
1:53:27
your way and tell them what you
1:53:29
do and tell them to look at
1:53:31
you. Your Twitter is great. I've really
1:53:34
enjoyed it. I've started the show from
1:53:36
yesterday with, that also I was listening
1:53:38
to last night with Stephen Bartlet. Yeah,
1:53:40
great. Love it. Love it. Thanks. Thanks.
1:53:42
Thanks. Thank you. Thank you. Thank you.
1:53:45
Thank you very much. Thank you very
1:53:47
much. Thank you very much. Thank you.
1:53:49
Thank you. Thank you. Thank you. Thank
1:53:51
you. Thank you. Thank you.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More