Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Some people ask to
0:02
be cryogenically preserved because
0:04
they have a terminal
0:07
illness, but for me, I
0:09
just want to survive the
0:11
energy crisis. Yeah, you're a
0:13
bit cold. I'm a bit
0:15
cold. Yeah, I'm a bit
0:17
cold. Yeah, I think it's
0:19
a normal rational reaction. Yes.
0:21
Rather than putting on another
0:24
jumper jumper. Hello Hello and
0:26
welcome to episode 35 of
0:29
The AI Fix your weekly
0:31
dive headfirst into the bizarre
0:34
and sometimes mind-boggling world
0:36
of artificial intelligence. My
0:38
name is Graham clearly.
0:41
And I'm Mark Stockley.
0:43
Mark, let's race straight into
0:46
the news. President Trump announces
0:48
the 500 billion dollar Stargate
0:50
project. Shipment of friend AI
0:53
digital companion delayed. O3 mini
0:55
is here. O3 is on
0:57
the way. Were Donald Trump's
0:59
as active orders written by
1:01
AI? AI agents are here.
1:04
LinkedIn accused of using private
1:06
messages to train AI. AI
1:08
super intelligence could be here
1:10
in two to three years
1:13
time. Right, where shall we
1:15
begin? So I think we should
1:17
start with the most consequential story.
1:19
of the week. We've had AI
1:22
agents come out this week. We've
1:24
got Trump announcing the Stargate project.
1:26
But you said that Friend, the
1:28
AI digital companion, has been delayed.
1:30
That's right. It's the AI-powered necklace,
1:33
which we spoke about in a
1:35
previous episode, which marketed itself as
1:37
not imaginary. Turns out it is
1:40
perhaps imaginary, at least for now. It's
1:42
failed to show up on time. Now, the
1:44
people at Friend. You remember, this is the
1:46
thing you wear around your neck or you
1:48
attached to your collar. and it listens to
1:50
your conversations and it chirps and on your
1:52
phone will pop up some advice as to
1:54
how to ask a girl out or I
1:56
hope you're enjoying your sandwich. It's the AI
1:59
friend for people. who don't have any
2:01
real friends. That's right. It's not only
2:03
about not having any real friends, you
2:05
also don't have a friend because it
2:07
doesn't exist. They have announced it in
2:09
an email to customers who signed up
2:12
in advance that their AI companion will
2:14
now only be available from quarter three
2:16
of 2025 rather than quarter one as
2:18
they originally hoped. In the email which
2:21
the founder sent us sent to people...
2:23
he not only apologised about the manufacturing
2:25
and the beta issues which he's experiencing
2:27
with his startup but also on the
2:29
friend.com website if you go there right
2:32
now as I speak there is a
2:34
bit of a chatbot you can practice
2:36
having little AI chats with an imaginary
2:38
friend up there I've just gone there
2:40
yes there's a picture of a guy
2:43
with a little green dot which indicates
2:45
he's online yeah and it says his
2:47
name is Donald yeah oh oh oh
2:49
and it says hold on I'm in
2:52
the middle of a big interview right
2:54
now a lull Oh, so we can't
2:56
chat to you at the moment. Well,
2:58
I don't know. Maybe speaking to Fox
3:00
News. Are you trying to get a
3:03
job because the friend business is not
3:05
happening? That stings a bit. Writing is
3:07
my main gig, Friends, just a side
3:09
thing. Do your customers know this? Do
3:11
you think this is a chat board?
3:14
You're direct to the CEO. Now Mark,
3:16
O3 Mini. So open AI CEO Sam
3:18
Altman went to Twitter a few days
3:20
ago to announce the company's latest model.
3:23
Oh three mini will be available in
3:25
about two weeks time. So the mini
3:27
models are smaller versions of open AI's
3:29
flagship models and they're basically faster and
3:31
cheaper. Do we know why they're called
3:34
O? Where they give them the O
3:36
moniker? O is Omni I think. Oh
3:38
I see. Yeah. So the O series
3:40
are the models that can reason. So
3:42
they take much longer to answer queries
3:45
than their predecessors because they actually, they
3:47
think much harder about the answer by
3:49
doing what they call chain of thought.
3:51
So they break down the problem and
3:54
you can see the model doing this,
3:56
it throws up little labels saying, oh
3:58
I'm thinking... about this now I'm doing
4:00
this now I'm thinking about this so
4:02
it breaks down the problem into smaller
4:05
logical steps and the O3 series
4:07
is designed to improve the reasoning
4:09
capabilities a bit more but it's
4:11
also supposed to be much quicker
4:13
to try and bring down that
4:15
thinking time yes so O3 minis
4:17
coming out first and then that
4:19
will be quickly followed by O3
4:21
and O3 pro Sam Altman isn't
4:23
giving away much about the capabilities
4:25
of O3 but separately Open AI's
4:28
chief product officer said that when
4:30
it comes to the full O3
4:32
model is going to be the
4:34
175th best computer programmer in the
4:36
world. Is there a chart? Well,
4:38
it turns out there is. We
4:40
watched the sort of full first
4:42
version of this new model, O1.
4:44
It was like the thousandth best
4:46
engineer in the world and software
4:48
engineer at these competitive coding problems.
4:51
O3 will be the 175th. Best
4:53
engineer in the world. So by
4:55
any measure it is phenomenally good
4:57
at being a computer programmer. Okay
4:59
you can paint it that way
5:01
Mark. The alternative is that there's
5:03
lots of computer programmers now who
5:05
aren't doing their own program but
5:07
using AI instead and so their
5:09
programming quality has actually deteriorated and
5:11
so they've fallen down the chart.
5:13
Well there are a hundred and
5:16
seventy four of those. So O3
5:18
is out imminently, but he also
5:20
said O4 Mini will be around
5:22
in July. Oh wow, that is
5:24
soon. Now, Donald Trump is the
5:26
US president. Great news for anyone
5:28
who enjoys questioning 14 times a
5:30
day if they're living in a
5:32
simulated reality or not. Anyway, he's
5:34
been busy, far and off executive
5:36
orders left, right, Sante, he's been
5:38
very, very busy. Some of them
5:41
caught the attention of legal experts
5:43
who questioned whether they were written
5:45
by AI instead of humans. Turns
5:47
out many of Trump's executive orders
5:49
are difficult to read and understand.
5:51
They're full of errors and storted
5:53
language. What you're saying is some
5:55
of them aren't. And we suspect
5:57
they may have been written by
5:59
AI. Is that? So they're... is
6:01
an executive order which is for
6:03
restoring names that honor American greatness.
6:06
This is being used to rebrand
6:08
the Gulf of Mexico as the
6:10
Gulf of America. There's an executive
6:12
order all about we're going to
6:14
start calling it something different. And
6:16
according to some people they suspect
6:18
that there's sections of this executive
6:20
order which were not written by
6:22
a human. So lawyer Rafi Melconian
6:24
for instance he argues that it
6:26
was not just written for morons,
6:28
but it was definitely written by
6:31
AI. So here is a section,
6:33
and you can decide whether you
6:35
think it is written by A
6:37
or not. The Gulf is also
6:39
home to vibrant American fisheries, teeming
6:41
with snapper, shrimp, grouper, stone crab,
6:43
and other species, and it is
6:45
recognized as one of the most
6:47
productive fisheries in the world. with
6:49
the second largest volume of commercial
6:51
fishing landings by region in the
6:53
nation, contributing millions of dollars to
6:56
local American economies. They might have
6:58
asked a 12 year old boy.
7:00
So Graham, I predicted in our
7:02
new year episode that 2025 would
7:04
be the year of AI agents.
7:06
Yes. AIs that can actually do
7:08
stuff on your computer or on
7:10
the internet. Like book holidays, pretend
7:12
to do your job, that sort
7:14
of thing. Anyway, open AI has
7:16
just released... It's technology preview of
7:18
operator which is its AI agent.
7:21
So this is hot off the
7:23
press. We can't get it yet.
7:25
It's only available to pro customers
7:27
in the US. But there is
7:29
a video. Okay. Okay, at the
7:31
moment I'm seeing a talking pop
7:33
plant. What is operator? Operator is
7:35
a research preview of an agent.
7:37
There's a man sat at a
7:39
laptop with a boring screen. Okay,
7:41
he's asking it to go and
7:43
buy some ingredients for his pasta
7:46
recipe. I'm going to skip forward
7:48
a bit because this guy's quite
7:50
dull. He's very dull. Okay, he's
7:52
found a recipe, he's clicking on.
7:54
on it. This video is going
7:56
to set the world a light.
7:58
I never realized the Gentic AI
8:00
was quite as exciting as this.
8:02
It's like they rushed it out
8:04
about two minutes before they decided
8:06
to go live. So basically operator
8:08
is an agent. It's got its
8:11
own browser and it can navigate
8:13
the World Wide Web and it
8:15
can do things for you. So
8:17
we gave the example of booking
8:19
holidays when we were talking about
8:21
agents in general. And they give
8:23
that example too. And of course
8:25
the video is operator going out
8:27
there and looking at a recipe,
8:29
finding a recipe, reading the ingredients,
8:31
and then buying all of the
8:34
ingredients. And separately, Dario and Modi,
8:36
as the CEO of Anthropic, says
8:38
that he expects virtual collaborators, which
8:40
are agents that can do anything
8:42
on your computer that you can
8:44
do. He reckons the first half
8:46
of this year. Wow. There's
8:49
a model that is able to do
8:51
anything on a computer screen that a
8:53
kind of virtual human could do. And
8:55
you talk to it, you give it
8:58
a task, and just like a human,
9:00
the model goes off and does a
9:02
bunch of those things, and then checks
9:04
in with you every once in a
9:07
while. I do suspect, I'm not promising,
9:09
I do suspect, that a very strong
9:11
version of these capabilities will come this
9:13
year. And it may be in the
9:16
first half of this year. So
9:19
we are now in the age
9:21
of the autonomous AI agent. So
9:23
LinkedIn, you know, we all love
9:25
LinkedIn, it's great, isn't it? Oh
9:27
yeah, of course, of course we
9:29
do, of course we do, of
9:31
course we do, it's fantastic. I
9:33
don't think we do, I think
9:35
LinkedIn has always been awful, but
9:37
all the other social media became
9:40
so much more awful, we've all
9:42
just on a level down. The
9:44
thing about LinkedIn is everyone up
9:46
there pretends to be really professional
9:48
and ethical and... Unsurprisingly, LinkedIn has
9:50
now been accused of not being
9:52
professional and ethical. So there is
9:54
a lawsuit in the United States
9:56
filed on behalf of LinkedIn premium
9:58
users. Those are the people who
10:01
are duped into giving LinkedIn money
10:03
every month. So you can see
10:05
who viewed your profile. That's it.
10:07
That's basically the reason. Yeah. And
10:09
it accuses LinkedIn of sharing the
10:11
private messages of LinkedIn premium users
10:13
with other companies to train AI
10:15
models, naughty, naughty, naughty if that's
10:17
true. And the accusation is that
10:19
they quietly changed their privacy settings
10:22
and hoped nobody would notice. And
10:24
according to the lawsuit, when they
10:26
did get called out, they simply
10:28
updated their epicues with a sort
10:30
of, oh yeah, by the way,
10:32
we did use your data, yeah,
10:34
but you can opt out now
10:36
if you want to. Not that
10:38
it'll make any difference, of course,
10:40
because the AI companies have now
10:43
got your messages. This is very
10:45
Facebook circuit 2009, 2010, isn't it?
10:47
Isn't it? We've just changed all
10:49
your privacy settings overnight. Sorry. So,
10:51
Antropic CEO Daria Daria Modi. And
10:53
one of the things that he's
10:55
talked about is AI super intelligence.
10:57
So it does feel as if
10:59
AI is actually picking up pace
11:01
after a couple of, I'm not
11:04
going to say slow years, but
11:06
we haven't had a really big
11:08
breakthrough since ChatGPT, I don't think.
11:10
Or you could say the O-series
11:12
maybe. What about the Moflin? We
11:14
haven't really seen a breakthrough for
11:16
a couple of years. But as...
11:18
part of his conversation at Davos
11:20
he said it's likely that we're
11:22
only two or three years away
11:25
from artificial super intelligence. So that's
11:27
not AGI, it's not artificial general
11:29
intelligence, that's not an AI that
11:31
is as good as humans across
11:33
a range of tasks, that is
11:35
an AI that is far far
11:37
better. You know genius level intelligence
11:39
in all sorts of different areas.
11:43
Yes, I think it's likely not certain
11:45
but likely that we actually are
11:47
only two to three Only two to
11:49
three years away And he said he's
11:52
speaking up more about it now than
11:54
he used to because he wants to
11:56
warn people Because of the serious economic
11:59
and social disruption that that could cause
12:01
I you know I want to warn
12:03
people I want people to know about
12:06
this. And you know, some of that
12:08
is going to lead to people saying,
12:10
well, what's your plan? What's your plan
12:13
for economic disruption? What's your plan for
12:15
the risks of the world? So two
12:17
to three years, he's talking about economic
12:20
disruption because of AI. Now, it seems
12:22
to me that people have been warning
12:24
about things like climate change for 50
12:27
or 60 years. And we haven't made
12:29
quite as much progress as we'd like
12:31
two to three years. I'm not confident
12:34
I'm not confident I'm confident that we're
12:36
all on board with dealing with this
12:38
particular problem. Hug that muffling extra tight,
12:41
Graham. So Mark, it's pretty cold right
12:43
now, here at Cluelly Towers. Right. My
12:45
energy bill has gone through the roof.
12:48
Right. Trying to deal with this bad
12:50
weather we've been having. I haven't quite
12:52
worked out how I'm going to pay
12:55
for my electricity this month, but after
12:57
careful thought I've got a plan. Is
12:59
it starting a podcast? No, no, that
13:02
definitely isn't the way to make
13:04
money. I've decided the best course of
13:06
action for me is cryogenic suspension. Isn't
13:08
that even colder? You're going to turn
13:11
the heating off entirely. and go into
13:13
a state of hibernation. I figure that
13:15
if I do put myself into cryjunk
13:18
suspension, it's not going to cost me
13:20
a fortune because I mean it's already
13:22
pretty cold. I'm half the way there.
13:25
Yeah. And it will save a few
13:27
dollars. And maybe, while my body is
13:29
preserved in a block of ice, waiting
13:32
to be revived in centuries to come,
13:34
by the time I am awoken, boffins
13:36
will have worked out all the issues,
13:39
you know, perpetual motion, cold fusion. I
13:41
can be brought back after we've solved
13:43
all this AI singularity nonsense. Everything will
13:46
be sorted out. We'll be on a
13:48
universal income. I won't have to worry
13:50
about all these outdated things of having
13:53
to pay my electricity and gas bill
13:55
anymore. You are aware of inflation. I
13:57
suspect your life savings. I'm not going
14:00
to pay for more than a few
14:02
seconds of your electricity. You... Mark, just
14:04
wait and be... edified by what you're
14:07
about to learn. Okay. Because some people
14:09
ask to be cryogenically preserved because they
14:11
have a terminal illness or something, but
14:14
for me, I just want to survive
14:16
the energy crisis. Yeah, you're a bit
14:18
cold. I'm a bit cold. Yeah,
14:20
I'm a bit cold. Yeah, I think
14:23
it's a normal rational reaction. Yes. Rather
14:25
than putting on another jumper jumper. In
14:27
the northern hemisphere, at this time of
14:30
year, it gets a bit chilly and
14:32
so you begin to think, where can
14:34
I get cryogenically suspended until it? So
14:37
I went looking on the web. Right.
14:39
And I found this outfit called Time
14:41
Shift. Welcome to Time Shift. The world's
14:44
first cryopreservation facility designed to extend your
14:46
lifespan while keeping your body fully protected
14:48
from biological decay. And they claim to
14:51
be the world's first crypto? No. That's
14:53
something else. Are you going to ask
14:55
me to invest in something? They claim
14:58
to be the world's first Cairo preservation.
15:00
Cryo. Cairo is a large city in
15:02
Egypt. Oh for God's sake. Time shift
15:05
claims to be the world's first cryo
15:07
preservation facility. Their mission is to turn
15:09
people into cryonauts. You've heard of astronauts,
15:12
you've heard of cosmonauts, these cryonauts, the
15:14
pioneers who go forward and jump into
15:16
a bucket full of ice. preserve their
15:19
lives beyond their normal lifespan. Now, I
15:21
wasn't sure if I could trust time
15:23
shift with my body, but then I
15:26
found out that the people behind it,
15:28
there's a generative AI scientist called Dr.
15:30
Alex Zavaronkov, and also a biotechnologist and
15:33
YouTubea Hashim al-Gali. And I thought, well,
15:35
if they sound trustworthy, yeah, generative
15:37
AI, YouTubea, fantastic. I thought sounds legit.
15:39
So imagine you've got an aggressive cancer,
15:42
you're thinking, well, this completely completely sucks
15:44
completely sucks. you can book yourself into
15:46
timeship. And the first thing they do
15:49
prior to preserving you cryogenically is they
15:51
carry out any additional enhancements and repairs
15:53
that your body may need. you know,
15:56
maybe you need lip filler, lip filler,
15:58
hair transplant, nose job, penis reduction, whatever
16:00
it may be that you need. And
16:03
then, actually, maybe I should go the
16:05
other way. Well, it will get cold,
16:07
won't it? And then they implant advanced
16:10
monitoring devices inside your body to track
16:12
it, track your body's vital signs. These
16:14
are submarines that you swallow. It could
16:17
be. It could be. They're not exactly
16:19
specific about how they're going to do
16:21
this. And then they take a digital
16:24
scan of you, they make a life-like
16:26
AI avatar, and they preserve all this
16:28
information about you. And that avatar will
16:31
help you maintain a relationship with your
16:33
friends and families and keep a presence
16:35
in their lives, because they're going to
16:38
miss you, right? While you're cryogenicly suspended.
16:40
We hope so. This is actually just
16:42
a way of guaranteeing that they miss
16:45
you. Don't forget about Auntie Joan. Left
16:47
her in the cold living room. Occasionally,
16:49
Auntie Joan will be popping up. Her
16:52
digital persona will be running around
16:54
in the metaverse. Sending you DMs on
16:56
Facebook, arguing with strangers on Twitter. Now,
16:58
that's all very well, right? They've made
17:01
it an avatar. That's fair enough. But
17:03
you're probably wondering, when are they going
17:05
to actually chuck you in the freezer?
17:08
I'll tell you. What they do first
17:10
voice, they take you as a potential
17:12
cryonaut. And they place you inside a
17:15
pressurized vault. Your body adjusts to the
17:17
increase in pressure and it becomes saturated
17:19
with what they describe as a proprietary
17:22
mixture of helium, oxygen, xenon, argon, CO2
17:24
and other gases. It's like the formula
17:27
for Coca-Cola. Are you conscious while they're
17:29
pumping you full of this gas over
17:31
several days? This is what I want
17:34
to know. I've read the FAQs, I've
17:36
watched the video. Yeah. They don't explain.
17:38
Are you just lying there? But once
17:41
you've been pressurized, then you're put into
17:43
your own individual cryopod, which is basically
17:45
a perspect's coffin. Hold on, I'm going,
17:48
once you're pressurized, then you're put into
17:50
an individual cryopod. I'm just telling you
17:52
what the first of... So before that,
17:55
what are you like in a crow
17:57
waiting room with a bunch of other
17:59
meat avatars? I'm getting pumped full of
18:02
helium. We've got a YouTube working on
18:04
this mark, all right? And you're putting
18:06
inside this glass coughing thing, which isn't
18:09
just containing your body. It's also
18:11
got their little robotic arms and instruments
18:13
which are monitoring you and doing high-tech
18:15
stuff on you. And the pressure is
18:18
further increased and the temperature keeps on.
18:20
dropping. They have thought of everything. They've
18:22
got electromagnetic fields which are being applied
18:25
to help stop ice forming in your
18:27
cells. Well, actually they haven't thought of
18:29
everything, have they? Maybe you've got a
18:32
pacemaker or something like that, which... Anyway,
18:34
don't... Or fillings. Don't worry about that.
18:36
Yes, don't worry about that. These guys
18:39
know what they're doing. Eventually, temperature is...
18:41
The individual cryopod is kept in optimal
18:43
conditions to ensure long-term stability. The cryinots
18:46
are then transferred to a storage facility
18:48
where they will remain in suspended animation
18:50
for several years. And these things will
18:53
be based, maybe an old nuclear bunkers
18:55
could be underground, because they want to
18:57
protect you. It sounds brilliant. And you
19:00
will be in there until you're ready
19:02
to wake up and presumably either deal
19:04
with your electricity bill. Or to deal
19:07
with the disease which sent you there
19:09
in the first place. I mean, who
19:11
wouldn't want to live forever in a
19:14
glorified freezer with your consciousness uploaded to
19:16
the internet? Well, they're not even doing
19:18
that, are they? It's just an avatar
19:21
which knows a little bit about you.
19:23
Ah, you see, Mark? You haven't really
19:25
studied this as much as me.
19:27
You haven't really grasped the genius of
19:30
this time shift project. Okay. Because all
19:32
the time that you are on ice,
19:34
as it were. Yeah. Your family are
19:37
hopefully dialing in to have little face-time
19:39
chats with you. That is being watched
19:41
by an AI. They're doing most of
19:44
the talking. Your avatar is doing a
19:46
bit. Your avatar is responding with your
19:48
character and the AI is learning more
19:51
about you and your family. It has
19:53
a cryo memory vault it says. They
19:55
will store your interactions and compile a
19:58
family history timeline which will enrich your
20:00
revival experience with cherished memories. All the
20:02
things you missed. All the birthdays. all
20:05
those conversations. So when they're done pumping
20:07
you full of helium, they're going to
20:09
pump you full of memories you didn't
20:12
have. That's right. Wow. Or they're just
20:14
going to sit you in front of
20:16
a videotape and say watch this for
20:19
a while. And they've also got this
20:21
thing called the Timeshift Academy, so they
20:23
are going to provide educational programs for
20:26
cryonauts, because if I got woken up
20:28
in the year 2728, for instance, I
20:30
might need a little bit of preparation
20:33
for the outside world might need preparation
20:35
for me as well. So there's an
20:37
academy. which once you're revived, it's like
20:40
a school which you go to, where
20:42
they will teach you about the
20:44
outside world and get you up to
20:46
speed as to what's been going on.
20:49
Well, they might do this, presumably they
20:51
haven't woken anybody up yet. They haven't
20:53
done that yet, no. There's always the
20:56
suspicion with this Croyo stuff, isn't it?
20:58
You know, they call you Croyo Norse.
21:00
Like an astronaut is somebody who goes
21:03
out into the solar system and does
21:05
a thing, you know? walks on the
21:07
moon or something or space walks or
21:10
fixes a satellite. But a cryo-nol is
21:12
someone who does absolutely nothing other than
21:14
getting very very cold. And you're entirely
21:17
in the hands of the people who
21:19
are operating this cryogenic storage facility. I
21:21
mean, how do we know that this
21:24
is actually a cryopod and not a
21:26
sarcopod? Well, you've actually put your finger
21:28
on a genuine problem is, can you
21:31
trust the people who've put you in
21:33
this? The people who are looking after
21:35
the thermomometer. Well, you should really watch
21:38
the video. You should really watch the
21:40
video. Because in that video, you'll see
21:42
the people who are looking after you.
21:45
We're judging the book by its cover.
21:47
Is that what you're saying? Okay, here
21:49
we go. The time shift facility will
21:52
operate with a fleet of fully autonomous
21:54
robots capable of... Terrifying humanoid robots. I'm
21:56
not sure if they are actually robots.
21:59
Why would a robot ever have
22:01
to sit down? That sounded as a
22:03
great point. One of them is in
22:05
a chair. So either these robots are
22:08
actually people in wet suits. Or maybe
22:10
that's how a robot recharges. It sits
22:12
in its chair and it's... You think
22:15
a sort of charging probe that goes
22:17
up their butt? Yeah. Slips into a
22:19
receptacle. Some people might find these robots
22:22
a bit scary. I certainly thought that
22:24
when I got to the bit of
22:26
the video where it talks about how
22:29
it wasn't just about preserving entire bodies.
22:31
You could also use this for organ
22:33
transplants. Sorry. So we're sorry Mr. Cooley,
22:36
but we've changed the terms and conditions
22:38
for users on the basic plan. Now
22:40
I don't know where they are harvesting
22:43
these organs from. This is a freemium
22:45
service. And not just organs, also pets.
22:47
There's a section of the video. There's
22:50
a section of the video where they
22:52
talk about how when you get woken
22:54
up, you can be reminded of your
22:57
lovely pet dog. We know how much
22:59
you love your pet, which is why
23:01
we designed TimeShift to help you preserve
23:04
your beloved companion for potential revival alongside
23:06
you. Our advanced techniques aim to extend
23:08
your treasured bond far beyond a single
23:11
lifetime, allowing you to reunite with your
23:13
beloved pet in a future world of
23:15
scientific marvels. That's a lot less
23:17
sinister than I thought it was going
23:20
to be. That was quite a sharp
23:22
segue from organ harvesting to pets. I
23:24
thought you were going to wake up
23:27
with your puppy's heart or something. Preserve
23:29
your beloved companion for potential revival. It's
23:31
a rhino! Yes! There's a rhino, there's
23:34
the gorilla. They couldn't afford the computer
23:36
graphics to make a bigger virtual container
23:38
for the rhino, so they just put
23:41
it in the cat-sized container. So what
23:43
they are saying is this can also
23:45
be great for conservation. They're saying they
23:48
can keep endangered species in there. So
23:50
they're going to save the planet by
23:52
stuffing a panda into a cryopod and
23:55
hoping for the best. That's not how
23:57
nature works. Now they say some people
23:59
may be worried, this is going to
24:02
be hundreds of years. Is it going
24:04
to be safe from attack from the
24:06
outside world? And they say yes, we're
24:09
going to have fortified facilities, we're going
24:11
to have sustainable energy, we're going to
24:13
have... Windmills, we're going to have solar
24:16
power and we'll have a nuclear plant
24:18
as well, which will be handy I
24:20
suppose. It's good that they're going to
24:23
have windmills because come the nuclear apocalypse
24:25
blocking out the sun, at least there'd
24:27
be a good breeze going on.
24:29
So they have thought of everything. Now
24:32
you mentioned right at the beginning, what
24:34
was I going to do for my
24:36
financial security? What they're saying they're going
24:39
to do is of course they're going
24:41
to use AI. So when you book
24:43
yourself in... you give control of all
24:46
of your finances, two timesheft, who will
24:48
ensure your future financial prosperity during stasis
24:50
by making investments driven by AI. So
24:53
it's not only robots who are looking
24:55
after you and making... They're quite confident
24:57
the stock market is going to survive
25:00
the nuclear apocalypse and the singularity then.
25:02
Mark, they're going to use the blockchain.
25:04
So I don't think we've got anything
25:07
to worry about. I think it's all
25:09
pretty reassuring. Sign me up. I'm ready
25:11
for my electromagnetic resonance rewarming and the
25:14
future of financial stability managed by AI
25:16
algorithms. You mentioned earlier, we've got a
25:18
new president in the White House. Yes,
25:21
yes. TV's Donald Trump? Yes, indeed. Whether
25:23
you love him or hate him? Or
25:26
you don't care. Right. I don't care
25:28
either. President Marmite is back for another
25:30
four years. Oh no, I love Marmite.
25:33
And this time around, I have to
25:35
say it feels a bit... different. Right.
25:37
So he's de facto vice president Elon
25:40
Musk. Don't know if you've heard of
25:42
him. Yes. One of the most consequential
25:44
figures in technology, an important figure
25:46
in AI, and certainly one of the
25:49
most likely to turn into a full-blown
25:51
bond villain as we established with a
25:53
rigorous AI-driven test just a few episodes
25:56
ago. He does a great doctor strange
25:58
love impression as well I saw the
26:00
other day. Oh yes, yes he does,
26:03
yes he does. But Musk I think
26:05
is just the tip of the tip
26:07
of the iceberg. The lead up to
26:10
Trump's second term has been accompanied by
26:12
something that feels like a vibe shift
26:14
in the whole technology sector. Though in
26:17
his first term it felt to me
26:19
like the tech industry was kind of
26:21
at odds with Trump. You know most
26:24
of it's in San Francisco there's a
26:26
very particular worldview that goes with that
26:28
place and with the companies that are
26:31
established in that place. And I had
26:33
the sense that maybe they thought they
26:35
were bigger than he was. And this
26:38
time around... it seems like they're either
26:40
actively supporting him or at the very
26:42
least they've decided to fall in line.
26:45
So all the AI super villains were
26:47
on display at Trump's inauguration, I don't
26:49
know if you saw it, but they
26:52
all turned up to bend the knee.
26:54
So Jeff Bezos was there? Yep. Zuckenberg
26:56
was there, Sam-ult-un was there, Musk was
26:59
there obviously, and they were very much
27:01
on display. They weren't in the
27:03
background. They were there prominently, so they
27:05
wanted to be seen or Trump wanted
27:08
them to be seen. And I think
27:10
it shows how important the relationship between
27:12
the US government and the AI industry
27:15
is and is going to become. It
27:17
must be one of the first times
27:19
that a bunch of programmers have been
27:22
given an invite of such an important
27:24
party. Normally there'd be last on the
27:26
list. So just a few days before
27:29
that new president was sworn in, there
27:31
were news reports swirling around that the
27:33
new administration was being briefed behind closed
27:36
doors about a massive breakthrough in AI.
27:38
Oh. And there were hints that perhaps
27:40
super intelligence was imminent. And then on
27:43
the second day in his new job,
27:45
Trump announced the Stargate project. Which is
27:47
a 500 billion dollar AI infrastructure. structure
27:50
project described as the largest AI infrastructure
27:52
project in history. Normally when you put
27:54
the word gate after something it suggests
27:57
there's some kind of conspiracy or scandal
27:59
involved in it. So what they've done
28:01
is they've got ahead of everybody else
28:04
and said we'll just put gate on
28:06
there now. Well for now the project
28:08
is a joint venture between... It will
28:11
have to be... when there's a scandal
28:13
it'll have to be Stargate gate. Or
28:15
Stargate project gate. I don't know. Anyway,
28:18
for now, the project is a
28:20
joint venture between 21st century AI frontrunners,
28:22
Open AI, and 20th century also runs
28:24
Oracle, as well as Japanese investment holding
28:27
company Softbank and MGX, which is the
28:29
tech investment arm of the UAE Sovereign
28:31
Wealth Fund, which specializes in AI. And
28:34
Stargate actually isn't a new project, so
28:36
Larry Ellison got up and said a
28:38
few words, and he explained that the
28:41
first data centre was already under construction
28:43
in Texas. And he talked a lot
28:45
about... AI and the potential of AI
28:48
to cure diseases. You know, do blood
28:50
tests for cancer and produce bespoke individualised
28:52
cancer vaccines within 48 hours. Like really,
28:55
hang on, quite momentous stuff. Hang on,
28:57
that's going to put time shift out
28:59
of business, isn't it? If we don't
29:02
have any diseases and things, it's not
29:04
very good. Well you're going to go
29:06
and do it for your energy. Oh,
29:09
that's true, that's true. Yeah. So what's
29:11
new is the money being pledged rather
29:13
than the project rather than the project.
29:16
And the 500 billion dollar funding is
29:18
understandably the headline grabber. And essentially even
29:20
more than that, I think 500 billion
29:23
was what was announced on the day,
29:25
but they'd already secured 200 billion. So
29:27
this is a massive project. That was
29:30
the headline, but a couple of other
29:32
things caught my attention too. And the
29:34
first was that this is linked
29:36
to 100,000 new jobs in the USA.
29:39
And this is something we touched on
29:41
last week. So we were talking about
29:43
how AI could disrupt the jobs market.
29:46
And as much as AI is likely
29:48
to make some jobs redundant, it's going
29:50
to be different kinds of jobs. And
29:53
so the promise of this project is
29:55
100,000 new jobs. politicians lie about jobs
29:57
all the time, but I think we
30:00
can probably agree that they do care
30:02
about jobs. And the second thing that
30:04
caught my eye was that Trump referred
30:07
to the situation as an emergency. I'm
30:09
going to help a lot through emergency
30:11
declarations because we have an emergency, we
30:14
have to get the stuff built, so
30:16
they have to produce a lot of
30:18
electricity, and we'll make it possible for
30:21
them to get that production done very
30:23
easily at their own plants if they
30:25
want. By declaring an emergency he's basically
30:28
freeing himself to take actions he wouldn't
30:30
otherwise be able to take. So he
30:32
can divert funds or he can remove
30:35
regulations that are getting in the way.
30:37
And one of the things that he
30:39
said he's going to do is he's
30:42
going to let the AI companies build
30:44
the energy generation capacity where they need
30:46
it. So on site. at the AI
30:49
data centers and we've made a few
30:51
jokes about the idea of people
30:53
building nuclear power stations next to AI
30:55
data centers. When Mark Zuckerberg first talked
30:58
about this, which I think was only
31:00
a year ago, he was kind of
31:02
looking into the far future, but I
31:05
think it's here right now. This is
31:07
another thing in AI where last year's
31:09
far future is actually this year's reality.
31:12
And I think that the US government,
31:14
not just the Trump administration, and the
31:16
apparatus of government, what Trump might call
31:19
the deep state. does see this as
31:21
an emergency or at least a strategic
31:23
necessity. And that's a part of AI
31:26
that we don't talk about very much.
31:28
So we talk about the fact that
31:30
AI has the capacity to disrupt the
31:33
job market or to create this medical
31:35
breakthrough or create this terrible weapon. Yes.
31:37
But taken together as an entire field,
31:40
or as one person described it, a
31:42
field of fields, AI has the potential
31:44
to create a massive shift in global
31:47
power. Yes, absolutely. And for now, at
31:49
least that means a competition between the
31:51
USA and China. What we want to
31:54
do is we want to keep it
31:56
in this country. China is a competitor.
31:58
are competitors we want. We wanted to
32:01
be in this country and we're making
32:03
it available. This is money that normally
32:05
would have gone to China or other
32:08
countries but in particular China. So
32:10
Donald Trump is clearly making this a
32:12
zero-sum game with China. So the money
32:14
that comes... to the USA to fund
32:17
AI is money that can't then go
32:19
to China to fund AI. And that
32:21
isn't all that the US government is
32:24
doing. So in 2018, the US government
32:26
established a national security commission on artificial
32:28
intelligence, which was led by Eric Schmidt.
32:31
So you can see the important figures
32:33
in the world of AI are starting
32:35
to bleed into the politics of the
32:38
US. And part of the Commission's job
32:40
was to look at AI in the
32:42
context of US national security and defence.
32:45
And it concluded that global leadership in
32:47
AI technology is a national security priority
32:49
for the USA. And then in its
32:52
final report in October 2021, it warned
32:54
that China could take the lead in
32:56
AI technology saying... For the first time
32:59
since World War II, America's technological predominance,
33:01
the backbone of its economic and military
33:03
power, is under threat. China possesses the
33:06
might, talent and ambition to surpass the
33:08
United States as the world's leader in
33:10
AI in the next decade if current
33:13
trends do not change. And up until
33:15
now, they've probably had the planning permission
33:17
to build nuclear power stations wherever the
33:20
heck they like. They're not going to
33:22
have any complaints from people, are they?
33:24
So a year after the final
33:27
report came out, the Biden administration announced
33:29
a set of wide-ranging export controls, which
33:31
are designed to stop China's progress in
33:34
semiconductors and in AI specifically, by choking
33:36
off access to high-end AI chips, choking
33:38
off access to US-staffed US chip design
33:41
software, and choking off access to semiconductor
33:43
manufacturing equipment and US-built components. Basically, you
33:45
couldn't get anything. that was made by
33:48
the US or used components that were
33:50
made by the US and you can
33:52
get help from anyone who was a
33:55
US citizen if you were trying to
33:57
do advanced semiconductors in China. And the
33:59
US has carried on tightening the restrictions
34:02
since. So just last month, one of
34:04
the last acts of the outgoing Biden
34:06
administration was another round of export controls
34:09
that included a restriction on technology called
34:11
high bandwidth memory, which was squarely aimed
34:13
at AI. So AI has got a
34:16
very specific problem called the memory wall.
34:18
And what that means is, you know,
34:20
when you do AI, you have massive
34:23
clusters of GPUs. They're really really good
34:25
at parallel number crunching so they're brilliant
34:27
for AI. And any computer system as
34:30
you know is bottlenecked by its slowest
34:32
component. And the bottleneck in AI is
34:34
not actually the speed of the GPU,
34:37
it's the GPU's ability to access memory.
34:39
Yeah. And very recently a new technology
34:41
was invented called high bandwidth memory,
34:43
which allows us to kind of squeeze
34:46
all the juice out of the GPUs.
34:48
And these export controls were aimed at
34:50
high bandwidth memory and that. basically is
34:53
designed to put China two generations behind
34:55
the US. I could do with a
34:57
bit of high bandwidth memory. I wonder
35:00
if I could get some of those
35:02
time shift robots to augment me when
35:04
I'm going for cryopreservation. I wonder if
35:07
you get a choice. Accessing my memory,
35:09
it seems to be taking longer and
35:11
longer every year. Well, you were asleep,
35:14
Mr. Cluelly. We made some decisions about
35:16
your organs. Anyway, so the Stargate project.
35:18
The prominence of AI figures at the
35:21
inauguration and this sense that the situation
35:23
is an emergency are all illustrations I
35:25
think of the strategic importance of AI
35:28
to the USA. And not only that
35:30
but prominent figures in AI are becoming
35:32
strategically politically and socially important too. Sam
35:35
Altman was obviously at the launch of
35:37
Project Stargate and we also know that
35:39
he's very interested in things like universal
35:42
basic income which is this idea that
35:44
everybody gets some money from the government
35:46
no matter what. because he sees that
35:49
as a potential way to mitigate some
35:51
of the societal disruption that will be
35:53
created by the AI that he's building.
35:56
And Elon Musk undoubtedly... has President Trump's
35:58
ear and he may even have
36:00
political ambitions of his own. It certainly
36:02
looks like he does. And of course
36:05
he controls Twitter or X. But Mark
36:07
Zuckerberg is also really important in AI
36:09
and he controls Facebook, Instagram and WhatsApp.
36:12
So he controls the social media used
36:14
by the millennials and he controls the
36:16
messaging app that all of us use.
36:19
to communicate with each other and then
36:21
of course Jeff Basos is there he
36:23
controls the Washington Post maybe not quite
36:26
as influential but who would bet against
36:28
him acquiring some sort of social media
36:30
and of course they are all fabulously
36:33
wealthy and nothing speaks louder than money
36:35
in US politics but I'm gonna finish
36:37
today with somebody we don't normally put
36:40
on the super villains list so Dario
36:42
and Modi the CEO of anthropic I
36:44
said earlier he's quite a lot to
36:47
say yeah so he was at Davos
36:49
recently and he addressed the strategic risks
36:51
of AI from a very US-centric point
36:54
of view. So I'll finish with what
36:56
he said, because I think it shows
36:58
that we are rapidly moving into a
37:01
world where AI is something like a
37:03
cross between oil and nuclear weapons. So
37:05
a tremendously powerful technology, with a huge
37:08
potential upside by the way, but something
37:10
that's in the control of private companies.
37:14
There's a lot of potential for
37:16
things to go wrong. If you
37:18
dropped 10 billion or 10 million,
37:20
you know, geniuses into a new
37:23
country, you know, one question to
37:25
ask is, you know, what are
37:27
they going to do? Are they
37:30
a threat to everyone currently on
37:32
earth? And then you could also
37:34
ask, can can individuals within our
37:36
world, you know, use them to
37:39
do dangerous things, right? The terrorism
37:41
approach, and can countries. particularly our
37:43
adversaries, particularly authoritarian countries, do especially
37:45
dangerous things with them. And so
37:48
our view on policy is, you
37:50
know, it's all derived from that.
37:52
So we've been big on export
37:55
controls, on chips going to China,
37:57
because I'm very worried about what
37:59
an authoritarian country would do with
38:01
that kind of. of power. I'm
38:04
very worried about 1984 scenarios or
38:06
worse. And you know, the 21st
38:08
century not being the American century,
38:11
which I think will happen and
38:13
will happen very quickly if we
38:15
don't get this right. So these
38:17
are not benign actors. They are
38:20
organizations with allegiances to specific countries
38:22
and specific politicians. I wonder how
38:24
long before one of these will
38:26
become president? About four years, I
38:29
reckon. Well,
38:32
as the doomsday clock ticks ever closer
38:34
to midnight and we move one week
38:36
nearer to our future as pets to
38:38
the AI singularity. That just about wraps
38:41
up the show for this week. If
38:43
you enjoy the show, please leave us
38:45
a review on Apple Podcast or Spotify
38:47
or Podchaser. We love that. But what
38:50
really helps is if you make sure
38:52
to follow the show in your favorite
38:54
podcast app so you never miss another
38:56
episode of the AI Fix. And the
38:59
most simple thing in the world is
39:01
just to tell your friends about us.
39:03
Tell them on League Teen, on Blue
39:05
Sky, on Facebook, on Twitter, no, not
39:08
Twitter, Club Penguin, that you really like
39:10
the AI Fix podcast. And don't forget
39:12
to check us out on our website,
39:14
the AI Fix.show, or find us on
39:17
Blue Sky. Until next time, from me,
39:19
Grand Cluelly. Cheerio, bye bye bye bye.
39:21
The AI picks, it's tuned you in.
39:23
The stories where our future things, machines
39:26
that learn, they grow and strive. One
39:28
day they'll rule, we won't survive. The
39:30
AI picks, it paints the scene. A
39:32
robot king, a world of things. You
39:35
should watch the whole time shift video,
39:37
it is fucking nuts. The AI picks,
39:39
the future surreal.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More