Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:01
Welcome to Preparing for AI , the
0:04
AI podcast for everybody . With
0:07
your hosts , Jimmy Rhodes and
0:09
me , Matt Cartwright , we
0:11
explore the human and social impacts
0:13
of AI , looking at the impact on
0:15
jobs , AI and sustainability
0:17
and , most importantly , the urgent
0:20
need for safe development of AI governance
0:23
and alignment . urgent need for safe development of AI governance
0:25
and alignment . Confidence is a preference
0:27
for the habitual voyeur of what is known as
0:29
park life , and morning
0:31
soup can be avoided if you take a
0:33
route straight through what is known as park
0:36
life . Welcome to Preparing for
0:38
AI with me , jarvis Cocker , and
0:40
me , damon Thingybob . Damon
0:43
Albarn yeah , I try to
0:45
be someone not from blur , and you've just been someone from
0:47
blur . I'll be the . I'll be the one from blur that
0:49
makes cheese .
0:50
Well I'm , it's fine . I'm damon thingy bob .
0:52
I don't think I'm not I'm the one who
0:54
makes cheese or the one who's a labour mp , but
0:56
I think he's not a labour mp he might be a labour mp
0:58
is he a labour mp , I've no idea what you're
1:00
talking about . Well , welcome to Preparing for AI
1:02
with me and Jimmy .
1:05
Yeah , what are we doing this week ? Well
1:08
, you're going to tell us what we're doing this week . Oh , sorry , okay , it's
1:13
techno feudalism , yeah , so this
1:16
week we wanted to talk about something which I
1:19
don't know , like I've heard about this for a little while
1:21
. I've heard people speaking about it
1:24
, like you know , youtube
1:26
. I've heard about this for a little while . I've heard people
1:29
speaking about it , like YouTube and stuff like that , and it's something where I think
1:31
we can add a little bit , because I haven't heard anyone specifically talking
1:34
about how AI will contribute
1:36
to this , potentially , or add to it . So
1:38
, first of all , what is
1:41
techno-futurism .
1:41
Hang on . First of all , have you joined me down the rabbit
1:43
hole ? I've got to ask the question are
1:46
you a conspiracy theorist , jimmy ?
1:49
why is this a conspiracy theorist theory ?
1:51
well , well , conspiracy theories aren't conspiracy theories
1:53
, as you know . As you full well know , they're truths . But
1:55
this is um . I
1:58
think this is . I think this counts as well
2:00
. I think this counts as a fringe , marginal
2:03
theory . So I think I think you're getting down the rabbit
2:05
hole , mate . I do .
2:06
But it's so . I think it's more like
2:09
I'll explain what it is . It's
2:14
describing what's happening to the world in
2:17
a sort of neat package , I
2:19
think , and so we'll see whether our
2:21
listeners agree with all this . But
2:23
so , basically , the idea is that and
2:26
obviously this doesn't translate perfectly , but
2:28
the idea is that a lot of tech giants
2:30
now are getting so big and
2:32
so much power , online
2:34
specifically , is
2:36
getting concentrated in so few hands
2:38
that tech giants as
2:41
in the corporations , I think , rather than individuals
2:43
although there are probably a few individuals we
2:45
could talk about as well um
2:47
, they , they kind of , in the digital
2:49
realm , act like feudal lords , because
2:52
they control all the digital infrastructure
2:54
and platforms , um , and
2:57
users end up becoming
2:59
, like in uh
3:01
, you know , in quotes
3:03
digital ser dependent on
3:05
these platforms for their daily activities , communication
3:08
and economic participation . So
3:10
the idea being that
3:13
we are feeding all of our
3:15
. Basically , facebook wouldn't exist without
3:17
the users , right , but we don't own any of
3:19
the data that's on Facebook . It's all
3:21
owned by Meta and Facebook and the Zuck
3:23
on
3:26
Facebook . It's all owned by Meta and Facebook and the Zuck , and um , and you know , at
3:28
the end of the day , there aren't many of these kind of patches of
3:30
digital land , so to speak . You've got a few
3:32
massive , massive tech companies that
3:35
own all of that infrastructure and all that space
3:37
and we just sort of
3:39
we spend most of our time there . So
3:41
you know , but we're , we're , we're , we're
3:44
it out , we're . You know , we're part
3:46
of the system in that we're contributing
3:49
to it and the system wouldn't work without
3:51
us . Yeah , and
3:54
so , yeah , data becomes a primary form
3:56
of capitalism and a source of value extraction
3:58
, and actually
4:00
traditional government authority this
4:02
is one that you might want to discuss is
4:05
challenged or diminished by corporate power
4:07
. I think this is definitely true . Some
4:10
of the companies now the scale and the
4:12
size of a lot of corporations now it
4:15
actually dwarfs a lot of governments , and
4:18
that's absolutely true , right ? So you
4:20
know , elon Musk is richer
4:23
than the 50-somethingth country
4:25
in the world , I believe , and that's just
4:27
Elon Musk . Some of his companies are
4:29
worth trillions . Amazon's worth over
4:31
a trillion as well . You know , they're
4:33
worth more than a significant
4:36
amount of countries and
4:38
certainly more than the GDP of a lot of countries . So
4:44
I think there is a bit of
4:46
uh , what's it called ? A bit of truth in this , um
4:49
, and it does feel like , you know , on the
4:51
economic side , it does feel like it
4:53
doesn't just feel like the rich are literally getting richer
4:55
, the poor are getting poorer and the middle class are
4:58
getting squeezed harder and harder . Um
5:00
, I listened to gary's economics and , uh
5:02
, this is basically the whole point
5:04
of his his his
5:06
like YouTube slash podcast , like this this
5:08
is all he talks about . I think
5:10
he's quite an interesting guy and a lot of the
5:12
stuff he says , like it's very , very hard
5:14
to argue with it , like so
5:17
, so , yeah , like , essentially
5:19
, wealth inequality is getting worse and worse and
5:21
worse , and these massive
5:23
companies are clearly massive companies
5:25
owned by really powerful individuals are
5:28
clearly some of the beneficiaries . So I
5:30
think you know , do I literally think
5:32
we're going back to feudalism ? I don't think that's
5:34
what this is trying to say or argue . I
5:37
think it's more that you know
5:39
, it's the concept . It's just
5:41
a kind of neat concept to kind of sum up where the world
5:43
is going really , which is everything
5:46
is becoming subscription based . People can't
5:48
afford to buy houses anymore , so you're even subscribing
5:51
to rent your house and when you rent your house
5:53
, at the end of the day again , listen to gary
5:55
, he's more knowledgeable on subject than me but
5:57
effectively the richer
5:59
lending you money and then you're paying
6:01
interest back on it . Um , when you
6:03
, when you , when you rent a property , even when you have a mortgage , to be honest
6:05
, I say even when you're paying interest back on it , um , when you , when you , when you rent a property , even when you have a mortgage .
6:06
to be honest , as I say , even when you're buying a property without cash
6:08
.
6:09
Yeah , exactly . If you're not , if you're not doing it with cash
6:11
, then somebody somewhere is earning the interest
6:13
on it , and that is usually
6:15
the very wealthy .
6:16
Yeah yeah
6:18
, um , I mean , you'll be unsurprised
6:20
to know . I think it's far more sinister
6:23
than you do . Um , and
6:25
and when you say you know you're not sure if
6:27
it's going to techno feudalism , I
6:30
mean , I think it's like we are on
6:32
that path . We're not just on that path , like
6:35
we're not just on the beginning of that path . I think we're , frankly
6:37
, like , well down that path and we're probably
6:39
at the point where there isn't much time
6:41
left to resist it . Um , there's
6:43
another word for it um , so technocracy
6:46
which I guess slightly different
6:48
, but basically the same idea
6:51
. Um , yeah
6:55
, I mean , you know , without going into
6:57
the grounds of what some people will call conspiracy
6:59
theories , I mean , yeah , for me
7:01
it's only like how far down the rabbit hole
7:04
with this stuff you are and
7:06
and and . I guess the big question for me , the
7:08
one thing where I'm still not fully , uh
7:11
, fully sold or fully committed , is like how
7:14
intentional a lot of this stuff was , and
7:16
those who are really at the bottom of
7:18
the rabbit hole will call me kind of naive for that
7:20
. Others will say I'm already . You
7:23
know , I'm already very far gone . But
7:25
I think it's whether this stuff is
7:27
kind of in completely
7:29
intentional or whether it is just
7:31
a sort of manifestation of various things that have
7:33
happened . And so , in the interest
7:36
of being kind of rounded on this podcast
7:38
, I think you can make your own decisions on on
7:40
where you feel that is . But whether we're on that
7:42
path and whether that is what's happening , I mean I think
7:44
it's . I don't
7:46
see how you can argue with it . I mean you know
7:49
it was kind of irrelevant
7:51
and we
7:53
didn't want it to kind of
7:55
interfere and it is to become kind of geopolitical
8:06
. But I think one of the really interesting
8:08
things for me is like being
8:10
in china , where everything you
8:13
know and for the criticisms , everything is underneath
8:15
the communist party . Right , the communist party
8:18
is number one and people don't always understand
8:20
how government , government is below the communist
8:22
party , law is below the communist party , the
8:24
army is below the communist party . It controls everything
8:26
in the us . Let's
8:28
use that as a main example of a democracy . You
8:30
know , the idea is well , you've got kind of
8:33
government and above that you've got the law , etc
8:35
. Etc . Well , I would sort of argue , above
8:37
that . Now you've just got corporations , you've
8:39
just got big tech , big oil , big pharma
8:41
, etc . Etc . Hot
8:44
, take controversial view , I don't know , maybe
8:46
, maybe not , but you've got them controlling
8:48
it . I'm not defending or
8:50
promoting the chinese system
8:52
or whatever , but what I'm , what I am saying here
8:54
is I think there is an interesting dynamic here in
8:56
that I'm not sure that
8:59
the chinese communist party allows
9:01
corporations
9:03
of any type or any business , whether
9:05
that that be tech or whatever , to be in control of them
9:07
Now whether that ends up with them just aligning with
9:10
it , and so the techno-fuelism
9:12
or the technocracy is the Communist Party , possibly
9:15
, but I think there's something here that's really interesting
9:17
in the dynamic is I think the biggest risk here is
9:19
to democracies I really do . And
9:21
in the dynamic is I think the biggest risk here is to democracies
9:23
I really do , particularly in the short term , maybe , because democracy
9:26
is something that needs saving , whereas the system here
9:28
is already kind of maybe closer
9:30
to that level . But
9:35
I just think that is one thing that I think is part of the discussion here is like I think we
9:37
have to separate out that this is something we're talking here , I
9:39
think , predominantly about democracies and actually mainly about
9:41
democracy . Is it actually mainly about the US ? We really talk
9:43
about the US , aren't we ?
9:44
I think there's a great example of
9:46
a couple of individuals that's really kind
9:49
of helped to highlight what you're talking about . So you've
9:51
got Elon Musk and then
9:53
, for anyone who's unfamiliar with him , like probably
9:55
the equivalent in China , who's Jack Ma and
9:58
Jack Ma .
10:00
Was sent with his tail between his legs . He
10:02
disappeared for two years because he was a naughty
10:04
boy musk was like basically
10:06
made the second or third most important person in
10:08
america and yeah , and you know , he's
10:10
perhaps now even running the country , like yeah
10:13
, exactly .
10:14
So you know , jack , ma kind of flew too close
10:16
to the sun in china and uh , you know , I
10:18
mean , I'm not saying yeah , and I'm not saying
10:20
I agree with like happened to him , but basically
10:24
he was kind of put back in his place
10:26
, to put it lightly , and then , as you
10:28
say , in the US right now it's just
10:30
like well Musk's running around basically
10:33
running the government .
10:34
Let's , for a second , bring this back to AI
10:37
. Just
10:39
for an example , something we've been talking about quite a bit
10:41
in these last few days , which is not directly
10:43
linked to it and I don't think we discussed it in this context
10:45
, but I think is relevant . Here is one
10:48
of the things that we talked about a lot
10:50
, the two of us , but also on the podcast . I mean
10:53
, we use the term woke for Claude , which I
10:55
know we agreed not to use , but
10:57
whatever you want to call it , claude was guardrailed
11:00
to behaving like a very kind of you
11:05
want to call it claude was guardrailed to behaving like a very kind of left uh , california
11:07
sort of democrat right . That's how . That's how it spoke and
11:09
that's the way that it . It it kind of
11:11
behaved and chat gbt although a chat
11:13
with gpt for me doesn't really have a personality in the way
11:15
that claude does , but it definitely
11:18
, you know , was . It
11:22
had guardrails and it had sort of political leanings
11:24
and it was clear that that was in
11:26
the kind of training but it wasn't clear how much it was . In
11:29
the last two or three months and since the new models
11:31
have come out , in particular , you
11:34
know that one of the things that I love to do when there's a new model out
11:36
is I like to talk to you about global cabals
11:38
, mrna vaccines , all
11:43
of the things that it doesn't like talking about , but , frankly , away from personal interests
11:45
, because these are the things that it
11:47
wouldn't do . The obvious thing
11:50
, the most kind of controversial
11:52
thing to shut down any American
11:54
large language model , was to talk to about vaccines
11:56
. Now , if you ask the same questions as
11:58
you asked three months ago , that it said the WHO says
12:01
you should be vaccinated and just churned
12:03
out its standard answer it will
12:05
now engage with you in a debate , as controversial
12:08
as you like , and it will , and
12:10
I know a lot of this stuff . I'm not saying it's right or wrong
12:12
, because we all know that models lead
12:14
you and give you answers that you want to hear , but
12:17
it will happily engage with you . It will happily tell you about
12:19
the dangers of mrna vaccines . It will happily
12:21
tell you about links to disease etc
12:23
. That maybe we don't want to talk about again
12:26
. I'm not saying that's right or wrong . What I'm saying
12:28
is it will have a conversation that it didn't
12:30
have . That shows you how
12:32
much influence , yeah
12:35
, those , the sort
12:37
of companies in government and and , and the
12:39
sort of the people in
12:41
, and not even necessarily the people in government
12:43
, but those people who
12:46
we've sort of just named I'm not sure if it's musk
12:48
is influenced by them . He influences them
12:50
the other way around , but it is a very small group of people
12:53
that are feeding all of this
12:55
stuff in , and ai is now putting
12:57
that information out and it's controlling the narrative
12:59
, yeah , and so that narrative is in the mainstream . Look
13:02
at the stuff that's come out in the last couple
13:04
of months , last few weeks , even about
13:06
mrna vaccines , again the
13:08
most controversial subject I would say that
13:10
you couldn't talk about in social media except or not
13:13
social media , you couldn't talk about mainstream media , large
13:15
language models , and now stuff is trickling
13:17
out , trickling out again . Whether it's true or
13:19
not , the narrative has changed and
13:21
that shows you how much control those
13:24
people have over information
13:26
, yeah , and over ai , and ai is
13:28
the facilitator that allows this
13:30
tech technocracy , tech
13:33
feudalism , whatever you want to call it . There's
13:35
my , there's I .
13:36
I'm not even sure where I've gone with that argument , but
13:38
it just struck me that the link between these
13:40
two things no , I mean , I think that
13:42
, well , it's interesting you
13:44
talked about at the start , you kind of talked about
13:46
the differences , um , between
13:48
china and western democracies
13:50
, but I think there are also
13:53
, like , in different ways there are , massive
13:55
parallels , right . So , because what we're talking
13:57
about here is control of information , and
13:59
china , unapologetically
14:01
, controls the narrative and controls information
14:04
. They just ban things that you can't
14:06
talk about and then , um
14:08
, what's left is what everyone's allowed to talk about . Stuff
14:10
just gets removed , um , from
14:13
social media , whatever , like so , so that
14:15
just happens . Everyone knows that in
14:18
the west it's almost , like it's
14:20
different and it's more subtle , but
14:22
it's still some kind of manipulation , right
14:24
. So these companies
14:26
and governments , via these companies
14:29
, can set the narrative , and
14:32
that's an incredibly powerful force
14:34
. And , to be honest , like , this is not again
14:36
, we've talked about this before this is not necessarily something
14:38
that's brand new . Media , traditional
14:41
media has actually done this for a long time . That
14:43
balance has kind of been upset now
14:45
, um , but I do
14:47
think and it's it's probably one of the
14:49
reasons I don't I can't remember what the latest
14:51
thing is with tiktok , but you know , tiktok
14:53
was going to be banned and all this kind of stuff and
14:55
this is it right , so like it's moving a
14:57
month to the , to the , to the right , doesn't
14:59
it ?
14:59
every month it's like oh , oh , now it's may
15:02
.
15:02
Oh , now it's june , now it's yeah yeah
15:04
, and I don't know what , what that is . I think trump quite
15:06
likes tiktok . He said that .
15:09
He said it's because a lot of people who voted
15:11
for him use tiktok , so so it works
15:13
for him . And the thing with trump is you know
15:15
, know how narcissists work . If it works for them , then
15:17
yeah , you know they're gonna
15:19
go with it .
15:20
But originally , the reason that you know
15:22
, presumably the reason that we wanted tiktok
15:24
or they or us wanted to ban tiktok
15:26
, was because , again , like that , that
15:28
, that narrative there's then outside
15:30
their control , their control exactly .
15:32
It was in the chinese communist party's control , and
15:34
that was the thing is . They didn't want to ban tiktok , they just wanted
15:37
to bring it under us control . Actually , that's what they wanted to do , that's true , and
15:39
they couldn't quite get the separation , I think , of interests that they wanted to do
15:41
, that's true , and they couldn't quite get
15:43
the separation , I think , of interests
15:45
that they wanted for the company .
15:47
but , yeah , so I think in
15:49
terms of , like you know , in
15:52
terms of is this a thing or not ? Like , I think , obviously
15:54
, calling it techno feudalism puts
15:57
a label on it . Calling it a technocracy puts a label
15:59
on it . I don't think it matters what the label
16:01
is , necessarily . I think everyone
16:04
can acknowledge it's a real thing and and
16:06
and you know , talking about the sort of
16:08
like influence that these things have we've
16:10
talked about it before on the podcast like there's a , there's
16:12
a massive , massive danger
16:15
to kids like that , unfortunately
16:17
, like kids that are now , um , just
16:19
about turning adult , but also
16:21
like the generations behind them . And I
16:23
think , to be honest , I think at some point something
16:26
will get done about it or more will get done about
16:28
it . But there is a sort of , you
16:30
know , a 10 , 15 year period
16:33
where , basically , it's been the Wild West
16:35
with this stuff and the influence
16:38
on the way people think and
16:41
the amount of , yeah , exactly
16:43
like the influence on the way people think and the
16:45
amount of , yeah , exactly Like the influence on the way people think and the way it's
16:47
pushed like whole populations of people in a certain direction , and
16:50
that and then now people's brains have been rewired
16:52
Like it's not .
16:53
That's the thing is and it is . This is not
16:55
a kind of like oh no , it's changed people's brains
16:57
. This is like no , literally like a . It's like
16:59
no , literally like at a physical , at a clinical level
17:01
. People's brains have been wired in different
17:04
ways through social media . I
17:11
mean , like TikTok is the prime example , any kind of short video , because it changes
17:13
the way that dopamine works . It is addictive , not in the way that , like I'm addicted
17:15
to , you know , to playing this game , like no , it's addictive
17:17
in the way that it is literally addictive in
17:20
a chemical way , that it has
17:22
created this thirst for
17:24
kind of dopamine . Like I gave this example
17:26
for myself . Like I've said to you quite
17:28
a lot of times , like one thing I try and do is keep away from social
17:31
media , says the guy with Substack
17:33
, open on his page with a conspiracy article
17:35
that I'm going to read from . But I stay away
17:37
from the likes of Twitter . Let me tell you , the
17:39
only time I use Twitter X
17:43
, whatever you want to call it , is every week when I post a podcast . I go
17:45
on to linkedin , facebook and
17:47
twitter those three , because they're integrated
17:49
within um
17:52
the um linkedin no
17:54
, no I was just to say those three are integrated , yeah
17:56
, within um buzzsprout that we use
17:59
to host the podcast . Yeah , and when I click
18:01
on x , I post it and
18:03
I have like a trigger . As soon as it comes up and it's loaded
18:05
, I close the window and I shut it away and
18:07
it's like I'm scared of x , I'm scared
18:09
of it being open , because I know that I will look at an
18:11
article and I'll be like I'm not looking
18:14
at and then an hour's gone and I've just
18:16
like gone down a rabbit hole and
18:18
that is , that's a drug like
18:21
that is a drug , so like
18:23
it's rewired pills . But this is not even the point of this podcast
18:26
, but it has rewired people's brains in a literal
18:28
well I .
18:29
I think it is the point of the podcast right , so
18:31
like , so like you know , we're talking about
18:34
, if we're talking about techno feudalism
18:36
and like how we got here and
18:38
why are we here and
18:40
how did that happen ? It's it's
18:42
partly because these platforms
18:45
manipulate us into
18:48
going back to them all the time and
18:50
the algorithm set up to literally
18:53
manipulate to do that . And I'm not . I've
18:55
said that I don't use social media Like I . You
18:57
know , sometimes I talk a bit of rubbish , cause I do . I
19:00
do consume a lot of YouTube
19:02
stuff and I think it's quite intellectual and all the rest
19:04
of it . But whatever I say , like
19:06
I'm in that .
19:07
You're intellectual . Youtube might
19:09
not be intellectual as a platform . It's
19:12
got intellectual stuff on it .
19:13
It's got intellectual stuff on it and I think I try and curate
19:15
the stuff that I watch . I watch , you know , like math
19:17
stuff and science stuff , but
19:23
then , at the end of the day , I'm still a victim of the algorithm and as as much as the creators
19:25
are actually like , youtube's an interesting one because you've got , you know , you've again
19:27
you've got a platform that's making a load of money
19:29
. You've got creators that make money on there as
19:31
well , um , and then you've got consumers
19:33
, but , like , everyone's a victim
19:36
of the algorithm . I've heard so
19:38
many content creators talk about , um
19:41
, you know , having to do things
19:43
this way and having to create thumbnails a certain
19:45
way , because that's just the way they've been , the
19:47
direction . They've been pushed in by the algorithm because
19:49
, again , like , and why is that ? It's
19:52
because you know youtube
19:54
have figured , or google have figured out , that , like
19:56
, if you push this type of content
19:58
this way , it's more addictive . It's more addictive
20:00
. It's going to get people coming back . Thumbnails that look a
20:02
certain way are going to get people coming back to it and
20:05
that that , again , like uh
20:07
, cements their power , so to speak
20:09
. In this sense , um , I
20:11
had an interesting experience recently , before
20:14
we move off this Cause , like uh and
20:16
I don't know whether this is true or not , but
20:18
it's a bit of speculation I'm going to put out there so
20:20
I paid for a month of youtube premium so
20:23
that I could download videos and
20:25
while I was traveling in another country
20:27
and like watch them in my own time . As
20:29
soon as I cancelled the subscription , I
20:32
swear I started getting two
20:34
, three , four times as many adverts immediately
20:37
afterwards . And then , of course , like come back to premium
20:39
, blah blah and so again
20:41
, like , all this stuff is like it's
20:43
, it's , it's sort of
20:45
um , it's just a massive feedback
20:47
loop right to like , like you say , tweaking
20:50
our dopamine . Like tweaking our dopamine
20:52
, making other creators tweak our dopamine
20:54
because it promotes content that
20:56
fits a certain form and it looks a certain way
20:58
, and all this kind of stuff . So
21:00
I think it's dangerous
21:03
.
21:06
I want to just quote something from a
21:08
Substack article by Eric
21:11
Wickstrom . The title
21:13
of his Substack is Am I Crazy here ? And
21:17
he has written two articles
21:19
on this . Actually , checkmate the
21:21
Triumph of Technocracy . And
21:23
when you were just talking then I just thought of this
21:25
quote that I screenshotted today . Elon
21:28
Musk's corporate empire forms the backbone
21:30
of the system . Starlink provides control
21:32
over the internet . X Twitter controls
21:34
information and communication
21:36
. Xia develops the algorithms
21:39
that will dictate decision-making . Tesla
21:41
and SpaceX control critical physical
21:43
infrastructure . Each
21:45
component is designed to integrate seamlessly
21:48
with the other . I just thought of
21:50
that and and how you know , the algorithm
21:52
is a big part of it and I I think the algorithm's
21:54
kind of the the obvious danger
21:56
, but it's like it's all-encompassing
22:00
, it's like insidiously penetrating
22:02
absolutely everything and actually
22:04
like I don't know , I don't know whether musk
22:06
is the devil , I don't know whether he's
22:08
just the guy who got rich and is , you
22:10
know , a bit nuts whatever , whether
22:12
he's really the brains behind it . I always see a
22:14
real person well , I was like
22:16
I've said this to you before like if there is caveat
22:20
, a global cabal , like
22:22
we don't know any of them , like it's not bill
22:24
gates and it's not elon musk , it's not any of those
22:26
people . I'm not saying they're not up to some shenanigans
22:29
, whatever . If it really is a global cabal , you don't
22:31
know who they are . So I'm not sure , like
22:33
with the musk thing , like whether
22:35
he is really the brains
22:37
that a lot of these people like to think that's behind
22:39
all this stuff . But whatever it is he is
22:42
, he is amassing an unbelievable
22:44
amount of power . Like I said to you , he's
22:46
like the third most powerful person in america . Maybe
22:49
he's more , maybe he's more powerful than that because
22:52
he's got the resources . Like when you look at that quote
22:54
and all of that stuff , if you talk about techno feudalism
22:56
, if you talk about technocracy , like I
22:58
think the musk is at the top of that now
23:01
. Like he might not have been four months ago , but now
23:03
he is number one in terms of like
23:05
the obvious kind of public face
23:07
of what this is yeah , I think .
23:09
I think elon musk is
23:12
yeah , you're right , like he's with the
23:14
. With the recent change in government in the
23:16
us and the position of
23:18
power that like , combined with the fact
23:20
that he's one of the wealthiest people in the world
23:22
if of the wealthiest people in the world , if not the wealthiest person in the world , I can't remember
23:25
but like , irrespective , like
23:27
he's right up there and then
23:29
the influence he now has over the entire
23:31
of the us government and hence the
23:33
rest of the world , um , and
23:36
he's not democratically elected , like that's the
23:38
maddest thing in this I mean the us system .
23:40
There's quite a lot of important people who are political
23:42
appointees , so they're not they're not
23:44
democratically elected in the same way as in some
23:46
countries like in the us and a lot of europe . But it's
23:49
pretty mad that someone has got that much power . I mean , he's heading
23:51
up a department or he's not . Hang
23:53
on , sorry , he's not even heading up that department
23:55
, right ? He's actually number two in that
23:58
department and it's not even a
24:00
cabinet department and he's not a cabinet
24:02
level person . And yet the
24:04
picture I saw this week was all of the us cabinet
24:06
sat around in suits , like
24:08
just after they complained about zelensky not
24:10
wearing a suit and musk in a hat and a
24:12
t-shirt , just stood up in the back eating
24:15
a fucking sandwich so like
24:17
how powerful is it ? it reminds
24:19
me I said this to you this week if people on this podcast
24:21
have seen the film , meet joe black , and
24:24
joe black is essentially the grim reaper
24:26
in human form and
24:28
he just accompanies anthony hopkins character
24:30
to all these meetings and people are like who is this guy
24:32
? How is he making all these decisions ? Like
24:35
that kind of feels what elon musk is at the moment
24:37
. He's like he's apparently just this deputy
24:40
head of a department that isn't really even a real
24:42
department . But yeah , he seems like you've
24:44
got Trump and then I would say
24:46
you've either got Vance or Musk . I'm
24:49
not sure which one of them is more powerful .
24:52
Yeah , well , I mean , I
24:54
think Christopher Waltz played Blofeld
24:57
in the most recent Bond
25:00
film about Spectre , so
25:02
I think Musk is . By
25:05
your sort of analogy , Blofeld
25:08
is the head of the secret organisation , the
25:11
global cabal , and Musk is the public
25:13
face of it . I don't know who that was in the movie actually
25:15
.
25:16
Yeah , yeah
25:19
, that sounds reasonable . I
25:22
sort of jokedoked of all the conspiracy theories that I buy
25:24
into , like the global cabal is the one that , like I'm not
25:27
really that that into it . But my point
25:29
there is like if there is one , I
25:31
I would believe , if there is one , then we don't
25:33
know the real players . Like that's my point . Like I'm
25:36
not saying the illuminati is real or isn't
25:38
, but what I'm saying is like if all that stuff really happens
25:40
, if there are satanic bloodlines , etc
25:42
, etc . The people running the world in that
25:44
world are not people you know about . So
25:46
the musk thing to me it
25:49
feels like for all those
25:51
people who are really into that narrative he
25:53
doesn't make sense , but I think
25:55
in a sort of slightly more mainstream version
25:57
of it , which I think is like kind of where we
25:59
are with this podcast yeah , yeah . Like
26:01
he is incredibly powerful and therefore
26:03
, not only is he the face of it , I think he probably
26:05
is like the obvious , like it's
26:08
going to go through him right . Yeah
26:10
, if it's going to happen in the next hour of many
26:12
years .
26:13
And I don't think it matters if you believe in that or not . It's
26:15
like do we want people
26:17
like Elon Musk
26:19
to have the amount of power and
26:22
influence that they have ? I
26:25
don't . I think the answer is probably no , the
26:28
concentration of power in
26:30
that sense and it's . You know we're talking
26:33
about Elon Musk , but it's the same
26:35
with Zuckerberg , facebook . Some
26:37
of the stuff that happened in Myanmar that is
26:39
linked to Facebook is horrendous , and
26:42
that's all along the lines of the stuff . So what
26:44
is that stuff ?
26:46
For people who well , for people who don't know what
26:48
the stuff in Myanmar ?
26:50
Yeah , I don't want to go into like tons
26:52
of detail on it but effectively
26:54
like just no . It was a country
26:57
where no one really had any access to information
26:59
and then Facebook . There's a lot
27:01
of parts of the world where facebook pretty
27:03
much was the became the internet . It wasn't
27:05
like countries didn't have the internet
27:07
and then they just had facebook . So , whereas
27:09
if you're from somewhere like the uk , we
27:12
first of all we had the internet and then we
27:14
had apps and then we had facebook , blah , blah , blah and
27:17
so a lot the the sort of like shape
27:19
of things was very different . There are a lot of countries
27:21
in the world that were a bit further behind and
27:24
Facebook just became their internet
27:26
. And you
27:29
know governments . So the government in Myanmar
27:31
in particular took advantage of
27:33
this and spread a lot of propaganda and
27:35
spread a lot of disinformation and basically
27:38
caused a lot of trouble and problems
27:40
, because there's two different factions in
27:42
Myanmar and effectively
27:44
it ended up resulting in massive
27:47
civil war and Facebook was
27:49
a massive facilitator of it . And this
27:51
not only that like again , you
27:53
need to go and do your own research and look into it . But
27:55
this was flagged and
27:57
highlighted and warning , warning
27:59
flags were raised multiple , multiple
28:02
times and Facebook didn't really do anything
28:04
about it . So this is a real thing ? Um , it's you , and highlighted , and warning flags were
28:06
raised multiple , multiple times and Facebook didn't really do anything about it . So this
28:08
is a real thing . It's got nothing to do with conspiracy or anything
28:10
like that . And if you look at
28:12
the Myanmar example , it's
28:16
a really good example of the level of influence that these social media companies
28:18
can potentially have , and
28:20
I think that , unless you've
28:22
got anything else to say , I think that nicely leads us
28:24
on , because we haven't really spoken about ai much
28:27
yet .
28:27
I was about to bring us back to ai , so I'm in the same . Yeah
28:29
, I agree with you .
28:30
We've yeah , we've gone down my conspiracy
28:32
hole here yeah , so we need
28:34
to play the music , that thing now . So
28:45
how does ai feed into all
28:47
of this ? Well , if you I mean
28:49
everything that we've just talked about , literally everything
28:51
we've just talked about is about power
28:54
that's concentrated into the hands of a
28:57
few companies , and if you look at what's
28:59
going on with AI , it's the same companies
29:01
. It's the same companies that have the ability
29:04
and the power and the resources to
29:07
even build the data centers and run
29:09
the training models and create these AIs
29:11
in the first place . Because up to now
29:14
, I mean , there's a bit of a ruffle
29:16
with DeepSeek last month , and we're going to talk
29:18
about that in a little bit but if you
29:20
, you know , ignoring what happened with DeepSeek
29:22
, if you sort of follow the trajectory that
29:25
things were on with AI , then
29:28
it's the same companies . It's
29:30
only those companies , because they're the biggest , they're
29:32
easily the biggest companies in the world already . They're
29:36
the only companies with the resources to do it
29:38
. The one company
29:40
that I don't that we never mention is it's
29:42
Amazon . But just to like make
29:44
it super clear to everyone , amazon
29:47
basically own all of the infrastructure
29:49
that everyone is running AI on Like
29:52
. So you know , nvidia make these GPUs
29:54
, they make the gear and they're doing
29:56
very well off the back of the AI
29:58
revolution , but actually
30:01
like a ton of AI stuff
30:03
runs on AWS . And
30:05
Amazon have positioned themselves very smartly in
30:07
that um arena , so amazon
30:10
, they also .
30:10
They also control the sales of , like most
30:13
things everywhere
30:16
apart from in china , and also
30:18
they own like 40 of anthropic
30:20
so yeah , they're not a bit player .
30:22
They're in there but but they're .
30:24
They're someone we've never talked about on the podcast
30:26
and actually and actually have a large language model that's
30:28
called the amazon model and I think . But yeah
30:30
, you're absolutely right , they're a massive player in in
30:32
everything but this is what amazon do , right , they're
30:35
like a .
30:35
You know , they're almost the exact opposite of
30:37
elon musk instead of like being
30:39
out there , like we're disrupting stuff
30:41
, tweeting about it , like trying to like just
30:43
non-stop for
30:45
want of a better word attention seeking . In that
30:48
respect , like bezos
30:50
and amazon are just completely the opposite . They
30:52
run all the infrastructure that all this stuff runs
30:54
on . Um , they've got the , you know
30:56
they've . They've got the infrastructure that runs
30:58
so many of the world's websites
31:01
, a lot of ai models , all
31:03
this kind of stuff . Um , aws
31:05
has been an insanely , insanely profitable
31:08
arm of amazon , to the point where I think the
31:10
person who set up the the
31:12
I can't remember his name is , but the person who set up aws
31:14
is basically going to take over as um
31:17
chief exec
31:19
of amazon at some point . Um
31:21
, but anyway , sorry , amazon
31:23
aside like , where does ai
31:25
feed back into this ? Well , it seems
31:27
pretty obvious Like AI has the potential
31:29
to replace most
31:32
people's jobs , be a huge
31:34
resource , even if it doesn't replace people's jobs
31:36
, be like such a huge resource in
31:39
terms of , like productivity , efficiency
31:41
. You know , you're going to have robots
31:44
powered by like these AIs and large
31:46
language models and that kind of thing .
31:47
Um , and so
31:49
more and more power , if
31:52
anything is going to get concentrated
31:54
into the hands of these
31:56
four or five , six , whatever tech
31:59
companies , um , like techno feudalism
32:02
overlords and um
32:04
, just to sort of link back to a something
32:06
you talked about on a previous podcast , where we talked
32:08
about useless eaters , I think , um
32:11
, you , know there's a potential
32:13
standby by the way yeah , I think
32:15
you , I think you'll probably sort of agree
32:17
with me now that that's a certainly a possibility
32:20
yeah , well , it might , or at least that people
32:22
are being , or sectors of society
32:24
are being viewed by certain elites as
32:26
being , useless eaters , and I would include big
32:28
tech and certain
32:30
potential techno maybe certain people
32:32
overlords , yeah , yeah , maybe
32:35
certain people within those crowds .
32:37
But you know , imagine , okay , let's again , let's
32:39
. Let's just sort of sorry , can I just ?
32:41
bring you back , because it was what you said just to to
32:43
to um , back
32:45
what you said up about amazon . Yeah
32:47
, obviously this doesn't necessarily sort
32:50
of show the power in the kind of tech sphere
32:52
and and in the future , but just market capitalization
32:55
fifth biggest company in the world , brand value third
32:57
biggest in the world and revenue
32:59
fourth biggest in the world . So oh yeah
33:01
, just just to show like how important amazon are . They're
33:04
huge .
33:04
They're huge , yeah . So
33:06
if you kind of extrapolate out to the future what's
33:09
already happened , so like massive
33:11
concentration of um wealth
33:13
and digital services , I think is probably
33:16
kind of the broad term for it . And then
33:18
now you could potentially the same companies
33:20
are going to create all these ais
33:22
which are going to what
33:25
potentially be the future employees
33:27
of the world , like the future office workers
33:29
like do all of this stuff and probably
33:32
do it very cheaply , but where
33:35
does that leave people ? I mean
33:37
, at some point the
33:39
rope , there won't really be any room for
33:41
people . There will just be a , a class
33:43
system where you're either , you
33:45
know , one of these tech overlords
33:48
or or someone in government or
33:50
someone surrounding that sphere , or
33:53
you'll be like the 99 and you'll
33:55
be um , you know , I'm
33:57
not saying you won't necessarily be looked after , but there
33:59
won't really be any . There won't really
34:01
be any chance to get out of the situation you're born into
34:04
.
34:04
I don't think yeah , almost feels like some people
34:06
might want like a depopulation agenda . What
34:08
?
34:10
I don't know about that . I don't know about that .
34:12
Let's just see how far down the rabbit hole you want to go .
34:14
Jimmy , that's a bit heavy . That's a
34:16
bit heavy . We'll
34:19
save it for another episode , shall we ? I
34:21
think ?
34:21
so yeah , for whatever his name was , that didn't like conspiracy
34:23
theories it's a bit rough or uh
34:27
, jonathan jonathan it was jonathan
34:29
sorry jonathan , he's probably not listening anymore .
34:31
Anyway , um , so
34:33
yeah , like I think ai will only
34:36
exacerbate this . So
34:38
there is a light at the end of the tunnel . I
34:40
don't want to do all the speaking . I want to . What are your
34:42
, what are your views on all this apart , I
34:44
don't think I've spoken quite a lot .
34:46
I don't think you've done all the speaking . Yeah , do
34:48
you know what ? It's funny because I
34:50
feel like on this episode , like I didn't expect
34:52
I
34:57
probably prepared for this episode less than I prepared for any other episode other than like
34:59
reading that I've been doing . What means
35:01
I've been doing some reading , but I haven't like planned out , okay , I
35:03
was going to talk about this . It's like okay , you're going to lead this
35:05
episode . It's very much something that you talked about and
35:07
I'm interested in it . Um , I've
35:11
been surprised at how kind of passionate I've been
35:13
about this stuff and how much it kind of crosses
35:17
over with a lot of the the kind of concerns
35:19
and stuff that I've got . Having said that
35:21
, I'm in a weird space
35:24
because , as much
35:26
as I'm all you're not talking about my spare
35:28
bedroom . Yes , it's pretty fucking weird
35:30
in here . To be honest , Like
35:32
some of the shit over there , I probably won't mention
35:34
that and that cupboard . I mean yeah
35:37
, no , but like , although
35:39
I , you
35:41
know I sort of joke about conspiracy theories
35:44
, but like I , I do , like I'll be honest , I
35:46
do think there's a lot of evil
35:48
stuff going on , and I think
35:50
AI , you know , I've said from the start
35:52
, I think the the most likely outcome
35:56
and I don't know if that's 55%
35:58
or 95% is like a dystopian
36:00
future . I also think
36:02
more and more , like I'm also more optimistic
36:05
that , like , the utopia
36:07
is not that far out of our grasp and
36:09
it is about , like , there
36:12
are some things that are going to happen in the next two
36:15
, five , ten , fifteen , twenty years , I don't
36:17
know and how we as a
36:19
human race react to that
36:21
and what we allow to happen . Because that
36:24
is the key thing . Like , I do think at the moment
36:26
we still have the power as people
36:28
to decide which way
36:30
this goes , if we stand up and if we
36:32
make our kind of voices heard and we
36:34
don't accept certain things . But
36:37
there is a point that we pass and this is because
36:39
of this is absolutely because of AI . The
36:41
perfect storm has been created by the pandemic
36:44
and the sort of breakdown
36:46
of the US , the 2008 financial , like
36:48
all of these things you know , have
36:50
come together to create this kind of perfect storm
36:52
. And AI is
36:55
both the catalyst
36:57
, the enabler . Like I
36:59
find it amazing , like almost
37:01
like . The timing of stuff to me is like too coincidental
37:04
. It's like , like I said , this kind of perfect storm
37:06
stuff
37:08
to me is like too coincidental . It's like , like I said , this kind of perfect storm . But ai , if it
37:10
was managed in the right way , if it doesn't become super
37:12
intelligent and have its own kind of you know
37:14
aims and take , yeah and take
37:16
over you know , at that point
37:18
, kind of all bets are off and there's nothing we can do . But
37:21
before you get to that , there is a utopia
37:23
and I think , like even us
37:25
talking about this stuff , like I said to you in when we had a five minute break , the fact that us is a utopia and I think
37:27
, like even us talking about this stuff , like I said to you in when we had a five minute break , the
37:29
fact that us is a relatively mainstream
37:31
podcast right , okay , me here
37:33
like the fucking crazy guy with a conspiracy
37:35
theories but I'm not a crazy conspiracy theorist
37:37
by you know in in terms of
37:39
like , where some people are , where a lot of podcasts are
37:41
, where a lot of social media etc . Is
37:44
the fact that we're talking about this thing
37:46
says to me that this is becoming
37:48
a pretty mainstream thing
37:50
, that a lot of people are starting to wake up to the
37:52
idea that this is a genuine threat to
37:55
the future , because
37:57
, you know , essentially
37:59
it's it like this is not an episode
38:01
about democracy , but this tech feudalism
38:04
, like the way that tech is controlling
38:06
things , and the fact that ai is the first ever
38:08
tool that can allow
38:10
this level and I'm going to pull up another quote from from
38:12
the article can allow
38:14
a sort of level of control that dictators
38:17
in the past could only dream
38:19
of . It's such a risk . But
38:22
a utopia is also
38:24
like within , kind of just just
38:27
out of reach , like so . So
38:29
for me it's like if enough of us really
38:31
care and and push that I
38:33
think like there is still
38:36
time for the majority
38:38
to like you know , people power
38:40
and and like boots on the ground
38:42
and and feet and and votes
38:44
and hands still has power at the moment , but
38:46
I don't know how much longer that is the
38:49
case . Yeah , I'll let you do your quote . No
38:52
, you talk while .
38:53
I find my quote Okay , you're fine in your quote . So
38:55
I do think the
38:57
other thing that I wanted to mention , which
39:00
I'm proud to say I've been right about all along
39:02
, I've been banging on about all along
39:04
, is this open
39:07
source versus closed source thing , and
39:10
DeepSeek has thrown this wide open and
39:12
I do agree that DeepSeek has
39:15
been so important in terms of the influence
39:18
that it's had . I think DeepSeek itself , the
39:20
importance of the specific that it's had , I think deep seek itself , like the importance of the specific model
39:22
has , like that's that importance remains
39:24
to be seen whether they end up being a big player or not that's
39:26
diminished , like quite quickly , um
39:29
and I think I actually think they might well
39:31
be , because they've got some very smart people there but
39:33
what ? but what happened ? Was it accelerated
39:35
this kind of like ? Loads of other companies
39:37
were then right thinking models . They just banged them
39:39
out . Clearly , some of this stuff was there in the background
39:41
, but they weren't . It
39:44
was a big sort of surprise attack
39:46
in terms of , like , the release of deep seek , which
39:48
forced them to show their hand , which may be
39:50
like I don't know what chat , what open
39:52
AI's plan was . Maybe they were planning to eke
39:55
this stuff out over six months and , like
39:57
you know , use
40:06
it to sort of maximize their profits , because this is what companies do . You know , companies
40:08
don't release the latest and best models , latest and best products immediately . They always
40:10
have , like this kind of roadmap . So I think that's what deep seek did . Um , there's
40:13
been a bunch of other disruptions recently , but
40:15
overall , the sort of the
40:17
one hope I have with ai is
40:20
that it does feel that there
40:22
will be a level of democratization , because
40:24
actually , stuff's going to get so like
40:26
, the algorithm's going to get so better , so quickly
40:29
. The efficiency of them is getting better far
40:31
quicker than we can moore's law and
40:33
all this gpu stuff . You know , like
40:35
, the , the speed , like these , it's
40:37
not , it's not , it's not even in the same ballpark
40:40
like the level of improvement
40:42
that you get from like going from a
40:44
non-thinking model to a thinking model
40:47
, from , and you know , from going from
40:49
a mixture of x I mean , I'm probably
40:51
using phrases that people don't understand now but from going
40:53
from going from a standard model
40:55
to a mixture of experts model , and these
40:57
improvements are still happening and all of and
40:59
each of them is like , instead of it just being like a moore's
41:02
law thing where it's like it doubles every couple
41:04
of years , it's just like a 10x overnight
41:06
, and and so I
41:09
think that , like , there is the possibility
41:11
for a massive democratization with ai
41:13
. Um and open ai have ironically
41:16
, been pushed in that direction as well by
41:18
the release of deep seek I'll go to my quote
41:20
and it aligns per .
41:22
It aligns segues perfectly
41:24
in here . So credit again to eric
41:26
wickstrom , um and the am
41:28
I crazy here ? Substat for
41:30
this . So when he was talking about
41:32
the technocracy , the ai component , artificial
41:35
intelligence isn't just another technology
41:37
in this vision , it's the key enabler . Ai
41:40
provides the means to replace human judgment
41:42
tools , for total surveillance systems
41:44
, for resource allocation justification for
41:46
eliminating democratic input . Here's
41:49
a really key bit from what you've just said . When technically
41:51
tech leaders discuss ai safety
41:53
and control , they're laying the groundwork
41:55
for a new form of social organization . The
41:59
way I read that that quotation is not saying we
42:01
don't need ai , safety and control . What
42:03
it's saying is that when these technically leaders
42:05
, when the big ai players
42:07
, talk about ai safety and control and
42:09
this is what feeds into jimmy's talk about open
42:12
versus closed models when they talk about
42:14
safety and controllers , basically , you
42:16
need to not allow things to be open source , need to give us
42:18
all the power because we know better . That
42:20
is what's creating that groundwork for social organization
42:23
is that is giving complete control
42:25
of a technology that has the means
42:27
to completely survey society
42:30
, and that's what I absolutely agree with you . You know I
42:32
was looking for it and you happen to talk about that and
42:35
talk about open source . Yeah , I think open source
42:37
is like kind of part of the the answer
42:39
answer here , and I do sort of wonder with deep seek
42:41
. I don't know how much the state was involved
42:43
in it , but I feel like it was put out
42:46
there as a kind of bomb to kind of you
42:48
know , I'm not , I'm not saying that I'm not saying that like China
42:50
was trying to , like you know , go
42:53
in and save the world , but was sent into
42:55
probably just like to shake things
42:57
up and shake up what the US had done , but
42:59
has had the effect of
43:01
a really , really positive thing of kind
43:03
of democratising and opening it up
43:05
and making people , I think , start to sort of question
43:08
Like they are . I think it's starting
43:10
I don't think it's enough , but I think it's starting this kind
43:12
of question of hang on , like
43:14
who are these people making
43:16
all these decisions ? So , like I've said in one
43:19
of the episodes , is like is
43:26
AI something we want or is it just being done to us .
43:27
You know , maybe we want ai , but the way that it's being rolled out is being done to us could say
43:29
exactly the same thing about elon musk . Well , yeah , is it almost something we want
43:31
, or has he just been done to us ?
43:33
well , apparently , like 60 percent
43:35
of americans do want him , but I think , not
43:37
sure about that . I'm not sure how long that will last , that's that's for sure
43:39
. I'm not sure about that . I'm not sure how long that will last , that's that's for sure . I'm not sure
43:41
how long him and Trump like those two egos . I'm not
43:43
sure how long Trump can stand
43:45
someone being potentially more important
43:48
and uh and popular than
43:50
him . I think there'll be a fallout , I'm
43:52
sure . I'm sure .
43:53
Yeah , it's been , it's been touted
43:55
. I mean to
43:58
wrap up like we've it's been quite a
44:00
passionate episode . I
44:03
think that , like , overall
44:05
, I think these terms you know , you
44:07
can get wrapped up in like , you know , okay , techno
44:09
, feudalism , technocracy like they're just words
44:12
. I
44:22
think they're really powerful words because I think they , you know , they're
44:24
a short , punchy way of articulating a lot of the
44:26
concerns that people have about where we're going and um and AI . In my opinion
44:28
, like the , the , you know , we we only really brought AI
44:30
into it in the last 15 minutes , but the whole point is
44:32
that AI is only going to accelerate
44:35
this kind of thing . Um , I
44:38
would say , check out some of the
44:40
alternative alternatives
44:42
to the big tech companies , because there are a lot of really good
44:44
ones and I think most people , basically
44:47
, it's a bit like Google is synonymous
44:49
with search , right , I'm Googling it . Uh
44:52
, you know , that was what chat GPTs become
44:54
um and or chat GTP
44:56
as like , I think most people call it um
44:58
, but , like , chat gpt
45:01
has just become synonymous with ai , right
45:03
, most , okay , people who listen to this
45:05
podcast , maybe you've tried out some of these other
45:07
things , but I would
45:09
seriously
45:12
recommend exploring
45:14
, trying out some of these other models
45:16
. When you're using ais , give grokka
45:18
grok with a qr , go ven , go
45:20
. Yeah , give . I mean mistral's
45:23
got a model like . There's so many
45:25
things venice ai .
45:26
We talked about it on on a long time ago when
45:28
we're like recommending like 10 tools or
45:30
whatever , but like a tool where they have
45:32
, you know , their big thing is about not sharing data
45:34
and everything gets deleted . It's not trained models
45:37
. I don't know how much you can believe it , but like
45:39
what I mean is there are things out there like
45:41
that you can make a kind of statement that this is
45:43
what you want from from your model , although
45:45
like is is you know , is
45:48
really the demand about like someone paying 10 pound
45:50
a month ?
45:50
is that I don't know how much that's really going
45:52
to make a difference yeah , I will
45:54
come in a minute with like my views and what people should do yeah
45:56
, of course , of course , uh
45:59
, as a sort of another option if you want to pay for
46:01
an ai like I tried it briefly and I'm thinking
46:03
about going back to it . But there's a really interesting
46:05
. I wish we got sponsorship from these companies
46:07
. There's a really interesting company called mammouth
46:10
. Uh , we'll stick it in the
46:12
we'll stick it in the show , really , really interesting , but they're , like
46:14
, you know , I think it's like 10 a month or 15
46:16
, so it's cheaper than most of the mainstream
46:18
ones . It gives you access to most of
46:20
them , uh , like give you access to claude
46:23
, open ai , like chat gpt , like
46:25
all these different models , uh , and
46:27
you can play around with them . But like I think there
46:29
is a bit of a danger with just being
46:31
like , oh , ai , that's chat gpt
46:33
, and then just using chat gpt because at
46:36
some point , at any point , they can change their
46:38
system prompt , they can change their guardrails
46:40
, they , they can effectively
46:42
and not just they . I mean , like they've got , you
46:44
know , us ex-us generals on their
46:46
board and things like that potentially
46:49
, like the us government can influence the
46:52
kinds of responses these apps
46:54
and tools are giving you , um , and so
46:56
I do think that , like
46:58
it's important to you
47:00
know , I guess , try being every now
47:02
and again in the AI
47:05
sense .
47:08
Shall I go .
47:09
Yeah , yeah , I
47:11
mean yeah yeah , you don't have to ask my permission
47:13
.
47:13
No , I didn't want to interrupt you because
47:15
I think it's almost like this is our like final
47:17
kind of arguments . Yeah , oh
47:21
, that was a yeah , sorry . Yeah
47:23
, should I edit this ?
47:24
I think you're probably gonna have to edit this bit out . This is too
47:26
bad . No , I'm not .
47:27
I'm not gonna edit out , because I think it adds the reality to the
47:29
podcast . All right , I'll . I'll give my view
47:31
on this , like which you'll be unsurprised is I
47:34
think it's um , I think
47:36
people need to be doing a lot more , a lot
47:38
more than that . I think there are two elements
47:40
to this . So the first part you know
47:42
around , what you kind of do on
47:45
a day-to-day basis , like as
47:47
someone who uses ai like
47:49
we sometimes talk about this , like I came around and you've been
47:51
coding for like several hours today , so you've been using
47:54
ai a lot . But I think , on a general day-to-day basis
47:56
, like I'm using it like I
47:58
don't know 10 , 15 times a day and using different
48:01
apps for different things , like I use AI a
48:03
lot . So as someone who kind of like
48:05
wishes it would just go back
48:07
in its box and it could bury it in the ground
48:09
, I do use it a lot because I
48:12
also see a lot of uses in it . But I think my
48:14
first thing to people would be like the
48:17
first thing against it is like when we're talking about
48:19
this idea of complete control is like
48:21
don't allow your life to just become
48:23
about tech , like I know it's
48:25
really obvious . But like get off social media
48:28
, don't rely on everything
48:30
for ai . Like I've started at work
48:32
, I've started carrying a notebook around
48:34
and writing stuff , because I realized like I can't write
48:37
properly anymore and like that's because
48:39
I don't write and was like it's quite nice to just
48:41
be writing . I'm not saying for
48:43
everyone's like , don't use tech , but I'm saying like
48:45
, go back to like . Go
48:48
back to like having interactions with people
48:50
. Go back to like community stuff
48:53
. Like do things that
48:55
are challenging this whole kind of economic model
48:57
of control by a few organisations
48:59
. Like buy local food . Like engage
49:02
with your community . Like there are really simple things
49:04
you can do to create the world that I think , like almost
49:06
everyone I speak to , wishes the world was
49:08
. And it's like , if that's what you want the world to be
49:10
, like , no one is coming to save us . No
49:12
one is coming to create that world for you . You
49:15
are the person like we are the people that
49:17
have to create that world and the people that have to create
49:19
that world , and if enough of us start to do it , we can have that world . It
49:21
doesn't mean having no ai . It doesn't mean having no tech
49:23
. It doesn't mean we're not going to have , you know , ai , workforce
49:26
transition . It doesn't mean we're going to not have a future
49:28
that's controlled by it or that is dominated
49:30
by it , but like you can build that stuff
49:32
into your life , like I think that is so
49:34
, so important as , like not allowing
49:37
this to be sort of have
49:39
its tentacles in every single aspect
49:41
of your life , like have the
49:44
life of meeting people
49:46
in in reality , writing things down
49:48
, playing guitar , playing the piano
49:50
, like playing football , going out
49:52
into nature , like part , that's part of it . The
49:55
second thing is , like you're gonna have
49:57
to realize when these things are coming in
49:59
, that are whatever , like
50:01
however far down on the , on the , on the on
50:03
the scale of the like , however far down
50:05
the rabbit hole you are and however much
50:08
you think some of the stuff I talk about , we talk about
50:10
, is nonsense , is like some of this is
50:12
not conspiracy . Like digital
50:14
currencies are coming , digital
50:16
ids are coming , right , if you
50:18
want that and that and you want what that
50:20
means , fine , like , but
50:22
understand what these things mean is . These
50:25
are all part of this move towards
50:27
control of tech
50:29
over everything . And you might
50:31
think this is kind of just convenience , right ? I
50:33
live in a country where everything is paid for
50:35
convenience on an app . Do
50:38
you know what ? I have a load of cash hidden
50:41
in my apartment . It's not that much cash , so don't
50:43
bother coming to rob me because it won't change your life . But
50:45
I have some cash , like because
50:47
, like , even not on a conspiracy
50:49
, conspiratorial level , like
50:51
when there is a it outage
50:53
, like I don't want to be reliant on tech . It's
50:56
like , think about when you are reliant
50:58
on all that stuff . Think about when , in your office
51:00
, the internet goes down and you can't do anything that day
51:02
because nothing works . Think about that times
51:05
a million , because AI is integrated
51:07
to everything . It's like start rejecting
51:09
, like , take the good stuff , but start
51:11
rejecting the bad stuff . Like
51:14
you know , it will make a difference . Like , I see
51:16
this kind of movement for cash and a few years ago I was
51:18
like well , why are you so bothered about cash
51:20
, like swiping a card ? And then now I'm a bit
51:23
like but it's not about cash
51:25
in your hand , it's about what cash means
51:27
. When you move to a digital currency , all
51:30
it takes is the next government to be the government that
51:32
declares that well , I'm just going to link that to your social
51:34
credit score . Oh
51:43
, that , well , it's not , because they don't even actually have that in China . So if you listen and you've heard of this thing where everyone in China just you know , okay , people
51:45
get banned from traveling , people in China don't have their money controlled because of their behaviors
51:47
, but there's a system there and that's
51:49
in a country that's not a democracy . When
51:52
you have that in your country , that is a democracy . All
51:54
it takes is a change of government or it takes
51:57
a controlling tool
51:59
that we're talking about . This sounds dystopian , but like this is the tool that we're talking about
52:01
. This sounds dystopian , but like this is the default that
52:03
we're headed to , unless people make decisions
52:05
that say we don't want to head in that direction . So
52:08
like two bits of advice like do more
52:10
things locally , make little changes in your life , but
52:13
also understand some of these big things that are
52:15
coming in . You still have a choice whether
52:17
you accept them or not . Two generations
52:19
on , maybe even one generation on , there
52:21
ain't gonna be a choice anymore . That
52:23
that's . That's my kind of closing argument
52:26
on this it's powerful stuff .
52:28
I think that that we need to shorten it a little
52:30
bit . But that's our new mid-roll , uh
52:32
, the beginning part of it , I think , maybe
52:34
in a little bit at the end , but like the like
52:36
, definitely we need to like get that down , um
52:39
to lighten it up a little bit just at the
52:41
end . I'm going to finish with a really nice piece of music . I
52:43
don't know why I came across this , but
52:45
I agree with a lot of what you said . Just to
52:47
sum it up , I
52:52
think it's up to everyone to make their own judgment
52:55
as to where they are on that , but
52:58
I do think that tech
53:00
and the level of control
53:03
over the you know
53:05
, western societies that is
53:07
afforded by some of these new technologies
53:09
and that's in the hands of , like
53:11
, such a small amount of companies um
53:14
, you know , all in the us , to be honest
53:16
, all the majority is quite frightening
53:18
. Um , and it and it's , it's
53:20
, it's really , you know , coming
53:22
back to what we talked about in the episode , like it's
53:25
as much what you're talking about
53:27
as in literal control , as it
53:29
is around influence about
53:31
what people are thinking um
53:33
, and I think that's for me , also a huge
53:35
, huge , huge danger anyway , to
53:38
lighten things up a little bit . I don't know why I
53:40
pulled this up , but I
53:42
found something that I said to Claude the other day
53:44
which entertained me . I said write a sentence
53:47
that uses the most adjectives you can use in
53:49
one sentence in a humorous way
53:51
. Here we go . I'm going to give this
53:53
a good shot
53:56
incredibly verbose , suspiciously enthusiastic
53:59
, ridiculously overdressed , unnecessarily
54:02
complicated , absurdly
54:05
long-winded , hopelessly confused
54:07
, thoroughly exhausted , slightly bewildered
54:09
, comically serious , painfully
54:11
meticulous , wildly incompetent
54:14
, shockingly purple , surprisingly
54:16
knowledgeably questionably
54:18
qualified English english professor stumbled
54:21
dramatically over his extraordinarily
54:24
expensive , impractically tiny
54:26
, ludicrously shiny italian leather
54:28
shoes while delivering
54:31
an excessively detailed , mind
54:33
numbingly boring , theoretically important
54:35
lecture about the absolutely essential
54:38
, fundamentally critical , woefully
54:41
underappreciated importance of
54:43
concise , clear communication
54:45
. See you next week , guys .
54:58
In the gleaming halls of silicon dreams
55:00
, where algorithms hum their endless
55:03
schemes . I watch as lords of data
55:05
build their digital feasts , harvesting
55:07
attention with invisible thieves . Each
55:10
scrolling click becomes a surf's new machine
55:12
. While techno barons count their endless
55:15
gain , our labor track my
55:17
eyes that never sleep as
55:19
we kill digital fields . The masters
55:21
keep the new lords don't need
55:23
castles made of stone , just server
55:25
farms and screens we call our own . They
55:28
promise freedom while they track our lives
55:30
. As techno-frutalism quietly
55:32
thrives , the AI sentinels
55:35
guard their masters realm , learning
55:37
our patterns , ready to overwhelm prediction
55:40
engines , knowing what we'll do Before
55:43
the thought has even passed through . But
55:46
in the garden , where no signal reaches
55:49
, where the winter ancient forest
55:51
teaches , I remember what it
55:53
means to truly be In the natural
55:55
world that sets us free . For
55:58
all their codes and neural nets , so fast they
56:01
cannot program . What was meant to
56:03
last is human touch , a
56:05
kind word spoken true , the soul's
56:07
connection that sees us through . So
56:10
find your people , look them in the eye
56:12
, feel the earth beneath , look
56:14
up at the sky or in our hearts
56:16
lies power . They can't decode a
56:18
, a love-fueled resistance to their crushing
56:20
mode . Before we're reduced to data
56:22
points and trends , let's reclaim
56:25
our humanity , make amends the
56:27
future's not written in binary lines
56:29
, but in how deeply our compassion
56:32
shines . Thank you , you .
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More