Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
You also might have
0:02
heard about MongoDB in certain ways.
0:04
This is a thing that's around
0:06
and I always heard about it
0:08
and yeah, what is it? And
0:11
it turns out a
0:13
database system that is quite important
0:15
for AI to work. So
0:17
I was curious what MongoDB actually
0:19
does. And I talked to Richmond
0:21
Alake, he is developer advocate at
0:23
MongoDB. And he explained a lot
0:25
about the three essentials of AI,
0:27
which is compute, algorithm and data.
0:30
And this last thing, data, the
0:32
important thing is how to access
0:35
the data and MongoDB helps. So.
0:38
Welcome to another episode of the
0:40
beginners guide to AI. It's Dietmar
0:42
from Argo Berlin at the microphone
0:44
and I can talk a lot,
0:46
but let's just give the microphone
0:48
to Richmond and see what he
0:50
has to say about how to
0:52
make data an essential part of
0:54
your AI. Let's go! Yeah,
1:02
today we have Richmond Alakev
1:05
from MongoDB. And before
1:07
I talk too much about him, people
1:09
know the drill. I give a microphone
1:11
to him because he can say much
1:13
more about himself. And actually, first of
1:15
all, Richmond, welcome to the podcast. And
1:18
my first question would
1:20
be, why AI? What
1:23
did bring you to AI? Yeah,
1:26
that's a very good question.
1:29
Dietmar, thank you for having me on
1:31
this podcast. I'm hoping we have a
1:33
very good conversation and speak to your
1:35
audience about AI, agentic systems
1:37
and what we're seeing today, but
1:39
more importantly, MongoDB's relevance
1:41
in the current state of AI
1:43
and in the future of
1:45
AI. So to
1:47
answer your question, why
1:49
AI? What brought me to
1:52
AI? So I thought In
1:54
my career journey, it was a
1:56
natural progression. So my
1:58
undergrad was in software engineering.
2:01
After my undergrad, I
2:04
then became a software
2:06
developer. I became a
2:08
web developer building websites,
2:10
building mobile applications. Then
2:12
I got bored. So
2:15
I thought to myself, what's
2:17
a good challenge that where
2:19
I can use my existing
2:21
skills and build upon it.
2:24
And AI was the next
2:26
level, was the next
2:29
logical choice. So I tried
2:31
to teach myself AI. I
2:34
wouldn't say I tried to
2:36
teach myself machine learning, but
2:38
I realized how difficult it
2:40
was at the time and
2:42
I scared myself into back
2:44
to university. So I got
2:46
a master's in computer vision,
2:48
deep learning and robotics. So
2:51
I guess that's what we call AI today. So
2:53
masters in AI. And yeah,
2:55
that is really why
2:57
AI won because it's
2:59
a solid challenge to
3:01
it's going to be
3:03
probably humanity lost in
3:05
last invention. And
3:07
it would do it right. And it's
3:09
it's relevant, very relevant.
3:12
Yeah. That's great. We'll
3:14
come to the topic of the
3:16
last invention later, but let's
3:18
go really simple, because I don't
3:20
do software, unfortunately, but
3:22
machine learning. I mean, this is
3:25
like the machines learn like the brain,
3:27
but how in detail does that
3:29
work? Can you go a little bit
3:31
deeper in that? you
4:53
And for this you need special chips.
4:56
everybody talks about them, I guess.
4:58
Well, you Nvidia chips,
5:01
but we didn't start at NVIDIA
5:03
chips, right? Machine
5:05
learning a field or
5:07
as a general conquest
5:09
for man, it's something
5:11
that we've been doing
5:13
since I guess as
5:16
a field since the
5:18
50s, 1950s. And then
5:20
GPU became very relevant
5:22
to the conversation in
5:24
in 2012, when, I'm
5:26
going to butcher surname,
5:28
I think Alex Tversky
5:30
and his released AlexNet.
5:33
they I'm
5:36
sure I'm getting the name of the, of
5:39
the network, of the
5:41
name of the convolutional neural
5:43
network properly. If
5:45
If not, then we correct
5:47
him. I it's AlexNet.
5:49
it was a neural
5:52
network that essentially could,
5:55
that was trained using GPU, which
5:57
means that the neural, the
5:59
parameters and the
6:02
weights and biases, the
6:04
information, the knowledge
6:06
of this neural network
6:08
was actually placed
6:10
on GPUs. instead of
6:12
CPUs or wherever
6:15
processing system was used
6:17
at the time,
6:19
because GPUs had the
6:21
ability to compute
6:23
and computer mathematical information
6:25
in parallel. It
6:28
made things a lot quicker. So
6:30
it made the training of these systems a lot
6:32
quicker, and then you can then do it at
6:34
what at the time was scale. And
6:38
by scale, I mean you can make this neural
6:40
networks bigger. I
6:42
won't go too much
6:44
into the weeds, but the
6:46
long story short is
6:49
GPU was one of the
6:51
key aspects of the
6:53
journey of AI that really
6:55
accelerated a lot of
6:57
things that we're seeing today.
7:01
And there's three commodities in AI,
7:03
compute, model, and
7:05
data. So GPU is in that,
7:07
isn't that computing? When you
7:09
solve for compute, you
7:11
can actually move a lot
7:13
quicker on that innovation
7:15
journey. Yeah, yeah, yeah,
7:17
this is, it makes sense. The more compute
7:20
you have, and this is the race for
7:22
data centers and stuff like that, and the
7:24
other parts come behind. That's
7:26
actually directly a way to talk
7:28
about MongoDB, because data is the
7:31
next step. The computer is nice,
7:33
but you have to do something
7:35
with it. And you are actually
7:37
the ones that provide the infrastructure
7:39
for accessing data as far as
7:41
I, with my half
7:43
knowledge, would say, could
7:46
you explain what you actually do? Data
7:50
is very crucial. Again,
7:52
one of the key commodities
7:54
within AI that makes AI
7:56
work. And you're absolutely
7:58
correct. MongoDB is at the center
8:00
of this. We've been at the
8:02
center of this. MongoDB is very
8:04
relevant. And MongoDB is
8:07
a general purpose database that
8:09
is feature rich. And
8:11
what it does is it
8:13
stores and retrieves. data.
8:15
It allows you to build
8:17
an application that can
8:19
store data and retrieve data.
8:21
That's what MongoDB does,
8:23
but there are different ways
8:25
you can store data
8:27
and there are different ways
8:29
you can retrieve data.
8:31
We've seen that evolve over
8:33
time, but the unique
8:35
aspect of MongoDB has been
8:37
and still is the
8:39
way we structure and format
8:41
data, then store it
8:43
within our systems is in
8:45
a format that is
8:48
different to what database we're
8:50
initially thought of, which
8:52
is tables, relational
8:54
model. MongoDB has the
8:56
document data model. And not to
8:58
go into the weeds of
9:00
things, the TLDR
9:03
is MongoDB stores data
9:05
in the way that
9:07
developers think. And
9:09
that is something called
9:12
JSON. This is
9:14
a JavaScript objects
9:16
notation. It's a data
9:18
structure that has a key
9:21
value pair format. And
9:23
this is common. You tell
9:25
any developer, Jason, they understand
9:27
what that is. And even
9:29
some non -developers understand what
9:31
Jason is. And it doesn't
9:33
matter what programming language, what
9:35
application type they're using, Jason
9:38
is the most common
9:40
inter -exchange data format within
9:42
the application landscape. But
9:44
wait, we have AI. That
9:46
has not changed. And
9:48
one thing that we saw
9:50
was this LLMs, large
9:53
language models, actually had
9:55
a natural affinity
9:57
for understanding JSON, which
10:00
means that not only does
10:02
MongoDB actually allow developers to build
10:04
application and store application in
10:06
the way that they already think,
10:08
but It applies, we also
10:10
allow you to store information in
10:13
the way that LLM already
10:15
think as well. So it's
10:17
just very natural, very natural
10:19
where we are and where
10:21
we're going. That sounds like
10:23
you build an AI and
10:25
that is like structured a
10:27
little bit like the human
10:29
brain and coincidentally, MongoDB works
10:31
like with the human brain
10:33
and then you can also,
10:35
AI can also work with
10:38
it. That's a really, really
10:40
practical thing. Yeah. Yeah. It's
10:42
basically, again, I'll go very,
10:44
very a bit technical, but
10:46
not too technical. One
10:48
thing is this LLMS, they gave out
10:50
a lot of a
10:52
lot of responses, right?
10:55
And we loved that
10:57
when they first came up,
10:59
but then we started
11:01
to realize that we required
11:03
structure in the outputs
11:05
of this LLM because it
11:08
allowed us to make
11:10
what was very probabilistic become
11:12
more deterministic, which means
11:14
that we can actually steer
11:16
this LLMs to produce
11:18
outputs that are predictable. And
11:21
the best way that
11:23
we know how to do
11:25
it for developers, for
11:27
application developers, for architects of
11:29
the future is through
11:31
JSON. So now there are
11:33
modes that are called
11:36
JSON mode, or maybe structured
11:38
outputs, which puts this
11:40
LLM in the state of
11:42
providing outputs that are
11:44
JSON formatted or in the
11:46
JSON schema. And MongoDB
11:48
stores that data perfectly. I
11:51
was always trying to figure out
11:53
the connection because you have a LLM,
11:55
it's basically language output, but then
11:57
you have kind of a switch and
11:59
you can let it program or
12:01
work statistically. And I think
12:03
then this is the other mode that
12:05
comes in. Oh yeah, yeah, okay. And
12:08
then there's another thing
12:10
is just so this structure
12:12
of MongoDB supports that. the
12:15
basics thing is what's so important
12:18
about data then? Why do I
12:20
need to store data? Yeah,
12:23
well, one thing is you
12:25
can answer that question in
12:27
any point in time and
12:29
the answer would relatively remain
12:31
the same. It doesn't matter
12:33
if it was the big
12:35
data error or maybe the
12:37
computer vision error or this
12:39
generative AI error. The question
12:42
is, Why do I need
12:44
to store data? One,
12:46
you need to store data
12:48
to allow for things within
12:51
the application, such as personalization.
12:54
That's one thing that is very
12:56
understandable to most people. When
12:58
you use an application, you
13:00
want to be able to
13:02
feel that the application was built
13:04
for you and is able
13:06
to meet your needs. And the
13:09
way that application developers have
13:11
done that or have done that
13:13
in the past is to
13:15
be able to collect data on
13:17
either the user or provide
13:19
data relevant to the user, which
13:22
requires you to store the
13:24
data somewhere. Now, let's take
13:26
that into the age of
13:28
generative AI that we have now.
13:30
There's a lot of data
13:32
on the internet. but why do
13:34
we need to store data?
13:37
These LLMs can provide an output,
13:39
but they need relevant data,
13:41
domain specific data that allows them
13:43
and their output to be
13:45
personalized to you. And that's
13:47
where we come in. We
13:49
give you the, we are the
13:51
data layer where you can
13:54
actually store that data and we
13:56
give you different methods of
13:58
retrieval where you can bring the
14:00
data to the LLM to
14:02
create that personalized experience for your
14:04
customers and restore different type
14:06
of data on structured and structured
14:08
data and one of the
14:11
common type of data that is
14:13
very popular today is something
14:15
called vector vector data or vector
14:17
embeddings which is basically a
14:19
data object of like a music
14:21
or a numerical representation of
14:23
a music or an image that
14:25
is that is stored and
14:27
used for things such as semantic
14:30
search. MongoDB stores that as
14:32
well. That is why
14:34
data. We can go into
14:36
the details, but that's generically why
14:38
data. Yeah. Actually, this vector
14:40
database, I heard a lot about
14:42
this is new thing, the
14:44
new shiny toy, because it's better
14:46
to store data like this
14:48
and you have things like retrieval,
14:50
augmented generation, where you use
14:52
those vector databases. Is
14:55
it? Why is it
14:57
better? Can you say that in simple
14:59
words? Why is the vector? I mean
15:01
vector might have had that in school,
15:03
but it's like long time ago. Yeah. You
15:06
hit the no on the
15:08
head, which is you've heard of
15:10
vector a long time ago.
15:12
Vector is nothing new, right? The
15:15
way we use it or
15:17
the high fidelity of the
15:20
vector data that are being
15:22
generated might be might be
15:24
new now. We can capture
15:26
more information in this numerical
15:28
representation of data objects. So
15:30
a vector data for your
15:33
listeners is let's imagine I
15:35
have an image. I
15:37
can pass this image through
15:39
what we call an embedding
15:41
model, which is a machine
15:43
learning model that has been
15:45
trained to understand some of
15:47
the patterns within any input.
15:49
It could be image or
15:51
text, and you have a
15:53
multimodal embedding model, then provide
15:55
a numerical output that captures
15:57
the patterns or the features
15:59
of this image. With
16:02
that numerical output, you can
16:04
actually do something which is interesting.
16:07
You can take a text, let's
16:09
say the text I
16:11
am searching for a
16:13
red image, then pass
16:15
that through the same
16:17
embedding model, then do
16:19
what we call vector
16:21
search. And that
16:24
embedding model that embedding of
16:26
vector data of the
16:28
text, which will be your
16:30
prompt, will within
16:32
this high -dimensional space be
16:34
closely related in distance
16:36
to an image or
16:38
images of maybe something
16:40
with that image of
16:42
the color red. That
16:44
is the whole premise
16:47
of vector search and
16:49
embedding. Now, it's
16:51
not new, it's been,
16:53
it's had before, but
16:55
What's new is the way
16:57
we're searching and the
16:59
fidelity of the embedded model
17:01
output. Vector databases
17:03
is something that are emerging. We've
17:06
seen some examples of vector
17:08
databases, and people are adopting
17:10
this. But what we see
17:12
is nothing has changed to the
17:14
way we store and retrieve
17:16
data. The general purpose of data,
17:18
which means that MongoDB is
17:20
even more relevant than ever. which
17:22
means that we're not just
17:24
subscribing to one way of storing
17:26
data. We can
17:28
store different types of data and
17:31
provide you the needs and means
17:33
to retrieve and store those data
17:35
effectively, including vector data. So with
17:37
some database, you get the ability
17:39
to store just vector data. But
17:41
with MongoDB, you have the general
17:43
purpose database where that is optimized
17:46
for AI workloads, including the storage
17:48
and retrieval of vector data. That
17:50
is a massive information, but happy
17:52
to deep dive wherever you want
17:54
to. No, that's great because actually
17:56
I now get what it means
17:59
and the so if I have
18:01
a prompt and I want to
18:03
get something and I mean it
18:05
might be a general output but
18:07
now I have like what you
18:09
said with a red picture I
18:11
want to I don't know create
18:14
a picture that that is connected
18:16
to my pictures that I have
18:18
in my firm and now it
18:20
can retrieve all pictures that are
18:22
similar because they are somehow red
18:24
and generate something and maybe have
18:27
a style for my firm and
18:29
then as suddenly I have the
18:31
style. with those red pictures and
18:33
what I create now is not
18:35
a general picture, but would be
18:37
possible, but one that bases on
18:39
my own stuff and with vectorization
18:42
the model can find it. Did
18:44
I explain that correctly? Very
18:46
good high level. explanation of
18:48
it. And then MongoDB is
18:51
that database where you store
18:53
that vector data and have
18:55
the mechanisms to retrieve vector
18:57
data efficiently. And we
18:59
store other type of data because this
19:01
is what a lot of people are
19:04
starting to realize. You
19:06
need you need different type
19:08
of data to make AI
19:10
work, not just vector data.
19:12
So I talked to a lot of customers
19:15
that are building the future. They're
19:17
building an AI application, they're
19:19
building an agentic system, and
19:21
they start to realize that
19:23
they don't just want to
19:25
do vector search, but they
19:27
want to do. full tech
19:29
search, together with vector search,
19:31
to increase the accuracy, to
19:33
make the outputs of the
19:35
LLM more relevant to the
19:37
user's query, and to make
19:39
the entire system that they're
19:41
building more reliable and scalable.
19:44
So you don't just go
19:46
out there looking for vector
19:48
database. You go looking
19:50
for MongoDB that is
19:52
that general purpose database. So
19:54
which allows, which you know,
19:56
you're building that a good AI
19:59
application, you're building a very
20:01
solid stack and you have a
20:03
trusted data layer that is
20:05
relevant regardless of what new data
20:07
types emerge. Oh
20:09
yeah, yeah, but you talked about the customers.
20:12
I'm really interested in which areas,
20:14
is there certain areas where the
20:16
people come from or it's just different
20:19
industries, everything, or
20:22
do you have some where more
20:24
people come now for developing AI? Yeah,
20:26
MongoDB is in a privileged
20:28
position where we get to see
20:31
a lot of the excitement
20:33
that is happening in AI today.
20:35
That's because... need data and
20:37
you need somewhere to store that
20:39
data regardless of what industry
20:42
you're serving or what customers your
20:44
application is building. So
20:46
the short answer to
20:48
that question is everyone is
20:50
playing with AI in
20:52
some sense, which means we
20:54
get to work with
20:56
people in the manufacturing industry,
20:59
in the telecommunication industry,
21:01
in the healthcare industry. it
21:04
really is a very privileged
21:06
position and they trust us to
21:08
be that data layer and
21:10
where they can build the future
21:12
of their applications and the
21:15
application or different application. So
21:17
that's the straight answer to that
21:19
question. It's everyone. Crazy.
21:22
And I read on
21:24
the website every month
21:26
there's 175 ,000 new
21:28
developers starting with MongoDB
21:30
and this is like But
21:35
this is just the new
21:37
people. How many people
21:39
work with MongoDB? Can one
21:41
say something like that?
21:44
I can't give you an
21:46
exact number, but I
21:48
can give you an understanding
21:50
or perspective, which is
21:52
MongoDB is one of the
21:54
most popular databases today. There's
21:58
a lot of people
22:00
that are familiar with MongoDB.
22:02
Most developers are. So
22:04
that's the skill at which
22:06
we are relevant and
22:08
which we have an opportunity
22:11
to help developers and
22:13
businesses build this long lasting
22:15
applications within this AI
22:17
era. So long story short,
22:19
it's hard to put a number but
22:21
we have There are
22:23
hundreds of thousands of developers
22:25
that are aware of MongoDB and
22:27
there are thousands of businesses
22:29
that trust MongoDB as that data
22:32
layout for their applications. Yeah,
22:34
that's a good answer because I
22:36
knew MongoDB as a name. I didn't
22:38
know the advantages and you told
22:40
me something about this, but yeah, this
22:42
is a brand name that everybody
22:44
knows somehow. You go a little bit
22:46
into programming and then you come
22:49
there and see your people. It's
22:52
a good branding. It's
22:54
good branding, but also the
22:56
technology itself is very useful. I'll
22:58
tell you a personal experience
23:00
of mine, which is when I
23:02
was in university, I
23:05
said earlier, my degree was
23:07
in software engineering, but I
23:09
don't tell people this. I
23:11
hated coding because to build
23:13
an application, you have to
23:15
learn the front end, the
23:17
back end, the database and
23:19
they were just different worlds.
23:22
In fact, in the
23:24
web application space back in the
23:26
day, we used to have three different
23:28
people. developers building one
23:30
application or more, at least
23:32
free, because you needed someone that
23:34
knew HTML, CSS, and JavaScript. They
23:37
needed someone that understood the
23:39
backend. It could be Java
23:41
or wherever backend language. They
23:43
needed a database administrator that
23:45
understood your database. But this
23:48
is what MongoDB did for
23:50
me. It unified the application
23:52
stack. which means with the
23:54
mindset of Jason and thinking
23:56
about Jason, it made it
23:59
easy for me to just
24:01
understand how things connect through
24:03
the application stack, the front
24:05
end, the back end to
24:08
the data layer. And
24:10
it was so significant
24:12
that it changed my career.
24:14
So I became a
24:16
full stack web developer because
24:18
I understood MongoDB. I
24:21
embraced that JSON, that object -orientated
24:23
programming mindset, that JSON mindset,
24:25
and used it all across the
24:27
stack. So we saw the
24:29
emergence of full stack web developer.
24:31
You probably have heard of
24:33
different stacks like a mean stack,
24:36
main stack. And those
24:38
are things that MongoDB helped within
24:40
the application space. But we're
24:42
not stopping there because that same
24:44
thing we did for people
24:46
like myself when I was a
24:48
bit younger is what we
24:50
are doing for the AI engineers,
24:52
for the AI application developer
24:55
today in the world of
24:57
generative AI. It's the same
24:59
aha moment. Oh,
25:01
yeah. So that is something
25:03
you didn't want to program. Actually,
25:05
do you still program or is
25:07
it AI or do you just
25:09
manage your agents and their program? Well,
25:12
I wish I had a bunch
25:14
of agents that were coding for me.
25:16
That will allow me to do
25:19
more podcasts and talk to
25:21
you more. But yes, I still
25:23
code on a daily basis. I'm
25:27
in a unique position
25:29
where I enjoy building things
25:31
and having a very
25:33
good understanding of what's going
25:35
on in the application
25:37
development landscape and then communicating
25:39
that information to developers.
25:41
to our customers. We do
25:44
that in MongoDB in
25:46
different ways. We have an
25:48
educational platform. We have
25:50
a developer platform where you can
25:52
see different articles. And we
25:54
actually speak to our customers one -to
25:56
-one and their specific AI team. So
25:58
long story short, I
26:00
code more than ever. So
26:02
it's not yet there. You said at
26:04
the start, AI may be our last
26:06
invention we make. Can
26:12
you go more for programming
26:14
or in general, like your
26:16
vision for that? I'm
26:18
curious to hear that. One
26:21
thing is, the
26:24
promise of AI has
26:26
always been to replicate
26:28
human intelligence. The question
26:30
is why, right?
26:32
Why have we wanted to do
26:34
this? And it's because we can get
26:36
a bit philosophical here. Great,
26:39
yeah. Life can
26:41
be laborious, which is you're coming,
26:43
you're born into life, you're expected to
26:45
go into work, you work for
26:47
several amount of years, then you retire
26:49
at age 60 something, and then
26:51
you get to enjoy the money. It's,
26:54
for lack of
26:56
a better word, that
26:59
there is more to life than
27:01
that, but... We need a way or
27:03
we need some form of entity
27:05
that can help us with this laborious
27:07
part of life that can then
27:09
allow us to experience the creative part
27:11
of life or aspects of life
27:13
have not seen. I
27:16
think there is a
27:18
future where AI allows us
27:20
to be very productive. It
27:23
allows horses as humanity to
27:25
explore different areas of that.
27:28
of the experience of what
27:30
it means to be humans
27:32
that we don't have the
27:34
opportunity or the scope to
27:36
explore today. It
27:39
can be
27:41
humans' last
27:43
invention if we do it right
27:45
and also if we do it
27:47
wrong. It
27:50
goes both ways. But I'm
27:52
not going to bring the
27:54
tumourism vibe. I think AI
27:56
has a lot of benefit
27:58
to what we do today
28:00
as humans and it can
28:02
improve a lot of productivity
28:04
for a lot of people
28:06
within different domains. I
28:09
actually have a typical question here in
28:11
the interview and that would be how probable
28:13
do you think is a terminator or
28:15
matrix scenario so a little bit of do
28:17
more is my one to get. I
28:20
think there is there is
28:22
always a potential for certain outcomes
28:24
right it would be. it
28:27
would not be intellectually honest
28:29
to say that there isn't
28:31
a certain percentage for that
28:33
outcome happening. Does
28:36
it happen exactly
28:38
like Terminator? I
28:40
doubt it. It
28:42
might be less cinematic
28:44
or dramatic. I
28:47
think any technology
28:49
in the hands of
28:51
the wrong people
28:53
can cause harms. And
28:57
that's the same for AI. That's
29:00
the same for the internet.
29:02
That's the same for any technology
29:04
we've ever invented. Yeah.
29:07
Yeah. No, it's like, this
29:09
is a probability. No, but
29:11
it's a possibility. Yes. Yeah. That
29:14
is good. Yeah. One thing
29:17
about yourself. Do you
29:19
use chat GPT or for what?
29:21
What things do you use AI
29:23
programming? Obviously, probably, but do
29:26
you have other things that are
29:28
funny, interesting or helpful? I
29:31
use chat GPT. Again,
29:33
I need to have a very
29:35
strong opinions of the things that
29:38
are coming out of LLM. So
29:40
I use chat GPT. I use
29:42
Claude. I use Gemini. I use
29:44
pretty much the whole chat interface
29:46
that are coming out. I
29:49
use it for different things, right?
29:51
For my side projects when I'm
29:53
programming, I use it for thinking
29:55
as well, for debating, for looking
29:57
at other perspectives of some opinions.
30:00
I use it for pretty
30:02
much a good amount of
30:04
what I do today. But
30:07
something I'm realizing that tomorrow
30:09
is that there is an
30:11
aspect of when I use
30:13
this LLMs, where I begin
30:15
to over engineer the problem. I've
30:20
never seen a computational
30:22
entity or even an
30:24
intelligent entity think about
30:26
2 plus 2 so
30:28
deeply as much as
30:30
an NLM will. But
30:34
there are times where I
30:36
see myself over -engineering when I'm
30:38
working with NLMs when the answer
30:40
is actually quite simple. But
30:44
yeah, I don't see a question. I use
30:46
it for a lot of things. And there's
30:48
some things that I don't think I can
30:50
use it for yet. But
30:53
it's really interesting, this point
30:55
where one gets tempted to...
30:57
ask everything to the machine,
30:59
even if it's simple, like
31:01
2 plus 2, use a
31:03
calculator. But the people
31:05
then asking those questions, those huge
31:07
machines. Great, yeah, that's a good
31:09
thing. And what can come out
31:11
of this is paralysis through analysis. If
31:14
you analyze too much, you
31:16
don't do. I think that's
31:18
it's over -engineering. And that
31:20
happens regardless of an LLM
31:22
or Rivaan LLM, which is
31:24
One thing that a lot
31:26
of people are going to
31:28
realize is LLMs and the
31:30
error of AI and the
31:32
AI models, they amplify a
31:34
lot of things, right? Productivity,
31:36
but they can also amplify
31:38
a lot of things that
31:40
we don't want them to
31:42
as well. So we have
31:44
to use them with, not
31:46
with caution, but use them
31:48
with habit. I find that
31:50
it's better for me to
31:52
have an objective. that I
31:55
want to get when I'm working with
31:57
Vela Lambs and then maybe time box
31:59
myself because you could go into the
32:01
rabbit hole. Definitely
32:03
those long chats. Yeah, great. Yeah,
32:05
yeah. And then now, now starting with
32:07
voice mode and so you're even more
32:10
tempted. Yeah, one has to control their
32:12
own use. Yeah, it does. And one
32:14
thing is One
32:16
thing I use LLMs for this
32:18
new interface is like you mentioned
32:20
voice mode is I use it,
32:22
I use it for learning different
32:24
languages as well. So I'm learning
32:26
a certain language at the moment
32:28
in time. I have a human
32:30
tutor, but at the
32:33
same time, sometimes I go
32:35
into chat GPT, I put it
32:37
in voice mode and I
32:39
say, Speaking Japanese,
32:41
let's speak in Japanese now and
32:43
we go back and forth in
32:45
Japanese very terribly on my part.
32:47
Excellent on the chat GPT part. And
32:50
I use it as a tutor
32:52
as well. But again, it's not replaced
32:54
my human tutor. I
32:56
totally get that because for
32:58
myself the listeners may know I'm
33:00
married to a Cuban wife
33:03
and so Spanish is a choice.
33:05
I learn languages with also with the
33:07
voice mode and it's really, yeah. This
33:10
is something so interesting that you
33:12
now suddenly can go there and talk
33:15
in a language and it helps
33:17
you and it understands your problems and
33:19
can correct you. Really, really great. Japanese.
33:22
Oh, yeah. But that's a
33:24
harder Spanish. It's quite easy
33:26
compared, I think. Japanese. Whoa.
33:30
Yeah. Okay. Yeah.
33:33
Long ago, I did some, some
33:35
candle like this. fighting
33:37
thing and they had like terms
33:40
and I try to remember them
33:42
and it was like not easy
33:44
but great it's like yeah cool,
33:46
cool challenge. I'm terrible now at
33:48
my Japanese and I'm just doing
33:50
it as it's fun right and
33:53
this is what I was talking
33:55
about the human experience. If
33:57
we have more time
33:59
to do certain things. What would
34:01
you do? For me, I
34:03
would learn some of the languages
34:05
of cultures that I'm interested
34:07
in. I
34:10
would love to do that full
34:12
time, but we have this laborious
34:14
aspect of work that then gives
34:16
us the permission to do the
34:18
things that we actually want to
34:20
do. But I'm in a
34:22
fortunate position where I made a conscious
34:24
decision to go into AI because I
34:26
wanted to do it and I was
34:28
good at it, I understood it, but
34:30
now I want to learn Japanese as
34:33
well. Yes,
34:35
it's actually, this is really like
34:37
this philosophical thing is a good
34:39
thing to end on because what
34:41
do you want to really, really
34:43
want to do? And what should
34:45
you do if there's no work
34:47
anymore? Many people will have
34:49
to think about what are they
34:51
have to have to develop
34:53
hobbies or passions and so on.
34:56
I think
34:58
there is
35:00
a future. there
35:04
is a possible future
35:06
where the word work disappears.
35:11
Wow. Yeah, great. Yeah, I
35:13
love it. I
35:15
think we have to stop now
35:17
because you can't top that
35:19
because this is really something. I
35:22
mean, yeah, the work is
35:24
done and by machines, by
35:26
thinking machines, they are great.
35:28
Yeah, but Richmond, tell
35:31
us where we can find
35:33
you, where can we find
35:35
MongoDB, and where can we
35:37
find some information? We put everything in the
35:39
show notes, but Richmond, tell me. Yeah,
35:41
we put everything in the show notes. You
35:43
can find me on LinkedIn,
35:46
right? Type up my
35:48
name, Richmond Alake, and hopefully there isn't
35:50
an imposter on LLM trying to
35:52
impersonate me, but I will show up.
35:54
and you can find me on
35:56
LinkedIn, reach me on LinkedIn. Well,
35:59
you can also see some of the stuff, good
36:01
stuff I'm doing over at MongoDB. We'll
36:03
put some stuff, some link
36:05
on the show notes, especially a
36:07
piece that we use to
36:09
explain AI stat and agentic system
36:11
and AI agents. I'll put
36:13
some articles that will help your
36:16
listeners understand what's going on
36:18
without going too much into the
36:20
details. And
36:22
yes, find me on LinkedIn, connect with
36:24
me, reach out to me and
36:26
let's learn together. Perfect,
36:28
perfect. So, Richmond, I really
36:30
thank you for this. And I also
36:32
thank you that you explained things not
36:35
on a deep tech level where I
36:37
was like blanking out here. I understood
36:39
what you said and was great having
36:41
you. Thank you very much
36:43
and look forward to speaking to you
36:45
soon. Yeah, we do that. Thank you.
36:48
Yeah, another thing I learned, JSON,
36:50
I always heard that, but that
36:52
is the language that makes it
36:54
easy to access data. Okay,
36:56
more I don't need to know,
36:58
but now I know there's
37:00
a thing that all programmers like
37:02
and that makes the communication
37:04
between LLMs and data easier. great
37:07
for us. And it seems
37:09
to be quite successful. 175 ,000
37:11
new developers each month. That's quite
37:13
a number. Yeah, so I
37:15
hope you, like me, learned a
37:17
lot about neural networks today
37:19
and machine learning and had fun
37:21
listening to Richmond. And the
37:23
last thing, don't forget to subscribe
37:25
to the newsletter. and
37:30
also follow us on your podcast
37:32
app. We would be happy to have
37:35
you there and in the next
37:37
episode as well. So signing off, Dietmar
37:39
from our Godot Berlin.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More