Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
babies is the topic
0:03
today yes babies
0:05
not financial securities banking
0:07
accounting management strategy. as
0:10
i know you yeah you work
0:12
in firms you need a i
0:14
for work but you most of
0:16
you also mother's father's grandparents something
0:18
like that and today we talk
0:20
to i'm in your debt he
0:22
is co -founder and cmo of.
0:24
Little one care that produces elora
0:26
this is the baby tracking device
0:28
and it is really something that opened
0:30
my eyes what is possible with
0:32
a i and the children with
0:34
children and. It's something you should
0:36
really listen to, to learn a
0:38
lot about how AI works, because
0:41
we don't just cover the baby
0:43
stuff. But before I talk
0:45
too much, if you're curious
0:47
to learn more about Ellora and
0:49
how babies can be tracked, the new
0:51
AI natives can be tracked, then
0:54
just listen to the episode. It's
0:57
Dietmar on the microphone again, it's the
0:59
interview episode and welcome to the
1:01
beginner's guide to AI. Let's
1:03
start with the interview. But first, a
1:05
quick thank you to our sponsor,
1:07
Sensei. Sensei is an AI
1:10
-powered wisdom engine that helps your
1:12
organization capture and share its most
1:14
valuable knowledge. Imagine
1:16
easily preserving advice from your
1:18
best experts or creating
1:20
interactive trainings your teams can
1:22
access anytime. Sensei
1:25
makes all of this
1:27
possible using AI -driven
1:29
digital replicas. check out
1:31
how Sensei can help
1:33
your team share knowledge smarter
1:35
at sensei .io and there's
1:37
also an episode where interview
1:39
Dan Thompson the CEO of
1:41
Sensei just look at
1:43
show notes and listen to
1:45
him and talk about
1:47
his vision thanks Sensei and
1:50
now back
1:52
to the
1:54
show so
1:57
I mean, welcome to
1:59
the podcast and can talk a
2:01
lot about you and it would
2:03
be all PR bubbles. I would
2:05
give so I give the microphone
2:07
to you and what did you
2:09
get? What did get you into
2:11
AI? Well,
2:13
thank you very nice and
2:15
thank you for hosting
2:18
me. What made us as
2:20
a company to establish
2:22
a product based on AI?
2:24
Well, it's a little
2:26
bit It's
2:29
a story about
2:31
prediction and
2:34
gambling. I will
2:36
explain. When we
2:38
started to discuss the opportunity
2:40
to develop the product we
2:42
developed, the year
2:44
was 2019. We
2:47
established the company in 2020. And
2:49
when we start to draw what we're
2:51
about to do, And
2:54
you know, it was kind
2:56
of a draft what we supposed to
2:58
do. We had a lot of
3:00
blank places where we said, okay, this technology
3:02
is not exist yet, but we do believe,
3:04
and this is the gambling part, we do
3:06
believe that it will be fixed in the
3:08
next two, three years. And this thing may
3:10
be four years. This thing next year for
3:12
sure is gonna happen. And
3:15
that way we started the
3:17
business. where we
3:19
knew that if the market
3:21
and the technology and
3:23
the algorithms and the public,
3:26
beyond everything is the
3:28
public needs and the public
3:30
maturiness won't be there, we
3:32
will fail. But on
3:34
the other hand, what is
3:36
innovation if you will
3:38
not be aiming and
3:40
running into a place
3:43
through a path that nobody
3:45
been there before and And
3:47
you actually don't really know
3:49
if everything, all the
3:51
stars will be aligned. And
3:55
thanks God everything happened. So
3:57
besides everything we have done,
4:00
it's a lot about predicting or
4:02
gambling. If you won, it's
4:04
predicting. If you felt it was
4:06
gambling, and I hope that
4:08
this gamble will be... it will
4:10
be okay because we believed that it's
4:13
going to happen. So we didn't
4:15
want to use AI, but AI is
4:17
the only way to solve some
4:19
of the things. Understanding the secret world
4:21
of babies without using cameras and
4:23
understanding what exactly they
4:26
do and tracking them, which
4:28
is, it's like, it's
4:30
not practical. So you
4:32
have to where to put something, to
4:34
attach something to the baby's wearable
4:36
device. And without using
4:38
visuals, in our case, we
4:40
track the baby's sounds and activities,
4:42
like the motion of the
4:45
baby, convert it
4:47
into using
4:50
algorithms, into
4:52
understanding, bringing
4:54
into these sounds. emotions to
4:56
bring them context and to
4:58
transform this context into a
5:00
table that in one screen
5:03
we tell you what had
5:05
happened all day when the
5:07
baby was asleep playing, laughing,
5:09
people with the baby crying,
5:11
feeding the baby. To make
5:13
that you need a lot of AI
5:15
power and what we have done
5:17
we understood that that kind of
5:19
tiny device is
5:22
not something that we'll be able to
5:24
do everything. So we split our
5:26
neural networks into two different neural networks,
5:28
one of them in the device.
5:31
It's tiny, it's agile, it
5:34
doesn't use a lot of energy because AI
5:36
needs a lot of energy and you
5:38
don't want to have a massive
5:40
device on your baby's shirt. And
5:43
in this neural network, With the
5:45
baby, we decided that we are going
5:47
to eliminate and to use it only
5:49
for the real things you need to
5:51
know in real time, emergencies. If the
5:53
baby fall from a changing table, if
5:55
somebody scream at your baby, if
5:57
the baby is crying and nobody approached
5:59
the baby, if the baby, for
6:02
instance, during the night
6:04
sleep, the device doesn't
6:06
recognize emotions, okay? So
6:08
you would like to
6:10
know about that. If
6:12
somebody shake your baby, like
6:14
rude and aggressive and aggressive
6:17
treatment, you don't want that. So
6:19
all the algorithms that
6:21
we have in
6:23
our Elora tracks that kind of
6:25
things in real time. And we allow you as
6:27
the parents to decide what you want to know, what
6:30
you don't want to know. And trust me, everybody
6:32
wants to know everything in this area. But
6:34
knowing when the baby was
6:36
crying, was sleeping, playing tummy time,
6:38
rocking the baby, that kind
6:40
of things. from
6:42
different reasons, especially
6:45
the need to make
6:47
the parents more mature and
6:49
not to peep and to track the
6:51
babies all the time what is going
6:53
on. We split that kind of things
6:55
and these things we actually analyze with
6:58
another neural network that we
7:00
have in the cloud. At
7:02
the end of the day, when you dock
7:04
the Lora in this docking station to
7:06
recharge it, we use this period of time
7:08
to upload the data. Over
7:10
there, we can play with it
7:12
more. We have more tools,
7:14
we have more resources, more energy,
7:16
and a couple of minutes
7:18
later, after you, for instance,
7:20
take your baby, you took a bath, put
7:24
your baby back in the bed in the
7:26
evening time, you take
7:28
the Laura back from the docking station since
7:30
you charged it, and you can sit in
7:32
the living room, open your cell phone, and
7:34
you see what had happened the whole day. So
7:37
if we found a way
7:39
to use our limitation to
7:41
split it to two neural
7:43
networks, that one work in
7:45
real time, tiny one, and
7:48
one works in the cloud. And
7:51
parents ask, no, no, no, I would
7:53
like to know if my baby is crying
7:55
now. No, it's not important. Trust your
7:57
nanny. Trust the people who are with your
7:59
baby and focus on what you need
8:01
at your work later on, at
8:04
the evening. come back you have more
8:06
energy play with your baby do what
8:08
you need to do now you can
8:10
analyze the whole day like how many
8:12
words your baby heard today it's not
8:14
important if you will not finish the
8:16
whole day right this is what we've
8:19
done yeah this this totally reminds me
8:21
of the slow thinking uh quick thinking
8:23
slow thinking of Daniel Kahneman who said
8:25
that we have some quick decisions to
8:27
make and this is in the Elora
8:29
and we have some slow ones that
8:31
that need thinking through it's perfect yeah
8:34
It's so important what you said. There
8:36
is a book that, in the
8:38
name of this book, Seven Habits of
8:40
Most Effective People. This
8:42
book is a well -known book. I think most
8:45
of the people didn't read it to the
8:47
end. It's another problem. However,
8:50
one of the things that this
8:52
book mentioned is the difference
8:54
between urgent and important. Yes,
8:57
what we do in real time
8:59
is urgent. But what you
9:01
see in the evening, this is the
9:03
important part. Why? Well,
9:05
you can play with your baby
9:07
today and you cannot do that.
9:09
Nobody will complain. Nobody will judge
9:12
you. And we won't judge you
9:14
either. But if you look at
9:16
any research that is dealing with
9:18
the importance of the first 24
9:20
months of baby's life and how
9:22
this How this 24
9:24
months are so crucial
9:27
for the baby's development as
9:29
an adult? The mental
9:31
health of this person, the
9:33
mental strengths of this
9:35
person as an adult. How
9:38
these people communicate with
9:40
others, compromising with communication with
9:42
partners, friends, spouses? There
9:45
is a connection between
9:47
the behavior of these adults.
9:50
and how much money they're going to make,
9:52
how they're going to behave at school
9:54
and the grades at school to the fact
9:56
what you have done in the first
9:58
24 months. So the funny thing is it's
10:00
not urgent. It's important to play with
10:03
your baby, but it's not urgent. You didn't
10:05
play with your baby today. It was
10:07
fine. You walk very hard. You were busy.
10:10
You didn't talk to your baby enough
10:12
today. Fine, nobody judge you, but
10:14
you need to see these reports all
10:16
the time to make you, we
10:18
call them call to action insights. Why
10:20
call to action? It's fine
10:22
that you didn't do that today, but maybe
10:24
in the weekend you play more with your
10:26
baby. Tomorrow you will think about it and
10:28
you will communicate with your baby and you
10:30
will describe your baby. You change your diaper
10:32
to your baby, share with the baby the
10:34
story. Oh, let's do this. Oh, this happened. Grandma
10:37
just came and it's a practicing the
10:39
parents. it's not urgent it's important and
10:41
this is the things we would like
10:43
to present you in the evenings with
10:45
the call to action reports to change
10:47
your manners because your baby is going
10:50
to do the same thing that i
10:52
after they will cry they will sleep
10:54
they will eat and they will poop
10:56
sorry with and without elora this is
10:58
the only thing they do but you
11:00
it's your decision to make i'm going
11:02
to play with my baby more I'm
11:04
going to sing to my baby, I'm
11:06
going to read to my baby, I'm
11:08
going to talk to my baby, I'm
11:10
going to play with my baby on
11:12
the carpet. I'm
11:15
going to put my baby in tummy
11:17
time, so important in the early stages. And
11:20
we are here to be
11:22
a reminder because these 24
11:24
months are going so fast
11:27
and we don't have second
11:29
chance in these 24 months.
11:31
And sorry, yes, it's not
11:33
urgent, it's just important. This
11:37
is what we do. Yeah,
11:39
this is I get deep thoughts
11:41
about this because because with
11:43
with AI with a lot of
11:45
this is kind of a
11:48
democratizer because I mean people who
11:50
like Really read about
11:52
what can and cannot do a baby
11:54
what you need to do and there's this
11:56
I think it's a Harvard study that
11:58
Children of higher educated people have like millions
12:00
of words more they're here in the
12:02
first one or two years and But
12:04
but now you can give people who don't
12:07
have the luck to have a higher
12:09
education You can give a tool and say
12:11
okay here. Here's what you do Here's
12:13
what you have to do and those children's
12:15
get better chances in life. I like
12:17
that Dietmar, I would like to
12:19
share with you two things about what you just said.
12:21
It was so important you mentioned that. Any
12:26
smartwatch, Apple Watch or any other
12:28
smartwatch will tell you how many
12:30
steps you had today. 8 ,000,
12:32
it's the minimum. But
12:34
there are days that you sit like
12:36
today. I'm going to sit all day
12:38
in front of the Zoom and talk
12:40
to people and several face -to -face meetings.
12:44
If I will not find the one and a half
12:46
hour to do what I need to do for my
12:48
practice, I will
12:50
miss my 8
12:52
,000 steps, the minimum
12:54
8 ,000 steps that I supposed to do
12:56
today. And the guilt feeling will be
12:58
there. This is what happened. And we use
13:00
these smart watches to improve our lifestyles
13:02
and well -being for the next couple of
13:05
weeks or months. With babies,
13:07
it's quite different. Two reasons. The
13:10
equivalent to how many steps you had
13:12
today, we share with parents how many words
13:14
your baby was exposed to today. Any
13:17
speech therapist will tell you
13:19
that's the average, the daily average,
13:21
supposed to be 15 ,000 words
13:23
a day. It's like, it's
13:25
a lot. And it's not
13:28
about your baby, it's not their decision
13:30
to make how many words they're gonna hear
13:32
today. And I'm not talking to be
13:34
stuck in front of the telly, I'm talking
13:36
to someone to talk to you. So
13:38
one thing I would like to share with
13:40
you. Do you have any idea how
13:42
many neural connections happens in the baby's brain
13:44
the moment you talk to him? One
13:48
million neural connections, it's
13:50
a firework every second, every
13:52
second, one million neural
13:55
connections. It's like you blow
13:57
their minds, this is,
13:59
and you need to talk to them. So this
14:01
is one of the things I wanted to share
14:03
with you. That's why it's one
14:05
of the most important. things
14:07
that we do in our
14:09
product because the very same
14:12
babies later on will be
14:14
able to communicate, explain the
14:16
world to themselves in their
14:18
mind or to others much
14:20
better than the other babies
14:22
who didn't communicate with their
14:24
parents. There are a
14:26
lot of researchers in this area
14:29
that shows very interesting things about
14:31
the type of the dialogues you
14:33
should do with your baby and
14:35
how these dialogues impact their ability
14:37
to communicate and rule other toddlers
14:39
later on, couple of years later
14:41
on in the daycares. They're ruling
14:43
them. They are the unnatural readers
14:45
because they know how to communicate
14:47
and not just to cry or
14:50
start using their hands. Okay, this
14:52
is one thing I wanted to share. The
14:54
other thing people do not
14:56
talk about that many parents
14:58
learn how to become parents
15:00
They read a lot of
15:03
books take part in many
15:05
workshops and they will do
15:07
everything to prepare the nest
15:09
Before the due date especially
15:11
first -time parents, which is
15:13
very natural things to do,
15:15
but none of us Pay
15:18
attention to the fact that
15:20
the moment you become parents, it
15:22
doesn't mean that you stop being
15:24
a kid. All of
15:26
us bring into this new
15:28
relationship with this new newborn, anything
15:31
we know about our relationship
15:33
with our parents, the
15:35
traumas, the things we
15:37
missed, the things we liked, the
15:39
things we hated. And
15:41
sorry, this baby doesn't need all of this.
15:43
I don't want to use bad words, but
15:46
they don't need to use, they don't want
15:48
to be part of your background and your
15:50
history. Please bring into
15:52
this new relationship, especially in the
15:54
first 24 months, white
15:57
paper, clean
15:59
relationship. And many people bring
16:01
in the guilt feeling, the problems
16:03
they used to have. And,
16:06
oh, my father was all day at work. I don't
16:08
want to be all day. No,
16:10
your baby doesn't need that.
16:12
And what we created is
16:14
an ability to make you,
16:16
we reflected the reality. This
16:19
is what you have done this day. Do better
16:21
tomorrow. I don't care what you had with your
16:23
parents. I don't care what you had in your
16:25
background. I don't care how much you work hard
16:27
or doesn't work at all. I don't care. This
16:30
is the amount of time you spend playing with
16:32
your baby. That's it.
16:34
Do whatever you need to
16:36
do the day after. And we're
16:38
simplifying everything in order to
16:40
provide you an ability to see
16:42
the reality without the noises, without
16:44
the problems that you might
16:47
bring into the relationship, in order
16:49
to make you the best
16:51
version of yourself to bring into
16:53
this relationship the best parents
16:55
you can be because your baby
16:57
has no other chance to
16:59
spend these 24 months. Yeah,
17:01
this is this is really great that they
17:04
all the baggage you carry around and now
17:06
you have a Objectifier like this is like
17:08
this is like you have an objective view
17:10
on this. This is not what you need
17:12
here This is what it's like a small
17:14
consulting you get most like this is probably
17:16
the idea you have a small consultant in
17:18
your pocket I did my have to I
17:20
first of all you write but we are
17:22
not consultant because we are the wait you
17:24
will meet in the gym or if you
17:26
go to your nutritionist we are the weight
17:29
in the room we will tell you what
17:31
is your weight we will never ever tell
17:33
you what to do about it if you
17:35
want to learn what to do about it
17:37
talk to your experts talk to yourself It's
17:40
your decision to make. We will not
17:42
judge. In our reports, there is no
17:44
good, bad. No, we just reflect the
17:46
reality. Because one of the
17:49
most, parenting is an
17:51
occupation that nobody really needs to
17:53
go to college to receive
17:55
a degree in order to become
17:57
a parent. We bring
17:59
each one of us bring into our
18:01
parenting behavior, our
18:04
values, history, understanding, reading.
18:08
our spouse impacted either so it's
18:10
it's very wide and wild
18:12
area so we cannot say this
18:14
is good is bad we
18:16
don't want to be there we
18:18
can tell you this is
18:20
your weight and this is your
18:22
weight yesterday and this is
18:24
your weight two weeks earlier do
18:26
whatever you want to do
18:28
with that this is the trend
18:30
that's it simple This
18:32
is funny because I have a Samsung
18:34
and I have to only do 6
18:36
,000 steps. So
18:38
is it better? Is it worse? This
18:41
is just the thing. You have to
18:43
make your own opinion on it. Yeah,
18:46
that's good. And we enable, in our
18:48
application, we enable you to
18:50
decide what are your goals by yourself.
18:52
We never ever decide for you these
18:54
the goals. If you ask people or
18:56
you read articles, you will find numbers.
18:59
But if you will dig into
19:01
that more, you will find
19:03
different numbers from different disciplines and
19:05
different methodologies. It's
19:09
not you need. I suggest that
19:11
each one of us will find
19:14
internally, what is my values, my
19:16
spouse values, what we're going
19:18
to do together. And the
19:20
moment we decide that as couples, this
19:23
is what we want to do. Our
19:25
Elora will reflect what we have done, simple.
19:29
Yeah. But
19:32
there is a thing I
19:34
can see what happens. You said
19:36
there's words I can see.
19:38
What are other things you look
19:40
for? Well, we have several
19:42
sensors. in our tiny device
19:44
and one of them tracks
19:46
the activity level. I think this
19:49
is one of the most
19:51
important things because there are many
19:53
researchers and especially the World
19:55
Health Organization published in many occasions
19:57
announcements that we do not
20:00
play enough with our newborns and
20:02
babies and they point out
20:04
that this is the main reason
20:06
to have overweight kids
20:08
at the elementary school. We didn't
20:10
have that in the past and
20:12
they want us to play more
20:14
with the babies. In the past,
20:16
kids, babies couldn't sit in the
20:19
crib by themselves and to be
20:21
stuck. They will scream. Now parents, babysitters...
20:23
Nanny's I'm not blaming them but the
20:25
technology enable you to put in the baby's
20:28
hands iPhone or iPad and they can
20:30
play all day and they are not screaming
20:32
and yelling and it's good for everyone
20:34
it's not good for the baby from the
20:36
physical point of view they need to
20:38
be all over the place they need to
20:40
break your house they need to to
20:42
be on the carpet and you need to
20:44
play with them sorry so we I
20:47
think this is the most this is one
20:49
of the reports I do like is
20:51
to present parents how much your baby was
20:53
active when they were not asleep. This
20:55
is one thing. The other thing we do,
20:58
it's about engagement, how much you engage
21:00
in any other caregiver, engage with
21:02
the baby over the day, during the
21:04
week, during the month, how
21:06
many words your baby was listening, it was
21:08
exposed to. And we
21:11
have another sensor that's actually, it's
21:13
a sensor that tracks the air
21:15
quality around your baby every couple
21:18
of seconds. like one minute
21:20
I think it strikes and understand
21:22
what is its monitors the air around
21:24
your baby and tells you if
21:26
your baby is exposed to a pollution
21:28
or not most of the case
21:30
it's fine but I would like to
21:32
be aware that my mother in
21:35
law is still smoking next to my
21:37
baby although it's important because many
21:39
people live in an area that they
21:41
need to know what to what
21:43
to do about it and there are
21:45
paths that you even in even
21:47
in a place
21:49
that the air is not
21:51
so clean, the air path
21:54
that you will walk and your baby
21:56
doesn't need to be exposed to air pollution.
21:58
This is the main sensors. By
22:01
saying that we have a
22:03
touch sensor when you touch the
22:05
logo of the baby, of
22:07
the Aelora, in a
22:09
very gentle way, because the Aelora is attached
22:11
to the baby's lower belly. But
22:13
any time you touch it, it changes
22:16
its color to a very gentle blue.
22:18
which enables you four seconds to say
22:20
whatever you want. For instance, oh,
22:23
let's have carrot, first time carrot. Oh,
22:25
let's have banana, first time banana.
22:28
Oh, grandma just came. Whatever
22:30
you say will be, or let's change
22:32
the diaper. Oh, it was dry, it was
22:34
wet. Anything you're gonna say
22:36
will be presented later on in
22:38
the app using speech -to -text technology.
22:41
Why? We changed
22:43
it the way you should treat
22:45
your baby diary. From the
22:47
parent's point of view, the
22:50
annoying part of opening the telephone
22:52
and type in that you were
22:54
breastfeeding, your baby used breastfeeding for
22:56
20 minutes left side. It's
23:00
fine, but you cannot stick to
23:02
that for a long time. The
23:05
parents, we enrich the world of
23:07
the parents on what they can type
23:09
in, because anything you're going to say is
23:11
going to be presented in the app. But
23:14
it's very interesting from
23:16
our side. What
23:18
is AI? AI is
23:20
not just fancy and
23:22
sophisticated algorithms. No way. AI
23:25
is about
23:27
collecting quality
23:29
data. If
23:31
you know in the area of AI,
23:33
there is a term saying garbage in,
23:35
garbage out. It's not just to develop
23:38
the rights and the most sophisticated algorithms.
23:40
It's about to have quality data. Our
23:42
data is the data that the moment you tap
23:44
the Lora and say what you have done. Let's
23:47
have carrots. It
23:49
enables us to have
23:52
data that is synchronizing
23:54
the sounds with the
23:56
motion of what is
23:58
feed your baby with
24:00
different type of foods.
24:03
Like maybe right now your baby is in
24:05
sitting position or your baby is like
24:07
nesting in your hands and you're crawling the
24:09
baby while your breastfeeding them. Maybe it's
24:11
a situation where you change the diaper and
24:13
this is what the patterns that we
24:15
need to track to develop around them algorithms
24:17
that will enable us to understand it
24:19
later on without you need to say them.
24:22
This is one aspect but the other
24:24
aspect is oh my goodness you had
24:27
carrot one time nothing happened but in
24:29
the second time in many occasions allergies
24:31
start only later not in the first
24:33
time you feed the baby with the
24:35
something. And it's so complicated to understand
24:37
what is the reason for these allergies.
24:39
And we would like to connect it
24:42
with the sleeping patterns and the crying
24:44
and the annoying activities that happen later
24:46
on. And you shared
24:48
with us very honest and
24:50
simple data, but in the future,
24:52
we will enable you to
24:54
learn from it much more things,
24:57
but we need the data.
24:59
So they may be earned. a
25:02
parent that was practiced to describe
25:04
the reality in the world because
25:06
you talk to your baby, oh,
25:08
grandma just came up, but because
25:10
you tapped it on the allura.
25:12
You earn as a parent a
25:14
very detailed baby diary. We earn
25:16
very clean and quality data that
25:18
tells us more, that enables us
25:20
to develop algorithms. So this is
25:22
one of the things we have
25:24
done. We need more data all
25:27
the time because more data gives
25:29
us more tools to develop better
25:31
algorithms. I love it because
25:33
there's always this discussions, let's say not on
25:35
the positive side with carrots, but on
25:37
the negative side. You gave
25:39
the baby some sweets, some
25:42
stuff. And then
25:44
the parents have an argument about
25:46
is the baby getting aggressive now because
25:48
of that. And if you have
25:51
some data, because three, four times this
25:53
happens and every time, like
25:55
an hour later, the child can't sleep or
25:57
something like that, you have a connection
25:59
there. something
26:01
very important about the level
26:03
of sugar in the baby's
26:06
blood. In the past,
26:08
we didn't have all of
26:10
these sweets in our life. These
26:12
sweets is, people don't know,
26:14
sugar in the form we have
26:16
it nowadays is something that
26:19
we have around 500, 400 years.
26:21
Actually, people don't
26:23
know, but the Industrial
26:26
revolution. Everybody talk about the
26:28
industrial revolution. It started in England
26:30
actually. People don't know that.
26:32
People don't know that this revolution
26:34
was based on the sugar
26:37
and the energy that you took.
26:39
You took 7, 8, 19,
26:41
15 years old kids. You
26:44
gave them a glass of
26:46
tea with a few spoons of
26:48
sugar because sugar was the
26:50
energy that made these kids to
26:52
work 20 hours a day.
26:54
and this the revolution started where
26:56
you could put these kids
26:58
with this cup of sugar and
27:01
the butter or whatever you
27:03
want them in the little bit
27:05
of piece of bread and
27:07
jam that was based on sugar
27:09
and this is what the
27:11
real engine of the industrial revolution.
27:13
Nowadays, we have sugar all
27:15
over the place, and you said
27:18
something very important about the
27:20
level of activity, crying patterns, and
27:22
I cannot fall asleep because of the
27:24
sugar I gave my baby, which is not
27:26
good point, not good. I'm sorry, I'm
27:28
saying that I'm not supposed, again, I'm not
27:30
supposed to tell you what is good
27:32
parenting, but I'm talking for my pain points
27:34
right now. So I take back,
27:37
I'm taking back my words. Do whatever you
27:39
want with your kids because this is our
27:41
methodology. But please look at
27:43
the connection between A to B. Sugar
27:45
before night sleep. Thank
27:47
you. That is great. No,
27:49
this is, I mean, there's always something
27:52
like, like if you have the data
27:54
you can make, like it's so simple.
27:56
Like now the people say, don't give
27:58
your child sugar. It's okay.
28:00
But you can, this is like this
28:02
product thing. The sugar is the product,
28:04
but the marketing would be the need.
28:06
You want to sleep. Yeah,
28:08
so then your baby has to sleep
28:10
and then don't give sugar I
28:12
have to share with you another thing
28:14
people talk about the AI AI
28:16
is going to be the next revolution
28:18
if we talked about the industrial
28:21
revolution The industrial revolution was about kind
28:23
of conversation with the horses Horses
28:25
listen, we are going to create a
28:27
technology that replace your muscles. We
28:29
might not need you anymore Oh, the
28:31
horses said my goodness. What we're
28:33
gonna do. We're going to be Nobody's
28:35
gonna need us AI
28:37
revolution, people think it's about changing,
28:39
it's a different evolution because it's
28:41
about a bit more army. We
28:43
might not need your brain and creativity,
28:46
we're going to have you AI that will
28:48
do it instead of you. Well,
28:50
maybe it's true, maybe it's not true,
28:52
nobody can predict the future. I
28:54
can tell but another thing. The
28:57
real revolution for the average
28:59
person like me and you is
29:01
about the ability to
29:04
understand that we have an opportunity to
29:06
collect data that we couldn't collect before
29:08
in much more practical and cheap way.
29:10
I will explain. In the past, in
29:12
order to collect data, you need to
29:14
have a paper and pen and to
29:17
document everything you do. Have a diary,
29:19
later on, Excel sheets, later on, application.
29:21
No, no, no, no, no. You
29:23
can have a device like a Laura, attach
29:25
it to your baby, and you are going
29:27
to have And to
29:29
digitize so many things that later
29:31
on in the future, the very same
29:33
psychologist and people that are going
29:35
to analyze it, the physicians will go
29:37
back and say, and ask the
29:39
question, how, nowadays, when you
29:41
started to give your baby calf, oh
29:43
yeah, I wrote it down, but
29:45
nobody knows what is the connection with
29:47
that carrot to the crying patterns
29:49
and the sleeping pattern, the activity level
29:52
and the pooing. Nobody connected yet. We
29:54
are gonna do that. This revolution
29:56
is about The ability
29:58
to collect data and with the
30:00
future AI is the ability
30:02
to use the very same data
30:04
and to prevent and predict
30:06
things that nobody even imagine that
30:09
you can. I
30:11
totally get that and this is
30:13
like not only the data,
30:15
but the time when you collect
30:17
the data, you focus on zero
30:19
to two years. That means this
30:21
is when parents are most stressed
30:23
and forgetful. everybody
30:25
knows their homeroom said let's mothers forget
30:27
things so if I yeah then later
30:29
I will write down that she got
30:31
carrots huh no forget you forget that
30:33
and so now you have a possibility
30:36
even more just one thing I find
30:38
really really interesting because if I take
30:40
out my phone the child is already
30:42
looking at the phone and wants the
30:44
phone True. So
30:46
true. It's so true.
30:48
I will tell you more than that.
30:50
Right now, when you develop a product,
30:52
you have to think in terms of
30:54
MVP, minimal viable product. You cannot develop
30:56
everything. Otherwise, you will never ever launch
30:58
it. We started our product
31:00
and it was designed for the first
31:03
six months. And the moment we launched
31:05
it, we started to develop the other
31:07
features to make it fit through the
31:09
first two years of life. But our
31:11
real challenge is to provide you as
31:13
the parents. Reasons
31:15
to keep using product later on
31:17
how we're gonna do that.
31:19
I would like to present you
31:21
The connection between the book
31:24
you read to your baby It's
31:26
gonna be a toddler later
31:28
on and their ability to recognize
31:30
purple that the world purple
31:32
doesn't exist in the world. It's
31:34
an imaginary world. It's an
31:36
imaginary term that we will never
31:38
ever say it if you
31:40
didn't practice your baby. So you
31:42
need to have life experiences
31:44
to play with your baby or
31:46
to read to your baby. Something
31:49
that we'll mention or we'll point, this
31:51
is purple. And later on your baby
31:53
will say purple, purple. And we would
31:55
like to show the connection and to...
31:58
You know, people say you need to
32:00
have quality time with your baby. I
32:02
would like to tell you how this
32:04
quality time is quality. What
32:06
is the quality in the quality time? I would
32:08
like you to see the connection between what you
32:10
have done with your baby and their ability to
32:12
count one, two, three, five, to ten. It's
32:15
about their thinking in math world,
32:17
which is very important, not just
32:19
vocabulary. If they start to count,
32:21
you have done something very important
32:23
that will impact their ability to
32:26
translate the world to themselves or
32:28
how much you love your baby.
32:30
We do not have enough. data
32:33
of laughing and we are not good
32:35
enough with recognizing it. It's not about
32:37
the AI and the algorithms, it's about
32:39
the data sets of baby laughter because
32:41
people don't laugh enough with the babies
32:43
and they should be laughing. And
32:45
we try to do that by analyzing
32:47
data from the public data that you
32:49
see all over the YouTube and everything,
32:51
but it's not enough. It's
32:53
not enough, okay? So we
32:56
would like to make you
32:58
use the Elora more and more
33:00
by providing you with more
33:02
tools that will show you how
33:04
you as a parent contribute
33:06
for your baby's development, cognitive development,
33:08
activity, behavior, and so on. So
33:10
two years is just the first stage. Later
33:12
on, over the very same hardware, we would
33:14
like to update the algorithms over there and
33:16
the software to enable you use it even
33:19
more. For the
33:21
people who watch the video, I'm
33:23
nodding my head crazily because as
33:25
an economist, I'm like, yes, yes,
33:27
yes, exactly. I
33:29
have to advertise.
33:33
There's an author. She's called Emily Oster.
33:35
She wrote two books on how to
33:37
raise a child. And the funny thing,
33:39
she's an economist. And she said, yeah,
33:41
we economists are not doctors, but doctors
33:43
are good at. caring for people, which
33:46
is what they should do, but economists are
33:48
good at data. So it's really the
33:50
thing. And if you have this data, you can
33:52
do so much about it. This is like, it's
33:55
really great. It's not just to
33:57
do so much about it. It's
33:59
not to bring, it's not to
34:01
do, not to bring your traumas,
34:03
your issues, your background, your guilt
34:05
feeling. It's about to deal with
34:07
it in a very simple, we
34:09
simplified everything because your baby doesn't
34:12
need this. SHIT
34:14
you bring into the relationship.
34:17
Because you still a kid of
34:19
other parents, the grandparents, and
34:22
you, many occasions when you think you
34:24
sold everything, but when you have a
34:26
baby in your life, everything comes back
34:28
to you. Sorry, I
34:31
would like you to keep it out of the
34:33
relationship with this lovely newborn. It doesn't mean that. If
34:36
one imagines the parents, the
34:39
own parents, grandparents, they grew up
34:41
in different times. Things were not
34:43
like we have. Absolutely. True. And
34:46
I have to share with
34:48
you. I raised my kids without
34:50
any baby monitors, any technologies. We
34:53
raised them. They have their
34:55
issues. they complained like and I complained
34:57
to my parents either and I
34:59
have trust me I have very good
35:02
reasons my kids think they have
35:04
good reasons no proportion between my reasons
35:06
to my my mom didn't want
35:08
to raise me my mom my mom
35:10
rejected me when I was a
35:12
baby people didn't even we didn't have
35:14
in the the language we didn't
35:16
we didn't use the term depression after
35:18
but nobody knew that and it
35:20
was shame It was shame. So she
35:22
didn't want to raise me. She
35:24
had depression and it was shame. When
35:26
I was trying to talk to
35:28
my mom about it, she didn't want
35:30
to discuss it till today. It's
35:33
impacted me in a way.
35:35
So I brought into the relationship
35:37
with my kids different issues
35:39
because of these issues and they
35:41
have issues with me. I
35:43
would love them to raise their
35:45
kids using only facts. That's
35:47
it. And to bring into
35:49
that their emotion and love and
35:51
values and cultural code, everything, but
35:54
based on facts and not
35:56
their issues as kids that I
35:58
read maybe in a bad
36:00
way. Sorry. I was not
36:02
perfect. Nobody's perfect. Nobody's perfect. This is the
36:05
thing. It's important to know. But if we
36:07
can and people want to do it, and
36:09
like, this is the thing. If I think
36:11
about reading a book, the child is... say
36:13
six months doesn't sleep so much or starts
36:15
and I have to go out in the
36:17
night with the child or whatever. I won't
36:19
read a book at that moment. I
36:21
know. I know. But what we have
36:23
done, besides of presenting you
36:25
the real facts that had happened
36:27
during the day, week, months, we
36:30
understood that technology
36:33
is not everything.
36:35
Within our app, with one click,
36:38
you can access to
36:41
a world of experts. Any
36:44
baby experts can join free
36:46
to little one dot care
36:48
app to create an account,
36:50
write their own tips. And
36:52
what you see when you access the
36:54
app and you look at over your
36:56
baby's data sets, when you were sleeping,
36:59
feeding, crying, whatever, when you click on
37:01
it, different kinds of
37:03
tips are presented. that
37:05
was written in advance by the very
37:07
same baby experts. And these tips
37:09
are, what we have done, they
37:11
are not just random tips
37:13
because they are filtered by the
37:15
age group of your baby
37:17
and the event type you just
37:19
clicked on. And you can
37:21
communicate with this expert, you can
37:23
chat with them through the
37:25
app. If you are going to
37:27
have a relationship with them,
37:30
you need to pay them separately
37:32
because it's free, but we
37:34
understood that data... AI will never
37:36
ever solve problems. Sometimes you
37:38
need to hear someone that been
37:40
there than that expert mom
37:42
or speech therapist or sleeping experts,
37:44
breastfeeding experts, nutritionists, baby care
37:46
specialists. All of them are waiting
37:48
for the parents over there. And
37:50
you need to hear it from them. Dietmar,
37:53
you are fine. It's just the
37:56
challenge for the next couple of weeks. Keep
37:58
doing what you're doing. You need to hear
38:00
that. It's not an AI. It's
38:02
not a pop -up that will tell you
38:04
you're good. You need to hear someone
38:06
and you need to question it in your
38:08
personal way. And then
38:10
it works. This connection. We created
38:12
the hub and we actually are
38:14
trying to do this up. to
38:16
bring back the whole village that
38:18
used to raise one baby. You
38:20
know, you need a village to
38:23
raise one baby and this is
38:25
what we're trying to do through
38:27
the app, but over real data,
38:29
real facts and not just your
38:31
stories and translation of reality. Totally
38:33
make sense to me because at
38:35
the moment i think is not
38:37
yet there maybe it is a
38:39
certain point that it has compassion
38:42
or can connect things but at
38:44
the moment we need hybrid solutions
38:46
where the people are still a
38:48
big part of it. I cannot
38:50
replace your face expression to a
38:52
baby your kisses and hug. and
38:55
you're talking to a baby
38:57
it will never ever replace that
38:59
don't even imagine it will
39:01
never ever replace it I'm afraid
39:03
of the day people will
39:05
start think it will it will
39:08
never ever replace I don't
39:10
I have to share something about
39:12
dialogues with a baby you
39:14
cannot put a baby in front
39:16
of telly and think that
39:18
this is exposure of baby towards
39:20
no way because we talk
39:22
in the baby language tomorrow
39:25
I would like to do
39:27
because in our face expression
39:29
and the way we pronounce
39:32
the verb the variables make
39:34
the baby understand and learn
39:36
the language we talk German
39:38
Hebrew English whatever this is
39:40
the time where the brain
39:43
as I said the neural
39:45
connections happens one million every
39:47
second one million seconds then
39:49
what they you the if
39:51
birds needs to teach the
39:55
other birds how to fly
39:57
and in their brain there
39:59
is already a neural connection
40:01
that tells you this is
40:03
the way we fly and
40:05
open our wings and what
40:07
we need to do we
40:09
have an algorithm in our
40:11
brain we were born with
40:13
that tells us you need
40:15
to learn Chinese English but
40:17
doesn't matter what language Japanese
40:19
it works And the funny
40:21
thing is that in these
40:24
early stages, this
40:26
neural connection, this world in your
40:28
brain can learn several languages at
40:30
the same time. There are
40:32
many babies that learns German
40:34
and Russian because they moved
40:36
from Russia to Germany to
40:38
Berlin and they learned both
40:40
languages. The number of
40:42
words they know is the very
40:44
same numbers of words that German baby
40:47
knows or American baby is knowing. But
40:50
they are learning different
40:52
dialects, different accents, different
40:54
grammars. And they will learn
40:56
two languages. It's fine. If
40:58
you can do it, go for it.
41:01
They will know less words, but they
41:03
will know both languages. And later on,
41:05
they will be even better than others. In
41:08
these early stages, we
41:10
have incredible abilities that I would
41:12
say it in a different way.
41:16
Kids that learn second
41:18
language. In the
41:20
elementary school, it's a miracle. It's
41:23
to learn a second language in the
41:25
elementary school or later on in the high
41:27
school, it's a miracle. When
41:29
you're doing it, when you are in the early stages, it's
41:32
like natural thing to do. Yeah,
41:36
yeah. It's really. I mean,
41:38
at the moment I learn Spanish and
41:40
it's hard. My child,
41:42
she grows up with the German
41:44
and Spanish, maybe even English or
41:46
so. And it's like natural for
41:48
her. The brain is able to
41:50
do it. And this actually, the
41:52
basic thing of Elowa is to
41:54
help the people know what's possible
41:56
and what's not possible and like
41:58
words where the thing, there's a
42:00
really important thing. Not TV, but
42:02
real words, like also moving and
42:04
all those things. You collect
42:06
You collect this data and I have
42:08
to ask this question. You
42:10
obviously, I assume you
42:13
have a data protection
42:15
strategy behind. Yeah, absolutely.
42:17
Listen, listen. First of
42:19
all, one of the
42:21
most important barriers with
42:23
wearable devices. It's not
42:26
just the behavioral thing. Yeah, to attach it
42:28
to my baby. What is it? Why
42:30
do you need that? I don't need it.
42:33
Beside of that, there are several aspects
42:35
that you need to take into
42:37
consideration. Otherwise, it's a big failure. Safety.
42:40
It has to be a
42:42
product that went through any
42:44
possible regulation. That's, for
42:46
instance, the material it's made of, the
42:48
size of it, the shape of it, how
42:50
you attach it, and so on and
42:53
so on. So you have to go through
42:55
that with a lot of understanding. You
42:57
need to work with entities that develops. And
43:00
this is what we have done. Toys.
43:02
for babies under the age of three.
43:04
Because it's a very interesting and very
43:06
professional word. You cannot do whatever you
43:08
want and there is no space for
43:10
mistakes. You cannot make mistakes. You need
43:12
to use other people 20, 40 years
43:15
experience and to work with them. This
43:17
is one thing. In
43:19
terms of privacy policies, we
43:21
have the GDPR. You
43:23
have the California's rules that
43:25
are the most advanced
43:27
states in the US. So
43:30
there are two ways. of working in
43:32
this area. As an entrepreneur, most
43:35
of them, if technical people are
43:37
listening to this podcast, I have
43:39
to tell you something very interesting.
43:42
There are two ways of dealing
43:44
with privacy policies. One, to
43:46
learn what is going
43:49
on, what is the
43:51
last version, regulation, rule,
43:54
statements, and to follow them. and it's kind
43:56
of a mission that you need to do
43:58
in order to mark V and to move
44:00
on in your life. It's another annoying mission.
44:02
Many people see that that way. There is
44:04
another way to see that and this is
44:06
what we have done. Try
44:09
to access the brain of the
44:11
people who wrote this regulation and try
44:13
to see the world through their
44:15
eyes, empathy. The moment
44:17
you do so, it's open
44:20
a new world and
44:22
instead of see the very
44:24
same issue as
44:26
a problem that you need to solve as
44:28
a mission, it gives
44:30
you a lot of freedom. Because
44:32
what is, for instance, GDPR?
44:34
The only thing they want
44:36
to do is they would
44:38
like you to explain the
44:40
simple user, what do you
44:42
collect? Why you
44:44
collect this data? Why you
44:47
need this data? What you're going
44:49
to do with this data?
44:51
And if you will understand that
44:53
and you will be curious,
44:55
do it, true curiosity, it
44:57
will enable you not to
44:59
write legal and knowing boring
45:01
papers that people don't even
45:03
read. It will make
45:05
you to make these people partnering
45:08
with you about this. And
45:10
the most important thing is like
45:12
yourself, if you don't want to
45:14
become, to keep a partner with
45:16
someone, you wanna have a red
45:19
button, one click done. I don't
45:21
want to be partnering with you.
45:23
Fine. Provide this button. You
45:25
don't have, if they, but if they
45:28
read your policies in the right way, they're
45:30
going to lose. They are going to
45:32
lose if they are not partnering with you
45:34
anymore. make them feel like they're going
45:36
to lose because it's not just about feeling,
45:38
share what you're really doing. So
45:40
when we ran at the beginning
45:42
with the beta files, alpha files,
45:45
and the early adopters programs, we
45:47
released terms of use and privacy
45:49
policies. It was documents
45:51
with pictures. It was not just text
45:53
with pictures because we wanted people to
45:55
read them to the end. We loved
45:57
it. Later on the lawyer
46:00
says pictures are not good way, it's
46:02
not good people might be understanding different
46:04
ways, but we wanted to read to
46:06
them. We forced our team of lawyers,
46:08
they had asked for that, to write
46:10
it in a way that people would
46:12
love to read it. It was very
46:14
complicated because it's a world of terminology
46:16
that you cannot twist it all over
46:18
one night. I would love
46:20
That the very same GDPR people
46:23
the very same California's all the
46:25
people who lead this world to
46:27
force company says don't use legal
46:29
words Sorry, I wish there was
46:31
because I want to read and
46:33
don't write more than one a4
46:35
paper So the moment because I
46:38
do believe in that it's about
46:40
respecting your clients That's so great.
46:42
Yeah That's it. This is what
46:44
we have done. And we work
46:46
with the parents. So privacy policy
46:48
is an issue. No advertisements. We
46:50
don't sell your data. We
46:52
don't need that. What we want to
46:54
do is the best for you. And
46:57
in the future, we will provide
46:59
more details like if you would
47:01
like to use your data. For
47:05
instance, we would like to tell
47:08
parents after we will
47:10
have a lot of babies in the US, we
47:12
would like to tell parents a lot of things
47:14
that nobody knows nowadays. The
47:16
secret world of babies is
47:18
telling parents which car
47:20
seats make, car seats
47:22
that you buy for your car,
47:24
you don't know what to buy. But
47:26
we would like to distinguish and
47:28
to tell, okay, these car seats, the
47:30
average baby fall asleep immediately. The
47:33
other car seats, that we compared
47:35
between the two, oh my goodness, although
47:37
they have amazing influences that tells
47:39
it's the best one, it's
47:41
a nightmare. You drive and
47:43
you don't, and they're the
47:45
beverage baby, don't stop crying. So
47:48
in order to share this data, can you
47:50
please tell us what Cassie to use? You don't
47:52
want, no problem, fine. Baby formulas,
47:54
which brand do you use? How
47:56
much we can tell a lot around
47:58
it how much takes you to
48:01
feed it if the baby sleep
48:03
well if the baby play and nice
48:05
and smiling the other baby formula.
48:07
Oh my goodness. It's a jungle. You
48:09
cannot But you have to do
48:11
that by explaining parents why you do
48:13
it how you're gonna use it
48:16
what they're gonna gain back Win -win
48:18
relationship. Otherwise, it won't work. Yeah Not
48:20
only this is the two things
48:22
one thing is if I don't pay
48:24
for it, I pay with my
48:26
data. So there is
48:29
somebody tricking. We say
48:31
you pay with your data.
48:33
You don't have to pay with
48:35
your data. If you pay with
48:37
your data, this is what you
48:39
and other parents are going to
48:42
gain fair. Exactly. If so, you
48:44
can donate your data basically in
48:46
this case. And if you don't
48:48
want red button from the app,
48:50
one message, we will delete the
48:52
data and then you receive an email saying,
48:54
why? Just to understand what went wrong,
48:56
what you don't like and so on. This
48:59
is what we do. And this
49:01
comes to a pure ethical
49:03
thing. It's not about lawyers and
49:05
this is what I like.
49:07
Do the right thing and customers
49:09
will love it. Because you
49:11
need to understand the logic behind
49:13
the passion that made these
49:15
people to dedicate their life to
49:17
write these privacy policies. Listen,
49:19
in the past, and
49:21
even now, in many states, in
49:23
many countries in the world, it's
49:25
still a wide world, and it's
49:27
not fair. It's not fair as
49:29
a user, from the user point
49:32
of view. Yeah? Totally.
49:36
Now we have to... come to the last
49:38
question of the interview. Although, I mean,
49:40
people out there know by now that I
49:42
have a child and I'm like totally
49:44
interested in the topic. Obviously I could go.
49:48
She's three. She's unfortunately not yet
49:51
in the area where you have
49:53
a product. But I definitely
49:55
get that. But before we
49:57
talk about that, the most important question
49:59
of the whole podcast, maybe, how
50:01
probable do you think is the Terminator
50:03
or Matrix scenario? We just talked
50:05
about ethics. So that is a good
50:07
segue. Will the machines take over? If
50:12
the machines will take over? Will
50:15
they take over the world?
50:17
Will they put us in
50:19
zoos or exterminate us like
50:21
we are cockroaches? I will
50:23
tell you. I think that
50:25
during revolutions, predicting
50:27
the future is the
50:29
most stupid thing to
50:31
do. However, during revolutions,
50:33
two things are happening. two
50:36
types of people, and
50:38
we are divided to two types of people.
50:41
And I will use the COVID
50:44
to share it with you,
50:46
okay? I will use the COVID.
50:49
All of us are split. I mean, I
50:51
will use the, if you remember the
50:53
cell phones, we, the cell phone access at
50:55
the beginning, it was in the cars
50:58
later on, people have massive, massive phone in
51:00
their hands, but they could walk and
51:02
to talk like they're walking and people look
51:04
at them, oh my goodness, what he's
51:06
doing. And we split
51:08
it to two different populations. One, oh
51:10
my goodness, I want, I want to have one of them. It
51:13
was not, it was not a big group. And
51:16
the other group, me,
51:19
I will never take this
51:21
vaccination. Me, I will never
51:23
have my boss to bring me
51:25
or buy me a phone that
51:27
everybody will be able to call
51:29
me out of the working hours.
51:31
No way, no way. What
51:33
I'm trying to say, see
51:36
in which group you are, ask
51:39
yourself, why am I in that group?
51:41
This is the very first question. Then
51:44
ask yourself, okay,
51:46
what is my
51:48
skills? What is
51:50
about my knowledge?
51:53
What is my
51:55
abilities, resources that will
51:57
make me a leader in that group?
51:59
Because the leaders will make the money. Sorry
52:03
for saying that. I'm very,
52:05
very, very bad person right now.
52:08
I'm not saying if it was good or
52:10
not good to have vaccination. I'm not saying
52:12
if it was good or not good to
52:14
purchase to be the very first person
52:16
to reconnect it all day to the internet.
52:19
The people who led
52:21
each approach made the
52:23
money become a leader
52:25
because the majority are
52:27
people who led by
52:29
others and paid for
52:31
the iPhone and suffers
52:33
from taking or not
52:35
taking vaccination. Again, I don't
52:37
care what it was the rights,
52:39
but I'm talking from how to see
52:41
revolution. Revolution is about leading it. And
52:44
I don't care what you're going to lead in the
52:46
revolution. Some
52:48
people still ride their horses nowadays and
52:51
didn't even pay attention to the industrial
52:53
revolution and they have a happy life.
52:55
I'm not saying what is good or
52:57
bad and they have ranches and they
52:59
will teach you how to ride your
53:01
horse and they make good money of
53:03
it. I'm not saying what is good
53:05
or bad. I'm saying, look at
53:07
yourself. Where am I in this revolution?
53:09
Prediction is a waste of time. Lead
53:11
it. Some people
53:13
talking about cleaning or talking about
53:15
what is going to happen, but
53:17
some people create new reality, be
53:19
part of it. This is
53:21
actually great because we are
53:23
in this is the beginners guide
53:25
to AI podcast meaning it's
53:27
about learning AI it's about connecting
53:30
to AI using it starting
53:32
to use it and Actually, yeah
53:34
starting from with with zero
53:36
to two years. They they will
53:38
be AI native those children.
53:40
This is this is coming. This
53:42
is the next way absolutely.
53:44
I think that the formal educational
53:46
system is going to suffer. more
53:49
than any other industry, because
53:51
it's easy to talk about
53:54
many industries, but think
53:56
about the average teacher, 50
53:58
years old, 55 years old, she's
54:00
good, she's good with kids. And
54:02
now she asked the kids, write me, do
54:05
this and that, they go to chat with you
54:07
right there and send it to her with
54:09
a couple of pictures from Gamma AI, and
54:11
what do you want from us?
54:14
And she needs to learn
54:16
from, these 15, 12
54:18
years old, 12, 15 years
54:20
old kids who knows
54:23
better will do better. The
54:25
toddlers and the babies we are raising
54:27
today. Oh, my
54:29
goodness. I'm afraid. I'm
54:32
afraid of them. Yeah. They're
54:34
going to be quite different than us. Yeah.
54:36
That's crazy. Yeah. As
54:39
the parents, as the parents, I think we
54:41
need to present them the world in a
54:43
way to see it as an opportunities. Yeah.
54:46
Yeah, I just think of
54:48
an Isaac Asimov story where
54:50
it's about teacher and the
54:52
school and the school is
54:54
like the robot that teaches
54:56
each child individually. I
54:59
don't think we will
55:01
need the robots because the
55:03
world, you don't need
55:05
robots, you don't need the...
55:07
entity to teach kids. You
55:09
need to create experiences and
55:11
learning through experiences is going
55:14
to change the learning from
55:16
technology that will provide you
55:18
information. Experiences are going to
55:20
replace because you don't need
55:22
to send your baby to
55:25
kids to school and they
55:27
can go and they
55:29
can learn informative world. the
55:31
informative world, it already exists
55:33
through OpenAI and any other
55:35
competitors, you don't need
55:38
that. Experiences, well, I
55:40
think we need animals and
55:42
human beings to experience the world.
55:44
Ride a horse, playing with
55:46
the cat and talking to a
55:48
human being is going to
55:50
be, and not just talking, you
55:53
know, playing in a mod with
55:55
another human being. I get goosebumps because
55:57
this is the end of the
55:59
story where the girl then says would
56:01
have been so great to be
56:03
in school with other kids and it's
56:06
exactly this. It's exactly nobody will
56:08
be able to take these experiences from
56:10
us whatever is going to happen. Especially
56:14
in the next let's say
56:16
20 years later on I
56:18
don't know. But I would
56:20
like to be part of
56:23
the leaders, because instead of
56:25
blaming or complaining or discussing
56:27
other people's activities, take
56:29
responsibility and do something. And this
56:31
is what I'm trying to do.
56:33
Maybe I will fail, but I've
56:35
tried. Great
56:39
last word for the podcast. The
56:41
only thing is, I mean, where
56:43
can we find you in the internet? Oh,
56:45
my goodness. Elora Baby
56:48
Wellness Monitor can be found on
56:50
Amazon. Elora Baby Wellness Monitor.
56:52
But we have a lovely website
56:54
named Glitter One Dot Care
56:56
over the company's name. And
56:58
over there, besides of
57:00
having a website with an
57:02
ability to purchase online
57:04
the product, And we
57:07
have a lot of
57:09
content that explains a lot
57:11
to parents and explains
57:13
baby care specialists. How can
57:15
they join the game? And
57:18
a lot of knowledge. And
57:20
over there, you can find the
57:22
groups, Facebook groups, we are
57:24
running for parents. Because we understood
57:26
that it's not just about
57:29
the AI and technology. You have
57:31
to create a holistic solution. A
57:33
to Z, that's
57:35
in the journey of
57:38
pregnancy and birth, you
57:40
will find anything you should
57:42
look for and any question you
57:44
might have to have shortcuts
57:46
to the right person in the
57:48
right community to talk to
57:50
and so on. So
57:53
I will put everything in the show notes
57:55
that people can find it because it's great.
57:57
It's a good thing. And yeah,
57:59
thank you for being here. Thank you
58:01
for hosting me. It was so nice to
58:03
talk to you. It was a pleasure
58:05
and I learned a lot. And
58:07
let's see in one or two
58:09
years you may have a product. Every
58:12
couple of months we have another idea
58:14
for another patent. Any couple of months we
58:16
learn something about this world. Oh my
58:18
God, I would love to talk to you
58:20
again. Yeah, then I definitely invite you
58:23
back and we see how things develop then.
58:25
So have a nice day, have a
58:27
lovely day and see you soon on
58:29
the podcast again. Thank you, sir. Thank
58:32
you. Yeah, you know
58:34
that I have a child by now probably
58:36
and yeah, unfortunately, she's too old for
58:38
a lower because I really would love to
58:40
test it as an economist. I love
58:42
to have the data to make data driven
58:44
decisions on what I'm you said quality
58:46
time in a sense of that you have
58:48
really an ability to say what's quality
58:51
time. love the interview and yes
58:53
connect to me look what they have
58:55
there for a product if you have
58:57
a baby that age or a toddler
58:59
look at the product it's really great
59:01
it's really something where they thought about
59:03
I think um yeah so great that
59:05
you listen to the podcast until here
59:07
if you're still here then there's something
59:10
you can do that's go to the
59:12
newsletter page argoberlin .com slash newsletter or
59:14
just click follow on your podcast app
59:16
And it would be great to have
59:18
you in the next podcast again here.
59:20
Bye from Berlin. It's Dietmar
59:22
from Argo. And don't forget
59:24
to quickly check out sensei .io and
59:26
see what a digital replica can
59:28
do for you. You
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More