#67 The AI Race: ChatGPT's New Web Search, Meta’s Llama AI Scaling Efforts & Python 3.13's Upgrades

#67 The AI Race: ChatGPT's New Web Search, Meta’s Llama AI Scaling Efforts & Python 3.13's Upgrades

Released Thursday, 7th November 2024
Good episode? Give it some love!
#67 The AI Race: ChatGPT's New Web Search, Meta’s Llama AI Scaling Efforts & Python 3.13's Upgrades

#67 The AI Race: ChatGPT's New Web Search, Meta’s Llama AI Scaling Efforts & Python 3.13's Upgrades

#67 The AI Race: ChatGPT's New Web Search, Meta’s Llama AI Scaling Efforts & Python 3.13's Upgrades

#67 The AI Race: ChatGPT's New Web Search, Meta’s Llama AI Scaling Efforts & Python 3.13's Upgrades

Thursday, 7th November 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Let's do it .

0:03

You have taste in a

0:05

way that's meaningful to software people .

0:08

Hello , I'm Bill Gates . I

0:13

would recommend TypeScript

0:15

. Yeah , it writes a lot

0:17

of code for me and usually it's slightly

0:19

wrong . I'm reminded , incidentally

0:21

, of Rust here Rust , rust .

0:25

This almost makes me happy that I didn't

0:27

become a supermodel .

0:29

Cooper and Ness . Well

0:31

, I'm sorry guys , I don't know

0:33

what's going on .

0:35

Thank you for the opportunity to speak to you today about

0:37

large neural networks . It's really an honor to be

0:39

here .

0:40

Rust Rust Data Topics Welcome to the Data Topics

0:42

.

0:42

Welcome to the Data Topics Podcast . Rust

0:44

Data Topics . Welcome to the Data Topics Podcast

0:46

. Hello and welcome to Data Topics Unplugged

0:49

, your casual corner of the web where we discuss

0:51

what's new in data every week , from

0:53

llamas to AI licenses

0:55

, everything goes . Check

0:58

us out on YouTube . I don't

1:00

even know where we are else anymore , but we

1:02

do have a video version of this , so feel free to go

1:04

there check us out . We also share

1:06

the screen here and there on the episode , so

1:08

feel free to have a look or have a look at the show

1:10

notes . Feel free to leave a comment

1:13

or question or send us via email

1:15

. We'll try to get back

1:17

to you in a timely manner

1:19

, but no promises there . Today

1:21

is the 4th

1:24

of November of 2024

1:26

. My name is Murillo . I'll be hosting

1:28

you today , joined by the one and only Bart

1:30

, hi . Woohoo , he's back

1:33

. Actually , yeah , but I feel like , for

1:35

the people listening , I'm not sure if no , maybe , yeah

1:37

, it's been two episodes , right that Bart hasn't

1:39

been there . Two released episodes

1:42

, no . Yeah that's true . Yeah , yeah , yeah

1:44

, and a recorded one and

1:47

one of the recording has to be released right indeed

1:49

, indeed , indeed , indeed . So , uh , what's

1:51

your absence ?

1:52

you want to explain yourself , mister I went to

1:54

uh kratz to do a

1:56

uh , so an island in greece to do

1:58

a trail run . Basically nice how ?

2:00

was it it was fun yeah

2:02

, did you beautiful island ? Did

2:04

you ? Um , how do you say your performance

2:07

? No performance . I don't know how did you place in the run

2:09

? I was , uh , eight place

2:11

oh okay , so the

2:13

seventh loser , basically .

2:15

Well , exactly maybe next time , bart just

2:18

kidding , that's kidding .

2:19

How many people ? Eight , no , just nine

2:22

. Just

2:24

kidding . No

2:26

, I'm just kidding , I'm just , I'm just jealous because I couldn't

2:28

do it .

2:29

I couldn't do it , if I'm being honest , but it's really

2:31

nice , like uh , it was uh , like

2:34

you can do a lot of elevation there . Like it goes from

2:36

crap , goes from sea level to

2:38

2500 meters , wow , yeah , I didn't know it was that high actually

2:40

.

2:40

Wow , but it's crazy , like , because you said like , oh , it's really fun , no , what did you say in high

2:42

, actually , Wow , but it's crazy , like , because you said like , oh , it's really fun

2:45

, no . What did you say in the beginning ? How did you describe it ? I said it's really

2:47

nice , it's a beautiful island . No

2:49

, no , no . You said it's really nice because you have a lot of elevation

2:51

, but to me that sounds like the opposite . You know

2:53

, it's like fuck .

2:54

That is true ?

2:54

Yeah , they say that . No , never mind , I'm not going to

2:57

go there . So cool . We

3:06

also had Halloween last week .

3:08

Did you do anything special ? I went

3:11

trick-or-treating with the kids .

3:12

Nice , nice , nice , nice , nice . What about

3:14

you ? We went . It

3:17

was a long weekend , right ? So me

3:19

and my wife , we rented a sleeper

3:21

van . We went camping as well . Oh

3:25

, I heard this . Yeah , you heard . We'll talk later

3:27

about it . It was fun

3:29

. It was fun , it was a different experience , it

3:32

was cool .

3:33

Where did you ?

3:33

go In the German community

3:35

of Belgium . So we didn't go far , because also it

3:37

was the first time . We're like , yeah , let's see how it goes

3:40

. And so we rented a van

3:42

with the black sheep van . So they have

3:44

a lot of different ones and you can just like the sleep prevents

3:46

. They have like a kitchen and some , some of them have

3:48

like a shower and bathroom and stuff . Um

3:51

, so we just rented one and went to the german community . It

3:53

was good weather . It was a bit

3:55

like it didn't rain no , it was

3:57

good weather to camp I thought

4:00

it was a bit cold , but

4:03

I'm from brazil , so what do I know ? But

4:05

yeah , it didn't rain . It didn't rain

4:07

. We were able to do some hikes For November in .

4:09

Belgium , it was good weather .

4:10

Yeah , that's true , that's true , but I think the best day was

4:12

Sunday , when it was sunny , but that's the day we're coming back

4:14

, so it was like okay . But , yeah , it was good it

4:28

was good .

4:29

What do we have for today , bart ? I see here uh gpt search . What is this about ? Uh

4:31

, chat gpt release a new functionality which basically

4:33

is a search functionality , right like how you would

4:35

uh use google . Yeah

4:38

, you can now use chat gpt and you can

4:40

basically like . There's a small icon which you can

4:42

click and you're showing it . It on the screen now

4:44

. It's like a

4:46

globe with search next to it . If you

4:48

click that , it will actually search

4:51

the interwebs instead of just

4:53

relying on its Training

4:56

weights , exactly , yeah

4:58

, what I understand is that they

5:01

leverage Bing quite a lot to do this .

5:04

But it's all under Microsoft umbrella , because Bing

5:06

does that too .

5:07

No , Well , Bing is just a search engine .

5:09

But it does also use LLMs behind and all these things

5:11

.

5:11

Right , yeah well they now also have an

5:13

AI version to that yeah .

5:16

Have you tried it ?

5:18

I tried it over the weekend and

5:21

I guess it works quite kind

5:23

of okay .

5:25

Better than Google .

5:26

Just recap that Google had the like

5:28

things that you would uh , that you , I think

5:31

, where you before you

5:33

had a lot of hallucinations like , if you have a specific

5:35

search , like I want to know uh

5:37

, which , uh which , shops

5:40

are open on Sunday that also

5:42

provide this service , like very specific , yeah

5:44

, right , like you , you save a huge amount of hallucinations

5:46

. Um , and

5:49

now it's just uses actual search

5:51

results to come to uh

5:53

an answer . Yeah , I actually use it . Like

5:55

it was very random , like when I tried it , it

5:57

uh , it was just released and I was looking

5:59

for uh climbing

6:02

areas in the Ardennes that

6:04

had a lot of routes that were suitable for kids

6:06

. So , very specific , right

6:09

. If I would have done this before , I'm sure there

6:11

would be a lot of hallucination .

6:12

And now it's quite okay . Oh

6:14

, great that's good .

6:16

When you go to the page that you're showing now . You

6:18

also have different types

6:20

of widgets .

6:22

Sometimes they show a map and stuff like that but

6:30

I couldn't really reproduce that , to be honest , like so yeah , for the people that are just listening

6:32

, on the announcement page that we'll put on the show notes they have some like tabs kind of , so they

6:34

show different examples for weather and then there's

6:36

like icons for weather stock . So

6:38

you have the classic time series plot

6:40

thingy , sports with

6:42

the , the game schedules , news and

6:45

maps . But you can get it

6:47

to work well with the example

6:49

search I did .

6:50

I asked can you show this on a map ? And didn't work

6:52

. But maybe you need to have a specific prompt

6:54

to to show results on a map , right

6:56

?

6:57

yeah , I think one thing that I I'd like using

6:59

chat gpt's for , almost for like brainstorming . So actually

7:01

when we're camping we're like , okay , what are

7:03

? Like me and my wife , we have two

7:05

dogs . Um , it's a rainy

7:08

day , can you give me 20 ideas of things that we

7:10

can do around here ? And then , yeah , like a

7:12

lot of times it gives some hallucinations . But if I say

7:14

20 , even if 10 are non-hallucinated

7:17

I think it's also good , but a lot of times

7:19

they still say I need to check with the local establishment

7:21

to see if they accept dogs and all these things and

7:23

I think maybe this could be , uh , this should be more

7:25

, should be yeah , yeah . So it's cool

7:27

, really , really cool . Um , maybe

7:31

also one thing I did use chat gpt

7:34

for , while we were on the chat

7:36

gpt topic , um , last

7:40

week I had a Not

7:45

an issue , I guess , it was a bit of an issue . So

7:48

it was about networking stuff , right

7:50

, and I'm not I think I'm not super comfortable

7:52

with networking and all these things . So I just kind of said

7:54

, okay , there's an issue , and then someone I'm

7:56

just putting the conversation here , I'm

7:59

not going to read through it all , but basically

8:01

that was an issue Someone was trying to deploy

8:03

, maybe before that setting the

8:05

scene , someone was trying

8:07

to deploy um two

8:12

applications in one vm , so it's just a vm

8:14

on the cloud . Okay , uh

8:16

, back end and front end , so like past api

8:18

and streamlit , right , um

8:20

, they're using docker compose and when they're

8:23

trying to deploy , both

8:26

addresses were working , so you can

8:28

access the documentation on the FastAPI

8:31

, you could access the streamlit . But whenever they

8:33

tried to talk to each other , something

8:36

was going wrong and just this connection refused , okay

8:39

, and I was

8:41

like , okay , I had to kind of solve

8:43

it and I wasn't sure where to start , right , and I was like Googling stuff and I was like , okay , I had to kind of solve it and I wasn't sure where to start

8:45

, right . And I was like Googling stuff and I was like you know what ? I'm

8:48

just going to ask ChatGPT , so also

8:51

as like a learning tool , right , and

8:53

this is the chat that I was showing before . So then

8:55

the first thing I just kind of said , hey , you're an

8:57

expert , blah , blah , blah , and then it gives

8:59

me a blob about ports

9:01

and addresses and all these things there were some questions

9:04

about , like the port , because sometimes the port was specified

9:06

and sometimes it wasn't . So then it explained a bit

9:08

some things like that . And then here it kind

9:10

of explained exactly what I mentioned here

9:12

with some dummy information

9:14

, right . And then it gave me a whole

9:17

bunch of examples , like , of possible

9:19

issues , right . And the

9:21

first thing they said is the course , cross-origin

9:23

resource sharing configuration , which

9:25

I've heard a few times but I never

9:27

quite understood . And

9:30

then they say firewall , group settings , protocol mismatch , and

9:32

I was like , ok , this is probably not correct , not correct

9:34

, but let's try this one , right . And

9:36

then I try some some things Right , like

9:38

, for example , I try to go in the

9:40

VM and curl the endpoints and I say , well , that's

9:42

working . Does that prove that it's course ? And

9:44

then I said , oh , actually , no right , because

9:47

this only happens on the browser . Blah , blah , blah

9:49

. And

9:54

then they also mentioned to go in the developer tools . I tried that

9:56

. I couldn't see anything . But then I also went on the browser

9:58

and I used the , the yeah , basically the console stuff , right

10:01

, to do a curl request and then

10:03

I got the course error and then I said , does this

10:05

confirm the issue ? And then it was like , yes , this confirms

10:07

the issue . So I was like , okay , now I need to . Okay

10:09

, how can I solve this now ? And then he gave

10:11

me instructions for the fast api . You can

10:13

do this , you can admit aware , blah , blah . And it actually

10:16

worked right and I just

10:18

thought it was a bit of a . It

10:20

was a bit of a mix right , like it did instruct me , quite

10:22

a . But I also had to use a bit of what I already

10:24

knew to try to confirm the issue and

10:26

then take steps there . And

10:28

I was actually super happy . I don't think I would have been able

10:30

to do it without something like Chachapiti to just give

10:33

me ideas , to just bounce back , to just like rubber

10:35

duck my way through it kind of I mean more than rubber

10:37

duck really , you know , yeah

10:46

, I thought like from it was really like wow , like that that's what it was , you

10:48

know , and something that I yeah , like you know , I don't know it was it felt very

10:50

magical . You know , it was just like oh , this

10:53

is super cool , um , and I think

10:55

it's also mixed a bit with like what I knew

10:57

already and trying some different things to confirm

10:59

. And I also think chat , gpt is really good for this

11:01

task that you can kind of verify yourself

11:04

right . And I think coding is a very

11:06

good example of these things because it gives you something and

11:08

you can try and either it will work or either it won't work

11:10

right . And I think for things

11:12

like I don't know if you're sick and

11:14

you ask ChagPT , it's probably not a good example because

11:17

you cannot verify the things that it's saying

11:19

right . So I think there's

11:21

a big use case where we talked about before , about

11:23

brainstorming , like giving ideas , but

11:26

also about these verifiable questions

11:29

, that if it hallucinates , and also

11:31

low stake a bit . So I think a doctor

11:33

is also something that is high stakes For coding . Maybe

11:36

it's lower stake , like if

11:38

ChagPT gives me a command to drop a database

11:41

, maybe I wouldn't just use it , I would do

11:43

some more research . But I think for like low stakes

11:45

thing that you can verify yourself , chatgpt is a really good

11:47

use case for it .

11:48

Yeah , and especially like how you use it here , like very

11:50

iteratively right , Like not just

11:53

generate me the answer , because that is often wrong

11:55

.

11:55

Yeah , indeed , indeed . But I feel like whenever

11:57

you hit a wall , sometimes it's good to just kind

11:59

.

12:00

Yeah , to get a bit of feedback . You would normally

12:02

reach out to a colleague .

12:03

Exactly you can with a lower threshold

12:06

ask something like ChatGPT

12:08

. Faster response as well . Right , If you send a message

12:10

, sometimes you have to wait .

12:12

But I think , like how you

12:14

describe it , a bit rubber ducking approach , that really

12:16

helps . Yeah , that's valuable .

12:17

Indeed , and I think , even

12:20

if the ChatGPT hallucinates and it gives you something way off

12:22

maybe that's some , that's something that is way off

12:24

gives you an idea of something that is more relevant

12:26

. Maybe

12:29

, maybe so , for example , when I was uh

12:31

doing something with rust and then they said

12:33

, so this , this , two traits , and I was like , ah , that actually

12:36

is not right , but maybe I should look more

12:38

into the type system and maybe this and maybe that , you

12:40

know , and actually it gave me a new

12:42

perspective , you know , and I think sometimes , at

12:44

least for me , I get so focused on one thing

12:46

, like , okay , I think that's the problem

12:48

, so how to try to solve this , this and this and this and this , and I

12:50

couldn't get it . And then it's like , well , what if this is not the

12:52

problem ? And that is the problem , and

12:55

having something that just like

12:57

a parrot , that just says something , uh

12:59

, helps me a lot . Okay , yeah

13:02

, yeah , so quite happy with it , not

13:04

, yeah , I think there's other ais , right , like findcom

13:06

as well , that gives more um , because

13:09

it also searches the web , right . So

13:11

that may be better , depending on what you're trying

13:14

to look for , because I also think that this was very

13:16

generic , right , it was like web

13:18

standards and stuff like that . So it's

13:20

only when you use it consciously to

13:23

debug or something it works .

13:24

I think what it doesn't work is when you use it consciously to debug or something it works . I think what it doesn't work is when you

13:26

use it as a , as a bit of a shortcut yeah

13:28

.

13:28

Yeah , yeah , yeah , yeah , yeah , yeah , yeah yeah .

13:29

Yeah , yeah , yeah , yeah , yeah , yeah , yeah , yeah , yeah , yeah , yeah

13:39

, yeah . Something like javascript and html

13:42

intermixed yeah and I just asked gpt

13:44

to to add a feature to it , change

13:47

it , and it didn't work at all , but

13:49

you knew what that ?

13:51

you knew that it wouldn't work when you read it I

13:53

didn't read it , like it was a big component like I just

13:56

tried to copy paste it just to see it just

13:58

as a test to see what it gives .

13:59

Yeah it doesn't work at all , yeah it was way

14:01

off , but then I tried it with cloth and

14:03

it was actually better . Oh really , yeah , I think

14:05

the performance was better , but still not not very

14:07

difficult but I think , as again , like if you

14:09

do this a bit more consciously , not trying to have a

14:11

shortcut , yeah , like it works to to very

14:14

easily adjust something that you're

14:16

not like . I don't , I never write liquid right

14:18

like this helps me to very quickly

14:20

uh make adjustments by by

14:22

interactively asking stuff .

14:24

Yeah , true , I also think that I don't know how

14:26

specific liquid components are

14:28

.

14:28

Well , the problem , I think , with liquid is that there's

14:31

probably not a huge amount of open source

14:33

liquid out there , right , exactly , there

14:35

is on Python .

14:36

Yeah , yeah , there is on Rust , but it's also like what

14:39

is in the training set right

14:41

. Yeah , indeed . Yeah

14:44

, I also wonder these things , things like if I'm asking something super

14:46

specific or something that I feel like it's very new

14:49

, um , maybe

14:52

I don't ask hbt because I know that

14:54

I need the freshness you know , like the

14:56

fresh information , um , but

14:59

yeah , I think indeed , it's kind of like knowing what

15:01

to what , to prompt where , right

15:04

, right , but I

15:06

think it's a valuable tool . But

15:09

, yeah , I wouldn't ask ChatGPT

15:11

for something like Python 3.13 , right , because

15:13

it's very new . Well

15:16

, with search you could do it , right . Ah , that's true

15:18

, with search you could do it , but actually that's what

15:20

I was using findcom for right

15:22

, which is a bit the search , but I think findcom for right , which is a bit the search , but I think

15:24

findcom , I think they brand themselves as like for developers

15:27

.

15:27

Let's see yeah , findcom to me

15:29

is a bit like perplexity , but then

15:31

for developers what do ?

15:33

you mean perplexity for developers

15:35

like perplexity . Yeah , like it's just advanced

15:37

search ah , yeah , yeah , yeah , yeah , but

15:39

that's the thing so here . So I'm just showing the ui

15:41

for findcomcom

15:43

Just put your question here but it says from

15:45

idea to product , but they also

15:48

have like a playground in code , so it does

15:50

feel more catered towards

15:52

developers , right , so that's what I was

15:54

using , but maybe it's like chat , gpt search will

15:56

actually fulfill that need

16:00

now , right , but

16:02

yeah , maybe , as I mentioned

16:04

, python 313 , python 313

16:06

is out since october

16:10

2024

16:12

, so it's been a little while , but I don't

16:14

think we got the opportunity to talk about it . It

16:17

made some noise on the python community for

16:19

two main reasons , I would say the

16:22

free threaded mode and

16:24

the just-in-time compiler . I

16:27

think those two things they made a quite a bit of

16:29

noise . Pep 703 and pep 744

16:31

um

16:34

, why is this a big deal ? Do you should

16:36

I can I throw you under the bus , or should I

16:39

just take a crack and then you can correct me ?

16:42

um , it's , uh

16:44

, it's a big deal because it's probably

16:47

should be a

16:49

performance enhancer the

16:52

free threaded , or probably both . Yeah

16:54

, this should both um , the free threaded is

16:56

. Uh is maybe a bit more specific

16:59

, because python has always had

17:01

a global interpreter lock , which

17:03

means that you're that if you do multi-threading

17:05

, that you're basically limited

17:07

to a single core .

17:10

Yeah , so basically , the global

17:12

interpreter lock basically locks

17:15

objects , right . So if you have a

17:17

value you can think of like a shoebox , right

17:20

, there's something in that shoebox , and

17:22

then if you have multiple processes

17:25

trying to access or change

17:27

what is in that box , then you have an

17:29

issue of what to do . So the lock basically

17:31

says I have that box , no

17:33

one else can touch it . But that also means that

17:36

only one thing can run at a time . So

17:39

Python is known for having

17:42

this issue . Yeah , and

17:44

yeah , like there are workarounds , right , like python's written

17:46

in c .

17:47

Some people write some stuff in other languages that don't

17:49

have this limitation , and but it's not really an issue

17:51

like it's a certain design

17:53

choice which allows that to be a very efficient garbage

17:55

collection and yeah , that's true , that's , um

17:57

, but because of this , indeed , like

17:59

, like , you had this where you can

18:02

, you can discuss whether or not that was true

18:04

threading or not , um , or that

18:06

it was actually single threaded . But , uh

18:08

, when you use multi-threading , it was , it

18:10

was scoped to a single core and now , with free

18:13

threading , you can basically thread across multiple cores . Yes

18:15

, um , and you had ways around this

18:17

with multi-processing and stuff like this , but , uh , this

18:19

is the first time , um , that we can do this

18:22

. That should , uh , especially

18:24

for multi-threaded stuff , it should speed stuff up and

18:27

the , the multi-processing get you mentioned .

18:29

Basically you start different Python processes

18:31

, yeah , so basically each

18:33

process will have its lock , but then

18:35

you just run it in parallel . Basically , exactly

18:38

.

18:39

So this is today an option . It's

18:41

not on by default . You need to enable it and

18:46

it's there . Let's see

18:48

what it gives . Yeah , it gives . Yeah , the big question is

18:51

that there is no guarantees

18:53

for uh , backwards compatibility on

18:55

all the libraries that were not set up , but

18:57

I think the community today has a very good view

19:00

on what the impact will

19:02

be . This is a bit of a let's

19:04

see what it gives . Yeah , the benefits are . Benefits

19:06

are more than downsides ?

19:09

Yeah , indeed , I did hear that

19:11

just because free

19:14

thread and mode is available doesn't mean that your

19:16

code is compatible with it . So

19:19

people need to change their code to keep these

19:21

constraints in mind . And

19:23

I also heard that this is not

19:25

a final thing

19:27

, like it's still experimental , right , but I think

19:30

python has a yearly release

19:32

cycle kind of , so this is still like

19:34

half baked , but it's a bit on purpose

19:36

because they want people to get their hands on early

19:38

. Yeah , file bugs and all these

19:40

things , right , um

19:42

, so , yeah , very cool . I haven't I

19:45

haven't heard a statement , let's

19:47

say , a personal experience with the free thread

19:49

of python , but , uh

19:51

, it looks cool . What about you have you ? Do you know anyone

19:53

that tried this or any first thoughts

19:55

, experiments , disappointments , maybe ?

19:58

I haven't tried it myself , yet and you know anyone that tried

20:00

.

20:00

Do you have you heard any ? Any statements

20:03

? I don't know if a statement is the right word , but , like the for any

20:05

testimonials , um

20:08

, no , no

20:10

, yeah no , yet , aside from the online

20:12

people and

20:15

the just-in-time compiler . What is this , uh

20:17

?

20:19

uh , it does a code optimization just

20:22

in time , and it should make sure that your code

20:24

runs faster for

20:26

certain scenarios . Again , that

20:29

is also

20:31

again not on by default . There

20:33

might be backwards compatibility issues . Let's

20:37

see . Yeah , To

20:40

me these things are most likely . Both

20:42

, by the way , are a bit more suited

20:45

for low-level libraries

20:48

. You're probably not going to get

20:50

much of a performance enhancement from

20:52

your typical use case

20:54

. Your typical use case is not

20:56

a huge optimization algorithm , right ?

20:58

Yeah , yeah , yeah .

20:59

Where this is really key , these last-minute

21:01

enhancements . Yeah , I

21:04

also feel a bit like that If you're building a data pipeline

21:06

, you're probably not going to benefit from this right , but

21:09

maybe the underlying libraries that you use too Indeed

21:11

, but I think that's a bit .

21:12

The Python is a slow language

21:15

, blah , blah blah . But the things that need to

21:17

be fast , people figure out a way already to make it

21:19

fast . I feel Like machine learning is very compute

21:21

hungry , right , but most of

21:23

the libraries underneath underneath they work with c++

21:26

or c , right . So

21:28

, um , yeah , but I

21:30

agree , I mean , I think it's . It's

21:33

not bad , right , like it's not . There's

21:37

no downside . Let's say this

21:39

is experimental , it's an option exactly so

21:41

foster is better indeed , and I

21:43

think it's a bit of an experiment , like people say , like

21:45

let's do this , let's see what gives . Yeah , right , um

21:48

, and yeah , let's see what gives

21:50

. I think typing hints was a bit the same , right ? They just kind

21:52

of put it there and then they all found these like very

21:54

cool use cases for it . So so I'm excited

21:56

, and maybe , uh , just talking about , uh

21:59

, jit compilers , so just in

22:01

time , I think the most famous

22:03

one is a pipe I pipe pi , right

22:05

, um , which

22:07

uh , basically takes your , your code and

22:09

right before it runs it

22:11

will compile to something that is very specific

22:13

. So an example is if you have

22:16

X is one , so

22:18

basically it's a number then the computer

22:20

thinks it needs to allocate certain memory and then afterwards

22:22

it realizes that it's 1.227

22:25

, whatever , and then you have to allocate another place

22:27

. If you can actually scan your code one time

22:29

, you can allocate it once and then it can make it faster

22:31

and all these things . Well , I'm not an expert

22:33

in these things , but that's how I I understand

22:36

them , but , um , so I think it's cool , let's see

22:38

. Let's see what gives as well , because I also think even

22:40

pipe has a lot of trade-offs , right ? So that's what held

22:42

people back from implementing python

22:45

, but uh , now

22:47

it's like an alternative python

22:49

. Yeah , implementation right , indeed

22:52

, indeed , indeed . So yeah , maybe

22:54

I don't want to get too much

22:56

into it , but like python

22:59

, I guess , is like the language syntax , right

23:01

, the way you write code and how you

23:03

understand it . There are different implementations

23:05

that try to comply to this . Pypy

23:08

is one of them , but I'm not sure if it follows

23:10

everything . So I think the core

23:12

is there , but maybe , if I don't know , like

23:14

a Walrus operator , I'm not sure if it's supported

23:17

, right . So there's some different implementations

23:19

in the different languages .

23:20

And I think the typical one that everybody uses

23:22

, if you just get started with Python , is CPython

23:25

, cpython , cpython it's by far the most popular

23:27

and that's written in C .

23:29

But then this PyPy is written in RPython

23:32

, I think , which is something that looks a lot like Python

23:34

itself . There's also a

23:36

Rust implementation of Python . There's a NET

23:38

implementation of Python . There are different ways

23:41

, right ? So

23:45

basically , you write a program that reads a file that

23:47

looks like a Python file and then you , yeah , but that language underneath

23:50

can be different things . Some other

23:52

small things that I came across

23:54

Python 3.13 , what didn't make

23:56

the headlines ? So

23:58

a lot of people made a lot of noise about these things , but

24:01

I thought , well , I'll get to it

24:03

in a bit . Yeah

24:05

, basically there were some changes to the PDB , right

24:08

, which is the debugger thing , right

24:10

? What does PDB

24:12

stand for ? Actually , completely

24:14

forgot , sorry , the

24:16

PDB . I

24:19

know it's for debugging , yeah , but basically there were some

24:21

issues on the REPL that made it nicer

24:23

to work with PDB ShootTail

24:25

, which is something to work with your file systems . There

24:28

was also some fixes there , small concurrencies

24:30

, uh , what I wanted to bring here

24:32

? The new annotation syntax allows

24:34

comprehension , comprehensions and

24:37

lambda , so this is type annotations , okay

24:41

, um , now the

24:44

annotation change that nobody asked for . So if

24:46

, if you go here , you see class name . And

24:49

now in classes you can also add type hints , right

24:52

, so not just functions . And you add it with this brackets

24:55

syntax . Here you

24:59

have the star operator and

25:02

then on the function you actually have a walrus operator thingy . Yeah , actually , I'm not

25:04

sure actually if this is the type hint or is this the type hint

25:06

. Anyways , for

25:09

the people just listening , basically have two classes definition

25:11

nested and then you have a whole bunch of names and a whole

25:14

bunch of stuff with if statements , lambdas

25:16

, walrus operators and all that

25:18

, and apparently this is valid

25:20

python code now in python 313 .

25:23

So and what is the hint that

25:25

it gives me ?

25:26

I have no idea

25:28

, because again , so the

25:32

type hints , they've been relaxed

25:34

to allow comprehensions on Lemtos . Basically

25:36

, and actually the bug ticket is

25:39

exactly this example . They're like , oh , this doesn't

25:41

work . They're like , oh , this is a problem , we

25:43

should fix it . And then they fix it

25:45

. So I thought it was a bit yeah

25:47

.

25:51

I think that's when the but you can basically say if , let's let , because you're just showing an example

25:54

of a class , but if I understand you correctly

25:56

, you can have a function , and that the

25:58

, the output of the function is a , is

26:01

a lambda , I think

26:03

so .

26:03

So a generator , I guess ? Okay

26:06

, I guess , or maybe just to say like , or

26:09

maybe just to say if this , if you give this

26:11

, then I'll give you that , or I'm

26:13

not sure exactly what's the use case , sure , when I would use

26:15

it because I don't know , that's not really

26:17

a type right like , it's something that generates

26:19

yeah , to be honest , I'm not

26:22

sure either . Um , I also came across

26:24

this because of this reddit post as well , and I think they

26:26

also mentioned I don't think I have any news

26:28

for this except typefire , which is pretty sweet

26:30

, but it's a pity that a person read blah . But

26:33

I think it's also a good example of , okay

26:38

, it starts to add more things to your brain , right , like

26:40

to understand this . You're

26:43

probably gonna have more of a headache than if you just didn't

26:45

add types at all , right , but

26:47

uh , yeah , they're making it pretty , pretty

26:49

flexible , so hope it doesn't get as

26:51

far . I don't see this in any code base , but

26:54

it's . It's something that is there today

26:56

, all righty

26:58

, um , maybe

27:00

something quickly as well . That I also saw

27:03

talking about python 3.13

27:05

. We talked about uv a

27:07

lot in the past . Um

27:09

, I saw this

27:11

on linkedin . So

27:14

sebastian ramirez , the guy from fast api

27:16

, um , he mentioned now that uv

27:18

support dependency groups . But what I wanted

27:20

to just highlight is that the , the pep , was accepted

27:23

on the 10th of october and 16

27:25

days afterwards it was available on uv

27:27

. So I

27:31

guess it's like . My

27:33

first thought is like uv is really trying to be

27:35

almost a

27:38

synonym to python standards , right

27:40

? So everything that gets accepted , they

27:43

will make a push to to

27:45

add it there , um , which

27:47

actually I think is a it's . I think it's

27:49

a nice way to go , in a way , you know , like

27:51

if you become the Python standard , then

27:53

I think you're pretty like

27:56

, you're not , like no one's going to accuse you for being too

27:58

opinionated , right , because you're just following the

28:00

community guidelines . Really .

28:02

And what are dependency groups ?

28:04

There's a definition . So I think it's like if

28:06

you have , normally you have the dependencies

28:08

and you have the dev dependencies , but then you

28:10

can have something like test dependencies . You

28:12

can have something like it

28:14

was already there right , but for uv I

28:17

don't think so . And there

28:19

was no standard . So , for example , poetry implemented

28:21

it . Okay , there was no standard there was no

28:23

standard exactly , no , exactly

28:26

finalized .

28:26

So the pep Okay , because

28:29

this

28:31

already existed , then If

28:34

this library with the Postgres

28:37

backend , for example , and instead

28:39

of the Postgres I want to use the DB backend , exactly

28:41

. And now there is a formal

28:44

definition of DB .

28:45

So I think it existed already

28:47

. Indeed , many tools implemented because there

28:49

was a need , but the community didn't agree

28:51

on what to do , right , maybe

28:54

something on that line , and I feel like I'm hijacking

28:56

the whole pod . We'll

28:58

get back to your topics

29:04

. For example , are these log files right

29:06

, requirements of txt and all these things , um

29:09

? But turns out that actually there's no convention

29:11

about this right . Even requirements of txt

29:13

is something that someone just did and everyone just

29:15

kind of did it as well , but it's not a pep

29:18

. It's not like something quote-unquote accepted

29:20

by python , right , um

29:23

? One thing that it was shared in our slack

29:25

, it was . It was a while ago as well . That was was shared

29:28

. There was another PEP and actually let's see if this

29:30

is . It

29:32

was already rejected . File

29:35

format to list dependencies for reproducibility

29:37

for an application . So basically a log file , right

29:40

? Is

29:42

this the one actually superseded

29:45

by no ? Maybe this one yeah

29:47

it's a draft . Basically

29:49

, there's a lot of attempts to make this

29:51

log file well

29:55

, a standard for log file , and

29:57

I think that's the biggest beef I have with UV

29:59

, because they do have a log file but

30:02

that if you use the UV log file

30:05

, you're stuck with UV . If you read the poetry

30:07

, then you're stuck with poetry . If you're stuck with Rhino because read the poetry , then you're stuck

30:09

with poetry . If you're stuck with rhino because it's requirements

30:11

of txt standard , right . But I

30:13

do feel like if you've , if there is a log

30:15

file format and uv adopts it and

30:18

other tools start to adopt it , then I would have no

30:20

reason to not go for uv , for example . That's , that's

30:23

the only , but it's not even uv's fault in

30:25

a way , right , um

30:27

, so yeah , there's a lot of tries for this and

30:30

one thing I thought it was funny in a way is

30:32

that I went through discussions

30:34

, the Python discussions , and they

30:36

actually reached out to Poetry to talk

30:38

about this proposal and Poetry already

30:40

said from the beginning like we're not going to support this because

30:42

the way we've set up our

30:45

tool is too different and we cannot make any changes now

30:47

. So

30:49

, poetry , I think it got really popular because it was one of the first ones , but at the same time

30:52

I think it made it very hard to change because

30:54

they built on top of that and

30:56

I do feel like Poetry is being left

30:59

behind more and more because they don't comply

31:01

to the standards , like the PyProjectoml from

31:03

Poetry . It's not standard , uh

31:05

lock file . They wouldn't be able to adopt it right

31:07

.

31:08

so well

31:11

, it's a , it's a design choice , right , like they want

31:13

to be to remain some somewhat

31:16

backwards compatible . Yes

31:18

, I mean they can say fuck everything . Like from

31:20

now on we do it a different way , right , like they could say that's

31:23

true , like there's also some value

31:25

in the maturity , that there is

31:27

stable stability , right , yeah

31:29

. But I also , most likely

31:31

, if you use uv today for a production project

31:34

and you look and you want to upgrade

31:36

it , uh , next year there's

31:38

probably not going to be an easy upgrade part yeah

31:41

that's true poetry probably will have .

31:44

Yes , that's true , but at the same time . So I

31:46

mean , I fully agree .

31:48

The only thing I have a bit of , and the thing is like

31:50

none of these things matter when it's your personal

31:52

pet project , right ? Yeah , you don't care

31:54

about these things . Yeah , from the moment

31:56

that you build something for a production

31:58

environment , you want to have some stability

32:01

in terms of years , not

32:03

weeks yeah , I think the the

32:05

issue for me is when you take that a bit too far

32:08

.

32:08

So I do think there's some changes . I do think poetry

32:10

should have had a breaking change by now . The

32:13

reason why I say this uh , like

32:15

poetry , they used to have like dependencies and dev dependencies

32:18

and then they also implemented group dependencies , but

32:20

then dev dependencies became a group and

32:23

then I see a lot of pyprojecttomo

32:26

that has two poetry sections one for dash

32:28

dev , dev , dash

32:31

dependencies , and then one dot groups

32:33

, dot dev . Yeah

32:35

, and that's because , like , people update the tool in

32:37

the middle , but like the actual like , I

32:40

guess for me the thing is like they buy , they

32:42

don't want to make it a breaking change , so

32:44

they just kind of keep adding stuff to it .

32:46

But I'm like , I'm not even debating that poetry is good

32:48

or bad . He had

32:51

another packaging tool . But

32:53

there are two sides to this coin and to being fast

32:56

and agile A Pepper's release

32:58

and 16 days later it's

33:00

a new thing . You can question

33:02

how good were these reviews ? How sure

33:04

are we of this implementation ? How clear

33:07

was this implementation ? Is this something that you can do one-to-one

33:09

, or do you need to have some discussions about

33:11

how the standard is actually implemented

33:14

, like there's pros

33:16

and cons to being very fast versus very

33:18

stable ?

33:19

No , that's true . That's true , I guess . For me

33:21

, my main point was I

33:24

think it's good to have backwards compatibility

33:26

, but I also think you

33:28

shouldn't . I think it's good to have backwards compatibility

33:30

, but I also think you shouldn't . I think there should be a limit to it , right ? Like , if you want to change the API , there

33:33

should be a breaking change , and you shouldn't just support

33:35

two different versions of the same API , just

33:37

so you don't make things breaking changes , right

33:40

. So ? But I agree , it's

33:42

always a trade-off . It's

33:45

always a trade-off . What

33:49

else ? What else is new ? Maybe back to the

33:51

ai stuff , bart ?

33:54

what the ice stuff ?

33:54

metal yama training to

33:56

be bigger than ever

33:59

. Is that what you said ?

34:01

oh yeah , there was an article on uh on

34:03

wired the last week

34:05

yes that um said

34:07

. Actually Meta today has the biggest

34:10

, biggest server

34:14

park not sure if that actually directly translates

34:16

to a server park , but at

34:19

least the most amount of resources to

34:21

do training of their Lama model versus

34:23

their competitors , like the most notable , of course

34:26

, being OpenAI . So the resource that Meta today has available is the most notable , of course

34:28

, being open the eye . But so the resource that , uh , the method today as available

34:30

is , as is the most

34:32

um of all the big ones , which

34:36

um questions

34:39

a bit like what ? How does the ecosystem look like

34:42

? I think a year ago everybody

34:45

thought llama was very cool but

34:47

no one really took it really seriously as a competitor

34:50

. Today the performance has become

34:52

really good . You

34:55

see it being used in

34:57

the industry , in the community , a lot because it's

34:59

way more open , you

35:02

can build upon on top of it . So

35:07

I'm wondering what Meta's position

35:09

will be like two years from now I'm also wondering

35:11

what's the end game here ?

35:13

because meta , are

35:15

they profiting from these models , and how

35:17

much ? Because

35:19

the , the models , are open

35:22

source . Well , maybe we can talk a bit about that

35:24

in a bit . But , like the , the weights are available

35:26

and I think , uh , I

35:29

think the only restraint is that , as long as you're

35:31

not competing with meta , you can use them

35:33

for whatever you want , kind of uh

35:37

, well , there are some limitations on there , uh

35:39

, in terms of number of users and stuff like

35:41

that okay , yeah , but it's it's very

35:43

permissive . It's very permissive , yeah

35:45

and um , so

35:48

I'm assuming they don't make a lot of money

35:50

from these models , but if they

35:53

have the largest cpu , gpu

35:55

, rack I'm there's

35:57

probably a lot of investment right , and

36:00

is it just to train the models or is it to also

36:03

use it in the meta products ? Probably

36:05

both right , yeah , probably both I

36:07

like .

36:08

It's . For me it's hard to like they . They

36:10

have a lot of features . I think there is now a model

36:12

. I don't think it's available in europe actually , but in whatsapp

36:14

as well there's . Uh , there is a

36:16

an ai model available . Um

36:18

they we don't know how it

36:21

is being used behind the scenes , for example , to optimize

36:23

ads , stuff like that we don't know . Yeah , meta

36:25

is very much driven on ads as well . Um

36:28

, with

36:30

all the resources they have available , I could see

36:32

them at some point providing a service yeah

36:36

, and a service for their lm . Um

36:39

, let's see , they're

36:41

still also very much on on the

36:44

uh augmented reality

36:46

side .

36:47

I would see something like this generative

36:50

ai also playing a role there yeah

36:53

true , so let's

36:55

um but I think it's like for

36:57

me , this , this

36:59

news , puts me a bit on the edge of my seat , in

37:01

a way , um , because

37:05

I feel like there's something I'm missing . You know , I

37:07

feel like , yeah , maybe they will release some gen

37:09

AI for , uh , augmented

37:12

reality , or , yeah

37:14

, maybe they'll release a proprietary model soon

37:16

, but there's something that I feel like it's , you

37:18

know , something's in motion that I don't see yet .

37:22

Let's see . Let's see . I think

37:24

it's always a bit difficult to a meta Like , like they . I think it's always a bit

37:26

difficult with meta . They did a lot of the VR stuff

37:29

, the AR

37:31

stuff and they invested more versus

37:34

what the result is today . I think that everybody

37:36

is thinking at the same time . If

37:39

anyone is going to pull off AR in the near

37:41

future , it's probably going to be meta .

37:43

Yeah , what do you think of that ?

37:47

Do

37:50

you think there is a future for AR ? I think so . Yeah

37:52

, I think so . I think the question is like how

37:55

will it look like and what kind of devices

37:57

, what kind of peripherals ? But I

37:59

think it's just waiting for

38:01

it to be available .

38:02

And do you think , and what kind of who's

38:05

the classical user of this ? Do you think it's more

38:07

for entertainment ? Do you think there is a

38:09

business

38:12

in terms of working people to use

38:14

? Because I also saw the Apple Vision . You know they had a

38:16

huge screen on the wall and they had this and they had that

38:18

, but where

38:20

do you see that they are fits there

38:23

? Maybe for you ? What are

38:25

the things that you say ? Ai would be a good use case

38:28

for this thing that I'm doing today um

38:32

AR , you mean yeah , uh .

38:37

I don't see myself using it okay today

38:39

. Yeah , especially not with Apple Vision

38:41

. I think Apple Vision is way too it's clunky

38:43

, you know it's too much in my way yeah

38:46

, yeah , yeah um , I think

38:48

it's very cool if you , if you like these type of gadgets

38:50

yeah , yeah , yeah , it's more . It has a very high gadget

38:52

factor . Um , I think with

38:54

the what's called again the glass

38:56

that were introduced very recently by meta the

38:59

, the . It's like the ray-ban thing , no yeah

39:01

, but also another one , um , which is it's

39:04

more or less comparable to apple

39:06

, uh , to apple's apple

39:08

vision pro orion maybe

39:10

yeah yeah , but much

39:13

more uh , accessible

39:15

, uh . At the same time , you can

39:17

. You can debate , like it's not much , much

39:20

, not very far off , on the google glasses

39:22

that there were 10 years ago yeah , yeah

39:25

, indeed , yeah

39:28

to me , I don't know . I think like yeah

39:30

, so maybe I'm showing here on the screen for people

39:32

just listening the orion , uh

39:35

, announcement right I

39:37

think what it wants you to do is to and today

39:39

, the only way that we really

39:41

have to do this is glasses is to add

39:44

information to real life

39:47

, right ? Yeah , like what you do already

39:49

with with if you have an alexa home

39:51

, if you have a smart home , these type of things , like you

39:53

can interact in a digital way

39:55

with your environment . What we , today

39:57

, do not really have yet is like

39:59

have these augmented things ? Like

40:02

we have notifications , we have

40:04

extra information popping up , yeah

40:06

, or glasses

40:08

the way ? I don't know , do I like them to be the way

40:11

? Probably not , like I think it's a privacy horror

40:13

, but I do think

40:15

we were going in that direction . I'm

40:17

not sure how it will look like . Yeah , I

40:19

think , and I think if there's a company that

40:21

today has a huge amount of knowledge invested

40:24

and capital invested in that , it's made yeah , that's

40:26

true .

40:26

I feel like , if the , if this becomes a big deal

40:28

tomorrow , they are the

40:30

leading one , right , right .

40:31

They even changed the name

40:33

company name to to reflect more a bit this

40:36

If tomorrow they can come up

40:38

with contact lenses that look normal , that you don't

40:40

see , and people wear them . I think

40:42

you will see yeah .

40:43

Yeah , yeah . But I also

40:45

think it's interesting because to me , I always associate

40:47

this with gaming . I

40:49

guess , like . So

40:54

when I think of augmented reality , I think , okay , some gaming stuff , like for

40:56

fun , right , that's also what we see today , like when you have the I mean , it's not

40:58

really augmented reality , but like when you have the vr

41:01

headsets , um

41:03

, it's more for like game-like environments

41:05

, right for vr , yeah , and uh

41:08

, yeah , I think they pretty much

41:10

are . They are advertising this very

41:12

much for the everyday user , for someone

41:15

like business . So in the video

41:17

here there's like someone having a call and like floating

41:20

in the middle of the living room , right .

41:21

But if you could have that with contact lenses .

41:24

But I feel like , even if it's like a sleek glasses , I

41:26

could buy that . I mean these glasses right now for

41:29

people that are just listening .

41:30

They're not sleek eh .

41:36

They're not sleek , they're like there's a google glass were better than this . Yeah , they look like uh

41:38

, there was a . I saw on the saturday night live . I think they were comparing this with the , the minions

41:40

. You know , the banana , you know , you saw that

41:43

. Yeah , it's really funny , um , but

41:45

yeah , they're a bit , they're a bit clunky , but this is much better

41:47

than uh , than what we would see three

41:51

years ago . Would you wear

41:53

something like this , alex , because you wear glasses normally

41:55

?

41:56

No , they're too clunky .

41:57

They're too clunky Okay . But

41:59

if you could have , like you're wearing glasses now this

42:04

functionality would be possible in your glasses , but then

42:06

maybe it's just a matter of time , right , but

42:09

that's what I'm saying . This

42:16

is not how it's going to look , look like , but I think , if anyone is ready to do this , it's meta

42:18

, yeah , yeah , but I do think , yeah , because , alex , glasses for people that are just well , it's not on the

42:20

screen , but like you're just uh like a thin

42:22

frame , right , and I think that's the the

42:24

main difference , right , like , uh , the

42:27

frame of the disc glasses are super , super thick . So

42:29

, but yeah , yeah

42:32

, true , true , I guess , uh , to

42:35

be seen what happens I think also with contact

42:38

lenses , would you see where your eyes

42:40

are looking .

42:40

Would it look ?

42:42

weird , it's

42:44

a good question .

42:45

Well , it doesn't exist in contact lens today , so no , but

42:47

I guess because imagine

42:50

the , the contact lenses .

42:51

I guess , like it follows your eyes right like you get stuck

42:53

to your . So I guess it's like because

42:56

right now on the image you saw like oh , the top left

42:58

there's this and the bottom there's this , and

43:00

you're looking here and you're doing there , but I guess if it just

43:02

follows your eyes it will always be fixed right to

43:04

in your view pane . I

43:06

guess it's hard to describe right like what

43:09

you look .

43:10

It would always be the top quarter , this will ever be possible

43:12

, a contact lens , but I think with the orion

43:14

, if I'm not mistaken . I think it projects it on your

43:16

eyes actually , so it's only virtual

43:19

that you're looking in a , and that's just

43:21

for where your eyes look to , how to , to how

43:23

it gets rejected is it ?

43:25

I'm wondering if this is okay for the eyesight if

43:28

you just have stuff projecting your eyes well

43:30

, that's what's happening all day long , I guess yeah

43:33

, but like photons , did you always hear like , oh

43:35

yeah , don't , don't watch tv in the dark because it's bad

43:37

for your eyesight and this and that ? But

43:39

is it that's what ? That's what I heard growing up ? I'm

43:41

not . Uh , that's what our mom stole this

43:43

yeah , right , yeah . So if your

43:46

mom , if you're listening , did you ? Did

43:48

you lie to me , mom ? Okay

43:52

, to be seen , to be seen , I think yeah

43:55

could be very exciting , though I

43:57

could buy into it if it

44:00

didn't look clunky indeed . Maybe

44:02

more on the AI tech , but

44:05

this is from OpenAI , actually not from Meta . I

44:08

saw this . This is experimental . It's called Swarm . Have

44:10

you ever did you come across this at all ? Part no

44:12

, so experimental , slash , educational

44:15

, so I guess not something for to

44:19

be picked up tomorrow , uh , but uh

44:22

, basically it's . Uh , how do you say orchestrating

44:25

agents , handoffs and routines ?

44:26

so basically , oh , it's the agents

44:28

talking against each other .

44:29

Yeah , yeah so this is from

44:32

OpenAI , so I think that's why I made a lot of noise . Openai

44:35

released a package which is not

44:37

just a wrapper around your models . Basically

44:40

, it's so you can create multiple agents

44:42

with different prompts

44:45

and then you can have them interacting

44:47

with each other . So this is supposed to help that . So

44:51

yeah , for example , the

44:54

example they have here , the , is like what's the weather , new york ? And then

44:56

there's a triage system and assistant

44:59

, and then there's

45:01

a function that says , well , transfer to weather assistant , and then

45:03

you'll go to the weather assistant and then the weather assistant

45:05

has a different set of prompts

45:07

and expertise and functions and all these things

45:09

, and it goes to 67 degrees . So

45:11

I guess you can also think if you had a bot in

45:14

the DataRoot's webpage , for example , and

45:16

then the first step is to know what

45:19

kind of question are you asking , so classifying what

45:21

these intents are , and then this is a

45:23

way to do it . So I think it's something

45:25

really cool . I think it's something that is

45:28

much needed , but

45:30

I wouldn't use the they just because they are

45:32

saying that it's experimental and educational , right , so

45:34

maybe just something for fun , but you did

45:36

see something like this .

45:39

I think I read this on Reddit . There

45:41

was a discussion on this .

45:43

Do you know if there are any other quote-unquote competitors

45:45

for this or no ?

45:52

I think the question is a bit about what is the definition of multi-agent

45:54

orchestration . There are definitely multiple frameworks

45:57

that do multi-agent yeah , that's true . I

46:00

think the interaction between those agents is a bit

46:02

the what the discussion is about

46:04

true

46:08

, true , true .

46:09

So just something I wanted to share , because one wise

46:11

man once said that a week keeps the

46:13

mind at peak , um

46:16

, and then , uh , maybe more

46:18

on the ai , or

46:21

maybe rag , to be more specific , pg

46:23

, p , pg rag , uh

46:26

, or pig rag , yeah

46:28

pg rag .

46:29

This this was released

46:31

by Neon , the

46:34

company behind the

46:36

managed Postgres database , with Serverless

46:39

right , that's the big Exactly , yeah , serverless

46:41

. So they have a way to separate out

46:43

storage and compute for Postgres . I

46:46

use it a lot actually , neon , you like it , I

46:48

like it , and

46:50

the reason I know of this is that they sent an update mail

46:52

, I think last week , where they said that

46:55

they launched PG Rack , the PG Rack extension

46:57

, and it's a Postgres

46:59

extension and it allows you to do big

47:03

parts of the Rack pipeline within

47:06

Postgres . So

47:10

you already have some of these things . I think the most notable

47:12

is pg vector . Yeah , but it's just

47:14

a vector database , it's just a way to

47:16

hold vectors , um with pg

47:18

rec . What you can actually do is that you can , uh

47:20

, you can , convert text

47:23

like html , like markdown

47:25

, like these type of things , to regular text . You

47:27

can chunk it , um

47:30

, and then you can . Also , there is a local

47:32

embedding model that they

47:34

use , so you can immediately from that embedded

47:37

in a vector space that

47:39

you can then query , and all

47:42

that in your postgres database so

47:44

it takes a step further from just a vector database

47:46

. It also process documents

47:48

and chunks them exactly and retrieves them based

47:51

on Exactly , yeah when , with just

47:53

the vector database or

47:55

just vector storage , if

47:59

you ? Simplify it like that it's just the storage of those vectors and the querying

48:01

of those vectors . Wow , this is also going to everything

48:05

up to that storage , all from Postgres

48:08

, which is interesting

48:11

.

48:11

Yeah , interesting . Also

48:14

, they're showing some examples on how to use it . You can literally

48:16

just put the prompt here , the query , right

48:18

, what is dot dot dot ? How does it work ? And then you

48:20

can actually , in your select statement , embed

48:24

it for query and then you put this in the query . That's how

48:26

Imagine works . So it

48:29

looks pretty cool . You haven't tried this . I'm assuming I

48:32

haven't tried this . I'm assuming I haven't tried it . No , no , and neon is it

48:34

like ? Is this anyone with the like who can

48:37

use this ? Only if you're using neon , only

48:39

if you're using postgres if you're using

48:41

using postgres . Yeah , this is like a regular postgres

48:43

extension okay , because it's on the neon

48:45

docs , but since open source , I guess it's

48:47

open source . Yeah , is neon open source as well ?

48:50

uh , it's a good question . I know that it's open source

48:52

. Yeah , is Neo open source as well ? That's a good question . I

48:54

know that they have open source parts of

48:56

their implementation . I'm not sure if everything

48:58

is open source , to be honest , or maybe

49:00

are they source open ? Which

49:02

is a bit , I think they're actually open source , but

49:04

I'm not really educated

49:07

on this . It's been a long time

49:09

. I looked into

49:12

it a bit when it was just released . That's why I'm quite sure that big parts

49:14

of it are open source . But really cool

49:16

.

49:16

I'm not sure of the latest status uh , and

49:18

maybe , uh , yeah , I mentioned open

49:20

source versus source open . Um

49:22

, I guess the main

49:24

difference , that way I see and maybe correct me if I'm wrong

49:26

here open source is something that is community

49:29

driven right , so people can contribute

49:31

with code and then it can

49:33

get accepted and can get incorporated in their

49:35

product . Source open is

49:37

more like the code is there but

49:39

you can see , but it's

49:41

not like you're not going to interact with it necessarily . You

49:43

can report bugs , but who

49:46

builds it and maintains it is more the

49:48

. Is that a good ? I think it's a bit of a simplification

49:50

. I think source available just means that you can maintains is more the . Is that is that a good ?

49:51

uh , I think it's a bit of a simplification , I think source available just means that

49:53

you can look at the source , yeah , and

49:56

maybe you can do other stuff with it . Open source

49:58

is a definition , um

50:01

, it really depends on the license

50:03

that you're using , and it's maybe a good

50:05

segue uh into the open

50:07

source initiative , which is , I think , the most

50:10

well-known organization that has has

50:13

built a number of these uh of

50:15

these open source licenses and

50:18

manages them , has discourse on

50:20

them to to let it evolve , and

50:22

last week they actually uh , after , I

50:24

think , more than a year of uh of

50:26

uh research , they uh

50:29

released version 1.0

50:31

of the open source ai license

50:33

so maybe , why

50:35

is this a big deal and why is this different

50:37

from just regular open source ?

50:39

what's the tricky thing ?

50:41

regular open source . What we're typically talking about

50:43

is source code , so just

50:45

written lines

50:47

of source code , which is different

50:50

from ai models , because an ai model typically

50:53

has it has source code , it

50:55

also has data , it has trained weights

50:57

. Um , it's

50:59

much more , let's

51:01

say , just with with just the source code , you

51:03

can't really talk and speak about a model because

51:06

it needs to be trained , etc . Yeah , and

51:08

there is a lot of chatter in the in the community

51:10

about what our open source models and whatnot . I

51:13

think Lama very much states that

51:15

it's open source .

51:17

But I think they play a bit because , again , this

51:19

is the first , as I know , the first

51:21

open source AI definition , exactly

51:24

, and I think people before , because there was no third

51:27

party that was saying what is open source was

51:29

not .

51:29

It was a bit of like anyone

51:37

can say it's open source because yeah , exactly , yeah , exactly . Well , and to be honest

51:39

, like it's not . That uh , osi , the open source need should have as really like a legal standing and

51:41

like , if you , if you use the definition wrong , you're

51:43

gonna , we're gonna sue you . They don't do this

51:45

right , but they , they try

51:47

to uh create a community-wide

51:51

consensus about what is open source and it has

51:53

been used .

51:55

These open source licenses have been used in legal

51:57

settings as well .

51:59

Yeah , yeah , that definitely has , so it

52:01

has like a legal .

52:02

it is a legal instrument as well .

52:03

It's like it's not just From the moment that you use a certain

52:05

license , there is a legal

52:08

meaning to it .

52:10

Yes .

52:10

But it's not because I say what

52:13

I do is open source yeah , yeah , yeah that

52:16

you can say I have it or not . And from the moment that I

52:18

say my code is GPL , yeah

52:20

, and I do something that is not GPL , yeah , that is

52:22

not okay , right .

52:23

Indeed so . But just to say , like the , like

52:25

these licenses they

52:36

show , it's like there is a very legal , practical , exactly . There's an implication , there's a consequences

52:38

. It has to be thought through like the , the , even the hashi corp that we talked , like the open tofu hashi corp , they , they changed licenses

52:41

in the way that the stores were still available , but they're restricting some things and there was like

52:43

very big legal implications from

52:45

the moment that you , let's say , implement the licenses

52:47

, which is what hashi corp did I think it was was g , it

52:49

, gpl . I'm not sure .

52:51

Yeah , I'm not sure , but from

52:54

the moment that you implement it , there is , there is an implication

52:57

, like it has implications , right , but

52:59

from the moment that you just like you don't care , but

53:05

, you just . So

53:08

this is uh , this is interesting in the sense that

53:10

, uh , I think , when just browsing

53:13

through this , I don't think llama is open source can

53:17

we have the hot , hot , hot hot ? I don't know if it's really

53:19

hot

53:24

, but , um , I think , uh

53:26

, the uh , the

53:31

interesting things about this is that it's

53:33

it says also

53:35

way on how to . So you

53:37

don't need to . You need to publish how

53:40

you trained it . I read that

53:42

as the source code . Okay

53:44

, you need to . You need to be

53:46

very transparent on what data

53:49

you used , how you acquired the

53:51

data . If there are ways

53:54

to acquire the data , either for free or

53:56

paid , okay , so

53:58

that , even if you don't have access to the data

54:00

set , that you could reproduce it

54:02

with the right amount of efforts and

54:09

that typically , like these open source models , they

54:11

have also their weights are included

54:14

.

54:14

Yeah , what they mentioned here is parameters , right .

54:16

Yeah , um , there

54:21

are some the definition and I'm don't have it exactly

54:23

fresh in my my my head , but the

54:25

it says that users can

54:28

freely use and reuse

54:30

this without specific , uh

54:32

, limitations . I think the

54:34

limitation that Lama defines

54:37

and again I don't know exactly how

54:39

it is anymore , but it says something about from

54:41

a certain amount of users , you can't

54:44

use this anymore or you need to inquire

54:46

. That is very

54:48

bit of arbitrary . That

54:50

is not something like . You need to show

54:53

that you use this as a base right .

54:54

Yeah .

54:55

Then still everybody can use this . Yeah , but this

54:57

is very arbitrary , like from that moment on what , you can't

54:59

use this anymore or you need to ask for permission

55:01

. Yeah , I see I think that falls out

55:03

of the scope of this open source definition

55:06

.

55:08

This is really cool . I

55:17

think it's much needed . I also think you mentioned that they've been for one year . You said they've

55:19

been uh , something like that , yeah , which I also think is good . I think if you have

55:21

something like this , like this is uh well , I think you need to take time

55:24

to to study and see how it is and

55:26

release before releasing something

55:28

right , this is kind of like it is

55:30

a one-way door in a way . Right , like , once

55:32

you release there , you create expectations

55:34

and you have this to kind of go back or modify . This

55:37

may not be as simple , because people are going to build

55:39

on top of this . So I also think it's it's

55:41

nice to hear that they it was a well

55:43

thought , yeah , you know , decision , and

55:46

I do think it's much needed , right . Like you said , we we

55:48

see a lot of stuff from models and all these things . I think

55:50

there are a lot of people that want to do as

55:52

well , but they don't know exactly what to do . Um

55:54

, how to say like , yeah , this is proper

55:56

open source and this and that kind of agree .

55:58

So , even if this is a proper , proper

56:00

way to educate the community on what

56:02

is indeed , and what not

56:04

?

56:04

and also why not , indeed , indeed

56:07

and I think , even if it's , even if people don't

56:09

fully agree , at least it gives

56:11

a common ground to discuss these things Exactly and

56:13

that you can evolve from this . Indeed

56:16

, indeed , really cool , really

56:18

cool . Maybe . I also saw here that we

56:21

can endorse the open source AI definition . So

56:25

look at that . We have quite a lot of companies

56:27

here . Probable . This

56:30

is the company now is behind Scikit-Learn

56:32

, ricardo Libre , big

56:35

retailer in South America , bloomberg

56:37

, mozilla . Really

56:41

cool , really cool . All

56:44

right , and now maybe to something a bit lighter

56:48

topic , maybe still

56:50

in the food for thought .

56:56

A lighter topic maybe um

56:59

, still in the food for thought . Someone used v0 by vercel and has some

57:01

thoughts to share . Uh , yeah , just think

57:03

it's cool . Okay , all right , thanks everyone . V0 by vercel is

57:05

uh , it's like their geni , like a

57:08

geni tool by vercel that allows you to

57:10

, with prompts , it generates

57:12

basically web sort

57:14

of web components for you . Yeah , I

57:18

think it leans very much towards React

57:20

components and

57:23

I actually , to be honest , I thought that you can only

57:25

do React components . Maybe I can share

57:27

my screen actually . Okay , there we go , I'll let

57:29

you do it if that's possible . I

57:32

thought you could only do React components and

57:34

I was playing with this over the weekend and you can actually

57:36

, but it's less performant . Like you can also say no

57:40

, this UI find the React

57:42

component looks cool , but please implement this with

57:45

HTMX .

57:48

And it does that to some extent as well , but it's probably because it probably uses

57:50

a base model that has some knowledge of HTMX , but it's

57:52

probably fine-tuned on the . Because Vercel

57:55

is the company behind Nextjs . Yeah

57:57

, which is also like . I

58:00

don't know if they're built on top of React . It's

58:03

built on top of React . Built on top of React .

58:05

So we're looking at a page now and it says

58:07

what can I help you ship and maybe

58:12

generate a

58:15

login page specifically

58:18

geared towards

58:21

a Brazilian

58:23

guy , you were going to say this . I was just

58:25

waiting , called Murilo , make

58:34

it super fancy and sleek . I

58:37

haven't tried this yet , so Is this free

58:40

or paid ? Well , what I'm using now is free . Okay

58:42

, I haven't like Brazilian

58:48

login , so it's generating . Bem-vindo

58:50

, faça login para continuar . Can we get an applause ? So it's generating

58:52

. You see , benvido Murilo Fasa login

58:55

para continuar .

58:55

Can we get an applause ? That was pretty good .

59:01

Yes , and then the email already

59:03

holds like a placeholder muriloexemplarcom

59:06

. There's a password place . There's an enter

59:08

button . How do you know it's a password ? It's

59:10

an enter button . How do you ?

59:12

know it's a password , it's just a senha . How

59:14

do you know what that means ?

59:16

It's I get it from the context .

59:20

It's very yellow with green .

59:22

Yeah , yeah , yeah .

59:24

It looks very Brazilian . Right , it looks very Brazilian

59:26

and the Portuguese is perfect . Bem-vindo

59:29

, faça login .

59:30

And if you look just as the aesthetics

59:32

, it looks really nice , right yeah ?

59:33

the aesthetics , but it does look very . I

59:36

mean , I guess it's very modern yeah yeah , yeah , that's what

59:38

. I was going to say . I feel like the style is very

59:40

much like Vercel , which is the modern style

59:42

. Right , like that's how you see the borders

59:44

are round , there's a bit of shadow

59:46

, like on the background . Right

59:49

Like the font , it looks sleek , it

59:51

looks really cool . I think it also it's TypeScript

59:53

.

59:54

It's TypeScript . Well , again , this

59:56

is a TSX file so it's TypeScript

59:58

, but if I ask it to just

1:00:00

do it in JSX , it will do it .

1:00:02

It will do it , I think , if you look

1:00:04

at the underlying code , you would say yeah .

1:00:07

If you look at the underlying code and the locations

1:00:10

of that stuff , it looks a bit else to

1:00:12

be trained on Shed's CDN , which

1:00:17

is a typical , very modern , sleek-looking component

1:00:20

library and it has the same bit of principles , right

1:00:22

Like the Shed's CDN .

1:00:24

from what I remember is a bit like that copy-paste

1:00:26

philosophy which is kind of what they're encouraging

1:00:28

you to do here .

1:00:29

I'll make it even better , make it even

1:00:31

more sleek

1:00:33

brazilian . Wow , it's gonna be

1:00:35

like samba football like

1:00:37

the brazilian times

1:00:40

10 , let's

1:00:43

see what it is and now it starts

1:00:45

making this very brazilian let's

1:00:48

see what it does first .

1:00:51

But uh , one thing I also like I think is

1:00:53

interesting like the , the web page is

1:00:55

in portuguese , yeah , but , um

1:00:57

, the logging is all in english as well . So

1:00:59

I also , because one time I was writing something for

1:01:02

like portuguese , and

1:01:04

hola morello , ah

1:01:11

, this is so funny . You know , jogadores , no

1:01:13

, it's like a football player , like it's

1:01:15

just player , right , but like e-mail , do jogador is

1:01:17

like . And

1:01:19

then murilo at canarinho canarinho

1:01:22

is like a canary , which is like a bird , which is the

1:01:24

mascot of the brazilian national football

1:01:26

team . So it's like and

1:01:28

then it says in the bottom novo , no samba

1:01:30

.

1:01:31

Like new at samba , join us

1:01:33

at blah blah blah , there's actually a palm tree , but you don't see

1:01:35

it . With the overlay , we can hold

1:01:37

on .

1:01:37

We can change this . Can we take

1:01:39

this ? Hold on

1:01:41

we can do this .

1:01:43

Let's remove this and

1:01:46

there's a small palm tree . Yeah , jumping .

1:01:49

There's a bit of sun on the top left , so if you do

1:01:51

something .

1:01:52

Brazilian time 10 times 10 , you

1:01:54

end up with football players . Yes it's

1:01:57

no yeah that's the conclusion of today it

1:01:59

just becomes more stereotypical .

1:02:00

Right , it talks about samba about

1:02:04

football just to dance , you know ? So

1:02:07

, yeah , so it becomes more stereotypical

1:02:09

, but yeah .

1:02:11

You want to one-up this ?

1:02:13

Let's see how racist you get .

1:02:14

Make it a thousand times

1:02:17

more Brazilian

1:02:19

. I

1:02:25

wonder if , at one point , you would just say but would

1:02:27

you be more comfortable filling this in versus a regular

1:02:30

login form ? For sure , Credit

1:02:32

card yeah , I'll do it all If you do A-B testing .

1:02:34

Yeah for sure Interesting . I

1:02:36

like how they said import coffee , sun , palm

1:02:38

tree , music , umbrella , flag feather , so

1:02:42

you can already tell what it's going to be . Oh

1:02:44

wow , it's

1:02:46

a lot of animations .

1:02:50

Yeah , they have a lot of animations also

1:02:52

I see they

1:02:55

went with carnival .

1:02:55

Now what's the feather thing does ? It doesn't mean

1:02:58

anything , I'm not sure actually . Ah

1:03:00

, email the sambista . So usually you dance samba

1:03:02

at carnival . So that's why , like within the

1:03:04

carnival , sambista folia

1:03:07

is all like kaina folia . Wow , this is

1:03:09

a novo , no bloco . Oh

1:03:11

, this is funny . So it means like new at the block , because

1:03:13

usually in carnival you go like in little

1:03:15

blocks , you know . So it's like this

1:03:18

is this is great .

1:03:19

There's a football , football

1:03:22

in the background

1:03:24

as well but aside from I mean aside

1:03:26

from the Brazilians , it looks good , right , it

1:03:28

looks good like with minimal effort it looks good yeah

1:03:30

that's what I really liked .

1:03:31

I didn't expect it to be so mature

1:03:34

in terms of uh this

1:03:36

almost makes me feel like

1:03:39

I could do something like this , right . But

1:03:41

I'm also wondering how , because I guess like

1:03:43

yeah , it's the same thing when you take a template and try

1:03:45

to modify it , right like as

1:03:47

soon as you try to get the two things to interact .

1:03:49

I don't know how easy it will be to

1:03:51

put everything together , to me the challenge with

1:03:53

Gen AI when it comes to visual stuff and

1:03:56

it would be a good test to do this here as well but

1:03:58

especially with images and videos and stuff , it's like

1:04:00

to generate something in the same style

1:04:02

, so you have this consistency . I think that

1:04:04

is very hard If you now say , do a login page

1:04:06

and then do a homepage and then do that page a

1:04:13

home page and then do that page .

1:04:14

What did ?

1:04:14

you pass in as a input . Can you like say well with images and maybe with with something

1:04:16

like v0 . It would be easier , because a lot of this visual aspect

1:04:18

is expressed in code exactly , so it's easier

1:04:20

to say like following this , this style

1:04:22

probably . Game

1:04:27

style make a calendar

1:04:30

application

1:04:35

. Let's try

1:04:37

it . I

1:04:39

hope I don't reach my limits , my

1:04:42

credit limit .

1:04:43

But it's good , it's also free . You said right .

1:04:45

Well , what you're seeing now is all free

1:04:48

and it's super easy to use . It's

1:04:50

like literally just copy this , this tsx

1:04:52

file , here , um and when , when

1:04:54

it it also generates . When I , when I was doing

1:04:56

it over the weekend with hd mix , it generates

1:04:58

all the files , so it also generates

1:05:00

multiple files that you can all copy

1:05:02

paste . This is really cool yeah

1:05:05

, um now we see the calendar

1:05:07

page . The same type of looks very similar

1:05:09

, same type of . Also a lot of animations

1:05:12

, the same background very brazilian

1:05:14

from

1:05:16

the yeah it's actually quite , uh , quite consistent

1:05:18

styling oh wow , look at that .

1:05:20

They have the , the . It

1:05:22

looks great , huh , cool . Yeah

1:05:24

. Well , I'm not sure I'll make the same stylistic

1:05:27

choices , but but like

1:05:29

it's very impressive as a as a product , huh

1:05:31

yeah so , yeah , I was

1:05:33

excited about it when I tried it . No , this is really

1:05:35

cool . This is really cool . Maybe I'll well

1:05:37

, I don't do a lot of things like this , but , uh

1:05:39

, I ever have a use . Maybe I'll give it a try , see how

1:05:41

far I can get I'll

1:05:44

stop the sharing yeah , do you think

1:05:46

there's a danger of doing things like this because , uh

1:05:48

, you don't understand the code necessarily .

1:05:50

And then I think not more than any other

1:05:52

, ai supported code generation

1:05:55

yeah to be honest okay no

1:05:58

, but this is uh , this is really

1:06:00

cool .

1:06:01

I gotta I'm gonna jump on that chip I think

1:06:03

the possibility even , but like you said

1:06:05

, it's not specifically with uis like

1:06:07

you can generate very

1:06:09

shitty functional code now without

1:06:11

knowing how you're doing it and without being opinionated

1:06:14

on how stuff needs to be done at all right , yeah , and

1:06:16

I think also like this , for ui is like

1:06:18

yeah , if it looks well on the ui , it's

1:06:20

fine , right , yeah , and then it's

1:06:22

a mess underneath , it's a bit uh , yeah

1:06:25

, I feel like there's always a bit of a danger , right , but yeah

1:06:28

, that's , that's , that's the truth , for I

1:06:30

think the challenge is very much like with very limited

1:06:32

effort , you can make something that appears

1:06:34

very functional , yeah

1:06:36

, which might not do what you're expecting

1:06:38

yeah , I think the moment

1:06:40

there's a bug or you need to change something before

1:06:42

if you didn't have the knowledge , making something that appears

1:06:45

to be very functional was very hard , yeah

1:06:47

, so that barrier was completely removed so

1:06:49

.

1:06:49

So I guess from yeah , yeah , but

1:06:52

I do think one use for chat GPT is

1:06:54

like when I say I don't know , write a recursive

1:06:56

function of this X and Y with this type of elements

1:06:58

and I kind of know what I want to do , but

1:07:00

it's just a matter of saving the time for me to type it . So

1:07:03

if you know what you want to do , I feel like for this maybe it's

1:07:05

too big if you know

1:07:07

what you kind of wanted to do and you can read the code and

1:07:10

you understand and you can make changes , I

1:07:12

think that's also valid . I think the issue is when you don't

1:07:15

really know what's happening

1:07:17

underneath and

1:07:19

then it's almost like a house of cards , right , like if

1:07:21

something , if you need to change one thing later

1:07:24

, if there is a bug , or if you there , I don't know then

1:07:26

kind of everything falls on you , right

1:07:28

. So , but there are there , it's a valid

1:07:30

, uh , there are valid use cases for

1:07:33

it . All righty

1:07:35

, but really cool . I feel

1:07:37

like uh covered quite a lot of stuff today

1:07:39

. We did have a few

1:07:41

more topics , but we can leave it for

1:07:43

next week as well . We

1:07:46

also have a long weekend ahead of us ahead of us , yeah , right monday is also

1:07:48

holiday , oh wow . And then you realized again a long weekend . We just came from long week , I know . That's

1:07:50

why I think a ahead of us , ahead of us .

1:07:51

Yeah right , monday is also holiday . Oh wow , I didn't even realize it . Again

1:07:53

a long weekend . We just came from long week I

1:07:55

know .

1:07:55

That's why I think a lot of people are taking this week as

1:07:57

a holiday , because then you have friday , saturday

1:07:59

, sunday , and then you take this week off , and then you have saturday

1:08:02

, sunday , monday off . Why do you tell ? Me this now

1:08:04

I'll

1:08:07

tell you earlier next time . I'm so sorry

1:08:09

, so

1:08:14

I'm assuming you don't have any plans .

1:08:16

No , not really no .

1:08:18

Okay , what about you , alex ? Anything

1:08:21

? No , I also

1:08:23

forgot . Okay , yeah , I didn't know

1:08:25

either . To be honest , I forgot I

1:08:28

was filling in my holidays and then I saw like , oh yeah

1:08:30

, a lot of people are taking holidays . Oh yeah , okay , there's a long

1:08:32

weekend , long weekend , that's smart . But

1:08:35

yeah , I also feel like in the end of the year , cause

1:08:37

you have to use your holidays in

1:08:40

the year cycle right In Belgium

1:08:42

. So I also feel like either

1:08:44

you plan well and then you allocate this time

1:08:46

, or you just didn't use holidays throughout the

1:08:48

year , and then you have to use it at some point . Oh yeah , I can

1:08:50

use it here , or you used it before and then you realize

1:08:52

you don't have enough . So it's a bit so

1:08:55

, okay , but cool , then

1:08:57

I guess I don't have anything planned either , but

1:09:00

I didn't . Well , I kind of knew about this , but

1:09:03

I think that's

1:09:05

it . I think we can call it a pod . Stay

1:09:07

warm everyone . Stay

1:09:09

warm everyone . Thanks , you have taste . See you next week In

1:09:11

a way that's meaningful to self-improvement Next week . We have to think while we're

1:09:13

recording . Hello , good morning

1:09:15

sir . I'm Bill Gates . I just

1:09:18

talked about it .

1:09:19

I would recommend TypeScript . Yeah

1:09:22

, it writes a lot of code

1:09:24

for me and usually it's slightly wrong

1:09:26

. I'm reminded , incidentally , of

1:09:28

Rust here , rust , rust , this , I'm reminded

1:09:31

it's a rust here Rust , rust .

1:09:32

This almost makes me happy that I didn't

1:09:34

become a supermodel .

1:09:35

Cooper and .

1:09:37

Netties .

1:09:38

Well , I'm sorry guys , I don't know

1:09:40

what's going on .

1:09:41

Thank you for the opportunity to speak to you today about

1:09:44

large neural networks . It's really an honor to be

1:09:46

here .

1:09:46

Rust Rust Data topics . Welcome to the

1:09:48

data . Welcome to the data topics podcast

1:09:51

.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features