#75 Developer Productivity in 2025: AI Replaces Engineers, Biden’s AI Chip Regulations, UV’s Killer Feature, and Doom in a PDF

#75 Developer Productivity in 2025: AI Replaces Engineers, Biden’s AI Chip Regulations, UV’s Killer Feature, and Doom in a PDF

Released Thursday, 16th January 2025
Good episode? Give it some love!
#75 Developer Productivity in 2025: AI Replaces Engineers, Biden’s AI Chip Regulations, UV’s Killer Feature, and Doom in a PDF

#75 Developer Productivity in 2025: AI Replaces Engineers, Biden’s AI Chip Regulations, UV’s Killer Feature, and Doom in a PDF

#75 Developer Productivity in 2025: AI Replaces Engineers, Biden’s AI Chip Regulations, UV’s Killer Feature, and Doom in a PDF

#75 Developer Productivity in 2025: AI Replaces Engineers, Biden’s AI Chip Regulations, UV’s Killer Feature, and Doom in a PDF

Thursday, 16th January 2025
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

This is how we do it . You

0:04

have taste In a way

0:06

that's meaningful to software people .

0:09

Hello , I'm Bill Gates . I

0:15

would recommend TypeScript . Yeah

0:17

, it writes a lot of code

0:19

for me and usually it's slightly wrong

0:21

. I'm reminded , incidentally , of Rust here . Rust

0:23

, and usually it's slightly wrong . I'm reminded , it's a rust

0:26

here Rust .

0:26

This almost makes me happy that I didn't

0:29

become a supermodel .

0:30

Cooper and Nettix . Well

0:33

, I'm sorry guys , I don't know

0:35

what's going on .

0:36

Thank you for the opportunity to speak to you today about

0:39

large neural networks . It's really an honor to be

0:41

here .

0:41

Rust Data topics . Welcome to the

0:44

data . Welcome to the here . Rust , rust , rust , rust . Data Topics . Welcome to the Data Topics .

0:46

Welcome to the Data Topics podcast Doom . Hello

0:48

and welcome to Data Topics Unplugged

0:50

, your casual corner of the web where we

0:52

discuss what's new in data every week

0:55

. From doom to sins

0:57

, everything goes . A

0:59

very dark episode . Check

1:02

us on YouTube . Feel

1:07

free to leave a comment um linkedin all the works , or talk to us via email at

1:09

data roots that data topics at dataio . Today

1:11

is the 14th

1:14

of january of 2024 . My name is morillo

1:16

. I'll be hosting you today , together with bart

1:19

hi and alex

1:21

behind the scenes making everything happen . She's

1:23

waiting . No , she's not really waving today , but uh , she's

1:25

waving now . There we go . One day

1:27

, alex will join us on the

1:29

pod one day . Um

1:32

, how are you doing , bart ? Good

1:34

, yeah , how was your weekend ?

1:37

um quiet , quiet

1:39

, quiet is good , quite as good . Yeah

1:41

, played in the snow oh yeah we don't often have

1:43

snow here but it's not a lot .

1:45

I feel like , um , because I went to

1:47

bed it was snowing . I woke up it was knowing there was like a

1:49

good layer of snow , which is

1:51

, uh , not very common . But actually last year

1:53

the same thing happened like around the same time . I

1:56

know because I went to , uh , tenerife

1:58

or gran canaria

2:00

, something like that better year . Yeah , but

2:02

like I remember , remember , I put the Christmas tree the day before

2:05

and I think it snowed so much that the

2:07

tree fell . There was a pile of snow or whatever

2:09

. They didn't collect my tree , like the Christmas

2:11

tree .

2:12

Oh , like that , yeah , yeah , you put it in the garbage

2:14

outside .

2:15

Yeah , they put it outside for them to collect . There's in Belgium

2:17

for people that are not aware there

2:20

, come and collect your christmas tree , and

2:22

this was actually yesterday , or

2:24

in my neighborhood , um , so that's

2:26

why I kind of know the timing for the snow . It

2:28

was a bit the same , because I remember it's not a lot when we're leaving

2:30

, so so , yeah , but

2:33

yeah also the . I also noticed that these are getting the

2:36

sunlight time . It's already

2:38

getting longer and , uh , still very

2:40

cold , though it's good as the sun is getting

2:43

longer , like the daylight is getting , because I wake up in the morning

2:45

and it's like , oh , it's light . And it's

2:47

like , oh , this is . It's like

2:49

it's a little thing in life , you know . It's like , okay , it's

2:51

going to be a good day , oh , yeah

2:53

, cool . Last

2:56

weekend , and maybe already segueing to maybe

2:58

the first topic , there

3:09

was the brussels motor show . It started last friday . It went

3:11

over the weekend . Um , in my project , uh , we've also the car expo

3:13

brussels , the car expo yeah so , yeah , it's an auto show , so different

3:16

brands they come they . They showed their latest

3:18

models , their prototypes , to prove concepts

3:21

and all these things . There are some games as well

3:23

, uh , and apparently

3:25

I didn't notice . But uh , quite big

3:28

these days in the european car

3:31

market yeah , I heard because , like

3:33

it's not , like not all , not a lot of places they

3:35

have all the brands coming together

3:37

it used to be geneva .

3:39

There was also a car uh , exposition uh

3:42

, which was really a bit like also the networking event

3:44

where all the ceos of the car brands went and etc

3:46

, etc , but it didn't survive , covet

3:48

ah , really yeah , and the

3:51

brussels car expo did , and so it's becoming a bit the de

3:53

facto new event of the year

3:55

for , uh , car manufacturers in europe .

3:57

Yeah , yeah so there are some interesting things

3:59

. So I went there yesterday so I'm

4:01

working on a project for an ai configurator

4:03

. I'll get into that in a bit . I

4:07

learned some things . So some people actually try , like they

4:09

buy the cars on the spot , like

4:11

they just see , like they see the car , they go in and they're like , okay

4:13

, I want this Because they also have , like , better deals

4:16

, you get a lot of discounts , right .

4:17

Yeah .

4:18

And sometimes people they say like , oh , this

4:20

model , because , yeah , a

4:22

lot of people go in , they open , blah , blah , they touch the

4:24

stuff so they also can get another discount on

4:26

top of that . So they all there are these things

4:28

that happen . Tesla was there with the Cybertruck

4:31

and the Cybercab . Oh yeah , it was there . Yeah , it was there

4:33

it's not free , legally right ? no

4:35

, so they didn't even have so . For

4:37

some cars they actually had an iPad , which probably had you

4:40

can click and interact . For the cyber truck

4:42

it was just a paper , just with some information

4:44

, some specifics and um

4:47

, have you ever seen a

4:49

cyber truck in real life ? Not in

4:51

real life now and

4:53

you've seen like uh in uh images

4:56

and stuff online I've seen one explode next to

4:58

a trump hotel ? Not really no , I didn't

5:00

see that the video no , no , no I didn't see

5:02

filtered fireworks . Yeah , oh well , no , I didn't

5:04

know . Segue , segue , yeah , did

5:07

you like it ? By the way , did you like the design of the cyber

5:09

truck ? I'm

5:11

uh neutral about it , but

5:13

would you buy one for yourself ? No , yeah , if it's the same

5:15

price as the car you have now , would you buy it same

5:18

price ? So what was the design ?

5:19

this changes just the design . I'm not

5:21

that sensitive to car design , so I'm probably probably the wrong

5:23

person to ask .

5:24

I feel like I'm not very either . I'm very functional

5:26

when it comes to cars , but to me it felt like

5:28

if you're playing a video game

5:30

, that person's internet is very

5:33

slow so it's very pixelated . You see

5:35

all the edges of the blocks and all these things . It's like

5:37

a Nintendo 64 , right . Yeah , something like that

5:39

. It's a bit strange . They also had a

5:41

cyber cab , so basically it's like a car but

5:43

there's no steering wheel , so it's like two passenger seats

5:46

, okay , cool yeah , so I also thought it was interesting

5:48

.

5:48

I also like only two seats .

5:50

Two seats and then the screen . So it's a small car

5:52

, uh there's , no , there's no backseat . There's

5:54

no backseat , but it was just like normal size okay

5:57

, so you can uh extend your legs I

5:59

guess , yeah , but it's like , I guess , supposed to be just a cab

6:01

, like a Waymo , but a car just for

6:03

this , so

6:07

yeah , and there are some games and stuff . But there's also the

6:09

iConfigurator Hub , so that's the project that I helped

6:11

with Nice no-transcript

6:32

, to try to do this with ai , so the

6:34

first prototype mvp

6:36

was also displayed there and it's

6:38

text .

6:39

Or I mean , is it text or his voice or it's text

6:41

?

6:41

the voice could be an easy iteration , exactly . But

6:43

I mean the adding voice is just

6:45

saying , yeah , now we have voice to uh

6:48

voice , speech to text , right

6:50

, um , and then you kind of go with the

6:52

same flow as you're there so

6:54

, uh , it was it was .

6:55

So it's like uh , I'm , uh , I'm

6:58

a guy with a family of three kids , uh , I

7:00

need , uh I've , only one car . Uh

7:02

, what kind of car would you recommend ? Is it like that ?

7:04

yeah , exactly . And then you can say like , oh , I just want

7:06

the cheapest , or I want something sporty , or I want

7:08

something xy , I want hybrid , like I think

7:10

that's , that's actually very important . You know like

7:12

, do you want diesel petrol

7:15

um ?

7:16

and you get this like could

7:18

it also be like ? I hear this type of van

7:20

or that type of van or that type of , uh

7:22

, yeah , and then you say

7:24

, okay , let's go for that one , and you can also configure it like

7:26

I want that color , yes so a lot of the times

7:29

if it's too broad .

7:30

So the way we set it up , if it's

7:32

too broad you would ask some more guidance questions

7:34

so you can narrow a bit . So actually in

7:36

the back we have the possibilities , right

7:38

. It doesn't come up with stuff from its head , so it's like

7:40

you have already the possibilities pre-configured . If

7:43

it's too broad , it will say , okay , maybe

7:45

select one of these things , cool . And

7:47

then sometimes , if it's narrow enough , you

7:50

have , uh , like tiles that you can

7:52

click on and then after that you can still

7:54

customize it , add the sun roof or add

7:56

this , or okay , I like this , but can you change

7:58

this for that ? So that is also possible . So

8:01

it was very interesting project . This happened

8:03

, uh , yeah , still happening technically

8:05

because the , the motor show , the brussels motor show , is still happening

8:08

. And , um , I

8:10

was talking to to jonas actually , and , uh , we're

8:12

discussing how it took me back to my

8:14

very first project at interviews , which

8:17

was uh , for another car manufacturer

8:19

, which was , uh , it was actually

8:21

also that back then it wasn't called

8:24

gen ai , but it was kind of Gen AI , it

8:26

was a Q&A maker . So

8:28

that's the first thing that I wanted to kind of reflect upon so

8:31

a Q&A chatbot , or really a Q&A generator

8:33

.

8:33

Text generator .

8:35

It was a chatbot , but it was very much

8:37

like it

8:40

wasn't really a chat , right ? Like you ask something , you give

8:42

an answer , Then you ask something , you give an answer . Then you ask something , you give

8:44

an answer . It's like . It's almost like if you restart the conversation after every

8:46

question and answer , it was almost the same . Okay , I

8:48

see , Right . So I think nowadays with ChatGPT

8:51

, the context and the collision

8:53

of the conversation is way better . It's

8:59

more of a natural dialogue , Exactly so the algorithms

9:02

back then , but there

9:04

was probably a lot of rule-based stuff right A lot

9:06

of like deterministic tree traversal

9:09

. Exactly . So that's the first thing I wanted

9:11

to bring up . Maybe

9:15

I'll share . Put this on the screen

9:17

. Yeah , for people that are not familiar , this is the auto

9:19

show that is happening now . Like

9:21

, as Bart mentioned , I should have put this before , but

9:32

what ? I wanted to show is the q a maker , which still exists , apparently , um , oh , it's actually

9:34

called it , yeah , q a maker and actually now it's called uh , ai services within azure , but

9:36

before it was called lewis , which is like language

9:38

, something , understanding , intelligent

9:41

system , something , yeah . So it was called lewis

9:43

. Um , so yeah it

9:45

was . They had like a pretty ui , basically actually

9:47

the , the work , the engine

9:49

was kind of there . You just kind of feed questions

9:51

and answers and you have metadata tags and all these things

9:53

and then you just had the , the nice ui

9:55

that actually azure provided , right . So basically

9:57

you just have a whole bunch of questions , you have a whole

10:00

bunch of documents that you provide and then the

10:02

idea is that it would match stuff for

10:04

you , right , and you had like the formats

10:06

and all these things very interesting , um

10:08

. But I still remember , even back then , that there was

10:11

a lot of um , it

10:14

was a lot of challenges , right . If you say , like , what's the

10:16

best company in the world , it would probably say the company

10:18

that I was working for , uh . But if you say

10:20

what's the worst company in the world , it also says the same company

10:22

company , right , because it's just matching keywords . So

10:25

, like it was very A

10:27

few rules that are not set up correctly , exactly right

10:29

. So you need a lot of examples

10:31

in the Q&A stuff and actually like looking

10:33

back , this is it's

10:36

egg ? No , it's

10:38

re or retrieval

10:41

, augmented generation . So

10:43

there was no generation , but there was the retrieval

10:45

part retrieval augmented

10:47

, yeah returns yeah , something

10:49

like that . Do

10:52

you have a ? We don't have a ? Sorry

10:55

, uh , but yeah

10:57

, but that was it back then . Right , um

10:59

, and I think , yeah , fast forward now to 2025

11:02

, gen ai , ai . I think

11:04

it's much advanced , but

11:07

it was an interesting reflection . You know , it's

11:09

like this is where we were . Have

11:12

you been to the motor show now A

11:15

few years ago ? Yeah , I think , even pre-COVID , to

11:17

be honest .

11:17

Yeah , it was when I was in the market

11:19

for a new car .

11:21

Ah , wow . And

11:23

then , since then , you chose a car . You're happy with

11:25

it now . So you went to the motor

11:28

show to look at cars and like to to

11:30

not have to visit uh 101 dealerships

11:33

, yeah , brands and that's

11:35

a nice thing , right like you see what's out there , what uh ? yeah

11:37

, indeed , I think if you really

11:39

like the , the cars

11:42

, though , if you're a , if you're

11:44

very fanatical , like

11:46

yeah , it's cool , but at the

11:48

same time I feel like maybe too busy for you . Like

11:50

, maybe you want to go to dealership because you have a one-on-one

11:52

attention , like , and probably if you're very enthusiastic

11:55

about cars , you don't want to go to dealerships , take

11:58

a chair , put it in front of the renault clio and really

12:00

stare at it , stare at it , just silence , appreciate it ?

12:01

yeah , no , but maybe like , go in , take it for an hour , stare at it . It's silence , appreciate

12:03

it .

12:04

Yeah , no , but maybe , like , go in take

12:06

it for a test drive . You know , maybe this Talk

12:09

to someone that can give you all the information about this car

12:11

without splitting the attention , you know . So

12:14

you know , um pum pum . Now

12:16

maybe more for the timely

12:19

news bizdev thingies

12:21

. I see here that the Biden

12:23

administration proposes new

12:25

rules on exporting AI chips , provoking

12:27

industry pushback .

12:32

What is this about , Bart ? Let me open the link

12:34

because I want to say it's two days ago . Is

12:36

that correct ? One

12:43

day ago , the 13th . So

12:48

it's yesterday that the Biden administration proposed

12:50

a new set of rules for exporting ai chips . Okay , um

12:52

, there's a lot of uh . I think the discussion is mainly

12:54

based on um

12:56

national security okay

12:59

and then mostly

13:01

aimed against China , from

13:04

the US , okay , and

13:15

it's a regulation that basically limits the amount of chips quote-unquote

13:17

AI chips that can be exported , and

13:19

it's very much like it impacts the big

13:21

US companies like AMD

13:23

, like NVIDIA , yeah , like it impacts like

13:25

the big , uh , us companies

13:27

like amd , like nvidia , yeah , um , and they , they have to limit their sales basically

13:30

to non-allied countries and

13:33

the biden's administration .

13:34

But trump is the president-elect , but he's

13:36

not . He hasn't taken over office .

13:38

I don't know how easy it is for them to reverse

13:40

it . To be honest and like

13:42

, like the title says , it's a proposal , so I doubt it's

13:44

going to be final and yeah um

13:47

, but it would say that it would mean that , for example , to

13:49

non-allied countries , uh , it would

13:51

limit the what they can order

13:54

to 50 to 320 000 chips

13:56

, okay , and

13:58

if you want to go beyond that , you need to have some

14:00

sort of license . There are also key allies

14:03

and they mentioned UK , japan , there are a

14:05

number of others . They get unrestricted

14:07

access . Okay , I

14:10

think there was a big backlash from

14:13

the US landscape , especially from NVIDIA . I

14:15

think NVIDIA was very explicit about it , yeah

14:17

, that it

14:20

would be bad for innovation

14:22

. It would be bad

14:24

for innovation . It

14:31

would hamper the competitiveness

14:34

of the U S chip generation landscape and the other point , I guess

14:36

yeah , I think so . I think , from the moment

14:38

that you say you're not allowed to

14:40

export , nvidia

14:45

is not allowed to export this anymore , or in such a limited way that it basically

14:47

handicaps anyone that wants to buy it . It creates a

14:49

temporary pause

14:51

for these countries like

14:54

china , yeah , but it basically signals

14:56

this like get your shit in order . You need to do

14:58

this yourself . Yeah , maybe it will take

15:00

them two years , maybe it'll take them five years , maybe it'll

15:02

take them 10 , but that means that after the time , they

15:04

don't need us anymore . True

15:06

, that's true .

15:07

I mean , it's a very , it's a very short term actually

15:09

, yeah , yeah , yeah , I feel like , yeah , it's a bit short-sighted , right

15:12

like in the , it doesn't fix

15:14

the . Yeah , I see , and what do

15:16

you think of the ?

15:16

because you said it's , you speculated

15:19

that it's a security thing well

15:21

, I think that is what the how more

15:23

or less I would describe .

15:24

Yeah and like the security thing . Is it

15:26

like because wasn't there

15:28

on

15:30

the supply chain or something that , uh

15:33

, the people were tampering

15:35

with the chips ? There was a story

15:37

or something like that is , in that sense , the security

15:39

, or like what was the security concerns of accepting

15:41

chips from I ?

15:44

think it's mainly uh the building up the ai

15:47

abilities and potentially

15:49

to be able to use them in uh , in

15:51

uh settings where uh

15:53

where they can be seen in an adversary , like

15:56

in in warfare and cyber , cyber

15:58

attacks , etc . I see , okay

16:01

, I think that is a more explicit point that is

16:03

being , uh , discussed in the community .

16:05

I think the last explicit point is also just competitiveness

16:08

on ai right yeah

16:10

, yeah , yeah , yeah , yeah , and

16:12

you do see that , like we , I think last week we talked

16:14

about , uh , the deep seek v3

16:16

, which is a chinese model , which is a chinese

16:18

model indeed , but like deep seek is a good

16:21

example .

16:21

Like they use

16:24

much , much less resources but in the end they're training on

16:26

nvidia chips , right ?

16:27

yeah , yeah , yeah , yeah it would hamper like

16:29

if they , if , if they can't just order

16:31

chips like that

16:33

, like you can't really invest in right , like you can't really

16:35

build the capabilities yeah , yeah

16:38

, indeed , indeed , and uh

16:41

, now

16:43

maybe play a bit on the words , because you mentioned

16:45

AI development and

16:49

I saw here a topic AI development , but

16:51

instead of developing AI , ai development

16:53

teams . I guess that's what you meant , because I put a link

16:55

here .

16:56

AI development teams . Yeah

16:58

, there were two notable things , I think

17:00

, this week . First

17:03

, zuckerberg I think it was in the news a lot Most

17:05

people will notice , but he uh announced that

17:08

meta plans to replace mid-level

17:10

engineers with ais this year yeah

17:14

, but uh .

17:16

So what I'm reading when

17:18

I see this is he basically

17:20

wants to replace people

17:22

with AIs . Well

17:24

, I guess technically , in

17:27

practice it's like I'm going to give

17:29

you coding assistants and

17:31

agents and all these things and I'm going to fire

17:34

three people , because now you can develop as much

17:36

as four people .

17:39

I think that is a step up . I think what he's

17:41

also alluding to is with uh , with uh

17:44

agentic ai . Is that at some

17:46

point , uh he he

17:48

tries for to really completely

17:50

eliminate the need of some developers not

17:53

all , of course , but yeah , I'm

17:55

also surprised that not just to supercharge

17:57

existing developers . What

17:59

he's really hinting at is like to to

18:01

replace yeah

18:03

, and why do you think mid-level

18:06

here ?

18:06

Why not juniors ?

18:08

I think what he just means here is like it's more

18:10

than just junior skills Interesting

18:13

, it is mid-level skills .

18:16

Interesting . What do you think of that ? Do

18:18

you think that's possible ?

18:20

Well , we discussed a bit Boltnew last

18:22

week , which is like this LLM

18:26

that allows you to very easily build

18:28

front-end and back-end applications . Focus

18:30

on front-end , but you can do back-end . I

18:33

think what

18:35

this shows is that , with

18:39

the right setup , enchanting AI on

18:41

co-development will get you very

18:43

far .

18:45

Yes .

18:46

That is a bit , and of course there's a lot of discussions

18:49

on how maintainable is it , et cetera , et cetera , et

18:51

cetera , but

18:54

that we are today much closer to having

18:58

agentic AI that can do code

19:00

development based on just describe

19:02

the feature out out . As to this uh

19:04

code base , we're much closer to

19:07

that , like way closer than we were two

19:09

years ago , for sure . So

19:11

to me , like that approach like you

19:13

have an existing code base , you hunt an agenda guy

19:15

to build a feature on top of that , be it front , on the back end

19:17

or whatever it will come yeah

19:20

, yeah , that's true .

19:22

I think so . I think well

19:26

, and I think there's another article that I

19:28

wanted to link with this , but I

19:30

agree , I don't know if you can go as far . Well

19:32

, we also talked a bit last

19:34

week . I want to say that

19:36

the expectations for people to be productive will increase

19:38

. And maybe if you read the Mark Zuckerberg's

19:41

announcement , it's a bit like that as well

19:43

, right , like ? if you're going to replace mid-level engineers with

19:45

ai . Basically what you another

19:47

way less , uh , clickbaity

19:50

. I guess way to think of it is like you're expecting

19:52

the people you have to be more productive so you don't need as many

19:54

people . Well , yeah , you could phrase it like that

19:56

right I feel like it's just in a

19:58

way , it's like we're kind of saying the same thing , but I feel like one is more

20:00

catchy um the

20:03

other , the other thing that I linked there is a very

20:05

similar one .

20:06

It's from uh microsoft , um

20:08

, where

20:10

uh microsoft

20:13

basically announced that they will uh

20:15

form an internal dev focused

20:17

, development focused ai organization

20:20

and it will be aimed

20:22

at building AI solutions , but

20:24

also aimed at AI

20:27

development within Microsoft to

20:29

basically fast-forward AI-supported

20:32

co-development . So

20:35

they're doing the same thing . I

20:38

think the way that they

20:40

report on it is in a much more in

20:43

a smarter way , in a more formal way , in a bit more thought , in a in a smarter

20:45

way , in a more formal way and a bit more thought through way . Yeah zuckerberg

20:47

is basically saying , uh , all the people that are building

20:49

these skills for me , I'm gonna fire them . And because it will be cheaper

20:51

but they're more or

20:53

less working towards the same goal .

20:55

You should see this yeah , yeah

20:57

, indeed , I feel like yeah . In

20:59

a way , it's a bit different

21:02

ways to say the same story . Yeah right , one

21:04

is more like more responsible , maybe

21:06

like we're working for dev focus

21:09

, the organization like let's adapt

21:11

more to this , and the other one's saying , yeah , adapting

21:13

means letting people go yeah , yeah do

21:18

you think this is the future ? do you think this is gonna

21:20

? Do you think this is gonna stick , I guess ? Or do you

21:22

think this is gonna stick , I guess , or do you think this is feasible for any organization

21:24

? Can you reflect a bit on ?

21:26

well , we're here talking really about code development . Yes

21:28

, um

21:31

well , like I was saying , I think we are much closer

21:33

to that today , especially if you start like I have

21:35

an existing code base and I want to start

21:37

building on that with new features , and

21:40

then if you have tools that

21:42

do code generation , that also can execute

21:44

that code , that can see at tracebacks

21:47

, that can build tests for you , et cetera , then you're

21:50

very close to that . And I think with

21:52

stuff like Bolt and stuff like Creatorxyz

21:54

, we see that today , but

21:56

it's only the first generation of those tools .

21:58

Yeah , that's true .

21:58

So to me it's that , if you follow I

22:02

think up to six months ago I was very skeptical on this

22:04

After Bolt , bolt , follow up I was like , I think , up to six months ago . I was

22:06

very skeptical on this , but but that's like bolt , really like change a bit

22:08

my viewpoint on this . Um , and

22:11

I think if we just see this as the first iteration and we're

22:13

three , four generations further , yeah we're

22:15

very close to this , but and

22:17

I and I think the the

22:20

big difference for an engineer is a bit

22:22

like what

22:25

we typically do is , then we tend to be , we try to be

22:27

very opinionated , like this is building good code

22:29

and this is what we do , and I think what the

22:31

focus will shift towards if

22:34

at least you adopt these tools . Right , if you adopt these tools , the

22:36

focus needs to shift towards very

22:38

clearly expressing , like these are the features that I want

22:40

to test build , these

22:44

are that you also want to test build , that you also want

22:46

to test them and on what functionality

22:48

you want to test them , and to really very

22:50

clearly express that yeah

22:52

, and leave

22:54

the code generation a bit up to the two .

22:56

Yeah , yeah , indeed . No , I agree

22:59

, I was also thinking , linked a bit to what

23:01

you were saying . If I'm reviewing someone else's

23:03

code these days because I

23:05

guess also I'm thinking about this , because I review the

23:07

ai generated code in a way right

23:09

, but uh , I've

23:11

always wondered , like , what are the things I should pay attention to

23:14

? Should I have to understand every single line , like

23:16

every dependency ? Do I need to understand everything you know

23:18

? Like and I think today

23:20

is more I look at something

23:22

. I can kind of get what it's like if you have a function

23:24

right , do I get what the function is doing ? Yes

23:27

, okay , then I don't understand all

23:29

the inside of it , right , because

23:31

I know that if I need to change something , I know it's going to be there . And

23:34

I feel like , in a way , you kind of start looking more

23:36

at , uh , how the pieces

23:38

connect and like , are things entangled

23:40

or not entangled ? You start thinking , like , if I

23:43

need to make a change , is this a change that I'm just going

23:45

to have to make here or I have to make in different ?

23:46

places right and you want to

23:48

test functionality .

23:49

I know , to test the functionality . But , like I guess , like if you

23:51

have a function that you know what the function is doing , you have

23:53

tests for it . Like the insides

23:56

it matters

23:58

less in a way , because even good . But you

24:00

know , like , yeah , I'm seeing a function that it's

24:03

very , maybe very vega , like every

24:05

function should do one thing and only one thing , right ? So

24:08

if it's a nice function that has a very good scope

24:10

of what it's doing , has a good test coverage

24:12

, even if the code inside is a bit shitty

24:14

, it's like you can always go back and change

24:16

it if needed , right , like you can refactor

24:19

it right .

24:19

Like yeah , and then typically well

24:21

often people express this let's say the outcome is

24:23

good , but then it comes down to performance

24:26

.

24:26

Indeed .

24:27

Like it really depends on the use

24:29

case whether or not performance is a thing right , and if

24:31

it becomes a bottleneck you fix it

24:33

.

24:34

Yeah , indeed , indeed indeed , indeed no , I fully

24:36

agree . I fully fully agree . Do

24:38

you think there's any caveats to this ? Like Meta and Microsoft

24:40

are doing this . Is there any reason

24:43

? Why ? Are there any exceptions , maybe Like

24:45

the startups ? Or do

24:47

you think it makes a difference the

24:51

type of product that people are using or the size of the organization or the skill

24:54

sets of the developers ? I think

24:56

.

24:56

maybe I

24:59

think the tool chain that you need for this is

25:01

not there yet . Honestly

25:03

, like I said , like Bolt created the

25:05

first generation , I think we probably need to be on

25:08

the fourth generation for wide scale adoption

25:10

. But , for example

25:12

, like what I think a lot

25:14

of developers are playing with , is GitHub

25:16

Copilot or Cursorai

25:19

? Is it ai ? I don't know .

25:22

Cursor .

25:23

AI-focused IDE and

25:25

that is like still miles away from

25:28

what bolt is doing , for example . Yeah

25:30

, because like um cursor

25:33

or hit the copilot you need to have . You need

25:35

to specify like I want these files for my code base

25:37

in the context when I ask a question I want or run to edit the code

25:40

. Um , it has very limited ability to actually execute the code and fix based on that . It

25:42

is very limited ability to actually execute the code

25:44

and fix based on that . It is very limited

25:46

ability , if any , to integrate with

25:48

a database and to uh , to apply migration stuff

25:50

like this . So it's very limited um

25:53

. And then , if you look at things

25:55

like bolt that are much further advanced , they

25:58

are still very much a proof of concept

26:00

, more or less like it's narrow right

26:02

it's narrow and also also in the sense that like it

26:04

doesn't integrate nicely with kit

26:06

, for example , with version code , like you can build your

26:08

proof concept , but it's very , very hacky , tacky

26:10

to get it into a versioned repo and stuff like that so it's

26:13

not like it's . It shows the

26:15

direction that we're going , but it's not ready for wide

26:17

scale adoption , while at the same time

26:19

, like these companies like microsoft , like

26:22

Facebook , like Meta , like they built

26:24

have trained huge LLMs in-house

26:26

. I mean , they're skills-wise probably

26:29

miles ahead of most other companies and

26:31

so they can probably more easily build these tools in-house as

26:33

well , to make it very specific to their own tool

26:36

chain that they have going on . So I think there

26:38

is a competitive advantage that

26:41

hopefully we'll get closer through time when we make

26:44

iterations on these .

26:44

Yeah , yeah also , as

26:47

you mentioned as well , I feel like cursor , copilot

26:49

, they're going really from the developer side and

26:51

cursor is really going from the functional

26:54

application side right

26:56

and I feel like ideally they

26:58

meet right at one point . Like you can

27:00

have a bit of both worlds

27:02

right like you have . You can get started quickly with something like bolt

27:04

, but at the same time you can still have a bit of both worlds right like you have . You can get started quickly with something like bolt , but

27:07

at the same time you can still be a bit opinionated , like I don't

27:09

, because I think bolt you said mentioned uses super base

27:11

yeah , the integration with super

27:13

base , right yeah so it's like , but if you say you

27:15

can still be , it's not as going to be as narrow . You

27:17

can say I want to do this and I want to do that and

27:19

you can do this , and then , like , you can maybe go

27:21

shift a bit more towards the developer side

27:24

and maybe do change it a bit more to

27:26

your liking , instead of having to problem over and over . You know , and

27:28

it kind of shifts it back and forth . But

27:30

, uh , interesting , um

27:32

, we'll link to this before

27:34

we cover the the other business dev topics

27:37

um , developer

27:39

productivity in 2025 , more

27:41

ai , but mixed results and I need to share

27:43

the different tab . Um

27:46

, I , yeah , I , I

27:48

looked , I read through this article and this

27:50

is from january 2nd , so not

27:52

that old um . But basically

27:54

, you're just looking at the developer activity 2025

27:57

and no big surprise . I

27:59

guess the ai and I assisted things are

28:02

there , right . So maybe just to I'm

28:04

just going to skim through the subtitles . So

28:06

, for example , they mentioned new security risks emerge

28:08

for AI , which I

28:12

wanted to ask a bit of your opinion . I do think this is relevant

28:14

, but I don't know how relevant it is today

28:16

, to be honest , because when

28:20

I read this , I think of AI

28:22

will pull in a dependency that has some

28:24

security vulnerabilities , right , and

28:26

because you're not really vetting the code , then

28:28

now your code is less secure , right

28:31

. But to be honest

28:33

, I feel like I'm not sure how big of an issue

28:35

it is if you have a developer that is reviewing

28:38

this stuff , or if you have a developer

28:40

that says write this

28:42

using this framework , because that's the framework

28:44

that I know that I like and the one that I'm using

28:47

on this project . Do you think this is a relevant

28:49

uh concern ?

28:51

so the concern here is , like ai is gonna

28:54

generate a lot of code for me , specify dependencies

28:56

that my and this code or the dependencies

28:58

might have . Security for our vulnerabilities

29:01

, right ?

29:01

yeah , that's , that's what I'm , that's what I'm thinking um

29:04

, I

29:06

agree to some extent .

29:07

I agree when , when you use something

29:10

for code generation , um , like

29:12

really to build an mvp , like to from

29:15

a to z building like a minimal application , I

29:17

agree to it . Uh , when you compare this to

29:20

having a single

29:22

very experienced developer , building is . Yeah

29:26

that's interesting , Then I would . If the

29:28

outcome needs to be the most secure application

29:30

, I would put my

29:32

money on the very experienced developer

29:34

versus AI . If

29:39

it's , we're going to build this thing over the course of the next year with a team of 25

29:42

people and no one knows the full code base

29:44

and maybe , if we're

29:46

in the JavaScript world , all these things

29:48

that we pull in now are going to be fully outdated

29:50

at the end of the year . I'm not

29:52

sure if it makes a big difference security-wise . Maybe

29:55

right To be honest , Because

29:58

you can do a bit of , let's say

30:00

, patching of

30:02

security vulnerabilities by including some scans

30:04

in your CI et cetera , stuff like this , which

30:07

you can do , whether it's AI-generated or

30:10

person-generated , whatever . You

30:12

can have these safeguards in place . So

30:15

I'm not sure if you look at , because from

30:17

the moment that you're working as

30:19

a team on a big code base , there are very

30:21

few people that have a full view on it . Right

30:23

, I think so too . Depends very

30:25

much on the context , of course .

30:27

Yeah , I think that security could

30:29

be well . I was thinking a bit more as you

30:32

were also discussing , I think , security

30:34

maybe also , I don't know if you have SQL injections or something

30:37

like maybe there's need to be dependencies , right , know if you

30:39

have sql injections or something like , maybe there's need to be dependencies

30:41

, right ? But I guess that I think people say security a lot

30:43

because , like , you have something that is generated from

30:45

another place and there is security

30:47

is always a the big risk , right , like

30:50

no one's gonna say no to security

30:52

. But to be honest , like I don't see

30:54

, like in practice , I don't see how

30:56

using ai will

30:59

make it a bigger risk .

31:01

Well , I think , like the example of dependencies , I

31:03

think is a very good one . I think if

31:05

you used ai today , it will , by default

31:08

, probably increasing in

31:10

future but improving the future , but by default

31:12

will will pull in our data

31:14

dependencies . True , that's

31:17

what you see right like . So that is a very clear something

31:19

. That is not okay . Yeah , it should be better

31:22

. Um , but again , they're

31:24

like you . I think you can have safeguards

31:26

in place that within your ci that you check a

31:28

bit on . Are there any high risk out of

31:30

dates ?

31:31

but I also feel like the

31:34

risk if I , if I ask uh chai

31:36

gpt to write a function for me and

31:38

it brings more outdated dependency

31:40

, it's still up to me Like

31:42

I feel like the accountability still is on the developer that

31:44

accepts this code , right , and I feel like to say like

31:46

, oh yeah , but Well , but that's like you have

31:48

a lot of different , like it's a very

31:51

wide spectrum just from , like when you discuss

31:53

this , from the security risk of using AI .

31:55

Like you have people that go full out

31:57

of AI and I only prompt yeah , slash

31:59

boltnew . Or

32:02

you say I used hit the copilot

32:04

a bit as a fancy autocomplete . Yeah , which

32:07

are two completely different things when discussing this

32:09

context , of course .

32:09

But I think , in this context , I'm looking at , like developer

32:12

productivity , right , so I'm

32:14

thinking of he's a developer

32:16

and he's being more productive because of AI . So

32:19

it's not

32:21

, um , I don't know . Like you're still expected

32:23

to code , right , like , like the idea is to

32:25

to assist you and be more productive rather than replacing

32:28

you . That's , that's a bit how I'm phrasing this . That's

32:30

how I'm looking at these things , because I fully agree with

32:32

you , right , maybe there are people that , uh

32:34

, they they just

32:36

want to prototype something quickly , or ambC

32:39

, or maybe they don't have the technical skills to do something

32:41

in JavaScript , and maybe that's fine . But when I look

32:43

at this and I think of security for developer

32:45

productivity , I

32:48

don't know , I feel like that kind of ability is still with the developers , right

32:50

, the AI is supposed to make you more productive .

32:52

But that's what I would say today as well .

32:54

Yeah , that's what I would say , I agree with , and also for

32:56

this year at least right , because again thinking of

32:58

2025 .

33:00

But I think the simple fact is and you

33:02

can't really go around it like if you , instead

33:05

of writing every line yourself and being conscious

33:07

of every line in other words , even

33:10

though it takes a lot , much more time like you

33:12

are conscious of every line , versus

33:15

now these 300 lines get auto-generated and

33:17

you read through it , you skim through it , but you're less conscious

33:19

of each individual line . That's true . So there might

33:21

like , objectively , there is probably a

33:25

big risk versus a very experienced developer , but

33:27

that assumes that the developer is very experienced .

33:29

But I agree with it's a good point you're making . But

33:31

then I think for me the question is , from

33:33

skimming the code , how many security

33:36

vulnerabilities there are , like

33:38

you know , because I feel like , again

33:41

, like not just you scanning right , but there's also linters

33:43

, there's also CI , there's also this , there's also that

33:45

. So I feel like the risk is very okay

33:48

, maybe not negligible , but I don't think it's

33:50

not something that really concerns me . Not necessarily

33:53

concerns me , but I don't think that's something that takes

33:55

a lot of mental space for developers that are using gen

33:57

ai these days if you use it as a fancy

33:59

auto , complete , not no , we are

34:02

okay

34:04

, but then , moving on , next thing , it says

34:06

observability . We need to shift

34:08

further left and I guess , to

34:11

be honest , I didn't really understand this . But uh , they

34:13

do mention that the gender , the

34:15

ai generated code , becomes a bit of

34:17

a black box . So

34:20

, yeah , you need to increase observability

34:23

. So I'm not sure if there's anything you want to

34:25

comment here , but and

34:27

what is the statement ?

34:28

How would you interpret this ?

34:31

They just say , like the

34:34

code now becomes a bit of black boxes

34:36

, because people don't fully understand what the code is doing

34:38

. It's just like a box that you plug it in . So

34:41

you have to increase observability on the developer

34:43

to chain . But I'm not sure exactly what this means , to be

34:45

honest .

34:45

But yeah

34:48

, I'm not sure if this is what they're hinting towards

34:50

, but I do think a bit what I was saying

34:52

. I think , from the moment that

34:54

you're not writing every line anymore , you

34:56

can be less opinionated on how you want your code to

34:58

be structured , but you need

35:00

to be much more conscious

35:03

on how do I make sure that what is generated

35:05

also works and works

35:07

well . So that

35:10

means , uh , having the right testing

35:13

in place . That means , uh , having

35:15

the right uh observability

35:17

in place so that you actually have let's

35:20

say , you're building a web application and

35:22

you see some features are very slow that

35:24

you have in place , that you

35:27

can monitor with

35:29

application process monitoring

35:31

, that you can monitor where is

35:33

the bottleneck and that you can quickly drill down to that

35:35

and then improve that function . Because you didn't write

35:37

that function and even though maybe if

35:39

you wrote that specific function , you

35:42

would have known function functional wise like this needs

35:45

to be fast .

35:45

Yeah , I see . So it's like . It's almost like now

35:48

we understand a bit the code , more based on the

35:50

observability metrics , because

35:53

we are less familiar

35:55

on the code itself .

35:56

Writing the code itself like yeah

35:59

, um , exactly

36:01

, and because of that , like you need to be like

36:03

logging , monitoring , um , early

36:05

validation , like it needs to be much earlier

36:08

on in the process , where normally , like

36:10

you , like , maybe not the best practice

36:12

, but what happens often is you

36:14

develop a proof concept it works

36:16

, okay , let's go for this . And only then

36:19

you start thinking of these things . Yeah , and

36:21

well now , and I'll just give an example of

36:23

something that I count on myself like you

36:25

get an error working with

36:27

bolt , like you get an error , it doesn't function , but

36:30

it's not clear what is happening . So you ask you

36:32

prompt to build in logging

36:35

, uh , at specific functions , so

36:37

you get a trace back , so you can reason a bit

36:39

with the model , like what is going wrong . But

36:41

this logging becomes much more important early on I

36:44

see what you're saying Versus when I'm writing the code myself

36:46

. Just to understand , like what

36:48

is going right or what is not going right ?

36:50

So basically , it's almost like Because

36:52

you didn't write it , you need to pay

36:54

closer attention to the metrics too .

36:56

You didn't write it . To

37:01

the metrics too , you didn't write it .

37:02

But it's , but you do want an overview of how does this go with . Basically , it's like almost like

37:04

signal probes throughout the code that you need now because you understand less

37:06

than internals or maybe you're not as close . Yeah

37:09

, it's a good point . I you know , I can see that , that

37:11

I kind of agree and I think maybe we should

37:13

um say

37:16

that again .

37:17

I think there are still a lot of people that are

37:19

highly , highly , highly skeptical about this

37:21

. Yeah , yeah like highly

37:23

skeptical . Like the thing on the hacker news . You see a lot

37:25

of discussion on this and I and

37:28

I agree with all points , but to me it's a bit like

37:30

, if you see the evolution

37:33

that we've made so far and how quickly it is going

37:35

, I think you , if you , you

37:37

, you should not ignore this . Yeah , I

37:40

think that is a that is a reality .

37:41

It's better to adopt this this way of thinking as

37:43

, and even if you don't use it yourself , but to be able

37:45

to do it right yeah , maybe I'll jump a

37:47

bit , uh , a bit ahead , because

37:50

I think there's something that , yeah , there's

37:52

a lot of what you just said . One of the points is that everyone

37:54

will need to upskill um

37:57

, which I think is kind of what you're saying , right , I think

37:59

the gen ai is not , uh , this

38:02

. These tools are not here to . They're here to

38:04

stay right and if you're not adopting them , you're

38:06

falling behind . So I do think teams

38:08

need to upskill . They also mentioned , like not just in

38:10

the um . I

38:13

think they mentioned not only on the the gen ai part

38:15

, but also like organizationally

38:18

, yeah

38:21

, but um , all

38:23

in all , I , I , yeah . This , I think , is very

38:26

like you said . A lot of people are still skeptical

38:28

. I think this is very commonplace , that you

38:30

need to adopt these things . It's like you're not going to

38:32

be coding on text editors

38:34

still right , they're tools and they're

38:37

here to help you exactly . So

38:39

, um , now going back a bit , bouncing back a bit up

38:42

again , the next thing they mention

38:44

is building at scale will be more complicated

38:46

. I think it's also we touched a bit upon

38:48

that like , maybe you can move very fast , but

38:51

because you can move so fast and so easy to add code

38:53

, that if you have something that is big

38:55

, maybe maybe

38:58

it's not gonna be maintainable , right ? Maybe because , yeah

39:01

, like , maybe you , you , you were too quick to accept

39:03

all the co-pilot suggestions , but

39:05

now you need to make a change and copilot cannot help

39:07

you anymore and you don't know what to

39:09

do , right ? So that's also something

39:11

we discussed a bit internally on our slack , right

39:13

?

39:13

like , uh , yeah , I think

39:15

there

39:18

is also like there are a lot of components to building a scale

39:20

. But I think if you because what

39:22

we've discussed so far is mostly building

39:24

MVPs right , I

39:27

think typically when you're in a large corporation , you have this code

39:29

base with 20 years of legacy , which

39:33

it will probably be a large code base

39:35

. I think that is also like it's still challenging

39:38

for most IDEs

39:41

to have a very , very large code base . In

39:43

the context , that's true . I think

39:45

with Cursor and with GitHub

39:48

Copilot , you still

39:50

need to specify , like this is

39:52

the scope of the code base that I want to have in the context . I

39:56

think , again , we will see improvements there Based

39:58

on your query . There will be an

40:01

intermediary step to determine

40:03

what should I keep in my context to answer this . So

40:05

we will see improvements there . So

40:09

that challenges the size of it . I think another

40:11

challenge is the requirements

40:13

of a specific

40:16

operation that has probably tons of regulation

40:18

and compliance measurements

40:21

is very specific

40:24

. Right Typically

40:27

involves a very long QA process to

40:29

get a feature to production and

40:32

I think this requires maybe it's

40:34

not a challenge of the tool per se

40:36

, because the nice thing with GenUI is that you can

40:38

inject this into context like what are

40:40

all these requirements ? that's like maybe it's easier than

40:42

the training a junior on all these requirements

40:45

, uh , but it's this

40:47

, this , uh , non-deterministic

40:49

approach will feel very like

40:52

. It will feel very risky to , probably to

40:54

people that will manage it , that are and responsible

40:56

when it comes to these regulations and compliance matters . Yeah

40:59

, that is so , that is . I think it

41:01

requires a bit of a different

41:03

way to look at development .

41:06

I think so too . I think , um , yeah

41:09

, they also mentioned like in the end , like maybe the

41:11

, the reviews , have become a

41:13

bigger part of the development part the

41:15

cycle right there's also ai reviews

41:18

. Right , maybe you can actually have , but everything needs to

41:20

be taken with a bit . I mean the humans do be on the

41:22

driver seat , yeah , but

41:24

I do think that , uh , yeah , the

41:26

tools are going to be all around and even on the reviews

41:28

and even all these things well to

41:31

me .

41:31

I think the the ai review because mentioned there , like

41:33

in the air review , is also a very interesting one . Like

41:36

it's uh , because we all I think you

41:38

and me both have worked in a very corporate environment as

41:40

well . We know that

41:42

, uh , that , for example , pr

41:44

reviews are very often formality . Yeah

41:47

right , and I think then , for in

41:49

those scenarios where pr reviews

41:52

are not taken seriously , an

41:54

ar review will add a lot of value because

41:56

you can specify all these requirements in

41:58

the ai review I think also the .

42:01

I think sometimes the reviews become a

42:03

formality because it's it's

42:05

yeah , it almost becomes personal

42:07

, right like . But I think if

42:09

an ai saying that you're going to shit , then it's

42:11

not me right like , it's not like you should don't be mad at me

42:14

, you know . So I think there's

42:16

also a bit of that . I think even the CI test

42:18

and the LinkedIn , I think it also helps a bit with that . It's

42:20

a bit less , it's more neutral . That's

42:22

why it's easier to say , ah , it's best practices , it's

42:24

not what I want , it's what the best practices

42:26

are . Think

42:34

it could help . One thing I also saw on github that whenever there are issues , there was like a

42:36

bot that will search and I don't know if like uses ai for sure , but I don't know if he uses the

42:38

gen ai part . But he would also go through

42:40

the similar issues that exist and link

42:42

stuff like maybe this one like almost like a rag kind

42:44

of thing , yeah , and say , yeah , this is what the issue is

42:46

. These are two links that maybe are

42:48

the same , so maybe consider closing

42:50

this issue already so we don't have duplicates . Yeah

42:53

, um , so not only on the ai review , but also they

42:55

were adding this .

42:56

So I thought it was very interesting , and it's also

42:58

and maybe another thing I think

43:00

is also interesting to see how to tackle

43:02

this , but probably something probably that will improve

43:04

it over time as well . It's like it's a bit of a ai

43:07

cogeneration . I'll take

43:09

an example of bolt is a bit of an unguided missile

43:13

. Normally , if you say I want to build this feature

43:15

, you build that feature . If

43:18

you prompt I want to build this , it interprets

43:20

it a bit not

43:23

exactly like that . I had an example where I

43:25

said I want to add

43:28

Google SSO authentication

43:30

to my username

43:32

, username password authentication mechanism

43:34

and Bolt

43:37

did it very flawlessly , but

43:39

it was a side effect that it basically

43:41

did . Oh , I added this Google

43:44

SSO authentication but I removed all

43:46

the other forms of authentication because you just only

43:48

asked me to . So it's

43:50

a bit this . Sometimes

43:53

it's more than just the thing that you want to build . Yeah

43:55

, and that's maybe linking

43:57

back to the .

43:58

What is in the text here is that this review

44:00

cycle will become longer , even though

44:02

the development cycle will be much shorter true

44:04

, true yeah which yeah

44:06

, I , but I think , I , I do

44:09

think that it

44:12

will be the case sooner or later , and I also

44:14

even wonder if , when you're

44:16

hiring , if you should focus more on people

44:18

that understand , that can read code , understand

44:20

code and write perfect code , because I think

44:22

that's where the job is going to shift a bit more

44:24

towards . The

44:28

next point here is teams will be organized differently , which I think alludes to the previous

44:30

points from Meta and Microsoft . That's

44:32

why I wanted to link this article here

44:34

, which yeah . I don't know if we need to discuss

44:36

more , because I think we discussed in the previous ones , but I

44:38

agree there will be changes Linked

44:42

a bit to the Meta . Junior

44:44

developers will be most vulnerable . I

44:46

think we also mentioned this a bit last week

44:48

how Bolt

44:50

is a very powerful tool , but I

44:52

also think it's very powerful especially to you , bart , because

44:54

you know where the pitfalls are and you know what things

44:57

is , you know what you want to do , you understand you can reason

44:59

things , but if you're more junior and it's like oh , this

45:02

works , like I wanted to go from A to

45:04

B and it goes from A to B , it doesn't matter that

45:06

as well . And also I think

45:19

they also mentioned here that , like computer science curriculum

45:22

includes a python class or two , but , uh

45:24

, probably someone that is a junior python

45:27

developer will not be as

45:30

knowledgeable as Claude , right

45:33

, so there's also that . Do

45:37

you have anything you want to comment on this ?

45:41

No , I think this is something that we've touched

45:43

on a few times . I

45:45

think that is a very fair point . I think the challenge

45:49

for I think there's also like

45:51

a challenge for the

45:53

education system , right Like this needs to become

45:55

part of the curriculum .

45:57

Yeah , I think that is the . I think even what

45:59

is cheating right , like if someone is using PHPT to do

46:01

their coding assignments , is it

46:03

cheating ?

46:04

Well , the whole discussion , of course , but yeah

46:06

, it's not .

46:07

But yeah , I do think the tools are there , People

46:09

are going to use it . You need to adapt Okay

46:18

. Think the tools are there , people are going to use it . You need to adapt

46:20

okay uh everyone . We need to upskill . We already covered burnout will still threaten developers

46:22

. So I think this is more um , yeah , talking about . People are expected

46:24

to be more productive , so burnout will still loom . Let's

46:26

say um , yeah

46:29

, I'm not sure if there's anything you want to add there , pressure

46:34

to automate everything will increase . I think

46:36

with a I think that's actually

46:38

. I think it's true . I think with ai and

46:40

people seeing the possibilities of automating things

46:43

with ai , people are going to start questioning

46:45

more like why are we doing all these things ? So I

46:47

do think there will be a .

46:49

It will come to attention to automate stuff and

46:51

I think we're still it

46:53

will come to attention to automate stuff , and I think we're

46:56

still , let's be honest , a little bit early . Um , because

46:59

a lot of tools do not have the , the easy

47:02

access to integrations with genii

47:05

, but a lot of things I

47:07

mean . Just a stupid example

47:09

on our , on our payroll , we were doing

47:11

a like it was an ad hoc task , but like to fill

47:13

in a very , a very uh

47:16

manual excel with information

47:18

that came from various sources yeah

47:20

like from the moment that you have access

47:22

to these contexts uh

47:24

, like these , to these sources in the context

47:26

, and you can actually easily alter

47:29

your excel with an llm and

47:31

the performance is good enough , like these by

47:33

default become yeah . Jenny , I support

47:35

a task and then instead of two

47:38

hours , it takes five minutes .

47:39

Yeah , I remember a bit Nico when

47:41

he was here , like how he

47:43

was also a bit not complaining

47:45

, but he was a bit . People push

47:47

the LLMs too much Like that

47:49

. Even simple things , people

47:59

use lms . But I also think it's a bit the power of it right like you kind

48:01

of have this , this tool that you can like automate excel . You could do this before by scripting

48:03

.

48:03

But now the mlm is like super easy .

48:04

Yeah , you just it's super easy , just say do it right um

48:06

, which I also think is very powerful , right , because you

48:08

don't , you don't need it to be like you're

48:10

not trying to replace people , necessarily , right , but

48:12

if you just say people just look like okay , okay , okay , okay , that's

48:14

much faster than having to type everything else , so I

48:17

agree . So AI's

48:19

wish list for 2025 . Just going through this

48:21

quickly Documentation

48:24

and code analysis . So basically they're saying they want

48:26

AI to help them more with documentation

48:28

and code analysis . Technical debt

48:30

cleanup , code testing Earlier

48:34

, easier provisioning of cloud infrastructure

48:36

, and I think that's it . Any of

48:38

these four things speak to you ?

48:42

Anything that , Jenny , I doesn't tackle that you wish

48:44

To me as a wish list for 2025 , all

48:47

of these things are valuable to documentation

48:49

, code analysis , tech , debt cleanup . What were the others

48:52

? Code testing , Code testing and easy

48:55

provision of cloud infrastructure . I think you

48:57

could have said this for the last 10 years .

48:59

Yeah , but I don't know if all the cloud infrastructure doesn't really

49:01

resonate with me . I feel like the other

49:05

stuff I think yeah , but I also feel like technical debt cleanup

49:07

. I don't think is going to really happen .

49:10

No , but I mean these are very generic things and

49:13

of really happen . No , but I mean these are very generic things and like of course they are

49:16

important and they may be even more important with gen ei , but like these are very to me good

49:18

feeling , very logical things to say in a wish list

49:20

so I mean I agree with them .

49:21

But yeah , that's true . Okay , then

49:24

we can move on we can move on .

49:26

I think the maybe the easier cloud infrastructure

49:29

you said you didn't agree . Um , I do agree with

49:31

that , like as an easier provision of cloud infrastructure . Why do you didn't agree ? Um , I do agree with

49:33

that , like as an easier provision of cloud infrastructure . Why do you don't you agree

49:35

? You think it's easy enough ?

49:37

I just feel like if it's too easy , you're

49:39

gonna miss other stuff , right like you some

49:41

like if you make it too easy , there are gonna be

49:43

side effects underneath that you're gonna miss . Like I

49:45

feel like you need like it's a bit like it is hard because

49:47

it is hard , right like you need to know what the policies people

49:50

have . You need to know how this policy will impact this . You need to know what policies

49:52

people have . You need to know how this policy will impact this . You need to know that you

49:54

cannot simplify it . Someone needs to know these things .

49:59

Yeah , I think you cannot simplify it when you

50:01

want to have a very generic cloud

50:03

environment where you want to be able to do everything .

50:07

Like a big AWS or something you mean .

50:10

Yeah , that's why people go for the big cloud providers , but

50:12

for a lot of , let's's say , more

50:14

smaller , specific solutions , you

50:16

go to flydo , you go to render , you

50:19

go to like these type of things right like you

50:21

say this is good enough , and it actually

50:23

takes care of everything that I do yeah typically

50:25

for large corporations that have their own um

50:28

data lake , for example

50:30

. Like this wouldn't be enough , right , like you don't have enough

50:32

controls , but like for for

50:35

smaller scale applications .

50:36

That is often definitely

50:38

good enough yeah , but

50:40

then I think jenny , it's not gonna play a role right like

50:42

fly that I know it's not specific to jenny

50:44

yeah , but that's what I'm saying jenny I

50:46

like it's not gonna help with these things but

50:49

uh , the other , the other three

50:51

, yeah , maybe

50:54

. Well , code testing I think is already here technical

50:57

, that cleanup . I don't think it's happening . I don't think it will happen

50:59

with any eye it will uh create

51:01

a lot of technical depth . I feel like you would add more right

51:04

because you don't know what's happening exactly and the documentation

51:06

. Cornellis , I agree , but , uh , I also think it needs

51:08

to be guided . And that is it . That

51:11

is it for this . Um , do you

51:13

want to change gears a bit ?

51:14

part uh , depends on

51:16

to what gear you want to shift . You

51:20

have a preferred . I'm typically all for changing

51:22

gears with you , but you're always surprising me with

51:24

what gear you're gonna go to .

51:25

Um do you want to talk

51:27

about tech a bit ?

51:29

uh , I thought we were talking about tech no , like

51:31

uh , stuff from the tech corner .

51:33

Okay , yeah , go go . All right , so

51:35

maybe what do you got ? Maybe we'll start with our

51:37

very own valatka you

51:40

mean lucas ? Yes , well

51:42

, lucas valatka mr valatka yes

51:45

, that's .

51:45

It's the guy um sir

51:47

valatka sir valatka , yeah , it takes

51:50

a more need to be knighted to be a sir . Right

51:52

, that's true , you do but you are . No

51:54

, you are lord right

51:57

, I don't think officially , but I did get

51:59

a , get a .

52:01

No , you got a certificate , certificate yeah

52:03

yeah , as a birthday gift you

52:05

need more efficient than that , so maybe

52:07

the best . What's the backstory ? But maybe just quickly .

52:09

Oh , it was for a birthday . You can get , you can gift

52:11

like these , uh , I don't know what is it like

52:13

a , like a square , uh , something yeah

52:17

, of scotland square and then , uh

52:19

, because it's part of some , uh , some

52:21

heritage the land , you , you become a lord

52:23

. Yeah , don't think it has much actual

52:25

legal value , but it's a nice

52:27

story , so everyone but

52:30

, if you run into Bart , everyone needs to say Lord

52:32

, bart . Let's go to the blog of soon-to-be-sir

52:36

Walatka . Okay , uv

52:39

has a killer feature you should know about . Yeah

52:41

, and I didn't know about it .

52:43

You didn't know about it . Yeah , I thought oh , this is going to be . I

52:45

think you didn't know . I thought we talked about

52:48

it as well , Specifically

52:50

with specifying the Python environment Maybe

52:53

not the Python environment , but

52:56

maybe explain . Uv is

52:58

Python-aligned

53:01

package manager , let's say so

53:03

. Uv can be compared with Poetry , pdm

53:06

, hatch , etc . So

53:09

that's UV . One

53:11

thing that UV has it's also a PEP is

53:13

that if you have a script , you

53:16

can actually add dependencies

53:18

to that script , right

53:20

? So you don't have a whole Python project , you just

53:22

have a script . You just define the dependencies as a comment

53:24

on the top and you can run it with UV .

53:28

But that's already a bit further than this

53:31

, right , like . It's one step further , like what you're describing

53:33

is , there is a PEP out there

53:35

that says if you have this py

53:37

file , this script , at

53:40

the top you can add some comments where you can specify

53:42

this script needs these dependencies , this

53:44

Python version . And then when you run

53:46

it with UV , uv implemented this PEP

53:48

, it

53:52

will actually set up this environment for you on the fly

53:54

and run the script with the dependencies .

53:58

Yes , pep , it will actually set up this environment for you on the fly and run the script with the dependencies . Yes , exactly

54:00

. And uh , not only that , not only the dependencies , but also the python version , and the python version , exactly and the python version . So actually

54:02

, that's also what lucas is alluding to here . So if you're doing some ad hoc

54:04

scripting let's say python 3.2 , and you

54:06

want one python and this and this , and you want to pull a dependency

54:08

, and yeah , so , um , typically

54:12

what you would have to do is to pip , install pandas

54:14

maybe . Uh , ideally

54:17

, you create a virtual environment , you need to activate virtual environment

54:19

, install pandas and then run python . Uh

54:22

, but yeah , if you have to install the

54:24

new python version , then it goes another

54:26

step . Right , you have to pyenv , install 312

54:29

make sure that you're using 312

54:31

. Create a virtual environment using that python , activate

54:33

the virtual environment using that python , activate the virtual environment , install pandas

54:35

and just run python . But with uv

54:37

and I think it kind of goes along the same lines

54:39

of what was happening before um

54:42

, uv

54:44

can create that virtual environment with the python

54:46

and dependencies on the fly for you , right

54:48

? So this is a bit different from what we're

54:51

saying before , like you're not necessarily running a script . Yeah , so . So this is a bit different

54:53

from what we were saying before , like you're not necessarily running a script .

54:55

Yeah , so this is because we're showing a screen . So

54:57

what Lucas is showing here is you can type uv

55:00

, run , and then you can do dash

55:02

dash , python . So you specify what Python version

55:04

you want and he's saying 3.12 , so he specifies

55:06

Python , dash dash with . So he specifies

55:08

dependencies With Pandas I want to run

55:10

Python Exactly , dependencies with pandas , I want to run python

55:12

exactly . And then the repl starts so with . With one

55:15

line in the cli , you get basically an environment that

55:17

you specified .

55:19

Yeah , I think it's used to play six commands

55:21

that you have to do and two tools . Well

55:23

, one tool probably most important

55:25

.

55:25

What he's doing now is like starting a

55:27

repl with a specific uh environment

55:30

, probably most relevant

55:32

when you're doing some ad hoc analysis

55:35

of something right , yeah , I just I

55:37

talk , want to quickly look into this , this csv

55:39

or whatever in this , in this example of the pandas

55:41

. Yeah , um , because

55:43

of this is , if this is something that you would repeat , you

55:45

would probably add this , like like we were discussing to

55:48

the script as a comment to the script or

55:50

maybe even start a project , or start a project maybe because I know how you feel

55:52

about the script , or maybe even start a project , or start a project , because I know how you feel about the scripting .

55:57

But yeah , indeed , and also , yeah , uv

55:59

will use 3.12 if you have installed . If you don't

56:01

have it , it will install for you . Same thing with Pandas

56:03

add that Python version so it kind

56:06

of takes care of all these things . Indeed

56:08

, as he mentions here , easier to remember

56:10

and no trace left behind happy scripting

56:12

. So , yeah , also shout

56:14

out to lucas , one of our colleagues here at data roots

56:16

, and if you want to have a look at other his post , we'll also include

56:18

this blog post there and his blog post is trending

56:20

on hacker news yes yes , that's

56:24

fancy right .

56:24

Yeah , it is right . He's like he's probably

56:26

now going around to his friends . Yeah , I'm trending on hacking news

56:29

. It's like , yeah , what you do I'm a .

56:30

I'm a blogger .

56:31

Part-time machine learning engineer you know

56:33

, I'm not trending

56:35

hacker news yeah , we need to check this linkedin after maybe

56:37

already changed it , right trending

56:45

hacker news .

56:45

Uh , contributor , exactly yeah , and uh yeah , when I'm not , when

56:47

I'm not trending , I just yeah . I help companies

56:49

. Yeah , um , cool , so maybe on the same lines

56:52

as scripting

56:55

. I think I talked to you about this

56:57

sometime , marimo , I

57:01

don't know . To be honest , marimo is

57:03

oh yeah yeah , it's , it's

57:06

not . I mean it's , it's comparable

57:08

ish to Jupyter notebooks . So

57:11

in a way you can also say , yeah , it's also for scripting

57:14

, also for exploration . Um , I

57:16

tried it a while ago . It's

57:18

interesting . But

57:20

basically , in a nutshell , tldr

57:23

is like they try to reimagine what notebooks

57:25

could be . So notebooks we

57:27

mentioned the repl before which basically

57:29

you just have one line , you , you write python

57:31

code and then you have the output . Jupyter notebooks are very similar , but you can save that , which is basically

57:33

just have one line , you write Python code and then you have the output . Jupyter notebooks are very similar , but you can save that

57:35

which is basically just a JSON , marimo

57:39

, it is a bit like that . It

57:41

is way newer than Jupyter

57:46

notebooks , but they are reactive

57:48

by nature . So

57:50

actually , well , you can turn it off . But for example

57:52

, like on the gif that I have here

57:54

, if you have x is

57:57

equal to 2 and then you actually

57:59

change the value of x wherever

58:01

x appears , irregardless

58:03

of the order of cells . So

58:06

it just kind of keeps track of the references . It

58:08

will update the values above as

58:10

well . So basically makes

58:12

it quote , unquote , make sure that

58:15

whatever

58:18

, like there's . No , I don't

58:20

even know if there's any python , but like variable shadowing

58:22

kind of thing , right . So

58:24

whatever you see on the notebook is actually what is there

58:26

. It's like it's almost like an application now , because if you

58:28

change that value , everything else will get trickled

58:31

down yeah , and if you compare this to

58:33

, uh , traditional , well , jupyter notebooks

58:35

, it's very much .

58:36

You run this and you see the output

58:38

, the output and stays consistent

58:40

. So you have the danger , if

58:42

you don't execute all your cells

58:45

in a linear way , that

58:47

you create side effects by not respecting

58:49

the exactly , the order , exactly and

58:51

here you're saying um , okay

58:53

, I don't necessarily need to do this in a linear

58:55

way , because if I change a variable somewhere , everything

58:59

that depends on that will update . Yes , which

59:01

is well very

59:03

similar to if you are used to working with

59:05

frontend frameworks like . Svelte Reax

59:08

like yeah .

59:08

Exactly so . That's kind of what the premise

59:11

is . So it's actually like the notebook is really an

59:13

app . If you look at the file

59:15

underneath , it's just a Python script

59:17

that every cell is

59:19

actually a function .

59:20

Okay interesting .

59:22

And then it returns the outputs . All the variables are there

59:24

, so it kind of keeps track of everything . So if you're

59:26

actually committing and having

59:28

a pull request in the end , it's like Python scripts

59:30

, right . This

59:33

reactivity doesn't need to be automatic , you can also just set

59:35

it as a stale

59:37

. The downside a bit is like if you're

59:39

doing compute intensive things , you change one thing . Now

59:41

we have to wait all the other things . So it's a bit of a

59:43

yeah , uh . But they also have the

59:45

, the marimo , so that's the , the jupyter

59:48

notebook competitor , let's say . They

59:50

also come with some um , with

59:52

widgets , so you

59:54

can have sliders if you want to change stuff . They

59:56

come with the graphs and all these things . So if you want

59:58

to , they also have a way . You have the classic

1:00:01

quote-unquote like this , like notebook view , but

1:00:03

you can also turn this into a powerpoint presentation

1:00:05

, not powerpoint , like slide presentation or

1:00:07

like an app , so every cell becomes like a bit

1:00:09

of a tile that you can add to your

1:00:11

application . So it's a

1:00:14

interesting there . There are some things that

1:00:16

, yeah , in the end I just kind of went back to jupyter

1:00:18

notebooks , to be honest . So I tried it . I was like , yeah , this is interesting

1:00:20

. But I

1:00:22

think , if you're doing a report , if you're having like a

1:00:24

little app , I think this could work well . But

1:00:26

, um , yeah , another thing

1:00:29

that is interesting why not go for this

1:00:31

? Um , I think it was like

1:00:33

let me remember , because

1:00:35

one thing like async doesn't work , and

1:00:37

I was trying to write some stuff async . So I

1:00:39

was doing stuff on the notebook and I always had to go

1:00:41

back to a script to run the last part . I

1:00:44

also think sometimes rerunning all the cells

1:00:46

sometimes it gets a bit in the way . So

1:00:49

I was really just looking

1:00:51

at some results , right

1:00:54

. So I was really just doing some looking at some results , okay , right .

1:00:56

So it wasn't something like , uh , if I was building an app or a report that I really wanted to show

1:00:58

something interactive , I think this would be very , very nice

1:01:00

well , I would be a bit uh cautious

1:01:02

about when adopting something like this is , like

1:01:05

jupiter is very much established , right , you see it everywhere

1:01:07

, everybody knows it , and like

1:01:09

, if it would be compatible , what it

1:01:11

generates is , uh , ipyte notebook

1:01:13

files , but it does not , right ? No , like , if

1:01:15

it generates ipyte notebook files , then you could

1:01:18

, you can test it out and

1:01:20

still be able to migrate back to jupyter . Yeah

1:01:22

, here that is harder right ?

1:01:23

yeah , you kind of put it ah , yeah , another thing

1:01:26

too , that , uh , I was just looking here

1:01:28

and I remembered they do

1:01:30

have a um

1:01:33

, uh vs code like extension

1:01:35

, but it didn't work really well . So I think the

1:01:37

one thing that I really missed is like , yeah , now I have the

1:01:39

ide with the ai assistant

1:01:41

, but if you have the web browser , you kind of

1:01:43

lose all that . That

1:01:45

was a big , that was a big hit as well . So there

1:01:47

are some things like that that they're still working on . But I

1:01:50

thought the premise and the exercise of reimagining notebooks

1:01:53

, I thought it was very , very

1:01:56

valid , right . One

1:01:59

other thing that I thought is maybe a side like

1:02:02

a fun bit you

1:02:04

can also every notebook . So if you open on the

1:02:06

browser , everything comes with a token , right

1:02:08

. So it's a bit more secure , let's say , but you can

1:02:10

also expose it and you can

1:02:12

actually run with WebAssembly .

1:02:15

Ah , yeah , okay .

1:02:16

So people could just , with the link and all these things they can actually

1:02:18

go and run in your browser . So I thought that was a nice

1:02:20

, nice , fun touch . All

1:02:23

righty , I

1:02:26

see also , indeed , we spent a lot of time on the previous

1:02:29

thing , my pad . Maybe Do

1:02:33

you want to change gears again , bart ? Um

1:02:35

well , maybe you're

1:02:39

very sketchy , you want to doom

1:02:41

this ?

1:02:45

big leap . Uh , yes , let's doom

1:02:47

this um I'm

1:02:50

just looking here .

1:02:51

What else can we cover before

1:02:54

we call it a pod ? What

1:02:58

is this Doom thing ? I saw it before , but I'll let

1:03:00

you explain . I thought it was pretty cool do

1:03:06

you know Doom ? I know Doom

1:03:08

. You should play Doom . Where

1:03:11

do you want to play this , bart ? But

1:03:13

have you played it ? I've played a bit . But

1:03:17

also last week we talked about the

1:03:19

gallery .

1:03:21

Ah , yeah , that's true . Yeah , yeah , yeah , that's true , that was

1:03:23

fancy , right , that was very fancy

1:03:25

. What was it called ? Again , I forgot the name . It

1:03:27

was basically Doom , but instead of shooting

1:03:29

monsters , you were walking around in the

1:03:31

gallery , drinking wine and collecting cheese

1:03:34

, looking at art , it's very nice , but

1:03:36

I think you're too young to have played actually

1:03:38

played doom when it came out right .

1:03:40

Well , I think I played it already as a I

1:03:42

say a retro , thing .

1:03:43

Okay , okay yeah , this is the way

1:03:45

to call me old , but

1:03:48

what is now there ? and I think it's actually a

1:03:51

reaction to something that came out earlier

1:03:53

, a

1:03:56

bit earlier we saw Tetris in a PDF , but

1:03:58

what I'm going to put on the screen now

1:04:00

is Doom running

1:04:02

in a PDF which is mind-blowing

1:04:06

. It's bananas , it's bananas

1:04:09

, it's very cool . So

1:04:11

you just opened a PDF link . I think you need a Chromium-based

1:04:13

browser to do it . Okay , because

1:04:17

it uses the PDFium engine and

1:04:19

apparently the PDF engine of most

1:04:22

browsers not necessarily

1:04:24

all PDF readers , but

1:04:26

the PDF readers of most browsers . They

1:04:29

do support a very

1:04:31

, very limited set of JavaScript . Wow

1:04:33

, and

1:04:35

he leveraged that

1:04:38

in order to basically

1:04:40

get a text-based Doom running

1:04:43

in a PDF engine , which is crazy

1:04:45

, and he renders the images

1:04:48

, the world , basically by

1:04:50

converting the graphics

1:04:52

line by line to to ascii characters

1:04:55

. It's crazy . It's crazy and

1:04:57

uh , I think but I'm not

1:04:59

100 sure I mean you need to . It

1:05:01

really depends on like this , security

1:05:04

wise , would not be able to function without user

1:05:06

interaction . So that's , I think that's also the

1:05:08

reason why there are like explicit buttons . You need

1:05:10

to interact for it to be

1:05:13

able to do something . I see , I

1:05:15

think the javascript engine that is in a pdf is it's

1:05:17

. It's it's very limited in what

1:05:19

it can do . You need to have user

1:05:21

interaction , a bit like in a

1:05:24

browser , like audio doesn't just start

1:05:26

playing without user interaction . Like there

1:05:28

are a lot of of safeguards in place

1:05:30

and with pdf it's probably much , much , much trickier

1:05:32

. So it's really cool to see that it

1:05:34

can even run . Doom has

1:05:36

run on a lot of places , even

1:05:39

on a pregnancy test it's

1:05:41

, but apparently it can also run in the PDF

1:05:43

.

1:05:44

This is cool , right ? I think some people were

1:05:46

asking , like why ? But I also think he's like , just

1:05:54

try to push it a bit . You know , yeah , why not , right ? I mean , why not ? And it's like maybe someone will look

1:05:56

at it . Oh yeah , maybe this is useless , but I have this idea , which is actually nice , you know . So

1:05:59

, yeah , I thought it's pretty cool

1:06:01

. Have you played it ? Actually on the

1:06:03

pdf , I played a bit on the

1:06:05

pregnancy test .

1:06:06

No , but the pdf

1:06:08

. I didn't have the pregnancy test with but

1:06:12

, uh , the the pdf is very easy

1:06:14

to play . It's just pushing the buttons .

1:06:16

Yeah , it's nice , it's cool . Makes

1:06:18

you think as well , like how the

1:06:21

compute power , how it advances

1:06:23

right before you needed something else . It's just on the PDF , it's

1:06:25

just on the browser , it's just there it's there

1:06:27

. Okay , maybe last thing

1:06:30

to close it off , if that's okay , unless

1:06:35

there's something you want to cover . Yeah , let's go . And I saw this . It's

1:06:38

a bit of a food for thought corner

1:06:40

. This is a

1:06:43

meme from

1:06:45

yeah , it's actually I don't know . It's

1:06:48

a picture of a news . You want to describe what

1:06:50

you see there , bart for the people that are just listening .

1:06:52

It's a picture of a news article

1:06:54

like in a in a physical newspaper

1:06:56

, a picture of a

1:06:58

woman and x which says there's a quote

1:07:00

I want ai to do my laundry and dishes so

1:07:02

that I can do art and writing , not

1:07:04

for ai to do my art and writing so that I can

1:07:06

do my laundry and dishes yes

1:07:09

, so the fool .

1:07:10

There is a tweet . She says yeah , the the issue

1:07:12

of ai is direction , because

1:07:15

now they're saying that it's AI

1:07:18

is shifting more towards the creative arts . But

1:07:20

that's the thing that quote unquote gives

1:07:23

people more pleasure and

1:07:25

AI should be optimized to allow

1:07:28

people to do what they want and to know um

1:07:31

, yeah , enjoy

1:07:33

life in a way like automate the boring stuff

1:07:35

, let me do the stuff that I like , but

1:07:37

that's not what we are seeing , according

1:07:39

, well , what I don't really according to

1:07:41

her , um , maybe

1:07:44

.

1:07:44

Why don't you agree , maybe , um

1:07:46

so

1:07:49

I think it's true that ai is

1:07:51

like generative . Ai , specifically , is very

1:07:54

active in the art

1:07:58

scene . Yeah , if you call this

1:08:00

maybe already a sensitive statement

1:08:02

, because a lot of people say that's not art , but in

1:08:04

the creative scene let's maybe put it like that in the creative

1:08:07

scene yeah but

1:08:10

it doesn't stop you from painting , right ? True

1:08:13

it's you choose whether or not you want to use

1:08:15

that as a tool .

1:08:16

It doesn't doesn't inhibit you to do anything

1:08:18

with yeah , yeah , yeah , I see what you're saying in the creative space , yeah

1:08:21

that's true .

1:08:22

Maybe as a professional it does . Maybe as a professional

1:08:24

you're forced to to pick up these

1:08:26

tools , but as a individual , as a hobby

1:08:28

that's how she's describing it you

1:08:30

choose what you do , that's , and I think

1:08:32

, when it comes to doing the laundry and dishes , I

1:08:36

think we're getting closer

1:08:38

to that as well .

1:08:40

Yeah .

1:08:40

And I think we've and

1:08:42

actually maybe something that we need to discuss , but I think

1:08:44

there have been a lot of hype

1:08:47

is maybe overstating it but a lot of more

1:08:50

news on robots in AI

1:08:52

, and I think the interesting thing

1:08:54

of Gen AI , when you

1:08:56

look at the field of more autonomous

1:08:58

things that also can do something in the real

1:09:00

world , is that the

1:09:06

way that you need to program these things is much less deterministic

1:09:08

like it's much less rule-based , and

1:09:11

it's very hard to do things rule-based because if I develop

1:09:13

a robot that does the dishes in your

1:09:16

house , I don't know how your house is going

1:09:18

to look like . So I'm going to try to imagine something and

1:09:20

I'm going to build tons of rules . And

1:09:24

if you have something with an LM layer in between , you

1:09:26

can be a bit more descriptive on

1:09:29

what you want to get done versus specifying

1:09:31

all the rules together . So I

1:09:33

do think we will see advances

1:09:35

there as well .

1:09:38

Which I think is interesting because before the Gen AI boom

1:09:40

, I feel like people will look at reinforcement

1:09:42

learning for these things right . I

1:09:44

think even you mentioned

1:09:46

as well that there was a little reinforcement

1:09:48

learning project , that you replaced the

1:09:50

thing with LLM right and

1:09:52

yeah it worked . It worked , I mean , it was easier

1:09:55

right to to , to get to a good point

1:09:57

.

1:09:58

So and it's probably not one or the

1:10:00

other right I think it's gonna be a combination . I think so

1:10:02

but what I'm trying to say is

1:10:04

like , like the whole uh

1:10:06

evolution that we're seeing also brings us closer to

1:10:08

getting it to do our dishes yeah

1:10:10

, yeah , yeah , yeah , no , uh , true , I

1:10:13

agree , I agree .

1:10:14

One other thing I was thinking when I was looking at that

1:10:16

um , again

1:10:18

a bit food for thought . The jetsons

1:10:20

, you know , the jetson part the

1:10:23

animated series .

1:10:24

Yes , yeah , of course .

1:10:25

So basically , uh , do you know jetsons , alex

1:10:27

? Okay , never mind . Uh , so

1:10:30

basically like a family from the future , right

1:10:32

? And one thing that I always thought it was interesting

1:10:35

is that only

1:10:37

the husband worked . Well , this is an old show , but

1:10:39

maybe I don't know , but his

1:10:41

work week was an hour

1:10:44

a day , two days a week , and

1:10:46

the hypothesis was that in

1:10:48

the future , machines will

1:10:50

automate everything and all the stuff

1:10:52

will be so efficient that people don't need to work as much , right

1:10:55

, and so people can actually spend more time doing the things

1:10:57

that they like . So that's why I thought it linked a bit to the

1:10:59

previous one .

1:11:00

But , um , yeah , like

1:11:02

that's not

1:11:05

how we see things evolving , necessarily

1:11:07

I also don't even know if that's something that people really

1:11:09

I think that's optimistic outlook .

1:11:11

Let's , let's stay optimistic , right yeah

1:11:13

, I think so too much doom and gloom already but

1:11:16

I also think that I

1:11:19

don't know like you know , like ikigai , you know

1:11:21

what ikigai is the

1:11:25

way to look at work the japanese yeah , it's not

1:11:27

necessarily the way to look at work , but like it's what ? like

1:11:30

they say , yeah , something like that

1:11:32

is like , uh , like um , the

1:11:35

one of the the ideas for having a

1:11:37

, a happy , long life is to

1:11:39

find purpose , and a big part of it is also in

1:11:41

your work , right ? So , yeah , that's why

1:11:43

they say that finding a work that you find purposeful

1:11:45

and something like this will lead to a better quality

1:11:48

of life and a longer life . So I also feel

1:11:50

like I

1:11:52

do think that maybe the , the ideally the

1:11:54

work would shift to something that is more enjoyable , but I also

1:11:56

don't feel like the not working is not the answer

1:11:59

, because I also feel like we need a

1:12:02

purpose and we need , like we need to

1:12:04

feel like we're contributing somehow and not just

1:12:06

being

1:12:08

existent .

1:12:09

Yeah , yeah , I see what you mean . Yeah , but I fully agree

1:12:11

with that .

1:12:11

Yeah , yeah , I think yeah , so I was thinking

1:12:13

like when I looked at the jet scenes .

1:12:15

It's like one and a half workday a week , I think is maybe

1:12:17

should be more like the ultimate , like , let's

1:12:19

say , the optimistic way of looking at all this is that

1:12:21

, uh , people

1:12:24

in general , and not just the chosen , chosen few

1:12:26

, like the wealth , will increase and people

1:12:28

become more wealthy and more at ease in life , right

1:12:30

yeah , I think that's the

1:12:32

, that's the optimistic view , for sure and if

1:12:34

, even if you cannot work for whatever reason

1:12:36

, that you're not poor ?

1:12:38

yeah , you have a good life . Yeah , you have a good life .

1:12:40

You don't have that the optimistic

1:12:42

way of looking and thinking that machines will make

1:12:44

everything efficient , automated

1:12:46

and less manual labor is

1:12:48

required and do you think ai , in

1:12:50

three years , will take

1:12:53

a step towards that direction ? I

1:12:59

think within three years we will see the effects of what

1:13:01

is going on now in terms of the

1:13:05

effects in the job market , that some stuff

1:13:07

will become automated and I think how

1:13:09

the world reacts to that is the

1:13:11

big question mark .

1:13:13

Indeed , we'll see . Indeed

1:13:15

, we'll see , we'll see , and with that

1:13:17

I think we can call it a pod . Unless there's something you want to plug

1:13:19

, part no , and that's . I think that's

1:13:21

it for today . Thanks y'all thank you you

1:13:26

have taste in a way

1:13:28

that's meaningful to software people hello

1:13:31

, I'm bill gates to

1:13:34

sell to people .

1:13:36

Hello , I'm Bill Gates . I would recommend TypeScript

1:13:38

. Yeah , it writes a lot

1:13:40

of code for me and usually it's slightly

1:13:42

wrong . I'm reminded , incidentally

1:13:44

, of Rust here Rust , rust .

1:13:48

This almost makes me happy that I didn't

1:13:50

become a supermodel .

1:13:52

Cooper and Ness Well , I'm sorry , guys , I didn't become a supermodel . Cooper and Netties

1:13:54

Well , I'm sorry , guys , I don't

1:13:56

know what's going on .

1:13:58

Thank you for the opportunity to speak to you today about

1:14:00

large neural networks . It's really an honor to be

1:14:02

here .

1:14:03

Rust , rust , rust , rust . Data Topics . Welcome to the Data

1:14:05

Topics . Welcome to the Data Topics Podcast

1:14:07

.

1:14:13

Are you Alex ?

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features