405: DeepSeek: The Open-Source AI Challenger to ChatGPT?

405: DeepSeek: The Open-Source AI Challenger to ChatGPT?

Released Monday, 3rd February 2025
Good episode? Give it some love!
405: DeepSeek: The Open-Source AI Challenger to ChatGPT?

405: DeepSeek: The Open-Source AI Challenger to ChatGPT?

405: DeepSeek: The Open-Source AI Challenger to ChatGPT?

405: DeepSeek: The Open-Source AI Challenger to ChatGPT?

Monday, 3rd February 2025
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

This week we are going to

0:02

discuss how open source crashed the

0:04

stock market. That's a big deal.

0:06

Good job. A lot of big

0:08

tech companies are feeling really shameful

0:10

right now. We're going to get

0:13

into it later in the show.

0:15

Welcome to Destination Linux where we

0:17

discuss the latest news, hot topics,

0:20

gaming, mobile and all things open

0:22

source and Linux. And we discuss

0:24

all of those things if we

0:27

have time to finally get. to

0:29

all the different stories that we

0:31

have. We're doing pretty good this

0:33

week. This past couple weeks. Yeah,

0:36

we did. We did good one

0:38

week, Michael. Two weeks, I think

0:40

two weeks in a row. Two weeks

0:42

in a row. Oh wow, look at

0:45

that. We got this. Let's go for

0:47

three. Let's go for three. Three.

0:49

And with me today are

0:51

the floppy disks of podcasting

0:53

Jill and Michael. Are you

0:55

calling us obsolete? Ooh, Rah!

0:57

Do you see that? I'm

0:59

unhackable. She's like, ooh, rah.

1:02

Flopiest. I love it. I

1:04

knew Jill was saying that

1:07

it's compliment. You know,

1:09

absolutely. Of course, of course

1:11

she went. We're also going

1:13

to be discussing Apple

1:15

being forced to open

1:17

its magical gates to

1:19

its garden. That's really

1:22

exciting there. Now let's get

1:24

this show on the road toward

1:26

destination. Linux. Our

1:28

feedback this week comes from

1:31

Allen. They have this to

1:33

say. I work in a

1:35

public defender's office and I

1:37

am of course the computer

1:40

systems administrator.

1:43

Administrations. Is that

1:45

a official position, Michael,

1:47

an administrator? I

1:49

think it's pronounced administrator. I

1:51

thought it was a minute.

1:54

So it's a combination administration

1:56

and err so as administrator.

1:58

I'm pretty sure. right. And

2:00

let us know by going

2:02

to destinationnics.net/feedback and or comments

2:05

that's it. Oh good job

2:07

Michael. I'm messing up all sorts

2:09

of stuff. Anyway he goes on

2:11

to say I love your show

2:13

and love Linux. I am setting

2:15

up a llama and open web

2:17

UI for an internal AI for

2:20

our attorneys to use with the

2:22

intention of the AI being arguing

2:24

the prosecution's case so our attorneys

2:26

can be better prepared. My first computer

2:28

was back in the 1980s

2:30

and it was in Epson

2:32

80-88 with an aftermarket 20

2:34

megabyte hard drive. That's right.

2:36

The insanely large 20 megabyte

2:38

hard drive that sounded like

2:40

a snare drum and a

2:42

CGA monitor. Yeah, he says I am

2:44

a little bit on the older side. Keep

2:47

up the good work. So first of

2:49

all, I love those sounds of

2:51

the older machines like to me.

2:53

That's just a symphony. It's an

2:55

orchestra. And literally there's like a,

2:57

that video, there's a YouTube channel,

2:59

I don't know if it still

3:01

exists, but at one point they

3:03

were making like songs using computer

3:06

hardware from like those days.

3:08

Yeah, hard drives, floppy drives. It's

3:10

awesome. That's one of my favorite.

3:12

Yeah. Those were the days out

3:14

there. And you know, the whole

3:17

idea of utilizing AI for law

3:19

practice and things is it's huge.

3:21

I have family members, the practice

3:23

law. AI has been a massive

3:25

time saver for them. Just as

3:27

an example, here are some use

3:29

cases that are being used with

3:31

AI when it comes to law.

3:33

Obviously legal research, but they have

3:35

specialized AI tools like Ross and

3:37

LexisNexis, which provide case law. So

3:39

when you think about a lot

3:41

of times when they're going through

3:44

different cases, they're trying to

3:46

find other cases that set the

3:48

president. of how it should be

3:50

what should be ruled on it.

3:52

Right? And so in this case you

3:54

have an AI tool. Well, not

3:56

the president, the precedent. That's why

3:58

I said, didn't I? It sounded like

4:01

president. What? Yeah. And you're

4:03

doing me a hard time

4:05

for announcing things. Not again,

4:07

Jill. You did this last

4:09

week, Jill. You were a troll

4:11

last week. I want to go

4:13

back and look at the footage.

4:15

I'm pretty sure I said precedent

4:17

for sure. 100%. Well, when the

4:20

president is going through laws. Okay.

4:22

So the precedent set the precedent so

4:24

they go back and it can look

4:27

through millions of cases at a time

4:29

right to find similarities versus having somebody

4:31

sitting there and you know looking at

4:33

case by case by case so the

4:36

amount of the speed of being able

4:38

to find other laws or other rulings

4:40

on uncertain cases is just incredible so

4:42

you could see that would be a

4:45

huge help. Also like going through discovery

4:47

because they give you like mountains of

4:49

things and all that sort of stuff.

4:51

like you know any like those kinds of

4:53

things and then going to like trying to

4:56

find case law and all that stuff like

4:58

there is so much potential for this

5:00

also let's hope it doesn't hallucinate

5:02

Yes, it starts talking like a pirate

5:04

in the middle court. It can't

5:06

obviously argue for you. But, well,

5:09

people will probably try, I'm sure.

5:11

There are lawyers and there's really

5:13

bad lawyers and there's some really

5:15

bad lawyers are going to use

5:17

AI for their whole, their whole

5:19

practice is just going to be

5:21

one big AI readout. There's already

5:23

one that happened and it's already

5:25

blew up in their face. Okay.

5:27

Good shop. That sounds exactly right.

5:29

Contract analysis. Michael, we have to

5:31

do contracts all the time and

5:33

that's, you know, looking at these

5:35

contracts can be daunting because Michael

5:38

does the best impression of the

5:40

contract wording. It's like here too,

5:42

therefore, here in course, all of those special

5:45

words. And it can make it difficult

5:47

to understand what's saying. So you could dump

5:49

that contract in AI and say, hey. What

5:51

does this really mean? What are the implications

5:54

of this? And it can spot different clauses

5:56

that might pose risks. Well, Michael A. I

5:58

says the party of the first... heart is

6:00

invalid. I don't know. Michael AI

6:02

is great for law cases. If you

6:05

want to use it for that, it

6:07

is out there for free. I gave

6:09

it to the world. Definitely. Don't. Don't

6:11

use that. If you're going to

6:14

represent yourself using Michael AI,

6:16

it will totally help you

6:19

out. You're legal castrics. We

6:21

are kidding. Don't do that. by

6:23

quickly processing vast amounts of key

6:25

data like you mentioned and then

6:27

the judicial decision support jurisdictions use

6:30

AI to assist judges in predicting

6:32

outcomes based on prior past cases

6:34

so again you can load so

6:36

much information into it and they

6:39

can process it so quickly that

6:41

it's really a perfect use case

6:43

for a profession. I do wonder

6:46

though will paralegals be impacted by

6:48

this a lot? You know, because

6:50

paralegals used to do a lot of

6:52

that research and work. It's a great job.

6:54

You make good money with it. So hopefully,

6:57

you know, AI continues to be

6:59

a asset to make things quicker.

7:01

They can work on more cases

7:03

more efficiently. But Michael, to your

7:06

point, when you deal with the

7:08

hallucination in AI, The idea of you're just going

7:10

to use AI and get rid of your

7:12

paralegals would be career law firm suicide I

7:14

feel like. You know, just you got to

7:16

have a human there. You may not need

7:18

as many potentially but you definitely still

7:21

need some humans there. I think I think

7:23

more I think the a lot of people

7:25

looking at this in like a negative because

7:27

a lot of people are not ambitious

7:29

enough and I mean that in the

7:32

sense of like people who run businesses

7:34

and companies. look at it like oh

7:36

we can replace people and like why

7:38

you could instead supplement them and have

7:41

them do more so instead of doing

7:43

higher having less people on board you

7:45

can actually get more customers because

7:47

they can handle more workload because

7:49

they have this why would you

7:51

not it's weird that people don't

7:53

look the other way and they

7:56

look like well I think everyone's

7:58

been jaded by corporate America and

8:00

the fact that every company was

8:02

drilling at the fact of getting

8:04

rid of employees when this first

8:06

happened. I mean, before AI even got

8:08

to the point it is, companies were

8:10

firing people, getting rid of their

8:13

staff, in anticipation for what AI could

8:15

do. It hadn't even done anything yet.

8:17

You know, they just were like, oh,

8:19

we don't need all this anymore. We're

8:21

going to get rid of like, they,

8:24

because they're so beholden to their stock

8:26

holders. who, you know, only care that you're

8:28

able to double the stock every year.

8:30

They're drilling at the idea of getting

8:33

rid of their employees. They can't wait.

8:35

They are so excited. It looks like

8:37

they can kind of pretend that they

8:39

did something of value. Yeah. Oh, they're

8:42

just, they're drawing at it. They're like,

8:44

oh my God, if I could just

8:46

get rid of them. I mean, they're

8:48

like family. We're all one big family

8:51

here, but I cannot wait to

8:53

get rid of them. You know,

8:55

you know, you know, you know,

8:57

And it's actually kind of perfect

8:59

because I never thought about it.

9:01

They always talk about, oh,

9:03

we're a family here and

9:05

I've never, I never, I

9:07

don't know why it never like

9:10

popped in my head.

9:12

But there's literally a phrase

9:14

that says never do business with

9:16

friends and family. You don't want

9:19

that. It doesn't make any sense.

9:21

Excuse of a business that's ever

9:23

existed in humanity. Which is proof

9:25

that we're like a family because

9:28

we it's it's a mess. It's

9:30

just one big slop. Yeah. Well,

9:32

thank you so much, Alan,

9:34

for writing in and letting

9:36

us know and sharing a

9:38

little bit of your journey

9:40

with computers. Yeah, the Epson

9:42

8088 was part of the

9:44

Epson Equity series of computers,

9:46

and they were wonderful. I've

9:48

actually got an Epson Equity

9:50

2 Plus with an Intel

9:52

286 in my collection. Of

9:54

course you did. Actually, one of

9:56

my computer labs when I

9:59

was teaching. we used Epson equities

10:01

for a while. So I got

10:03

to set all those up and

10:06

deploy them and they were great

10:08

machines at half the price of

10:10

the IBM compatible. So that was-

10:12

I smell a treasure hunt because

10:14

I kind of want to see

10:17

this thing. Is it too big to

10:19

hold up, Jill? Yeah, and heavy. It'll

10:21

have to be on the cart, huh?

10:23

Yeah, it would. Well no I really appreciate

10:25

you writing in this feedback Alan is

10:28

awesome because it allows us to kind

10:30

of talk about what's going on in

10:32

the AI world different use cases for

10:34

the AI world if you have use

10:36

cases in the AI world that you want

10:38

to send to us how you're using it

10:40

how it's making your life better easier

10:43

us making your life work worse if

10:45

if you are in a corporation

10:47

and it's all one big family

10:49

and AI is threat of doom

10:51

and gloom let us know Your

10:54

thoughts on that go to destination

10:56

linux.net/comments or destination Linux dot

10:58

net slash forum Also if

11:00

speaking of the corporation if

11:02

they're like family, let us

11:04

know if Bob's your uncle Very nice

11:07

And as Linux users, we know

11:09

what's up. I mean, security is

11:11

non-negotiable. But with threats getting smarter,

11:13

your security tools need to keep

11:15

pace without dragging your systems down.

11:17

Traditional agents, they're just

11:19

slow. They'll slow you down. They'll

11:22

leave blind spots. It's just a

11:24

mess. It's time for a smarter

11:26

approach. And that's why Destination Linux

11:29

is proud to be sponsored by

11:31

Sandfly. The Revolutionary Agentless platform designed

11:33

for Linux. Sandfly just doesn't just

11:36

detect and respond. It revolutionizes security

11:38

with SSH key tracking, password auditing,

11:40

and drift detection, covering threats from

11:43

every angle, whether your systems are

11:45

in the cloud, on-premises, or embedded

11:48

devices. Sandfly ensures they're all

11:50

secure without the headaches of

11:52

agent-based solutions. Listen to what

11:54

Ken Kleiner, the senior security

11:56

engineer at the University of

11:58

Massachusetts, has to say, he says, Samfly is

12:00

the first product I've seen that accurately

12:03

and quickly detects thousands of signs of

12:05

compromise on the Linux platform.

12:07

Its unique method automates tasks which

12:09

would be manually impossible. Automation is

12:12

key with detection and Samfly completely

12:14

fits this and other requirements. If

12:16

your organization is using Linux, this

12:18

should be part of your cybersecurity

12:21

tool set. Experience security

12:23

that's not just effective,

12:25

but gives you peace

12:27

of mind. No agents,

12:29

no downtime, just cutting

12:31

edge protection. Dive into

12:33

the future of Linux

12:35

security at destinationlings.net/Sandfly. That's

12:37

destination linux.net/S-A-N-D-F-L-Y. And see

12:39

how Sandfly can transform

12:42

your security strategy. Michael, did

12:44

you know? You can actually test

12:46

this for free. Even if you're

12:49

not a business yet, you're thinking

12:51

about opening one. Twenty. Wow. That's

12:53

all your fingers and all your

12:56

toes, Michael. All my fingers and

12:58

all my toes can run your servers.

13:00

That's 20. That's 20 of them right

13:02

there. It's a very efficient typing

13:04

process though. The interface to

13:07

the software, its capabilities are

13:09

truly awesome. Check out Sandfly,

13:11

destination linux.net. Sandfly really is

13:13

an amazing tool. You gotta

13:15

check it out. Michael, this episode,

13:17

you know, you give me some crap

13:19

sometimes that I write episodes about AI, but

13:22

you're going to see that I was

13:24

like, took your feedback and was like, no

13:26

more AI. We're not talking about AI. And

13:28

then something this week happened, and then

13:30

I literally message you saying, hey,

13:33

this better be on the show. Yeah,

13:35

this is going to be on the

13:37

show. All right, so unless you're living

13:39

under a rock like Michael because he

13:41

doesn't listen to any news at all

13:43

Except for this I did hear about

13:45

somehow popped up on his feed somewhere

13:47

If it's tech news, I'll pay attention

13:49

if it's regular news, then I don't

13:51

pay attention Yeah, the tech stock sell

13:54

off caught a lot of people by

13:56

surprise like yesterday the stock just completely

13:58

tanked We recorded this on Wednesday for

14:00

invidious, Microsoft, huge, AMD, they

14:02

all slid, like 15% in

14:05

some cases, like these were

14:07

huge stock slides. I think like

14:09

the whole, like the whole number,

14:11

now this could be just like

14:14

what I heard, it might not

14:16

be accurate, I'll just give a

14:18

round figure, but there's people saying

14:21

that the drop of the stock

14:23

market was like $500 billion dollars.

14:26

Yeah. Like collective. Yeah, I

14:28

heard in video was up to

14:30

17% more. Yeah. This is how

14:32

stupid shareholders are. Like, you can't

14:34

use them to run your business. If

14:37

companies could understand this, like, when I

14:39

say stupid, I mean, as a group

14:41

mentality that shareholders have, they hear any

14:43

little news. and they have no they

14:45

don't even know the industries they're investing

14:48

in they barely understand what they're working

14:50

on they just heard AI they put

14:52

a bunch of money in it and

14:54

then they're like i don't know what

14:57

it is but it sounds cool and

14:59

then if any news hits they don't

15:01

take any time to like what's the

15:03

overall impact what's the long-term impact what's the

15:05

long-term impact what are other things they're coming

15:08

out and other they don't know any of

15:10

that they're stupid as a whole and if

15:12

any of those people are listening to this

15:14

episode. Contact makes, I'm

15:16

starting my own companies

15:19

called blockchain, Bitcoin slash

15:21

AI. So, you got all the

15:23

other words in there? Yeah, all of

15:25

them. I, oh, dot I, yeah. Very

15:27

good, Michael, very good. I'm going to

15:29

invest in that because meme coins are

15:31

all the rage right now. I don't

15:34

know why, but they are. I don't

15:36

know. Like, there's literally not one

15:38

that is of any value, but

15:40

for some reason people still are.

15:42

doing it, I lie. I mean, I

15:44

want to be clear that buying shares in

15:46

the company and investing in the

15:49

company is awesome. It's such a

15:51

great idea and it helps so

15:53

many companies out. The problem is

15:55

companies listening to their shareholders

15:57

and doing everything their shareholders.

16:00

one without any gatekeeping and because

16:02

as a whole they're stupid and

16:04

you can see it with this

16:06

like they literally are like acting

16:08

like the world is on fire

16:10

because there's another competitor to AI

16:13

as if nobody else in the

16:15

industry was ever going to create

16:17

competition for the US we were the only

16:19

ones who were ever going to have AI

16:21

that's it yeah I have no idea where

16:23

this is from it's from a movie or

16:26

TV or something where it's like it's such

16:28

a great saying where it's A person

16:30

is smart, but people are stupid.

16:33

Yes. It's like it's like a

16:35

collective when as a collective

16:37

we can we we get worse and

16:40

worse the more get us all

16:42

together. We're stupid. You put us

16:44

all in a room or morons,

16:46

you know, individually, brilliant. Yeah.

16:48

So, you know, the stock market

16:50

freaked out and why are they

16:53

freaking out because this Chinese startup

16:55

AI tool or company known as

16:57

deep seek seek gained. 1.6 million

16:59

downloads on its debut of its

17:02

AI. They said it's not only

17:04

cheaper, it requires less resources

17:06

to run on than the

17:09

other models that also utilizes

17:11

reasoning-based AI to solve problems,

17:13

which means it's on par

17:15

with ChatGPT's latest model, which

17:18

of course costs hundreds of

17:20

billions when you look at the

17:22

whole amount of investments that have

17:24

been made into AI. And so... you know,

17:27

meta chiefs, AI scientists stated,

17:29

it's not that China's AI

17:31

is surpassing the US, but

17:33

rather that open source models

17:36

or surpassing proprietary ones. Boya,

17:38

boya, I'm kind of... That's

17:40

the best part. I think

17:42

that this is really interesting

17:44

news and I think there's

17:46

definitely some pros and cons, you

17:48

know, to everything, but this is

17:51

wonderful news because it's open source

17:53

and maybe... I'm a little bit

17:55

bitter about it, but open AI,

17:58

even being called open AI, makes

18:00

me hate them more because

18:02

they didn't change their name

18:04

when they abandoned open source.

18:06

So they confuse people by

18:08

having that name and they

18:10

probably do it on purpose at this

18:12

point. But I am, I'm kind of

18:15

like a kid in a candy

18:17

store in the enjoyment of how

18:19

delicious that backlash for them is

18:21

because it's like, well, you know

18:23

what, open source just kicked you

18:25

in the teeth. Yeah, did. Yeah, leave

18:27

it to open source, AI to

18:30

be more cost effective, take

18:32

less energy to run, and

18:34

be more memory efficient. Whoo,

18:36

win win. Yeah, absolutely. You

18:38

know, it's interesting because the, when

18:41

I talk about the investors not

18:43

doing their research, when you

18:45

really look into Deep Seek, so

18:47

these were all the headlines that

18:50

were everywhere. It only cost

18:52

six million dollars to produce

18:54

the whole thing, like. That's

18:56

shocking if that's the case,

18:58

right? But if you just dig a little

19:00

deeper, you find out that that's

19:03

not really the case at all.

19:05

Like we know for sure it's

19:07

somewhere north of $500 million have

19:09

been invested since the start of

19:11

this company. Probably billions behind the

19:13

scenes that are, you know, not

19:15

having because China does things a

19:18

little different than the U.S. Things

19:20

don't necessarily... A little different. We're

19:22

not above board either, but we

19:24

pretend to be a little better

19:26

and they're, you know, they don't even

19:28

try to pretend, I think, in a lot

19:30

of cases. Yeah. But also, like, I think

19:33

there's reports saying that it's based a lot on

19:35

llama or... Because I mean so however

19:37

much money that meta put into that

19:39

is definitely going to should be applied

19:41

to how much money was required. This

19:43

is why again shareholders don't understand anything

19:46

is that it's like saying that you

19:48

when Ford you know was creating a car

19:50

and doing the research and development

19:52

to ever create the assembly line and

19:55

all of that type of stuff it

19:57

cost him a billion dollars of research

19:59

and development. And today, Toyota can

20:01

make a car for 23

20:04

grand. Like, there's no comparison

20:06

because you've, they've obviously, Deep

20:08

Sea, has utilized all of

20:11

the work that all these

20:13

companies have done. And I

20:15

can tell you that with

20:17

certainty, in my opinion, because

20:19

its responses are

20:21

darn near identical. You should

20:24

say it with a question mark.

20:26

I can say it with certainty.

20:28

I mean, I tested deep seek

20:30

out in all kinds of different

20:32

contexts. I mean, writing code,

20:35

you know, asking it general

20:37

questions, seeing where it's guardrails,

20:40

we're at, telling jokes, doing

20:42

show nodes. I tested all

20:45

kinds of different things. And

20:47

number one, it's super cool.

20:49

I think it's amazing. And I

20:52

love that it's open source. But

20:54

I see nothing that makes me go.

20:56

This is surpassed anything that exists

20:58

that's exist already. Its responses are

21:01

so similar that I can tell

21:03

it's got to be trained on

21:05

nearly the same identical data that

21:07

the other models were trained on.

21:09

And, you know, when you ask

21:11

it about things like, I think

21:13

one of the fears people had

21:15

is, hey, this means the Chinese

21:17

chip manufacturing is caught up with

21:20

the US, but the reality is you

21:22

find out that they in fact

21:24

utilized invidious GPUs, GPUs, and When

21:26

I asked Deep Seek itself,

21:28

I said, did you train on

21:30

invidia GPUs? It said, yes, we

21:32

had some H100s and some others,

21:35

which are the latest, you know,

21:37

some of the latest ones. But

21:39

it also mentioned that we may

21:41

have utilized things like AWS cloud

21:43

services and Google cloud services, which

21:45

allow us to access in video

21:48

hardware, It suggested that that may

21:50

be the case. So I like

21:52

how it said may do that.

21:54

We may have done may. Yeah.

21:56

If the investors just breathe

21:59

a little. may be having this

22:01

call may be recorded. It's always recorded.

22:03

May be recorded. So if they just

22:05

didn't freak out and just put a

22:08

little thought behind it, they'd be like,

22:10

hey, this is really cool. And competition

22:12

pushes innovation. And so, of course, I

22:15

want to see more amazing AI models

22:17

and things. And what do you think

22:19

ChatGPT and Gemini and all these people

22:22

were doing right now? They're freaking out

22:24

trying to figure out how to get

22:26

one of their. better models out to

22:29

surpass what's being done here because you

22:31

have to. And to me, that's always

22:33

a good thing. It's a terrible idea

22:36

in my mind that ChatGPT or any

22:38

other doopoly or triopoly of a company

22:40

will be completely in control of AI.

22:42

That scares me more than anything else

22:45

because these tools are incredibly powerful. They're

22:47

the iteration of like a digital army.

22:49

you know like digital warfare technology that

22:52

can be utilized from these things and

22:54

to have one single entity control all

22:56

of that is never good in human

22:59

society all through history when you've got

23:01

a concentration of power in one area

23:03

this is bad this would be like

23:06

the internet was controlled by a single

23:08

group that would be like the same

23:10

level or word yeah like there's this

23:13

is that's a good point I mean

23:15

I hadn't thought about that but I

23:17

definitely had the same fear of like

23:20

a couple even a few companies dominating

23:22

it in the sense of like because

23:24

that that stupid article we covered many

23:27

episodes ago have no idea when we

23:29

covered it that stupid article about the

23:31

dangers of open source AI and here's

23:34

here's we did find out that open

23:36

source AI is dangerous to terrible people

23:38

who make decisions on stock stuff like

23:41

that definitely it is dangerous for those

23:43

people so yes Yeah, we don't want

23:45

terrible people. I mean like terrible decision-makers.

23:48

Oh, yeah. Yeah, we don't want T1,000.

23:50

walking around. So that's what we had.

23:52

Unless it's open source too. Yes, that's

23:54

the thing. See, open source creates a

23:57

system of checks and balances. And with

23:59

open source we won't have T1,000. We'll

24:01

have T1,000. We'll have OS1,000. Yes, there

24:04

we go. We'll have T1,000. But they'll

24:06

run some arbitrary version of like Arch

24:08

Linux that requires tons of compiling and

24:11

things to make it do anything. And

24:13

so it won't be that dangerous after

24:15

all. Yeah. We'll have to ask them.

24:18

We'll see if they, what happens when

24:20

you RMRF and OS 1000? Oh yeah.

24:22

We can hopefully wish that they're all

24:25

running Linux from scratch? That would help.

24:27

That would make sure no one uses.

24:29

Yeah. So what was interesting when I

24:32

was doing, I was reading and doing

24:34

research, I noticed an interesting tweet or

24:36

an ex post. Not a tweet. It's

24:39

a tweet. It's still a tweet to

24:41

me. Yeah. So a tweet. I don't

24:43

care what the rename. I don't know.

24:46

I don't care. If those in the

24:48

audience like the X. You don't exist.

24:50

No one likes sex. Yeah, no one

24:53

likes sex. I don't have to worry

24:55

about offending you because no one likes

24:57

it. Oh no. Michael, there's somebody in

25:00

there who's sitting in their underwear eating

25:02

Cheetos going, oh I'm like, oh excuse

25:04

me, they're Cheetos, they're just, Cheetos went

25:06

everywhere, they're getting to the keyboard right

25:09

now. I think the only people who

25:11

are going to get comments, but they're

25:13

going to be in like a sarcastic

25:16

sarcastic, like, like, like, I like, I

25:18

like, like, I like, I like, I

25:20

like, I like, I like, I like,

25:23

I like, I like, I like, I

25:25

like, I like, I like, I like,

25:27

I like, I like, I like, I

25:30

like, I like, I like, I like,

25:32

I like, I like, I like, I

25:34

like, I like, I like, I like,

25:37

I like, I like, I like, I

25:39

like, I like, I like Yeah, because

25:41

it's like it's a tweeting is so

25:44

much better than Xing. I mean, tweeting

25:46

wasn't either, but it's better than Xing.

25:48

I'm gonna send you an X. Sweeting

25:51

at least made sense because it's like

25:53

what birds send out small signals and

25:55

the idea of like a tweet would

25:58

be like a small message and you

26:00

know, it kind of. make sense? It's

26:02

what kind of weird. Tweet, tweet, tweet,

26:05

tweet, tweet, tweet, tweet, tweet. How dare

26:07

you say that on air? That

26:09

is so offensive. Sorry Jill, go

26:11

ahead. Well anyways, back

26:13

to the tweet, Tech

26:16

Investor Mark Andresen actually

26:18

wrote on Twitter on

26:20

January 24th. Deep Seek

26:22

R1 is one of

26:24

the most amazing and

26:26

impressive breakthroughs I have

26:28

ever seen and as

26:30

open source a profound

26:32

gift to the world. I

26:34

just love that. Yes. Michael,

26:37

the fear with these things

26:39

is, is it really open

26:41

source? Is it really open

26:43

source? Well, I mean, the

26:45

answer to this is clear.

26:48

It's 100% kind of. All right, so,

26:50

you know, the, I think one of

26:52

the big things that made people freak

26:54

out to is that Berkeley, who has

26:56

a research tool that performs rating on

26:58

chatbot capabilities and it ranked this immediately

27:00

in the top 10 of chatbots in

27:02

there. In regards to open source, you

27:04

know, the term gets thrown out a

27:06

lot. So I tried to dig in

27:08

the best I could. I mean, Deep

27:10

Seek so new they barely had time

27:12

to put up information about their information

27:14

about their stuff. So I assume more

27:16

will come out on this, the source

27:18

code associated. with Deep Seek models,

27:20

the Deep Seek V2 and V3

27:22

is released under the MIT license.

27:25

However, they do have restrictions

27:27

on other models and require

27:29

customers to pay to connect

27:31

through their API. However, they're certainly

27:33

more open source than JetGPT, which

27:35

is zero. Which used to be,

27:37

you know, super open source. I

27:39

think it's amazing. I'm with you,

27:42

Michael. I am so not... a fan of

27:44

chat gPT because of the fact that

27:46

they took the work of the open

27:48

source community, then the moment there

27:50

was billions available to be made, closed

27:52

it down, kept that work, closed it

27:55

down, and then, you know, made this,

27:57

and then have done, I haven't seen

27:59

them. do anything that's made the

28:01

news anyways, open source sense. Like,

28:04

they might really open source at all.

28:06

But I don't think since since they

28:08

did like a code dump when they

28:10

did the transition, I don't think they've

28:13

done anything since that I've seen it

28:15

all. So they have. It's certainly not

28:17

notable enough that it's ever made any

28:19

news that I've seen in there. And

28:22

I would just like to give you

28:24

a reenactment of when I heard about

28:26

this and how it was hurting chat

28:28

GBTT and open AI. That's what Jesus

28:31

said. Very good reenactment. There is a

28:33

difference between their model license though and

28:35

their code license. The model license governs

28:37

the use distribution and modification of the

28:40

AI model itself, including its weight architecture

28:42

and the data used for training. The

28:44

code license applies to the source code

28:46

related to the models, including any software

28:49

libraries, tools, or scripts. This part have

28:51

V2 and V3 open source under MIT.

28:53

The other part is under far more

28:56

restrictive restrictive license. there so the model

28:58

has restrictions such as and these are

29:00

terrible restrictions it makes me really

29:02

upset prohibitions on the use of

29:05

military purposes so you can't

29:07

use it for military purposes I

29:09

mean oh darn fun in that

29:12

there goes the T1,000 yeah yeah

29:14

restrictions against generating

29:16

harmful content I mean again

29:18

but not there are you Yeah, how

29:20

dare you. And preventing the use

29:23

of the model to exploit vulnerability.

29:25

So you can't go use it

29:27

to hack and stuff like that out

29:29

there. These are, these are, these are, these

29:31

are, these are, these are unacceptable

29:34

and also good. Yes. Now, I will

29:36

say this, the problem with tools

29:38

like Deep Seek is that China

29:40

being the way its government is

29:43

run is extremely restrictive

29:45

of what information.

29:47

even historical events that it

29:49

will allow to have any AI

29:51

run against. So there are certain

29:54

events in history that I'm sure

29:56

just like in any other country

29:58

that China's not. proud of and

30:01

when people test those certain events

30:03

to see if they can get

30:05

information from the AI about them

30:07

it completely does not answer it

30:09

doesn't allow you to have that

30:11

says it can't help with those

30:13

it's very censored it's extremely censored

30:16

in fact and it's on guardrails

30:18

and it's only updated to October

30:20

2023 so it's not current so

30:22

when you play with it is

30:24

it impressive Totally Totally. I think

30:26

it's amazing. I love that it

30:28

has open source to it. I hope

30:31

Deep Seek doesn't do what

30:33

ChatGPT does and close itself

30:35

down now that it's gotten

30:37

all this popularity. At the

30:39

same time, is there anything

30:41

in this tool that's not

30:44

already been done before? No.

30:46

I mean, ChatGPT has far more

30:48

than source. I mean, well, if

30:50

we've got Mama, we've got lots

30:52

of open source. I think that

30:55

what they did was able to make

30:57

an open-source thing that is at least

30:59

on par with chat GPT. So I

31:01

think that in itself is cool. In parts.

31:04

Because it doesn't have voice

31:06

integration. So you can't go talk

31:08

to it. If you do a

31:10

chat GPT, you can sit there

31:12

and have a chat with it

31:14

and it's going to interact like

31:17

a human back with you. That's

31:19

a huge innovation leap. That took

31:21

a lot of technology to do.

31:23

It takes a lot of processing

31:25

to do. and Grock and others

31:27

are being integrated into call centers.

31:29

They're being integrated into workplaces to

31:32

as bots to handle people's issues

31:34

to resolve problems real time. And

31:36

that means that hallucination cannot exist. Right.

31:38

So these are these are very specialized

31:40

models that are built off of chat.

31:43

GPT that run through APIs for all

31:45

these corporations and things. And so from

31:47

those aspects, Deep Seek has none of

31:49

that. And so what it is is

31:52

a really good. large language model

31:54

that was trained on what

31:56

seems like very similar data

31:58

to other models. it's runs really

32:00

well but again to freak out and

32:03

act like it's the game over China

32:05

wins pull all your money out is

32:07

just I mean the idea like your

32:10

your rebuttal for what I said is

32:12

like is you've convinced me 100% but

32:14

like the idea that people were seeing

32:17

this and just instantly reacting like you

32:19

know the sky is falling type of

32:21

stuff yeah it's kind of hilarious in

32:24

a way in a way because If

32:26

you just take a moment and research,

32:28

which is to be fair, people taking

32:31

any moment to do any research these

32:33

days, is quite rare. So at least

32:35

we understand. It's tough with the internet

32:38

and the library in your hand. It's

32:40

really tough. No, no. So you don't

32:42

understand. We having easy access to the

32:45

information super highway. is remember it was

32:47

called that at one point. Yeah, sure

32:49

was. Yeah, see that was that was

32:51

great. But the problem is we don't

32:54

want to get in the car and

32:56

actually go on that super highway. We

32:58

just want it to be like kind

33:01

of like read the sign to get

33:03

on the on ramp and be like

33:05

that's enough. Yeah, I read the sign.

33:08

It's just like traveling on it. I

33:10

read the sign where it's like roughly

33:12

going around there. Yeah. Yeah. That means

33:15

headlines people. It's interesting because so I'm

33:17

you know I'm trying to take the

33:19

position of both I love that this

33:22

exists I think it's super exciting I

33:24

love that it's open source also stop

33:26

freaking out because it's not that exciting

33:29

it's not like they you know paved

33:31

new roads with this and I think

33:33

the claims that have been made about

33:35

The $7 million and all this ridiculous

33:38

stuff need to be thoroughly investigated and

33:40

have already been debunked. Yeah, there's been

33:42

people talking about how the amount of

33:45

information about the money has gone from

33:47

like 6 million or under 6 million

33:49

to 6 million than to 10 million

33:52

than to 20 million to 100 million

33:54

and probably billions. by the Chinese government,

33:56

behind the scenes. Yeah. And also all

33:59

the billions that were done by the

34:01

other stuff that it was using, for

34:03

sure. So like, it was a lot

34:06

of money that went into it. But

34:08

there's also another AI model that people

34:10

are not talking about. And actually, it's

34:13

nicknamed because it's like the presidential model.

34:15

And that's because it's called, it's like

34:17

based on llama. You've heard of O

34:19

llama, right? Oh. And that is... Rockalama!

34:22

Is that real good? No, it's not.

34:24

But it should be. Oh good, Michael.

34:26

That joke is stupid good. Rockalama. Oh

34:29

my gosh. Oh man. I'll be here

34:31

all... Well, I'll be here next week

34:33

too. Next week, too. Indeed. I won't

34:36

be here next week, though, so if

34:38

you hate everything I just said. You

34:40

don't have to hear me next week.

34:43

You will have victory. for one week

34:45

and he'll be back. For one week

34:47

and then I'll be back. Yeah. You

34:50

win the battle but I'll win the

34:52

war. So that's how this works. Exactly.

34:54

That's kind of our take right now.

34:57

Obviously Deep Seek's got a lot of

34:59

road ahead of it, a bright future.

35:01

There's a lot of things that can

35:04

come out over in the next few

35:06

weeks about this. One thing I want

35:08

to mention is like all these AI

35:10

models, the privacy policy is a big

35:13

deal here. What you put into AI

35:15

when you're going and testing and testing

35:17

this. has no you have no rights

35:20

to at all. It's assumed that it's

35:22

being read is now there. It's going

35:24

to be tied to you in every

35:27

form fashionable. They're going to buy more

35:29

data on you and tie that to

35:31

your thoughts and everything that you're putting

35:34

in there. So just assume you're typing

35:36

a public X, that everyone can see

35:38

and read with your actual full name

35:41

in it. And that's how you should

35:43

use AI because yes. By the way,

35:45

Deep Seek's no different in the chat,

35:48

GPT and the others this way. There

35:50

is no privacy expected or guaranteed and

35:52

people are reading this manual. light asterisks

35:54

to what you just said and that

35:57

is because you can technically not all

35:59

of it not everything you can technically

36:01

get the deep seek thing because it

36:04

is open source and you can run

36:06

it locally if you have powerful enough

36:08

computer which is cool because you don't

36:11

necessarily need the most powerful computer ever

36:13

in order to do it and so

36:15

in those cases there have been people

36:18

testing to say that some of a

36:20

lot of the restrictions are taken off

36:22

not everything but like the stuff that's

36:25

information related in terms of like historical

36:27

events that stuff is those restrictions are

36:29

removed so you the local versions you

36:32

could do and you could argue that

36:34

they're not going anywhere because you control

36:36

where it's going if as long as

36:38

the code doesn't have some kind of

36:41

you know phone home thing or whatever

36:43

but We don't I don't know if

36:45

it is because I'm not sure if

36:48

anybody's actually done an audit of it,

36:50

but if it's if not and you

36:52

have it locally you could argue that

36:55

you have that benefit, but if you're

36:57

using the one that's online in the

36:59

like API and whatever, then just know

37:02

it's not any better. They're all taking

37:04

your data and even if you put

37:06

it in the public and not on

37:09

the chat system, they're still probably trying

37:11

to take your data. Yeah, exactly. That's

37:13

the best way to look to look

37:16

at this. And so, you know, there's

37:18

obviously the governments can't get along five

37:20

by five sandbox one shovel. And so

37:23

you got to deal with that aspect

37:25

as well that that stuff could be

37:27

used by either government nefariously. And so

37:29

you've got to keep that type of

37:32

stuff in mind as well. So and

37:34

sometimes it feels like a litter box

37:36

and not a sandbox. Yeah, we'll use

37:39

a litter box. More like a litter

37:41

box for sure. All right. So happy

37:43

news, Joe. I want some happy news.

37:46

I mean, this was happy news too.

37:48

But, uh, pebble watch. Do you remember

37:50

the pebble smartwatch? Oh, absolutely. In fact,

37:53

I had one. And she did, Joe.

37:55

Yeah. I had one because they were

37:57

that that year, I went. to see

38:00

yes and they were showing them off.

38:02

And I bought one. Oh yeah, you went

38:04

to see yes, that's right. Yeah. So

38:06

that was cool but unfortunately that's one

38:08

of the few, one of the few

38:11

tech in my collection that kind of

38:13

got destroyed because it got wet in

38:15

rain and I forgot and I was

38:18

wearing it and then we had a

38:20

thunderstorm and it ruined it. That's a

38:22

bummer. Yeah. So I need to get

38:25

another one off. It was a

38:27

really good looking watch. Like for

38:29

at the time it was released,

38:31

does anybody remember like about the

38:33

year that that came out? 2012?

38:35

2012? Somewhere around there? That sounds

38:38

about right. I think it was

38:40

also like the first one that

38:42

was like a smartwatch that actually

38:45

was good. Yeah, it had the

38:47

whole package. It was had really

38:49

good battery life. It was kind

38:51

of the first time using e-ink

38:53

on a commercial level. It was

38:56

2013. Yeah, 20, okay, 23rd was

38:58

close. Very close. Yeah. And

39:00

it had really good,

39:02

it was functionally really

39:05

great because it was

39:07

smart, but not too

39:09

smart. So it wasn't like, you

39:11

know, an Apple watch or a

39:13

Samsung watch where there were too

39:15

many bells and whistles built into

39:17

it. It was just very simple.

39:20

It had your watch faces, you

39:22

could keep track of your footsteps.

39:24

It was very simple and well-designed.

39:26

I would argue it looks better

39:28

than some of the watches. today. Yeah,

39:30

exactly. I think so too. It was a

39:33

very good-looking watch. It had some

39:35

cool features for the time. Again,

39:37

keep in mind when it was

39:39

released, but it had, I think

39:41

it used an arm processor. At

39:43

least one of the iterations did

39:45

Bluetooth connectivity. Very fitting for

39:47

a watch. Yeah. Three-axis accelerometer,

39:50

gesture detection. Yeah. Yeah, yeah, yeah.

39:52

And then an ambient light sensor as well.

39:54

So it had some pretty cool features in

39:56

there. Not all the health stuff that we

39:59

have today though. a pathometer, I believe,

40:01

and I don't think it had heart

40:03

rate detection either. It didn't have

40:05

any of that type of stuff. But, you

40:07

know, think of the year that this thing

40:09

came out. Yeah. Oh, and so

40:11

many people loved it because the

40:13

notifications were really well. They worked

40:15

really good with the phone, the

40:17

smartphones. So that was, that was a

40:20

thing. Well, you know, it's a shock

40:22

that it went away, because Google bought

40:24

the company and Google's known for never

40:26

getting rid of great things that it

40:28

that it produces. It's what they're

40:30

known for. No, they're not

40:32

known for that. Interesting. So

40:34

here's the thing. I feel like

40:37

it's actually a little more than

40:39

that because the Google did buy

40:41

it, but didn't didn't fit bit

40:44

discontinue pebble? Yeah, bit

40:46

bought it first and then Google

40:48

bought it and then Google bought

40:50

fit bit. But who

40:52

actually discontinued pebble? Was it

40:55

Google or was it fit bit? Who

40:57

knows? I mean, that's what that's

40:59

the only that one's questioning. Is

41:02

this like that game? What is

41:04

it? 12 layers to 12 layers?

41:06

No, it's six degrees of

41:08

Kevin Bacon. Yeah, who? I mean,

41:10

I know it first. Well, I know

41:12

Fitbit utilized a lot

41:14

of the technology from I

41:17

think that's what I did. They

41:19

just took everything and put it

41:21

into Fitbit. I think that's what

41:24

they did. I can't guarantee that

41:26

because it's been many years, but

41:28

I have a feeling. We should

41:30

use it if I could find out

41:32

maybe. We should open up Deep Sea

41:34

and find out see what it

41:36

says. Yeah. So there's good news. If

41:39

you were a fan of the pebble

41:41

as well, Google did make a really

41:43

good decision here to open source. Yeah.

41:46

The pebble OS. So that's cool. That

41:48

means that there's a whole community out there

41:50

by the way that are keeping these watches

41:52

alive That's one of the things I love

41:55

about open source community these companies come out

41:57

they create a really cool product the other

41:59

get bought out out or they don't innovate

42:01

and they go away, but then you still

42:03

have this product you spent a lot of

42:06

money on and or that you you know

42:08

became a huge fan of and now if

42:10

it has web services and other things which

42:12

the pebble watch did they're dead unless

42:14

the hackers come in and fix that

42:16

for you and that's what a few

42:19

of these movements do to keep these

42:21

things alive and so there's a whole

42:23

thriving community of people out there utilizing

42:25

these peble watches and they were I

42:28

think it's called repel. Yeah, where I

42:30

call it rebel. Yeah, a rebel, yeah,

42:32

there that allows you to still take

42:34

the functionality of this watch and keep

42:36

it alive for any of the web,

42:38

not all of the web services are

42:41

available, as I was reading through their

42:43

page. It looked like they have some

42:45

of the core, all the core ones,

42:47

though, that the pebble originally had working

42:49

there with the rebel project. So I

42:51

think it's super cool. And now with

42:53

this. with the code being released, they

42:56

can actually probably turn on all of

42:58

those features that they wanted that worked

43:00

back in the day because they have

43:02

the code to generate that, which is

43:04

super cool. I mean, I'm not going to

43:06

go get one, and be honest, like I

43:09

love my Garmin watch, but if I had

43:11

a pebble, I would think this is really

43:13

cool. And if they make a new version that's

43:15

like, you know, I don't know if

43:17

the name pebble can be used, it's

43:19

probably owned by Google at this point,

43:22

but like... if they make a new

43:24

watch and there's a modified updated version

43:26

of the OS and then that could

43:28

be cool or if like they could

43:30

take it and then like all the

43:33

work for like the like the the

43:35

pine time for example and all that

43:37

work could be you know put into

43:39

it because it's open source and all

43:41

that I think that would be great

43:44

and I have a suggestion for

43:46

the new name for that watch instead

43:48

of pebble we'll call it gravel.

43:51

I got a better one for

43:53

you. Applestone would work. Raspberry Pie

43:55

Foundation comes out with a watch

43:58

and they call it fruity.

44:00

pebble there we go that's perfect

44:02

that's good Ryan that's good

44:04

see that's that's that's great

44:07

but there's there's there's there's

44:09

two trademark things to deal

44:11

with in that case true but it's

44:14

a genius name it's a great name

44:16

is it's a tell me you would

44:18

not own a fruity pebble if I would

44:20

go out and buy one right now

44:22

yes it's powered by raspberry pie chip

44:24

I mean it's powered by dad jokes

44:26

too that has to be an app

44:29

Yeah, raspberry pie foundation, we released this

44:31

to you with no cost to us

44:33

that idea. As long as you send

44:35

us, number one, all of us get

44:37

a free, fruity pebble. And number two,

44:39

you program an app that tells

44:41

dad jokes in the OS. That would

44:43

be great. And I also just realized

44:46

that technically speaking, I should make a

44:48

t-shirt for this because I'm powered by

44:50

dad jokes. Yeah, it's a good one. Oh,

44:53

I like that shirt. That would be

44:55

a good shirt. I'm powered by dad

44:57

jokes. I like it. Speaking of fruit,

44:59

Michael, you know that company

45:01

out there? Oh, they make bones

45:03

and stuff, I think. Bones, laptops

45:06

and tablets and like own ridiculous

45:08

amounts of money. They have lots

45:10

of money. Yeah, you're talking about

45:13

the, are you doing the one

45:15

that's Macintosh or are you

45:18

talking about Fuji apples? Fuji

45:20

apples of course those are delicious

45:22

by the way those are delicious

45:24

those are the best. So Apple

45:27

is being forced to open its

45:29

gate to the garden the European

45:31

Union is pushing Apple to make

45:33

its iOS features including air drop

45:36

and airplay more interoperable with other

45:38

platforms like Android. and the document

45:40

released by the EU outlines several

45:42

changes they want Apple to implement

45:45

to ensure effective interoperability with third-party

45:47

devices including airdrop Apple must provide

45:49

a protocol specification for

45:51

third parties to allow file sharing

45:53

between the devices and then airplay

45:55

they want to enable obviously the

45:57

airplay receive be able to receive

45:59

the files and things like that

46:01

or be able to receive the

46:04

casts I guess from your phone

46:06

so you can airplay on different

46:08

devices and then other features proposal

46:10

covering iOS notifications working with third-party

46:13

smart watches so that other companies

46:15

like fruity pebble can get the

46:17

have that integration with Apple as

46:19

well as Android when they do

46:22

that. So here's my take on

46:24

this. EU is actually making Apple

46:26

a better company and Apple doesn't

46:28

realize it. Like when I think of

46:30

USBC as an example, they use forcing Apple

46:33

to go to USBC, made Apple, better

46:35

product. It's a better product now because

46:37

I don't have to have stupid lightning

46:40

cables plus USBC cables plus all this

46:42

waste, you know, of different wires and

46:44

things. And then we know the lightning

46:47

cables break off inside iPads all the

46:49

time. That's the number one repair I

46:51

do for family members, by the way,

46:53

is pull out the lightning, broken piece

46:56

of lightning port out of the charging

46:58

of an iPad, because the things always

47:00

snap, they just break. It was a

47:02

cool design, it allowed fast charging, but

47:05

it was outdated when USBC came, you

47:07

know, got to... And they were very

47:09

quick at the same, they were like

47:11

roughly around the same time, I think.

47:13

Roughly, lightning was first, I'm pretty sure

47:15

it was out there. But like at

47:17

least it was marketed better if it

47:19

wasn't. But it's also kind of funny

47:22

because you're saying that they're making them

47:24

a better company with that. I feel

47:26

like they know it, but they're like,

47:28

it's more like when you are taking

47:30

a dog to the vet. They don't like

47:32

it. They're gonna fight, but it's good for

47:35

them. That's a great analogy, man. That's a

47:37

great analogy. Apple is so stubborn with this

47:39

stuff, but it's really anti-competitive. Like nobody else,

47:42

like Garmin, for instance. you know can't have

47:44

access to make their watch as functional as

47:46

it could be because Apple wants to keep

47:48

that stuff closed down. And of course they're

47:51

always going to say it's for security and

47:53

all the other stuff. Well then how do

47:55

you do it on your own watch? Dummies.

47:58

So you know you've got to allow. You've

48:00

got to allow companies to get in

48:02

there and also eat. Let somebody else

48:04

to the dinner table and eat apple.

48:06

Like you don't have to hoard all

48:08

the food yourself. Let some other companies

48:10

eat, you know, let us have some

48:13

food. We're not saying we want a

48:15

part of the apple. We just want

48:17

to be a part of the orchard. There

48:19

you go. Let us in the orchard. Oh,

48:21

Ryan, I fully agree with you,

48:23

actually. And as an Android user,

48:25

it would actually be so convenient

48:27

to partake in the original. Applewald

48:30

Garden of sharing data and

48:32

casting data? Especially on my

48:35

Applewald Garden. It'd be the Apple

48:37

Garden. Our Apple Garden, yes.

48:39

There we go. So especially on

48:42

my smart TVs that almost always

48:44

have airplay built in for casting

48:46

and it is annoying having to

48:49

sometimes use third party apps to

48:51

achieve this on Android to cast

48:54

a certain manufacturers

48:56

of TV Samsung. Yes. And also

48:58

Samsung, you're terrible at so

49:00

many things. But the other

49:02

thing is, there's Apple car

49:04

play. In my opinion, you're

49:06

terrible, yes. But the fact

49:09

that they got called having

49:11

microphones on your TV, you don't

49:13

need that. But the car play stuff,

49:16

like I've used the Android version

49:18

and I've used the Apple versions

49:21

for the car stuff and the Apple

49:23

ones, it's better. Yeah. Like, I

49:25

don't know why Android can't even

49:27

like, like, it's so annoying to

49:29

use the Android one, and the

49:32

Apple one is just like, it's,

49:34

it works smoothly, and you can

49:36

even do it like wirelessly hooking

49:38

up and stuff, like, anyway, so

49:40

I think Apple makes a lot

49:42

of cool products, they have a lot

49:45

of good stuff, but also they're

49:47

terrible, so in my opinion. So

49:49

I think the EU is, just them

49:51

doing this, let me show you my

49:53

reaction. When I'll reenact it. Reenact it. When, uh,

49:56

here we go. Another reason. For, for, you

49:58

gotta watch the video version of this show.

50:00

for Michael's reenactments. Yeah,

50:02

exactly. You don't want

50:04

to miss this. This is when

50:06

I heard the news that EU

50:09

is forcing Apple again to do

50:11

something that they shouldn't have

50:13

already done years ago. Wow,

50:15

it's very similar to your reaction

50:18

of chat GPT. I mean, yeah,

50:20

it's interesting. It's, it

50:22

had the same feeling. Now I

50:24

just picture you in your house

50:27

all day reading headlines going. Yeah,

50:29

but not just that, but doing

50:31

it to a camera. Yeah, okay,

50:33

good, good, good. That's not on

50:36

apparently. You look at the market

50:38

share, you know, generally you want

50:40

to say, hey, companies are free

50:42

to do what they want. It's

50:44

their products, they innovate, blah,

50:46

blah, blah, but there are monopoly

50:49

laws for a reason. And so

50:51

when you look at Apple's share.

50:53

You know, they have a 28%

50:55

of the global smartphone market share.

50:57

Guess who owns the rest of

50:59

that, like nearly all of it

51:02

is Android. So you've got a

51:04

duopoly, right? Yeah. In the

51:06

United States, however, it's not

51:08

necessarily like Android

51:10

because Google doesn't have them

51:13

as most phone sold. That's

51:15

Samsung. So like, I think

51:17

they have like a little bit of

51:19

an out there. But it's

51:21

utilizing Android. It's all Android.

51:23

And Apple has a significant lead

51:25

in the market share in the

51:27

US of 57%. So globally, 28%.

51:29

But in the US, they darn

51:31

near dominate the entire market. I

51:34

mean, 57% of the entire smartphone

51:36

market is huge, like, because everyone

51:38

has a smartphone. Like, everyone. And

51:40

some of us have many. Yeah. Now,

51:42

now when you have a newborn, the hospital

51:44

gives you a smartphone for the newborn. Like,

51:47

it's just, you get one right out of

51:49

the womb right out of the womb. No,

51:51

you know, you have and actually the you

51:54

have you have to have so many decisions

51:56

when you have a kid yet to name

51:58

the kid and pick which. you want iPhone

52:00

or Android? So hard. So hard. Well,

52:03

yeah, I mean, the naming the kid

52:05

is actually second to picking the phone.

52:07

Of course, yeah, yeah. You know, do

52:09

you want a kid walking around with

52:11

an Android or you want a kid

52:14

walking around with Apple? That is

52:16

a tough decision. Their future, you

52:18

know, you're basically, you're forcing

52:20

them to have this particular device

52:23

until they're at least four, you

52:25

know. Eligible for the next upgrade.

52:27

Yeah, so they love their closed

52:29

garden. It obviously gives them a competitive

52:32

advantage. And, you know, normally I

52:34

would be like, hey, they're allowed to

52:36

do that. But when you cut off

52:38

all competition from being able to compete

52:41

with you at all, and you have

52:43

a monopoly or duopoly, then, you know,

52:45

that's where governments come in to break

52:47

that kind of stuff up. And I

52:49

think that's kind of what's happening here,

52:52

is they're being forced to open that

52:54

garden because. they are so dominant and

52:56

they're dominant because they made a lot

52:58

of great decisions with hardware let's be

53:00

honest like they they you know so

53:02

many people to this day will sit

53:05

there and argue with me that

53:07

people don't care about privacy but

53:09

you know who cared about privacy

53:11

enough to actually trumpet it and

53:13

I'm not saying they actually you know

53:15

are the best in privacy because they're

53:18

far from it they spent at least

53:20

hundreds of millions if not a

53:22

billion dollars on ads and newspapers,

53:24

billboards, TV advertising, talking about the

53:26

fact that we're more private than

53:29

Android. And so if you think

53:31

people don't care about privacy, I

53:33

would say yes they care about

53:35

convenience more, but privacy is up

53:37

there and Apple's marketers knew it and they

53:39

capitalized on it. And if you ask anyone

53:42

on the street which one's more secure. they're

53:44

going to tell you Apple, even if they're

53:46

not a big technology person, because Apple spent

53:48

a ton of money getting that message out.

53:51

So it doesn't. And it's also not even

53:53

about like, you know, there's the argument of

53:55

like, some people would say the terrible take

53:57

of like, I have nothing to hide, blah, blah.

54:00

that there's people who are like

54:02

that would still choose an Apple device

54:04

because they get they get the

54:06

privacy on top of like they

54:08

would still want privacy if there

54:10

if the convenience is not lost

54:12

like it's it's more of like the fact

54:14

that if you if you're saying that

54:16

I have to do all this work

54:18

to get privacy then never mine but

54:20

once as long as you get both

54:22

sides that they're gonna always choose

54:24

to the the the lesser evil which

54:27

is having at least some privacy. Well,

54:29

Jill, that's not the only fruit in

54:31

our show. We got more fruit to

54:33

talk about, don't we? We sure

54:36

do. And speaking at apples, have

54:38

you ever wondered what would happen

54:40

if an apple became sentient?

54:43

Like a T1,000. I'm not eating

54:45

one. With AI. Every time I'm eating

54:47

one, I'm like, what if this

54:49

apple was sentient and knew I

54:51

was eating it? And then my

54:53

cannibal? It's not the same

54:55

species that's alive. You're not

54:57

an apple, so it's it's

54:59

fine. Oh, no. You know, yeah, an apple's

55:02

alive, just like a carrot.

55:04

You wouldn't be vegan at

55:06

that point, but you know, like,

55:08

uh, so. And also with

55:10

that count, but I'm not

55:13

accountable, but I'm not vegan.

55:15

Would that count as like,

55:17

not vegetarian or not vegetarian?

55:19

Maybe it isn't? You said earlier

55:21

you've never thought about. What

55:23

if an apple became sentient?

55:26

Have you really ever thought

55:28

about that? I can happily

55:30

and gladly tell you no. I've

55:32

never- That is shocking. What

55:35

about bananas? Yeah? Well, naturally,

55:37

because, you know, peanut butter,

55:39

jelly time. peanut butter, jelly

55:42

time. Okay. All right, good.

55:44

All right, Jill, continue. Well,

55:46

if apples became sentient, what

55:48

would it have? Would apples

55:51

get along with? Bananas?

55:53

Form of locomotion.

55:55

Well you may not get

55:58

all those answers. to

56:00

your burning questions, but you

56:02

cannot let us play this game

56:05

and experience what it's like

56:07

to be both Apple and

56:10

Man. So our game this

56:12

week is Apple Man. The

56:14

game on steam describes itself

56:17

like this. Apple Man is

56:19

no ordinary fruit. Due to

56:21

a mysterious force beyond anyone's

56:24

comprehension, he has been blessed

56:26

with two unbreakable legs. Yes,

56:28

thus the locomotion joke earlier.

56:31

He's been blessed with two

56:33

unbreakable legs and a myriad

56:35

of special abilities. With his

56:38

newfound power, he is absolutely

56:40

determined to accomplish his lifelong

56:42

dream of reaching to the

56:45

sky. His lifelong drink of

56:47

like two days. So Appleman

56:49

is actually a really fun

56:51

romp of a 2D physics

56:54

driven platformer and it's only

56:56

499 on steam. So don't

56:58

give it a try. It's

57:00

really a fun game. A

57:02

mysterious force beyond anyone's comprehension.

57:05

Is it a man that

57:07

became an apple? Or as

57:09

an apple, because it was

57:11

bitten by a radioactive apple.

57:14

Or is it an apple

57:16

that became a man because

57:18

it was bitten by a

57:20

radioactive man? Then 4.99 and

57:23

find out, Michael, you know? It's

57:25

an apple with Frigili, legs.

57:27

Yeah, Frigili, Frigili, Jill,

57:29

your dad jokes need help.

57:31

Like they are... Hey! They're so...

57:34

Like, Michael's from the

57:36

Christmas story. Why are you

57:38

being so cold? You're basically

57:40

acting like a freezer right

57:42

now. This is unbelievable. So

57:44

look, I know a lot

57:47

of you out there have

57:49

the same burning questions we

57:51

do about apples becoming

57:53

sentient. And so 49

57:56

is a small. You put it in

57:58

the fridge. Yeah. Oh my gosh. off his

58:00

mic remotely is that like the thing?

58:02

You know, luckily it's powered by a

58:05

mixer so only I have control. If

58:07

I had ability to have a sentient apple

58:09

I would send it over there to turn

58:11

off your mic as well. I would

58:13

be very terrified and let it.

58:16

Yeah, you'd be scared of a little

58:18

Apple man. Really? Unbreakable legs? What if

58:20

it tried to kick me? Yeah, that's

58:23

true. It would try

58:25

to kick you. But

58:27

its head's not unbreakable.

58:30

It's just legs are

58:33

unbreakable. You just

58:35

smash. Oh, that's true.

58:37

That's true. Okay, fair

58:39

enough. Yeah, yeah. Makes

58:41

some apples. As you

58:43

all know, I have

58:45

a lot of nicknames

58:47

for Michael, most of which

58:49

I can't say on here.

58:51

Ryan, we're all on this show,

58:54

so we're all technically eggheads. So

58:56

it's funny because people would,

58:58

you might think that Ryan is,

59:00

that saying this is like, oh,

59:02

because this is a joke and

59:04

he never actually called me that,

59:06

he has. Yes. Yes. All right, so

59:08

it also happens to be our

59:11

software spotlight this week of egghead.

59:13

Egghead is a trivia app, Michael,

59:15

that lets you learn and have

59:18

fun at the same time. Well, I

59:20

mean, obviously I would play this

59:22

and I would get everything right

59:24

automatically. Yeah, because you are an

59:26

egghead. It has tons of categories

59:28

like history, science, geography, sports, art,

59:30

celebrities, Santee and apples. Celebrities, guarantees,

59:32

guarantees, guarantees, I get stuff wrong.

59:34

Yeah, I don't think I would

59:36

do very well in that category

59:38

either. It has ability to download

59:40

quiz, save downloaded quizzes, delete, save

59:42

quizzes, switch to Dark Lider, or

59:44

system mode. And to me, if

59:46

that doesn't have that. Having the

59:48

dark vote in there. Oh, it's cool.

59:50

You know, it's just I think

59:53

we need to now this is

59:55

a perfect app that we can

59:57

do a live stream challenge

59:59

of Like who can get the most

1:00:01

answers of this quiz? And I have

1:00:04

AI open because I'm not getting it.

1:00:06

You cannot. Yeah. I don't think it

1:00:08

will help you anyway because it's probably

1:00:10

timed and if it isn't I'll add a

1:00:13

timer anyway. Ryan you could put it on

1:00:15

easy mode and that would that could

1:00:17

help a lot. Yeah. I need really

1:00:19

on super expert hardcore awesome mode. I

1:00:22

mean even things like who was the

1:00:24

first US president don't ask me stuff

1:00:26

like that. No clue. Like Bob's your

1:00:28

uncle. Yes. There you go. And

1:00:31

it's another, that's just, if

1:00:33

you didn't know, that's

1:00:35

what George Washington's nickname

1:00:37

was Bob, and he was

1:00:40

somebody's on. Yes. We're so,

1:00:42

we're so bad with this type

1:00:44

of data, it's unbelievable. All right.

1:00:47

Where at least it is right

1:00:49

now. The point is that it's

1:00:51

far better to spend your

1:00:54

time with an app like

1:00:56

this where you're learning something

1:00:58

rather than doom-scrolling posts on

1:01:01

social media. That's true. Fill

1:01:03

your brain instead of killing

1:01:05

it with politics. That's how

1:01:07

I feel anyways. Fill your

1:01:10

brain. Mm-hmm. Joe, give us a tip

1:01:12

of the week. Which got? Oh, cool.

1:01:14

So last week we talked about

1:01:16

the awesome end map in the map.

1:01:19

So this week, check

1:01:21

out a tool called

1:01:24

Linus. And it's not,

1:01:26

as in Linus, L-I-N-U-S,

1:01:28

it's L-Y-N-I-S. Linus is

1:01:30

a powerful command line

1:01:32

tool for auditing, system

1:01:34

hardening, and compliance testing.

1:01:36

And it works on

1:01:38

systems running. Linux, Mac-O-S,

1:01:40

or a Unix-based operating

1:01:42

system like BSD. It

1:01:44

performs an extensive health

1:01:46

scan of your systems

1:01:49

to support system heartening and

1:01:51

compliance testing. And the Linus

1:01:53

project is actually open source

1:01:56

software with a GPL license.

1:01:58

Something we love to. here and

1:02:00

has been available since 2007.

1:02:02

And there's so many use cases

1:02:05

for Linus. Developers can use

1:02:07

Linus to test docor images

1:02:10

or improve the hardening of

1:02:12

their deployed web application. System

1:02:14

administrators can use Linus to

1:02:17

run daily health scans to

1:02:19

discover new weaknesses. IT auditors

1:02:21

can use it to show

1:02:23

colleagues or clients what can

1:02:25

be done to improve security,

1:02:27

and penetration testers can use

1:02:29

it to discover security weaknesses

1:02:32

on systems of their clients that

1:02:34

may eventually result in system

1:02:36

compromise. This is actually a really

1:02:39

cool tool that you can have fun with,

1:02:41

even if you're none of the above. So,

1:02:43

you know, make sure to check this one

1:02:45

out and see how you can harden your

1:02:47

system. And it's conveniently available

1:02:50

to install in your

1:02:52

digital software repository.

1:02:54

You know what's really cool about

1:02:56

this, Jill, is I didn't know

1:02:58

this existed in my cyber security

1:03:00

course that I'm taking in college

1:03:02

for cyber security. And this came

1:03:04

up. This was one of the

1:03:06

tools that I had to utilize

1:03:08

in a virtual machine when looking

1:03:10

for compromise. Ryan, this tool has

1:03:12

been around since 2007. And you have

1:03:14

not heard, let me tell you what I

1:03:17

heard of it. I heard of it. I

1:03:19

heard of it when we covered it

1:03:21

on this show. Just now? Just

1:03:23

now. I've known about it because

1:03:25

of my students, you know, that

1:03:28

we're going into penetration testing and

1:03:30

whatnot. So it's really good, like really

1:03:32

good. So when you run this tool,

1:03:34

it gives you a report on. all

1:03:37

the elements of your system. It's looking

1:03:39

at user groups, it's looking at the

1:03:41

file system setups, it's looking for known

1:03:43

vulnerabilities, it's going to tell you things

1:03:46

that you need to patch, that are

1:03:48

unpatched, it's going to go through your

1:03:50

whole system and do a hardening test

1:03:52

on it. When you talk about hardening,

1:03:54

like people who are raving into IT

1:03:57

and things, a lot of times if

1:03:59

you're traveling, internationally, your company will require

1:04:01

you to get your laptop hardened.

1:04:03

And one of the things that they

1:04:06

do, of course, there's multiple steps

1:04:08

to that, one of the things is

1:04:10

making sure that all the counts are

1:04:12

locked down, that your bios, for instance,

1:04:15

is locked, so people can't get

1:04:17

into that, all kinds of different things.

1:04:19

But running this type of hardening test

1:04:21

is just one of the many methods

1:04:24

you could see where there might be

1:04:26

a possible vulnerability on your system.

1:04:28

Really. playing with some of these

1:04:30

tools because with AI being as

1:04:32

powerful it is, you're going to

1:04:34

need them. Trust me, you're gonna

1:04:36

need tools like this because AI

1:04:39

is going to rock our world

1:04:41

when it comes to security issues

1:04:43

and vulnerabilities and everything else because,

1:04:45

you know, it just allows people

1:04:48

to quickly find and create zero

1:04:50

day vulnerabilities on top of everything

1:04:52

else. Yeah, it's probably gonna rock

1:04:55

your world. It might peble your world

1:04:57

too. Man, Michael, no more dad

1:04:59

jokes for the rest of the

1:05:01

show, okay? Oh, good thing. We're

1:05:03

almost done with the show. We

1:05:05

are done with the show. A

1:05:07

big thank you to each and

1:05:09

every one of you for supporting

1:05:11

us by watching or listening to

1:05:13

Destination Linux. However, you do it.

1:05:15

We love your faces. You can

1:05:17

come join us on our discord.com/Discord.

1:05:19

Stop. I know some of you

1:05:21

have heard this a hundred thousand

1:05:23

times. Stop playing the episode right

1:05:25

now. No, you listen. Yeah. Okay. I

1:05:27

have an update related for

1:05:29

details, so you better listen.

1:05:32

And that update is that

1:05:34

apparently Fitbit didn't kill Peble.

1:05:37

Peble itself just collapsed

1:05:39

and went into insolvency, and

1:05:41

they filed for that, and

1:05:43

then Fitbit acquired it because

1:05:45

of that, and they got

1:05:47

other assets. Sorry for being

1:05:49

this jerk earlier, Fitbit and

1:05:51

Google. My bad. Yeah. So thanks

1:05:54

to Michael A. As in me,

1:05:56

I looked it up. And... That's

1:05:58

the information. So we now know

1:06:00

the information and also it

1:06:02

there's a comment about the

1:06:04

person who founded Pepple and

1:06:06

say I'm gonna thanks for

1:06:09

Google for doing this and I'm

1:06:11

like well you could have just

1:06:14

done it yourself before the

1:06:16

company anyway but that's good

1:06:18

so now we know Google

1:06:20

is at least in this particular

1:06:23

this one avenue is doing

1:06:25

a good thing yeah good job Google

1:06:27

good job Google And if you

1:06:29

want to give a mini-clap to us,

1:06:32

you can support the show by becoming

1:06:34

a patron and going to tuxigil.com/membership. and

1:06:36

you can get a bunch of perks

1:06:39

too. We'll give you extra, we'll give

1:06:41

you many claps back in the form

1:06:43

of being able to watch the show

1:06:46

live, get unended episodes of the show,

1:06:48

get merch discounts, and you get special

1:06:50

access to the patron-only post show that

1:06:53

happens every week after the show, and

1:06:55

also the patron-only sections of our discord

1:06:57

server and so much more. So go

1:07:00

to tux digital.com slash membership. And

1:07:02

if you'd like to get another

1:07:04

special clap that you can go

1:07:06

to tuxegel.com/store and get all of

1:07:09

the awesome. No we go. I held it

1:07:11

all up early this time Michael. Yes

1:07:13

you did. And just for note, most

1:07:15

of the time when Ryan holds up

1:07:17

stuff, it's not in the store. And,

1:07:19

uh, but you can, there's a lot

1:07:21

of stuff that's in the store, like

1:07:23

the shirt that Jill is wearing and

1:07:25

the shirt that I'm wearing and nothing

1:07:28

that Ryan ever is wearing. And it's

1:07:30

huxdigital.com/store. You get a bunch of cool

1:07:32

stuff like shirts, mugs, hoodies, and hats

1:07:34

and so much more. tuxdigital.com/store. Nips

1:07:36

for your stylus? No, not really.

1:07:39

No. They're called, they're called nibs.

1:07:41

They're called nibs. They're called nibs.

1:07:43

I never knew. I never knew. I never knew. Yeah.

1:07:45

Like I was like, what is, I don't know

1:07:47

what this is called, but okay, the thing at

1:07:50

the end of a stylus, nibs apparently. And make

1:07:52

sure to check out all the amazing

1:07:54

shows here on Texas. That's right, we

1:07:56

have an entire network of shows to

1:07:58

fill your whole week with geek. goodness.

1:08:00

Head to text digital.com

1:08:02

to keep those Linux

1:08:05

penguins marching. Everybody

1:08:07

have a wonderful week

1:08:09

and remember that the

1:08:11

journey itself is just

1:08:13

as important as the

1:08:15

destination. Right people. We'll see

1:08:18

you next week except for

1:08:20

Ryan. I won't see you.

1:08:22

Spazzy spazzy in chat. Spazzy

1:08:25

C says clap clap clap

1:08:27

clap clap clap clap. Did

1:08:29

he say clap clap clap

1:08:32

clap clap clap clap clap

1:08:34

clap? Joe. He typed it.

1:08:36

I'm sorry. Yes. No voice

1:08:38

there. Come on. So technically

1:08:41

he's saying it in chat

1:08:43

because that's a courteousness

1:08:46

of us recording the

1:08:48

show. Yes. Okay. Absolutely.

1:08:51

Yeah. It was such a

1:08:53

good show that the fruit

1:08:55

that was throughout this show.

1:08:57

Whoever writes this show. There

1:08:59

we go. Please continue,

1:09:02

continue, please. Oh, please. I

1:09:04

mean, you don't have to,

1:09:06

but more, more, please. No,

1:09:09

that was a great job.

1:09:11

I love the theme. It was

1:09:13

a lot of fun. It was

1:09:15

fun. Yeah. Was fun. So yeah,

1:09:17

that's it. Go away now.

1:09:20

Bye. Bye. Bye. We'll see it.

1:09:22

We'll see you next week. Jill

1:09:25

and I will see you next

1:09:27

week.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features