Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Is there a better way I think
0:02
there is? Every single interface that
0:05
I interact with, every single problem
0:07
space that I'm trying to solve,
0:09
are going to be made easier
0:11
by virtue of this new technology.
0:13
If you were starting from scratch
0:16
today, you probably wouldn't build this
0:18
app-centric world. You can imagine a
0:20
post-phone world. The past 20 years
0:22
of consumer technology have been a
0:25
story of apps, of touch screens,
0:27
and of smartphones. These form factors
0:29
seemingly appeared out of nowhere. It
0:32
may be replaced just as quickly
0:34
as they were ushered in. Perhaps
0:36
by a new AI-enabled stack, a
0:39
new computing experience that is more
0:41
agentic, more adaptive, and more immersive.
0:43
Now in today's episode, A16C's growth
0:46
general partner, David George, discusses this
0:48
feature with arguably one of the
0:51
most influential builders of this era.
0:53
That is, Menace CTO, Andrew Boz
0:55
Boz Boz Bozworth. who has spent
0:57
nearly two decades at the company,
0:59
shaping consumer interaction from the Facebook
1:01
newsfeed all the way through to
1:03
their work on smart glasses and
1:05
AR headsets. Here, Boz explores the
1:07
art of translating emerging technologies into
1:10
real products that people use and
1:12
love. Plus, how breakthroughs in AI
1:14
and hardware could turn the existing
1:16
app model on its head. In
1:18
this world, what new interfaces and
1:20
marketplaces need to be developed? What
1:22
competitive dynamics hold strong and which fall
1:25
by the wayside? For example, will brand
1:27
still be a moat? And if we
1:29
get it right, BOSS says the next
1:32
wave of consumer tech won't run on
1:34
taps and swipes, it'll run on intent.
1:36
So is the post-Mobile phone era upon
1:39
us listening to find out? Oh, and
1:41
if you do like this episode, it
1:43
comes straight from our AI revolution series.
1:46
And if you miss previous episodes of
1:48
the series with guests like AMDA CEO,
1:50
Lisa Sue, Anthropic co-founder Dario Amade
1:53
and the founders behind companies
1:55
like Databricks, Waymo, Figma, and
1:57
more head on over to a16z.com/AI
1:59
Revolution. As a reminder, the content
2:02
here is for informational purposes only,
2:04
should not be taken as legal,
2:06
business, tax, or investment advice, or
2:09
be used to evaluate any investment
2:11
or security, and is not directed
2:13
at any investors or potential investors
2:16
in any A16Z fund. Please note
2:18
that A16Z and its affiliates may
2:20
also maintain investments in the companies
2:22
discussed in this podcast. For more
2:24
details, including a link to our
2:27
investments, please see a16z.com/disclosures.
2:33
Boss, thanks for being here. Thanks
2:35
for having me. Appreciate it. Okay,
2:37
I want to jump right in. How are we
2:39
all going to be consuming content
2:41
five years from now and ten years
2:43
from now? Ten years I feel pretty confident
2:46
that we will have a lot more ways
2:48
to bring content into our viewshed than just
2:50
taking out our fun. I think August
2:52
reality glasses, obviously, are a real possibility. I'm
2:55
also hoping that we can do better for
2:57
really engaging in immersive things. Right now you
2:59
have to travel. I'm also hoping that we
3:01
can do better for really engaging in immersive
3:04
things. Right now you have to travel to
3:06
like the sphere. Which is great. There is
3:08
a real possibility. I'm also hoping that we
3:11
can do better for really engaging in immersive.
3:13
Sure, we can go and pay a lot for
3:15
tickets. Is there a better way I think there
3:17
is? So 10 years I feel really good about
3:19
all these alternative content delivery vehicles. Five
3:21
years is trickier. For example, I think the
3:24
glasses, the smart glasses, the AI glasses, the
3:26
display glasses that we'll have in five years
3:28
will be good. Some of them will be
3:30
super high-end and pretty exceptional. Some of them
3:33
will be like actually little and like not
3:35
even tremendously high resolution displays, but they will
3:37
be like always available and on your face.
3:40
I wouldn't be doing work there. But like
3:42
if I'm just trying to grab simple
3:44
content in moments between, it's pretty good
3:46
for that. So I think what we
3:48
are seeing is, as you'd expect, we're
3:50
the very beginning now of a spectrum
3:52
of super high-end, but probably very expensive
3:54
experiences that will not be evenly distributed
3:56
across the population. A much more broadly
3:58
available set of experiences. that are not
4:00
really rich enough to replace like
4:02
the devices that we have today.
4:04
And then hopefully a continually growing
4:07
number of people who are having
4:09
experiences that really could not be had
4:11
any other way today. You're thinking about
4:13
what you could do with mixed reality
4:15
and virtual reality. Yeah, we're going to
4:17
build up to a lot of that
4:19
stuff. So throughout your career, I would
4:21
say one of the observations I would
4:23
have is you've been uniquely good at
4:25
piecing together various big technology shifts into...
4:27
new product experiences. So in the
4:29
case of Facebook early days for
4:32
you, obviously you famously were part
4:34
of the team that created the
4:36
news feed, and that's a combination
4:38
of social media, a mobile experience,
4:40
and applying your like old school
4:42
AI to create to it, to
4:44
create to it, to create it,
4:46
to it, to create it. Exactly.
4:48
But that's pretty cool. And like
4:50
a lot of times these trends,
4:52
they come in bunches. And that's
4:54
what creates the breakthrough products. So
4:56
maybe take that. If there was a
4:58
thing that, not me specifically, but I think
5:00
me and my cohorts at meta were really
5:03
good at, was like, we really immersed in
5:05
what the problem was. What were people trying
5:07
to do? What did they want to do? And
5:09
when you do that, you are going to reach
5:11
for whatever tool is available to accomplish that
5:13
goal. That allows you to be really honest
5:16
about what tools are available and see trends.
5:18
I think the more oriented you
5:20
are towards the technology side. you get
5:22
caught in a wave of technology and
5:24
you don't want to admit when that
5:26
wave is over and you don't want
5:28
to embrace the next wave. And you're
5:31
building technology for technology sake. So like
5:33
solving a product problem. But if you're
5:35
embracing like what are the issues that
5:37
people are really going through in their
5:39
life and they don't have to be
5:41
profound. I bring that up just because
5:43
I think we're in this interesting moment
5:45
where a lot of people wanted a
5:47
new wave to becoming because it would
5:49
have been advantageous to them. The AI
5:51
revolution that's happening right now is it really
5:54
feels tangible. These are real problems
5:56
that are being solved and it's
5:58
not solving every problem. It creates
6:00
new problems, it's fine. So it feels like
6:02
a substantial, real, new capability that we have.
6:04
And what's unusual about it is how broad-based
6:07
it can be applied. And while it has
6:09
these interesting downsides today on factuality and certainly
6:11
compute and cost and inference, those types of
6:13
tradeoffs feel really solvable and the domains that
6:15
it applies to are really broad. And it
6:17
says that's pretty unusual. Certainly in my career.
6:20
You know, almost always when these technological breakthroughs
6:22
happen, they're almost always very domain specific. It's
6:24
like, cool, like, this is going to get
6:26
faster, or that's going to get cheaper, or
6:28
that's now possible. This kind of feels like,
6:31
oh, everything's going to get better. Every single
6:33
interface that I interact with, every single interface
6:35
that I interact with, every single problem space
6:37
that I'm trying to solve, are going to
6:39
be made easier by virtue of this new
6:42
technology. That's pretty rare. revolution in computing interfaces.
6:44
And we really started to feel 10 years
6:46
ago, like the mobile phone form factor, as
6:48
amazing as it was, this is 2015, was
6:50
like already saturated. That was what it was
6:52
going to be. And once you get past
6:55
the mobile phone, which is again, the greatest
6:57
computing device that any of us have ever
6:59
used to this point, of course, it's like,
7:01
okay, well, it has to be more natural
7:03
in terms of how you're... Getting information into
7:06
your body, which is obviously, I do usually
7:08
through our eyes and ears, and how we're
7:10
getting our intentions expressed back to the machine.
7:12
You no longer have a touchscreen, you no
7:14
longer have a keyboard. So... Once you realize,
7:17
those are the problems, it's like, cool, we
7:19
need to be on the face, because you
7:21
need to have access to eyes and ears
7:23
to bring information from the machine to the
7:25
person. And you need to have these neural
7:28
interfaces to try to allow the person to
7:30
manipulate the machine and express their intentions to
7:32
it when they don't have a keyboard or
7:34
mouse or a touch screen. And so that
7:36
has been an incredibly clear-eyed vision we've been
7:38
on for the last 10 years. But we
7:41
really did grow up in an entire generation
7:43
of engineers for whom the system was fixed.
7:45
The application model was fixed. The like interaction
7:47
design. Sure, we went. from a mouse to
7:49
a touch screen, but like it's still direct
7:52
manipulation interface, which is literally the same thing
7:54
that was pioneered in the 1960s. So like
7:56
we really haven't changed these modalities. And there's
7:58
a cost to changing those modalities because we
8:00
as a society have learned how to manipulate
8:03
these digital artifacts through these tools. So the
8:05
challenge was for us was, okay, you have
8:07
to build this hardware, which has to do
8:09
all these amazing things and also be attractive
8:11
and also be light. and also be affordable.
8:13
And none of these existed before. And what
8:16
I tell my team was like, that's only
8:18
half the problem. The other half the problem
8:20
is great, how do I use it? Like,
8:22
how do I make it feel natural to
8:24
me? I'm so good with my phone now.
8:27
It's an extension of my body, of my
8:29
intention at this point. How do we make
8:31
it even easier? Yeah. And so we were
8:33
having these challenges. And then what a wonderful
8:35
blessing. AI came in two years ago much
8:38
sooner than we expected and is a tremendous
8:40
opportunity to make this even easier for us
8:42
because the AIs that we have today are
8:44
a much greater ability to understand what my
8:46
intentions are. I can give vague reference and
8:49
it's able to like work through the corpus
8:51
of information as available to like make specific
8:53
outcomes happen from it. There's still a lot
8:55
of work to be done to actually adapt
8:57
it and it's still not yet a control
8:59
interface like I can't reliably work my machine
9:02
with it. There's a lot of things that
9:04
we have to do. But we know what
9:06
those things are. And so now you know
9:08
much more exciting place, actually. Whereas before we
9:10
thought, okay, we've got this big hill to
9:13
climb on the hardware, get this big hill
9:15
to climb on the interaction design, but we
9:17
think we can do it. And now we've
9:19
got a wonderful tailwind where on the interaction
9:21
design side, at least, there's the potential of
9:24
having this much more intelligent agent that now
9:26
has not only the ability for you to
9:28
converse with it naturally and get results out
9:30
of it, but also. to know by context
9:32
what you're seeing, what you're hearing, what's going
9:34
on around you. Yeah. And make intelligent inference
9:37
based on that information. Let's talk about like
9:39
reality labs and the suite of products, what
9:41
it is today. So you have Quest headsets.
9:43
You had the smart glasses, and then on
9:45
the far end of the spectrum is Orion
9:48
and some of the stuff that I demoed.
9:50
So just talk about the evolution of those
9:52
efforts and what you think the markets are
9:54
for them and how they converge versus not
9:56
over time. So when we started the Raybam
9:59
Meadow Project, they were going to be smart
10:01
glasses, and in fact, they were entirely built,
10:03
and we were six months away from production
10:05
when Lama 3 hit. And the team was
10:07
like, no, we got to do this. And
10:10
so now they're AI glasses, right? Like they
10:12
didn't start as AI glasses, but the form
10:14
factor was already right. We could already do
10:16
the compute. We already had the ability. So
10:18
yeah, now you have these glasses that you
10:20
can ask questions to. And in December to
10:23
the early access program, we launched, we called
10:25
Live AI. So you can start a live
10:27
AI session with your Raybam meta glasses, and
10:29
for 30 minutes until the battery runs out,
10:31
it's seeing what you're seeing what you're seeing
10:34
what you're seeing, what you're seeing, what you're
10:36
seeing, what you're seeing, The Ray-Ban meta looks
10:38
like an incremental improvement to Ray-Ban stories. And
10:40
this is kind of the story I'm trying
10:42
to tell, which is the hardware isn't that
10:45
different between the two, but the interactions that
10:47
we enable with the person using it are
10:49
so much richer now. When you use Orion,
10:51
you use the full AR glasses, you can
10:53
imagine a post-phone world. You're like, oh, wow,
10:55
like if this was attractive enough and light
10:58
enough. and added battery life enough to wear
11:00
all day, this would have all the stuff
11:02
I need. Like it would all be right
11:04
here. And when you start to combine that
11:06
with images that we have of what AI
11:09
is capable of, see you did the demo
11:11
where we showed you the breakfast. Yeah, I
11:13
did. And it's, yeah, and for what it's
11:15
worth, I'll explain it because it's very cool.
11:17
Got to walk over and there's a bunch
11:20
of breakfast ingredients laid out and I look
11:22
at it and I say, hey meta, what
11:24
are some recipes, what are some recipes, AI
11:26
component when we first thought about it. It
11:28
had this component that was very direct manipulation.
11:30
So it was very much the model on
11:33
the app model that we're all from that
11:35
within the course. And I think there's a
11:37
version of that. Yeah, of course, you're going
11:39
to want to do calls and you're going
11:41
to do your email and be able to
11:44
do your texting and you want to play
11:46
games. We have to play our Star Gazer
11:48
game and do your Instagram. What we're now
11:50
excited about is, okay, take all those pieces
11:52
and layer on the ability to have an
11:55
interactive assistant that really understands not just what's
11:57
happening on your device and what email is
11:59
coming in. But also what's happening in. Yeah,
12:01
of course. But also what's happening in the
12:03
physical world around you and is able to
12:06
connect what you need in the moment with
12:08
what's happening. And so these are concepts where
12:10
you're like, wow, what if the entire app
12:12
model is upside down? The device realizes that
12:14
you have a moment between meetings, you're a
12:16
little bit bored. Hey, do you want to
12:19
catch up on the latest highlights from your
12:21
favorite basketball team? Like those things become possible.
12:23
Having said that, the hardware problems are hard
12:25
and they're real and the cost problems are
12:27
hard and they're real. And you come for
12:30
the king, you best not miss. The phone
12:32
is an incredible centerpiece of our lives today.
12:34
It's how I operate my home. I use
12:36
my car, I use it for work. It's
12:38
everywhere, right. It's everywhere, right, right, right, right.
12:41
And... the world has adapted itself to the
12:43
phone. So it's weird that my ice maker
12:45
has a phone app, but it does. Like,
12:47
I don't know. I'm not sure. Seemed excessive,
12:49
but like, somebody today is like, I gotta
12:51
make an ice maker, number one job, gotta
12:54
have it out. It's like, the smart refrigerator,
12:56
you're like, you're like, I don't need this,
12:58
it's like, take it on me. I do
13:00
think it's going to be a 10 year
13:02
view is harder because, man, like, like. Even
13:05
if knocking out, it's dominant into the phone
13:07
in five years. It just seems so hard.
13:09
I'm thinking, it's like unthinkable for us, right?
13:11
That's what I said, like, Orion was the
13:13
first time I thought, me. Orion, like, put
13:16
it on my head, I was like, okay.
13:18
I was like, okay. I was like, okay,
13:20
like, it could happen, like, there does exist
13:22
a life for us, I've had into that.
13:24
I was like, like, okay, like, like, like,
13:27
like, it could, it could, it could, it
13:29
could, yeah, yeah, yeah, like, like, it could,
13:31
it could, it could, it could, yeah, it
13:33
could, it could, yeah, it could, yeah, yeah,
13:35
it could, it could, yeah, it could, it
13:37
could, yeah, it could, it could, yeah, it
13:40
could, yeah, it could, it could, it could,
13:42
it could, it could, it could, it could,
13:44
it could, So maybe you get to the
13:46
point where the hardware is capable, it is
13:48
market accessible, but do you tether to the
13:51
phone? Do you take a strong view that
13:53
you will never do that and let the
13:55
product stand? Like how do you think about
13:57
that piece? The phones have this huge advantage
13:59
and disadvantage. Huge advantage, which is like the
14:02
phone is already central to our lives. It's
14:04
already got this huge developer ecosystem. It's its
14:06
anchor device. And it's a wonderful anchor device
14:08
for that. The disadvantages, I actually think what
14:10
we found is the apps want to be
14:12
different when they're not controlled via touch screen.
14:15
And that's not super novel. A lot of
14:17
people failed early in mobile, including us, by
14:19
just taking our web stuff and putting on
14:21
the mobile phone, and be like, oh, the
14:23
mobile phone, we'll just put the web there.
14:26
But because it wasn't native to what the
14:28
phone was, and I mean, everything from interaction
14:30
design to the actual design, to the layout,
14:32
to how it felt, like, because we weren't
14:34
doing phone native things. we were failing with
14:37
one of the most popular products in the
14:39
history of the web. This is like the
14:41
major design field, like the scumorphic idea versus
14:43
the native idea. Yeah, and I think having
14:45
the developers is a true value and I
14:48
think having all this application functionality is a
14:50
true value, but then once you actually project
14:52
it into space and you're manipulating it with
14:54
your fingers like this as opposed to a
14:56
touch screen, you have much less precision, it
14:58
doesn't respond to voice commands because there's no...
15:01
tools for that. It was no design integration
15:03
with that. So having a phone platform today
15:05
feels like, wow, I've got this huge base
15:07
to work from on the hardware side, but
15:09
I've also actually got this kind of huge
15:12
anchor to drag on the software side. And
15:14
so we're not opposed to these partnerships. And
15:16
I think it'll be interesting to see once
15:18
the hardware is a little bit more developed
15:20
how partners feel about it. And I hope
15:23
they continue to support people who buy these
15:25
phones for $1,200 dollars, $1, $1,200,000. The biggest
15:27
question I have is whether the entire app
15:29
model, we were imagining a very phone-like app
15:31
model for these devices, admittedly a very different
15:33
interaction design, input, and control schemes are very
15:36
different, and that demands like a little extra
15:38
developer tension. I am wondering if like the
15:40
progression of AI over the next several years
15:42
doesn't turn the app model in its head.
15:44
Right now it's a common unusual... thing where
15:47
I'm like I want to play music. So
15:49
in my head I translate that to I
15:51
have to go open Spotify or open title
15:53
and the first thing I think of is
15:55
who is my provider going to be? Yeah
15:58
of course. As opposed like that's not what
16:00
I want. What's extremely limiting? What I want
16:02
is to play music. Yes. And I just
16:04
want to be like go to the AI
16:06
and like cool play this music for me.
16:09
Yeah. And it should know. Oh like you're
16:11
already using this service. We'll use this service.
16:13
Song or this one has lower latency whatever
16:15
thing is or it's like hey the song
16:17
you want isn't available on any of these
16:19
services Do you want to sign up for
16:22
this other service that does have the song
16:24
that you want? I don't want to have
16:26
to be responsible for orchestrating like what app
16:28
I'm opening to do a thing We've had
16:30
to do that because that's how things were
16:33
done in the entire history of digital computing
16:35
you have an application based model that was
16:37
the system So I do wonder how much
16:39
AI inverts things. That's a pretty hot take.
16:41
Yeah, it's a hot take. Inverts things. And
16:44
that's not about wearables. That's not about anything.
16:46
That's just like, even at the phone level,
16:48
if you were building a phone today, would
16:50
you build an app store the way you
16:52
historically built an app store? Or would you
16:54
say like, hey, you as a consumer, express
16:57
your intention, express what you're trying to accomplish?
16:59
And let's like see what we have. you
17:01
probably wouldn't build this like, absentric world where
17:03
I, as a consumer, I'm trying to solve
17:05
a problem and first have to decide which
17:08
of the providers I'm going to use to
17:10
solve that problem. Yeah, of course. That's fascinating.
17:12
And again, I think it's a function of
17:14
where the capabilities are today and I think
17:16
where we have line of sight into orchestration
17:19
capabilities. And then, of course, you get a...
17:21
build the developer ecosystem to develop on the
17:23
platform. Which is incredibly hard. That's the thing
17:25
I want to see. That's the hardest piece.
17:27
Yeah. The stronger we get at agentic reasoning
17:30
and capabilities, the more I can rely on
17:32
my AI to do things in my absence.
17:34
And at first it will be knowledge work,
17:36
of course. That's fine. But once you have
17:38
a flow of consumers coming. through here, what
17:40
you're going to find is that they're going
17:43
to have a bunch of dead ends. Yeah.
17:45
Where they're going to ask the AI, hey,
17:47
can you do this thing for me? And
17:49
it's going to say, no, I can't. That's
17:51
the gold mine that you take to developers.
17:54
And you're like, hey, I've got 100,000 people
17:56
a day, use your app, use your app,
17:58
yeah, trying to use your app, look, here's
18:00
what's what's coming through, here, and you're, and
18:02
you're like, I've got, I've got a, I've
18:05
got a, I've got a, I've got a,
18:07
I've got a, a, a, a, a, a,
18:09
a, a, a, a, a, a, a, a,
18:11
a, a, a, a, a, a, a, a,
18:13
a, a, a, a, Right. I'd go back
18:15
and say, hey, you gotta pay for this.
18:18
There's a guy who does this for you,
18:20
but you gotta pay for it. Yeah. And
18:22
I, by the way, I'm not just talking
18:24
about apps. I'm like, it's a plumber. There's
18:26
something of a marketplace here that I think
18:29
emerges over time. So that's how I see
18:31
it playing out. I don't see it playing
18:33
out. I don't see it playing out as
18:35
like someone goes. repeatedly in certain areas because
18:37
that's a type of functionality that is currently
18:40
behind some kind of an app wall and
18:42
there's no There's no word hasn't been built
18:44
native to whatever consumption mechanism. There's no bridge.
18:46
Yeah. Yeah. And everyone wants to build the
18:48
bridges like no it's gonna manipulate the pixels
18:51
and it's gonna manipulate it's like fine it
18:53
can do those things. I'm not saying the
18:55
AI can't cross those boundaries. But I think
18:57
over time that becomes the primary interface for
18:59
humans interacting with software as opposed to the
19:01
like pick from the garden of applications. Yeah,
19:04
that makes a ton of sense. That's a
19:06
very alluring in state, just as a consumer,
19:08
right? Yeah, it's messy. And I think it
19:10
creates these very exciting marketplaces for functionality inside
19:12
the AI. It abstracts away a lot of
19:15
companies' brand names, which I think is going
19:17
to be very hard for an entire generation
19:19
of brands. Yeah. Like the fact that I
19:21
don't care if it's being played on one
19:23
of these two music services. That's hard for
19:26
those music services who like really want me
19:28
to care. Yeah. And like they want me
19:30
to have a stronger opinion about it. And
19:32
like they want me to have an attachment.
19:34
Yeah. I don't want to have an attachment.
19:36
There are some things where you may value
19:39
the attachment. In the world where I'm like,
19:41
here's an app garden, and these two are
19:43
competing for my eyeballs, the brand that they've
19:45
built is the hugely valuable asset. In the
19:47
world where I just care if the song
19:50
gets played and the song gets played and
19:52
sounds good, a different set of priorities are
19:54
important. I think that's net positive because what
19:56
matters now is performance on the job being
19:58
asked. Well abstracting away, that's like effectively articulating
20:01
abstracting away margin pools, which puts a lot
20:03
more pressure on us trusting the AI or
20:05
the distributor of the AI. And so far
20:07
as I'm floating between different companies that are
20:09
each providing AIs, the degree which I trust
20:12
them to not be bought and paid for
20:14
in the back end. they're not giving me
20:16
the best experience or the best price for
20:18
money. They're giving the one that gives them
20:20
the most money. Yeah, of course. So yeah,
20:22
it's the experience of people's search today, right?
20:25
It's a very different world. It's a very
20:27
different world. But you can actually see inklings
20:29
of it today, right? So certain companies are
20:31
willing to work with the new AI providers
20:33
in agientic task completion. Yeah. And then they're
20:36
like, well, actually, wait a minute, wait a
20:38
minute. that I have this brand relationship directly
20:40
with the demand side. So that's potentially messy,
20:42
but a bright future, especially if we don't
20:44
have to pay that like brand tax. Yeah,
20:47
it'll be very messy. I don't know it's
20:49
avoidable, because I think once consumers start to
20:51
get into these tight loops where more and
20:53
more of their interactions are being moderated by
20:55
an AI. you won't have a choice. That's
20:57
like where your customers will be. Yeah. But
21:00
it's going to be a pretty different world.
21:02
Yeah, it'll be a different world and there
21:04
will probably be some groups that try to
21:06
move fast to it as a way to
21:08
compete with things that are branded. Yeah. And
21:11
just say I'm going to compete on performance
21:13
and price. Yeah, that's right. Where do you
21:15
think that could potentially happen first? It probably
21:17
will mirror query volume. web era when Google
21:19
became the dominant search engine. So before that,
21:22
the web era was like very index-based. It
21:24
was like Yahoo and it was like links
21:26
and getting major sources of traffic to link
21:28
to you was the game. And then once
21:30
Google came to dominance, which happened very quickly
21:33
over maybe a couple of years, I feel
21:35
like. All that mattered was like SEO. All
21:37
that mattered was like where you were in
21:39
the query stream. And the query stream dictated.
21:41
what businesses came over and succeeded? Yeah. Because
21:43
like the queries that were the most frequent,
21:46
those were the ones that came first. Yeah.
21:48
And so like I travel travel travel travel
21:50
travel travel is the one that travel came
21:52
right away. Right. Like it is a huge
21:54
disruption and travel agents went from a thing
21:57
that existed. Do I think that didn't exist
21:59
in a relatively short? And they all created
22:01
on the basis of like execution of the
22:03
best deal. It was literally like seamless fashion
22:05
with the highest conversion. I think SEO has
22:08
gotten to a point now. Where it's kind
22:10
of a bummer, it's like made things worse.
22:12
Like everyone's got so good. This is like
22:14
game. Everyone's gotten so good at it. Especially
22:16
with AI, actually. That's right. So I actually
22:18
think it's like we had this incredible flattening
22:21
curve. Now it's like starting to kind of
22:23
rise up in terms of. Especially with paid
22:25
placement too. Yeah, that's so dominant. Yeah, that's
22:27
right. And this is like probably the cautionary
22:29
tale for how this plays out in AIs
22:32
as well. because those are the queries that
22:34
are, that's the volume of people unsatisfied with
22:36
the existing solutions that they have. Yeah. Otherwise
22:38
they wouldn't be asking about it. And product
22:40
providers and developers will follow that and build
22:43
specifically to solve those problems. That's right. Once
22:45
it tips in each vertical, we get a
22:47
lot of progress very quickly. Yeah. Towards better
22:49
solutions for consumers. And then once it's a
22:51
steady state. it starts to be gamesmanship. Yeah,
22:54
and that's something to fight. And that's decaying
22:56
or... That'll be the true test of it.
22:58
The true test. Can it get through that?
23:00
Can it avoid falling into that. Can it
23:02
avoid that travel? Yeah, yeah, that's right. Well,
23:04
a lot of that is business model driven
23:07
and we'll see how that evolves over time
23:09
too. That's right. You guys have also been
23:11
leading from the front on this... idea of
23:13
open source. Yeah. And so talk about some
23:15
of your efforts on that side of the
23:18
business and then what is the ideal market
23:20
structure of the AI model side for you
23:22
guys? There's two parts that came together. The
23:24
first one is Lama came out of fair,
23:26
our fundamental AI research group, and that's been
23:29
an open source research group since the beginning.
23:31
You know, since Jean Lacoon came in and
23:33
they established that. It allowed us to attract
23:35
incredible researchers who really believe that we're going
23:37
to make more progress as a society working
23:39
together across boundaries of individual labs than not.
23:42
And to be fair, it's not just us,
23:44
obviously. the transformer paper was published at Google.
23:46
And like, you know, big, we self-supervised learning
23:48
was our contribution. Like, everyone's contributing to the
23:50
knowledge base. But when we open source Lama,
23:53
that's how all models were open source of
23:55
that. Like, of course, like, everyone's open. The
23:57
only thing that was unusual, everything else just
23:59
went close source over time, effectively. That's right.
24:01
But before that. Every time someone built a
24:04
model, they open-source it so that other people
24:06
could use the model and see how great
24:08
that model was. I was like, mostly how
24:10
it was done. If it was worth anything.
24:12
There's certainly some specialized models for translations and
24:15
whatnot where kept closed. But like, if it
24:17
was a general model, that was what was
24:19
done. Lama 2 was probably the big decision
24:21
point for us. Lama 2, and this is
24:23
where I think the second thing that came
24:25
into some belief that I've had that I
24:28
was advancing really continuously internally, that Mark really
24:30
believes in, believes in, believes in, and he
24:32
believes in, and he's, and he's, and he's,
24:34
and he's, and he's, and he's, and he's,
24:36
and he's, he's, he's, he's, he's, he's, he's,
24:39
he's, he's, he's, he's, he's, he's, he's, he's,
24:41
he's, he's, he's, he's, he's, he's, Which is
24:43
first of all, we're going to make way
24:45
more progress if these models are open. Yeah.
24:47
Because a lot of these contributions aren't going
24:50
to come from these big labs. Like they're
24:52
going to come from these little labs. We've
24:54
seen this already with Deep Seek in China,
24:56
which was put in a tough spot and
24:58
then innovated incredibly in the memory architecture and
25:00
a couple other places to really get amazing
25:03
results. And so we really believe we're going
25:05
to get the most progress collectively. The second
25:07
thing is, inside this piece, is, you know,
25:09
you know, you know, this is a classic,
25:11
I believe this is you're going to be
25:14
going to make going to make commodities, you're
25:16
going to make commodities, you're going to make
25:18
commodities, And you want to commoditize your compliments.
25:20
Yes. And we're in a unique position strategically
25:22
where our products are made better through AI,
25:25
which is why we go investing it for
25:27
so long. Whether it's recommendation systems in what
25:29
you're seeing in feed, reals, whether it's simple
25:31
things like what friend do I put at
25:33
the top when you type you want to
25:36
make a new message? Who do I think
25:38
you're going to message right now? Yeah. that
25:40
to really big expansive things like, hey, here's
25:42
an entire answer, here's an entire search interface
25:44
that we couldn't do it for in what's
25:46
out. Yeah, yeah, yeah. That like now is
25:49
a super popular surface. Yeah. So there's all
25:51
these things that are possible for us that
25:53
are made better by this AI, but nobody
25:55
else having this AI can then build our
25:57
product. The asymmetry works in our favorite. Yeah,
26:00
of course. And so for us, like commoditizing
26:02
your compliments is just good business sense and
26:04
making. helps a bunch of small startups and
26:06
academic labs. It also helps us. Yeah, so
26:08
we're the application provider. So we're all super
26:11
aligned on a business model alignment and industry
26:13
alignment there. So it comes from both this
26:15
fundamental belief in how this kind of research
26:17
should be done and then aligns with the
26:19
business model and so there's no conflict. Yeah,
26:21
societal progress plus business model alignment. It's all
26:24
together. It's all going the same direction. That's
26:26
great. I want to shift gears to talking
26:28
about the impediments to progress and like what
26:30
you think. or kind of linear versus not.
26:32
So the risks to the vision, to the
26:35
overall vision that you articulated, obviously hardware, AI
26:37
capabilities, vision capabilities, and screens and all that,
26:39
resolutions. We talked about the ecosystem and developers
26:41
and native products. So maybe just talk about
26:43
what you see or kind of the linear
26:46
path things and the things that may be
26:48
harder or riskier. We have real invention risk.
26:50
There exists risk that the things that we
26:52
want to build. We don't have the capacity
26:54
to build as a society as a species
26:57
yet. Yeah, and that's not a guarantee I
26:59
think we have Windows to us you've seen
27:01
Orion so like it can be done Yeah,
27:03
there's but yeah, it feels like it's a
27:05
cost reduction exercise It's materials improvement exercise, but
27:07
it can be done There is still some
27:10
invention risk far bigger than the invention risk.
27:12
I think is the adoption risk is it
27:14
considered socially acceptable? Are people willing to learn
27:16
a new modality? Like we all learned a
27:18
type when we were kids at this point.
27:21
We were born with phones in our hands
27:23
at this point. Are people willing to learn
27:25
a new modality? Is it worth it to
27:27
them? Ecosystemrous. Even bigger than that. Like, great.
27:29
build this thing, but if it just does
27:32
like your email and reals, that's probably not
27:34
enough, do people bring the suite of software
27:36
that we require to interact with modern human
27:38
society to bear on the device. Those are
27:40
all huge risks. I will say we feel
27:42
pretty good about where we're getting on the
27:45
hardware on acceptability. We think we can do
27:47
those things. That was not a guarantee before.
27:49
I think with the Rayvan meta glasses, we're
27:51
feeling like I'm using technology. Within that. super
27:53
interesting regulatory challenges. Here I have an always
27:56
on machine that gives me superhuman sensing. My
27:58
vision is better. My hearing is better. My
28:00
memory is better. That means when I see
28:02
you a couple years from now, and I
28:04
haven't seen you in the internet, I'm like,
28:07
oh God, I don't remember. We did a
28:09
podcast together. What's the guy's name? Can I
28:11
ask that question? Am I allowed to ask
28:13
that question? What is your right? Your face.
28:15
You showed me your face. You showed me.
28:18
You showed me your face. You showed me
28:20
your face. You showed me your face. You
28:22
showed me your face. You showed me your
28:24
face. You showed me your face. You showed
28:26
me your face. But I don't have a
28:28
great memory, so am I allowed to use
28:31
a tool to assist me or not? So
28:33
there's really subtle regulatory, privacy, social, acceptability questions
28:35
that are like embedded here that are super
28:37
deep individually and can derail the whole thing.
28:39
Like you can easily derail the whole thing
28:42
in slow progress. Yeah, that's the things I
28:44
think we sometimes think. in our industry, it's
28:46
like feel the dreams, if you build it,
28:48
they will come. And it's like, no, a
28:50
lot of things have to happen right. Well,
28:53
you can also, we're stuck, totally. That's the
28:55
whole, that's the risk is, you're sure you
28:57
can get your hands locked. Right technology can
28:59
get derailed for a long periods of time.
29:01
Nuclear power, nuclear power, nuclear power, got derail.
29:03
Yeah, nuclear power, it, it's, can get derailed,
29:06
for long periods of time. And they just,
29:08
like they just, like they just, like, like,
29:10
like, like, like, like, like, like, like, like,
29:12
like, like, like, like, like, like, like, like,
29:14
like, like, like, like, like, like, like, like,
29:17
like, like, like, like, like, like, like, like,
29:19
like, like, like, like, like, like, like, like,
29:21
like, like, like, like, like, like, like, like
29:23
is looking better than it has been, but
29:25
I think there's still a lot of big
29:28
hedges across there. I actually think the ecosystemers
29:30
was one I would have said previously was
29:32
the biggest one, but AI. is now my
29:34
potential silver bullet there. If AI becomes the
29:36
major interface, then it comes for free. And
29:39
I will also say that we've had such
29:41
a positive response from even just set aside
29:43
Orion, even the Raybam Meadows, companies that want
29:45
to work for us. and building that platform.
29:47
It's not a platform yet. We're so little
29:49
competing. There's so little computing. We just connect
29:52
an app. We literally don't have any space
29:54
yet. But we did do a partnership with
29:56
Be My Eyes, which like helps blind and
29:58
harder vision people navigate. And it's really spectacular.
30:00
And so there's a little window there where
30:03
we can start building. So yeah, the response
30:05
has been more positive than I had expected.
30:07
So everything right now, tailwinds abound. And to
30:09
be honest after eight years after eight years
30:11
of eight years of headwinds. Having a year
30:14
of tailwinds is nice. Yeah, I'll take it.
30:16
I'll take it. I'm not going to look
30:18
at it in the face. No victory left.
30:20
Yeah, but that's good. Okay. But it's all
30:22
hard. Yeah, yeah. At every point it could
30:24
all fail. Yeah, I like these uses start
30:27
with it's invention risk. It's I don't know.
30:29
There's like, there's many ways this just won't
30:31
work. Yeah, that's like, there's many ways this
30:33
just won't work. This just won't work. This
30:35
just won't work. This just won't work. We're
30:38
true believers. We're true believers. We're true believers.
30:40
We're true believers. We're true believers. Like we're
30:42
true believers. Like we're true believers. Like we're
30:44
true believers. Like we're true believers. Like we're
30:46
true believers. Like we're true believers. Like we're
30:49
true believers. Like we're true believers. It needs
30:51
to happen and it doesn't happen for free.
30:53
We can be the ones to do it.
30:55
Our chief scientist, Michael A. Rash, who's one
30:57
of my favorite people I've ever gotten a
31:00
chance to work with, he talks a lot
31:02
about the myth of technological eventualism. It doesn't
31:04
eventually happen. There's a lot of people in
31:06
tech. Like, yeah, AR eventually happened. That's not
31:08
a what... fucking works now. That would actually,
31:10
you have to, you have, or it's a
31:13
specific one that would just absolutely not. You
31:15
have to stop and put the money and
31:17
do it. Somebody has to stop and like
31:19
do it. And that is the difference. The
31:21
number one thing I say is like the
31:24
difference between us and anybody else is we
31:26
believe in this stuff in our course. This
31:28
is the most important work I'll ever get
31:30
a chance to do. This is Xerox Park
31:32
level. new stuff where we're rethinking how humans
31:35
are going to interact with computers. It's like
31:37
JCR, Licklider, and the human computing. We're seeing
31:39
that with AI. It's a rare moment. It's
31:41
a rare moment. It doesn't even happen once
31:43
a generation, I think. It may happen every
31:45
other generation, every third generation. you
31:48
don't get a chance
31:50
to do this all
31:52
the time. this all the time.
31:54
not missing it. We're it.
31:56
We're to do it
31:59
and we may fail.
32:01
Like it's possible, like it's possible.
32:03
not fail for lack
32:05
of effort or belief.
32:07
effort or belief. Great. ton, boss.
32:10
Nice cheers. boss. cheers. cheers. Yeah,
32:12
cheers.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More