Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Support for the show comes from
0:02
Alex Partners. The market is evolving
0:04
at a breakneck speed, and we're
0:06
only starting to understand how disruptive
0:08
forces such as AI, cyber threats,
0:11
and tariffs will change the game.
0:13
The winners will be those who
0:15
prioritize execution and know when to
0:17
adapt. And for unparalleled insights, they
0:19
can turn to the Alex Partners
0:21
Disruption Index. Stay tuned to hear
0:24
more about it later in the
0:26
show. In the face of disruption,
0:28
Businesses trust Alex Partners to get
0:30
things done when it really matters.
0:32
Read more on the latest
0:35
trends in C-sweet insights at
0:37
disruption dot Alex partners.com. Disruption
0:39
dot alix partners.com. Support for
0:41
Decoder comes from Indeed. When
0:43
you realize that you needed to
0:45
hire people yesterday, there's indeed. With
0:48
Indeed, you can find relevant candidates
0:50
and qualified talent quickly. There's no
0:52
need to wait any longer. Speed
0:55
up your hiring right now with
0:57
Indeed. And listeners of this show
0:59
will get a $75 sponsored job
1:02
credit to get your job's more
1:04
visibility at indeed.com/Decoder. Just go to
1:06
indeed.com/Decoder right now and support our
1:08
show by saying you heard about
1:11
indeed on this podcast. indeed.com/ Decoder.
1:13
Terms and conditions apply. Hiring? Indeed
1:15
is all you need. Support for
1:18
Decoder comes from AR. Have you
1:20
ever wondered what's powering your smartphone
1:22
and other devices we interact with
1:24
daily? Or what lies at the
1:27
heart of life-saving drug discoveries and
1:29
robotic surgeries? The answer is arm.
1:31
Arm technology is moving the world
1:33
forward, enabling AI to create a
1:35
more meaningful, more connected life for
1:38
everyone, everywhere. Arm believes the future
1:40
isn't about technology. It's about people,
1:42
and the possibilities technology can offer
1:44
us all. The future is built
1:46
on arm. You can discover more
1:49
at arm.com/discover. Hello
1:53
and welcome to Decoder. I'm Neil Appetel, editor-in-chief
1:55
of the Virgin, and Decoder is my show
1:57
about big ideas and other problems. Today
1:59
I'm talking Almar Latour, who is publisher
2:01
of the Wall Street Journal and CEO
2:03
of its parent company Dow Jones, which
2:05
is most easily thought of as a
2:07
huge research and data provider for other
2:09
companies. And Dow Jones itself is part
2:12
of Rupert Murdoch's news corp, which we'll
2:14
come back to. Now, Almar is a
2:16
fascinating guy. He started as a news
2:18
assistant at the journal in the 90s,
2:20
spent time as a tech reporter, and
2:22
eventually rose through the ranks to become
2:24
CEO in 2020. putting him in charge
2:26
of how all of it makes money.
2:28
And if you've been paying attention, you
2:30
know it's a tough time to be
2:32
making money in the news business, especially
2:34
the paid news business. There are the
2:37
usual challenges of competing with social media
2:39
platforms flooded with free content, but also
2:41
new challenges like AI copyright fights, and
2:43
now even the Trump administration, which has
2:45
been pushing hard to shut down critical
2:47
reporting and limit press freedom. Allmar has
2:49
insight into all of that. He's made
2:51
deals with AI companies like Open AI.
2:53
He's suing other AI companies like Proplexity
2:55
for training without permission, and he's pushing
2:57
to build his own AI data products
2:59
for Dow Jones customers. On top of
3:02
that, he is a fierce defender of
3:04
press freedom, who fought to have Wall
3:06
Street Journal reporter Evan Gerskovich released after
3:08
being imprisoned in Russia for over a
3:10
year. while still working in Newscorp whose
3:12
chairman Rupert Murdoch has deep ties to
3:14
Trump and who has overseen a vastly
3:16
more polarized news media. So, Almar and
3:18
I talked about all of that, and
3:20
I really pushed him on a few
3:22
of his answers, especially right at the
3:24
top of the conversation when I asked
3:27
him at the journal cutting a huge
3:29
chunk of its tech reporting team literally
3:31
the day before we recorded. To his
3:33
credit, Almar was game, and he hung
3:35
in there for all of it, but
3:37
you will hear him literally congratulate me
3:39
for almost getting him to slip up.
3:41
I did my best. There's a lot
3:43
going on in this episode, and I
3:45
think many of you will have thoughts
3:47
on it. So let's just get right
3:49
into it. Allmar Latour, CEO Dow Jones,
3:52
and publisher of the Wall Street Journal.
3:54
Here we go. Allmar
4:03
Latour, you are CEO of Dow Jones and
4:05
the publisher of the Wall Street Journal. Welcome
4:07
to Decoder. Great to be here. Thank you.
4:09
I have a lot to talk about with
4:11
you. There's an entire set of, I think,
4:13
complicated AI questions that might be existential for
4:15
the meat industry, but you're heavily invested in
4:17
building some of that technology and building some
4:19
of those services, which I think is interesting. There's
4:22
the general state of the press in 2025, which
4:24
I want to talk to you about. I know you're
4:26
very interested in press freedom. Yes. But
4:28
it happens that I have you on the
4:30
day after the news, and so I
4:32
want to start with news. Yes. Just last
4:35
night, the Wall Street Journal, of which
4:37
you are the publisher, restructured how it covers
4:39
tech and media that involves cutting about
4:41
10 or 12, 15 editors and reporters. Obviously,
4:44
I'm personally very interested in how you structure
4:46
a newsroom to cover tech. Why make
4:48
that decision? Why get smaller? This is
4:50
a newsroom decision, so this
4:52
is squarely the terrain of
4:55
Amitaka, who is a new
4:57
editor, relatively speaking. She's a
4:59
year two, moving into year
5:01
three. Emma was hired with
5:03
the remit of helping to
5:06
increase engagement with our existing
5:08
readers and to expand our
5:10
readership and to maintain and
5:12
enhance the quality of our
5:14
coverage. She has set out
5:17
over the past two years,
5:19
really, to rethink how she
5:21
wants to offer news with
5:23
the Wall Street Journal newsroom.
5:25
Her consistent message, and this
5:27
is one that I subscribe
5:30
to, is distinctive journalism is
5:32
what makes the difference, knowing
5:34
the interesting story, the story
5:36
behind the story, and to
5:38
have exclusive journalism and exclusive
5:40
insights. That I say is
5:42
a preface because Emma has
5:44
been making changes to nearly
5:47
every part of the Wall
5:49
Street Journal and continues to do
5:51
that. And so what happened
5:53
yesterday was a continuation of
5:55
that. And generally, when you
5:57
look at, and I'm not
5:59
speaking specifically... about the San Francisco Bureau
6:01
about tech, but generally when you look
6:03
at the changes that have been brought
6:06
in she has broad in new talent
6:08
and she has an antenna for what
6:10
she thinks works there and one of
6:12
the areas where you see that. very
6:15
pronounced in recent months even as in
6:17
Washington DC that has gone through several
6:19
cycles of changes. And so she's brought
6:22
a new people that has had consequences
6:24
and the marching orders are slightly different
6:26
where there is a closer connection to
6:28
the center of the newsroom where the
6:31
decisions about news can be made
6:33
in context of a broader story,
6:35
a macro story that's happening around
6:37
the world rather than in isolation
6:39
around a certain topic. That's the
6:41
context for yesterday. I'm not going
6:43
to comment on specific individuals or
6:46
specific plans from Emma. I'll leave
6:48
that to her. I've worked with
6:50
quite a few of those people.
6:53
As you know, I was a
6:55
tech reporter myself as well. But
6:57
overall, what Emma has focused on
6:59
and what the Journal and Dow
7:02
Jones are focusing on is going
7:04
deeper and having more exclusivity, more
7:06
quote-unquote proprietary content. And those moments
7:08
like yesterday are... Absolutely never easy. I
7:11
think I see a thesis of Dow
7:13
Jones as a company, and we'll come
7:15
to the big picture in a second
7:17
here. Yes, yes. The idea is you're
7:20
going to give a bunch of people
7:22
in the business world an edge, an
7:24
information edge, whether it's with the Dow
7:26
Jones information services or some of the AI
7:28
tools or with the Wall Street Journal,
7:31
which gets a bunch of scoops and
7:33
tells people stuff they didn't know before.
7:35
But I'm just kind of looking at
7:37
Dow Jones as broadly. to 600
7:40
million that's up 3% your earnings
7:42
are up 7% to 74 million
7:44
but the cuts are like in
7:46
tech which is dominating the world
7:48
like yeah and I'm just
7:51
wondering about that resource allocation
7:53
because that that's the role
7:55
the publisher I think yeah
7:57
well it is in the sense that
7:59
The newsroom has a budget and
8:02
we support quality journalism and frankly
8:04
we are so successful at this
8:06
moment as a company that all
8:08
of our investments are in enhancing
8:11
the quality of our journalism, enhancing
8:13
quality of our data or analytics
8:15
etc. And so getting better news,
8:18
getting better information. That is the
8:20
mission, and so we are investing
8:22
in that. So I just want
8:25
to correct one simplification that sometimes
8:27
comes to the surface at moments
8:29
like this, and I don't think
8:31
you intended to do that, but
8:34
I think it's important to make
8:36
a distinction. And this is hard
8:38
when you go through what is
8:41
at an individual level super, super,
8:43
super hard, you know, what happened
8:45
yesterday and many times in journalism.
8:47
This shift, like any other shift
8:50
that Emma has gone through, is
8:52
not to eke out more profit
8:54
by having fewer resources. There is
8:57
an overall climate inside our company.
8:59
You see our earnings growing and
9:01
you see our revenue grow and
9:03
our subscription base growing. We will
9:06
invest wherever there's a good business
9:08
case to be made and tech
9:10
and the cross section of tech
9:13
with policy with politics, with global
9:15
trade, with society is one of
9:17
the top priorities for Dow Jones,
9:20
top stories in the world. So
9:22
don't take a snapshot and say,
9:24
okay, we're going to stop there.
9:26
This is a top priority. We'll
9:29
permeate. Tech will permeate everything. Everything
9:31
is permeating everything right now. We
9:33
are looking at Washington or looking
9:36
at the announcements from Macron or
9:38
in France. And so Don't take
9:40
the snapshot, I guess, is what
9:42
I was headed at, that this
9:45
is a moment in time, talk
9:47
again in a year, and our
9:49
tech coverage should be broader, deeper,
9:52
and probably have a larger following.
9:54
I think one of the questions
9:56
I have, I run a tech
9:58
publication. Yeah. nominally a tech publication
10:01
and we are we're heavily invested
10:03
in covering policy. Yes. One of the
10:06
lines we've always used a cliche even
10:08
is the version cover everything because everything
10:10
is now a text story and that
10:12
was away I think 10 years ago of
10:15
maintaining a broad focus and now it's very
10:17
real. Elon Musk is at the state of
10:19
the Union. You can see the tech giants
10:21
fighting tooth and nail against
10:23
the Digital Services Act in the EU
10:25
and that is now a part of American
10:27
foreign policy. Do you see that is... Okay,
10:29
maybe all of the Wall Street Journal
10:31
is about tech in that way. Is
10:34
that is that getting more expansive for
10:36
you? There is a current of tech
10:38
that runs through every story at the
10:40
Wall Street Journal runs through everything at
10:43
Dow Jones in two ways as a
10:45
story and as technology, right? There's not
10:47
a part of the Wall Street Journal.
10:49
There's not a part of Dow Jones
10:52
where tech does not feature and therefore
10:54
having people cover tech and isolation is
10:56
is one One way of covering this,
10:59
in addition, tech becomes a core part
11:01
of many other beats and many other
11:03
areas that we cover. Tech is a
11:05
horizontal and it's a vertical at the
11:08
same time. We'll come back to the
11:10
AI deals you've struck. I want
11:12
to talk with them extensively, but
11:14
just in this context, like so
11:16
many publishers, you've struck a deal
11:19
with opening AI, you've struck deals
11:21
with other AI companies. Are those providing
11:23
enough revenue for you to
11:25
invest against in the newsroom?
11:27
Or are you still in wait and see
11:29
mode with those deals? I don't think
11:31
we should tether or investments in AI
11:34
to any individual AI deal. So that's
11:36
not how I look at it. And
11:38
so it's not like, oh, AI brings
11:40
in this much money and now I
11:42
can invest this much in AI. You
11:44
could, I guess, rationalize it. Or the
11:46
newsroom. Or the AI in the newsroom.
11:48
We have our investment priorities and we're
11:50
following a game plan that we've been
11:53
following for a while and you have
11:55
the meanders every once in a while
11:57
but but the goal is pretty clear we
11:59
we intend and growing in three ways.
12:01
One is by going deeper, investing
12:04
in the depth of our content,
12:06
over data, over analytics, by growing
12:08
wider, and that is by adding
12:10
new areas of expertise. So last
12:12
week we announced our agreement to
12:15
acquire Oxford Analytica, so going deep
12:17
in geopolitics. That's an addition. But
12:19
in future investments, once they are
12:21
part of Dow Jones and the
12:23
Wallster Journal. we will probably invest
12:26
in going deeper into that area
12:28
of geopolitics. And then the third
12:30
way of growing is by connecting
12:32
everything that we have, meaning that
12:35
there should be easier access to
12:37
the data that underlies everything of
12:39
Dow Jones for everything that we
12:41
do at the Wall Street Journal.
12:43
Now as to your question, investing
12:46
in the newsroom, is a goal
12:48
in itself. We are a successful
12:50
leading subscription-driven news organization. And we
12:52
grow by investing in our journalism,
12:55
not by shrinking our journalism. So
12:57
we're not, there's sometimes a temptation
12:59
to be in media an austerity
13:01
mode and just take away in
13:03
order to eke out a profit.
13:06
That's not us, and that's not
13:08
how we're growing whatsoever. I think
13:10
it's more than a temptation for
13:12
most media businesses right now. That
13:14
is the reality of the situation,
13:17
right? It costs more to make
13:19
the information than most people. can
13:21
return on it. Yeah, but if
13:23
your answer to that time and
13:26
time again is okay I'll I'll
13:28
cut in order to make ends
13:30
meet, but that's not a strategy.
13:32
You're not addressing something in your
13:34
model. You might not be addressing
13:37
something correctly in the way you're
13:39
organized, maybe where you're focused. So
13:41
that to me was never an
13:43
acceptable method to grow or to
13:46
create great journalism. I understand that
13:48
sometimes as a company in any
13:50
industry you can have your back
13:52
against. Well, you may have to
13:54
cut in order to make ends
13:57
meet, but that's not a strategy.
13:59
Yeah. And so. But people get
14:01
that wrong. I think you're absolutely
14:03
right. That might be sort of
14:06
a prevailing tendency, but I don't
14:08
think it should be. The prevailing focus,
14:10
I think, should be. Where do you
14:12
make a difference in the
14:14
news and information that you offer?
14:16
How do you make that distinctive?
14:18
How do you add value? How
14:20
do you allow people to make
14:22
decisions based on that? How do
14:24
you convince people? that they
14:26
should recognize the value of the information
14:28
that you offer. And people in the
14:31
past used to say, oh, you're a
14:33
subscription business, you're the Boston Journal, because
14:35
you're all about business, and therefore that
14:37
doesn't apply to anything else. I don't
14:40
think that is true. I think people
14:42
recognize value of a lot of different
14:44
types of information, and doesn't just have
14:46
to be about business. And so I
14:48
think there's a lot of opportunity actually
14:50
in shifting from this austerity mode to
14:53
creation mode and building mode. easier said
14:55
than done and sometimes you have to
14:57
step away from some things that just
14:59
aren't working. And that recognition of
15:01
value is very challenging. I understand why
15:03
it happens in the business community. I
15:06
even understand what happens for us in
15:08
the tech press because it's often tradable.
15:10
So you can pay a high rate
15:12
to the Wall Street Journal if you
15:14
are a Wall Street trader or an
15:17
investor, some other kind of business professional
15:19
because the information has such clear value to
15:21
you that you can use to trade upon
15:23
in some way. to make a deal or
15:26
make an investment and buy or sell a
15:28
stock. I think for the average consumer, that
15:30
information is not tradable. They can just
15:32
open TikTok and maybe there's some
15:35
influencer reading the Wall Street Journal
15:37
to them for free. In that elimination
15:39
of scarcity, I think has been the
15:41
fundamental challenge. It was the challenge we
15:44
went into social video platforms. It feels
15:46
like the challenge again for AI. The
15:48
AI platforms are going to take
15:50
all of the world's information and now...
15:53
completely eliminates. Even the scarcity of having
15:55
to click. They're just going to tell you
15:57
what the models have read on the internet. Do
15:59
you pursue? that is as existential a
16:01
challenge as some of your peers
16:04
in the media do? Yeah, first,
16:06
I think you're absolutely right in
16:08
that the bar on being distinctive
16:10
by content or news or information
16:13
that you create has gone way
16:15
up. And so you have to
16:17
be more discerning into where you
16:19
focus at. You ask about the
16:22
existential threat around AI. That goes
16:24
back to the recognition of value.
16:26
And in first instance, I want
16:28
to make sure that the industry,
16:31
but certainly now Jones and the
16:33
Wall Street Journal and all of
16:35
our publications. don't fall into a
16:37
trap that is similar to two
16:40
decades ago when all information had
16:42
to be free and people took
16:44
scraps from search engines, etc. and
16:46
then found out over time that,
16:49
hey, we've lost effectively, that we
16:51
see that market. And so that's
16:53
why we're investing time and resources
16:55
right now into making sure that
16:58
large players in the AI space.
17:00
recognize that value and our push
17:02
in first instance is to make
17:04
sure that there is a commercial
17:07
agreement around that and I think
17:09
in many cases that is on
17:11
both sides of that fence to
17:13
prefer to prefer an outcome and
17:16
then where we can't reach a
17:18
commercial agreement where there are fundamental
17:20
differences of opinion we are prepared
17:22
in some cases to say, okay,
17:24
then we'll fight it out in
17:27
court. And so we've walked both
17:29
paths with a preference for the
17:31
first. But I'm still answering your
17:33
question as to what's existential here
17:36
and is there an existential threat?
17:38
And you spend it forward and
17:40
we can get to that what
17:42
it means for the user and
17:45
how consumers respond to that. But
17:47
I think first we've got to
17:49
get to that starting blocks, if
17:51
you will, which is okay. These
17:54
are the companies that are providing
17:56
Gen AI UX and new interactions
17:58
for consumers. have gotten to that
18:00
point by using information. We need
18:03
an acknowledgement that information has value,
18:05
certainly our information. And that if
18:07
you want to use that information
18:09
on an ongoing basis that makes
18:12
sure that some of your Gen
18:14
A. I. Produced content answer to
18:16
queries is current and is reliable.
18:18
You'll have to pay us for
18:21
access if you value that. And
18:23
so that part we cannot skip
18:25
over. There's a whole other part of
18:27
this where We don't yet exactly
18:29
know how the user is going
18:31
to interact, but we see the
18:34
trending there. But that part, we
18:36
have to get right. And we're
18:38
in the middle of that, or
18:40
maybe we're at the first part
18:42
of that still, yeah. I'm
18:44
going to ask you one
18:46
more, like, very existential philosophical
18:48
question. And I need to
18:51
get back to having you
18:53
explain the company. I need to
18:55
get to the other questions. It
18:57
feels like we might be describing
19:00
a world where regular people are
19:02
awash in a sea of free lies, right, that
19:04
come to them on social media, the
19:06
social media companies are giving
19:08
up on fact checks, various
19:10
billionaires, say whatever they
19:12
want, on podcasts with no pushback.
19:15
And then what you are providing
19:17
is a very expensive source of
19:19
truth. Or hopefully what I am
19:21
providing is a. affordable source of
19:24
truth. Yeah, I would say that's
19:26
a big discrepancy, right? Yeah, I
19:28
would say whereas affordable is a
19:30
cup of coffee, like a day.
19:32
I mean, everyone drinks coffee. Like it's
19:34
not, it's not, I think it's
19:36
a myth that access to reliable
19:38
information is. is unaffordable. It is
19:40
an individual choice. I realize it's
19:42
a hard choice to make if you
19:44
don't have significant disposable income, and
19:46
that's what you're indicating. But I
19:48
do believe that there is also
19:50
a choice to be made. I
19:53
understand why people would buy the Wall
19:55
Street Journal or some indulgence as products.
19:57
We should come to. But that's the
19:59
bigger picture. It's convincing the next
20:01
consumer that they should pay
20:03
for information as opposed to
20:05
picking a filter bubble on
20:07
social media. Whether or not
20:10
that's AI or if it's
20:12
social media, if it's just
20:14
algorithms or whatever, that seems
20:16
like the challenge the media
20:18
faces. The game is, or
20:20
the challenge is, convince someone
20:22
that it's to their benefit.
20:24
to invest in having access
20:26
to reliable information, whether that's
20:28
for making decisions in the
20:31
realm of investments or technology
20:33
or policy, or whether it's...
20:35
hyper-local, and I actually want
20:37
to understand what's going on
20:39
in my community. Some of
20:41
that I might get from
20:43
AI, but some of that
20:45
I might not, and I
20:47
might want to invest a
20:50
small amount of money to
20:52
understand what's happening in my
20:54
community. And there's some examples
20:56
of that popping up, and
20:58
I think we'll see in
21:00
response to this huge question
21:02
that you're asking. We will
21:04
see innovation in journalism and
21:06
a lot of creativity in
21:09
already, and then in years
21:11
to come. where undeniably with
21:13
the truth that you just
21:15
presented that there's this flood
21:17
of information, mixed quality, undeniably
21:19
there's also huge demand for
21:21
reliable information. People are craving
21:23
it more than ever before.
21:25
In fact, the more noise
21:27
there is, the more people
21:30
are confused and the more
21:32
they are reaching for, hey,
21:34
tell me what this means.
21:36
We see this in our
21:38
data, right? We see this
21:40
when there are... The moments
21:42
of friction in society or
21:44
in business or on geopolitics
21:46
and any market, we see
21:49
a spike and people coming
21:51
to us for free, but
21:53
also we see a spike
21:55
in subscriptions demand for reliable
21:57
information I think has gone
21:59
up. as a pool of unreliable information
22:01
has grown or uncertain information. Some
22:04
of it might be reliable, some
22:06
of it might not be. So
22:08
I think you cast that as
22:10
an existential risk. I can also
22:12
cast that as an opportunity. I'd
22:14
like to be aware of the
22:17
existential risk, take the precautions there,
22:19
but mainly focus on the
22:21
opportunity and meeting that demand.
22:24
And I think we haven't.
22:26
met that demand by any
22:29
stretch. I think there's a
22:31
huge opportunity for
22:33
us and for
22:36
other publications lying
22:39
ahead to meet that
22:41
demand. And I
22:43
think that demand
22:45
will actually only
22:47
grow. We need to take
22:50
a short break. We'll be right back.
22:52
Hours are long and initially totally unpaid.
22:54
There's no guarantee of success. And yet,
22:56
the Pet Rock guy made it happen.
22:58
But if you run a business, you
23:00
know that growth hinges on harnessing the
23:03
power of a great team. If you
23:05
want to add an essential piece to
23:07
that team, you might want to check
23:09
out Shopify. Shopify is an all-in-one digital
23:11
commerce platform that wants to help your
23:13
business sell better than ever before. It
23:15
doesn't matter if your customers spend their
23:17
time scrolling through your time scrolling through
23:19
your physical storefront. Shopify says they
23:22
can help you convert browsers into
23:24
buyers and sell more over time.
23:26
And their shop pay feature can
23:28
boost conversions by 50%. There's a
23:30
reason companies like Mattel and Heinz
23:33
turn to Shopify to sell more
23:35
products to more customers. Businesses that
23:37
sell more sell with Shopify. Want
23:39
to upgrade your business and get
23:42
the same checkout Mattel uses? You
23:44
could sign up for your $1
23:46
per month trial period at shopify.com/Decoder.
23:48
All lower case. That's shopify.com/ Decoder
23:51
to upgrade your
23:53
selling today. shopify.com/Decoder.
23:56
Support for Decoder comes
23:58
from Kenstay. Do you ever
24:00
feel paralyzed with all the tasks that need
24:02
to get done? Especially if you're a small
24:04
business owner looking to promote your brand, you
24:06
want to be able to focus on the
24:08
content of your website, not the back-end minutiae
24:10
and security that comes with it. And let's
24:12
be honest, not everyone is tech savvy or
24:14
has the time to learn Java. But keeping
24:16
your website up to date is vital in
24:18
this day and age. So what do you
24:20
do? First, take a simple water. Second, take
24:23
a simple water. Look to get help from
24:25
Ken's. Kenstay has bundled up all of the
24:27
essentials to make sites stress-free with speeds that
24:29
will wow your visitors, security that never
24:31
sleeps, and a dashboard that's incredibly
24:33
intuitive. And when you hit a snag,
24:35
you'll be able to talk to real
24:37
humans, 24-7, 365. In short, Kenstay is
24:40
perfect for those who want professional results
24:42
without needing a technical background. So if
24:44
you're tired of being your own website
24:47
support team, you could switch your hosting
24:49
to Kenstay and get your first
24:51
month free. And don't worry about
24:53
the move. They'll handle the whole transition
24:55
for you. No tech expertise required. Just
24:58
visit kensay.com/decoder to get started. That's
25:00
k-i-n-st-a.com/decoder. Support for Decoder
25:02
comes from Vanta. Trust
25:04
isn't just earned. It's demanded.
25:06
Whether your startup founder navigating
25:08
your first audit or a
25:10
seasoned security professional scaling your
25:13
GRC program, proving your commitment
25:15
to security has never been
25:17
more crucial or more complex.
25:19
That's where Vanta comes in.
25:21
Businesses use VANTA to establish trust
25:23
by automating compliance needs across over
25:25
35 frameworks like SOC 2 and
25:28
ISO 27001. VANTA also helps centralized
25:30
security workflows, complete questionnaires up to
25:32
five times faster, and proactively manage
25:34
vendor risk. VANTA not only saves
25:36
you time, it could also save
25:39
you money. A new IDC white
25:41
paper found that VANTA customers achieve
25:43
$535,000 per year in benefits and
25:45
that the platform pays for itself
25:48
in just three months. You could
25:50
join over 9,000 global companies like
25:52
Atlacian, Cora, and Factory who use
25:54
Vanta to manage risk and prove
25:57
security in real time. For a
25:59
limited time, Our audience gets
26:01
$1,000 off Vanta at
26:03
vanta.com/Decoder. That's v-a-n-t-a.com/Decoder for
26:05
$1,000 off. Welcome back.
26:07
I'm talking with Almar
26:09
Latour. Before the break, we were
26:12
talking a lot about his role as
26:14
publisher of the Wall Street Journal,
26:16
which is half of his job.
26:18
But the other half is being
26:20
CEO of Dow Jones. You're probably
26:22
thinking that has something to do
26:24
with the very famous Dow Jones
26:27
Industrial Industrial Average. But the company
26:29
actually sold that off on the
26:31
go. Now it's much more of
26:33
a data and insights provider.
26:35
This is a good place to
26:37
back up a little. We've talked
26:39
a lot about the Wall Street
26:41
Journal. I think people know about
26:43
the Wall Street Journal. I think
26:45
Dakota Listeners also probably know about
26:47
Rupert Murdoch and News Corp, which
26:49
is the parent company of
26:51
Wall Street Journal. Describe how all
26:54
of that fits together in your role
26:56
as CEO. Think of it as a
26:58
Rubik's Cube. And inside of it is
27:00
all of our premium journalism,
27:02
our exclusives, our explanation of
27:05
what's happening right now. There's
27:07
our proprietary data that we
27:10
have on many different sectors
27:12
in the global economy. There's
27:15
factiva, many sources, thousands and
27:17
thousands of sources from around
27:20
the world, sitting inside that
27:22
Rubik's Cube. Each tile on that
27:24
Rubik's Cube is a way to get
27:26
out of Dow Jones, what is
27:29
important and relevant
27:31
to you. Maybe I want a couple
27:33
of different tiles, maybe I
27:35
want the whole thing. That's fundamentally
27:38
a way of thinking about how we...
27:41
operate and how ultimately also were organized
27:43
and AI is actually helping a great
27:45
deal with this and is accelerating this
27:48
or genitive AI AI has been an
27:50
automation has been with Dow Jones for
27:52
a long long time. Let's then make
27:54
that a little bit more complex than
27:57
that but if one of those styles
27:59
is about bond trading, I ought
28:01
to be able to get all
28:03
information, premium information, live information, but
28:05
also analytics and forecasting out of
28:08
Dow Jones to help me in
28:10
my job. The way that were
28:12
organized is sort of the cards
28:14
that I was dealt when I
28:16
took over almost five years ago
28:19
as CEO was to look at
28:21
Dow Jones is a platform of
28:23
verticals. You've got business news, it's
28:26
both a horizontal and a vertical,
28:28
but that's the Wall Street Journal,
28:30
as you say, needs no huge
28:33
explanation there. There's wealth in investing,
28:35
where we have barons, market watch,
28:37
financial news, which is a title
28:40
in the UK, but also private
28:42
equity news. We've got a compliance
28:44
arm that has data on compliance.
28:47
It helps companies discern. Should I
28:49
do business with this person
28:51
or not? Are they on
28:53
a sanctions list or not?
28:55
We've added since an energy
28:57
arm and within that commodities
29:00
and petrochemicals, we've added a
29:02
leadership arm that organically, looking
29:04
at what it takes to
29:06
be a modern leader. And
29:08
each of these verticals, if you
29:10
will, are successful when they
29:12
do four things. You have to have
29:14
news. And leading news in that
29:17
particular vertical, in that particular
29:19
area of concentration. So big,
29:21
your industry, we've got to
29:23
be, unless you have that,
29:25
you're not relevant. Second, you have
29:27
to have proprietary data. Some of that can
29:29
come out of the news, some of it,
29:31
you have to build, you have to buy,
29:34
and we've done that. If you have those
29:36
two things, you can do proper analytics,
29:38
AI helps with that to some degree.
29:41
And that's where the value keeps on
29:43
going up. if you sell products that
29:45
can do forecasting and analytics. And
29:47
then the fourth factor is convening
29:49
power, that is bringing people in
29:51
that industry, in that sector, in
29:53
that vertical, bringing them together. And
29:56
if you have those four, you
29:58
get actually a mini net. work
30:00
effect inside that industry. We've seen
30:02
that, for example, happen by
30:04
design. We've executed against this with
30:06
our manager, Joel Lange, who
30:08
runs our risk and compliance business.
30:10
If you have all those
30:12
four parts, then you see the
30:14
leaders in that industry come
30:16
together and, let's say, take the
30:18
risk and compliance. You see
30:20
compliance officers coming together in our
30:22
compliance council at Davos and
30:24
in other places. And so now
30:26
you have thought leaders there.
30:28
Well, the procurement officers who buy
30:30
our data products and see
30:32
that, oh, yeah, the people that
30:34
are leading my department or
30:36
that are leading my company are
30:38
talking on a Dow Jones
30:40
platform about these big themes. So
30:42
it anchors Dow Jones more
30:44
deeply, whether that's just an observation
30:46
and anecdotal, or whether that
30:48
actually translates into business, often it
30:50
translates into business. That has
30:53
the same effect on other analytical
30:55
products. And then if you
30:57
add news to it in that
30:59
risk vertical, we have mentioned
31:01
internally that we will have a
31:03
risk journal. So you will
31:05
have risk industry folks, compliance officers,
31:07
and the like tap into
31:09
that risk news product to actually
31:11
start their workday and understand
31:13
what's happening. Now, end to end,
31:15
we are present in your
31:17
workflow in that industry. And by
31:19
the way, since we're at
31:21
the Wall Street Journal and that's
31:23
an extension as well, we're
31:25
probably also when you go home,
31:27
we're still with you in
31:29
another way. So
31:31
that's the view of the
31:33
company and our operating model.
31:35
Build verticals that have these
31:37
four parts at a minimum.
31:39
Go deeper and make that
31:41
exclusive. This is why what
31:43
Emma is doing in the
31:45
Wall Street Journal Newsroom really
31:47
matters. More exclusives, more distinctive
31:49
journalism helps with that first
31:51
part with the news, obviously.
31:53
We've built and bought and
31:56
created more proprietary data. We've
31:58
spent well in excess of
32:00
a billion dollars on getting
32:02
companies that are specializing. that added to
32:04
a roster. We're doing more analytics and even a little
32:06
bit of consulting. We're never going to be a consulting
32:08
company, but that's an outflow of analytics. And then
32:10
convening power, I think, for a
32:12
long time in the media, was
32:14
misread as just events, but it's
32:16
something bigger, far bigger than that.
32:18
I think it's a subscription business.
32:20
It's a recurring revenue business, should
32:22
be. It doesn't mean that you
32:24
can't have sponsorship for it, but
32:26
it is fundamental to anchoring. the
32:28
decision-makers of a certain industry in
32:30
your platform. So we have that
32:32
stack. When we have that, we
32:34
call it a full stack. So
32:36
we've got a full stack in risk.
32:39
We're building a full stack in
32:41
energy and subsets of energy. And so
32:43
we're going deeper there. And then you
32:45
see us in the future. It's a
32:47
scalable model because we now understand
32:50
how do you build these verticals.
32:52
Either we can use mine the
32:54
Wall Street Journal for new verticals
32:56
because we see what people gravitate
32:59
to, where do we have the
33:01
expertise, and then build on that,
33:03
or we can inorganically add as
33:05
well, or we can organically start
33:08
outside the Wall Street Journal Newsroom.
33:10
So going deeper, going wider, and
33:12
then connecting everything is where ultimately,
33:14
if you go back to the
33:16
Rubik's Cube, if you want to
33:18
buy that whole Rubik's Cube, we
33:20
will also make that possible for
33:22
you. And so we're making sure that these
33:24
magnificent data pools are
33:26
going to be available. And some
33:28
of this is of course still,
33:30
and it works, but going to
33:33
be available to our journalists so
33:35
that they can do exclusive work
33:37
with that. You're describing Dow Jones
33:39
is something that makes really high
33:41
quality, rigorous information across its newspapers,
33:43
magazines, across its data products. That
33:46
sits within News Corp. How often do
33:48
you hang out with Rupert Murdoch? I
33:50
wouldn't put it in
33:52
the category of hanging
33:54
out whatsoever. There's a
33:56
healthy friendly interaction. He's
33:58
chairman emeritus. You know, the
34:01
caricature on a side, just he's
34:03
built enormous media success stories over
34:05
time. And so from a business
34:07
point of view, there was certainly
34:10
in my early days, there was
34:12
a lot to learn from like,
34:14
how do you create businesses and
34:16
such? But there's a. from Newscore
34:19
there's been nothing but support for
34:21
our our growth story. Yeah. And
34:23
so I'm very thankful for that.
34:25
That's that's Robert's Loughlin. It's also
34:28
Robert Thompson. It's a whole apparatus,
34:30
but the access to capital that we've
34:32
had as Doug Jones to you have.
34:34
Five years ago, we hadn't done, I
34:36
think, an acquisition for well over a
34:39
decade. Now we've done billions in acquisitions.
34:41
That's support. I would say that's
34:43
a vote of confidence for the direction
34:46
that we're taking for the strategy that
34:48
I outlined to you, going deeper,
34:50
growing wider, connecting things. But there's also
34:52
a deep respect for the independence of
34:55
the Wall Street Journal and the
34:57
value that comes with that. And so
34:59
I've seen that consistently applied. That's my
35:01
answer to your hangout question. River Murdoch
35:04
plays on both sides of the
35:06
information crisis, right? You can watch
35:08
his other properties, create whatever reality
35:10
is politically expedient for Donald Trump,
35:12
and then I can see the
35:14
Wall Street Journal. rigorously cover the
35:16
impact of tariffs, all the way down
35:18
to the opinion pages, which sometimes say
35:20
the tariffs are bad. Oh, no, they
35:22
don't just say tariffs are bad. The
35:25
opinion pages, as Trudeau, the Prime Minister
35:27
of Canada, yesterday in his press conference,
35:29
said, I don't often, I'm paraphrasing here,
35:31
I don't often quote the Wall Street
35:33
Journal, but they said that. this trade
35:35
war is the stupidest trade war in
35:38
history or something along those lines we
35:40
have run we don't hold back or
35:42
our opinion pages don't hold back and
35:45
their assessment based on well-established principles of
35:47
free markets and free people and so
35:49
that we both smile for for obvious
35:52
reasons I'm sure to ask about it
35:54
that is is core to who we
35:56
are the independence to make that judgment
35:59
is core to who we are. And
36:01
yeah, for my part, I'm focused on
36:03
the Dow Jones part of making sure
36:06
that that sings. But do you see
36:08
that contradiction? Do you think your team
36:10
see that contradiction that you're trying to
36:12
sell really high quality information while another
36:15
part of the structure that has the
36:17
same ownership is contributing to
36:19
an information crisis? Listen, I
36:21
can't. comment because of shared ownership
36:24
structure. I'm not going to comment
36:26
on my colleagues at Fox. We
36:28
had yesterday protesters outside on Fox
36:30
Square and you know people are
36:32
protesting that while the reporters in
36:35
the Wall Street Journal and Dow
36:37
Jones are doing their job. So
36:39
there is an awareness of the
36:41
perception of Fox and what we're
36:43
focused on, but I think there's
36:46
also a great awareness amongst our
36:48
staff and has been for now
36:50
over 15 years, that those two
36:52
things are separate without specifically commenting
36:54
on how you characterize that because
36:56
I'm just not going to get
36:58
into that. I want to ask
37:00
the last Dakota question and I
37:03
want to end by talking about AI kind
37:05
of at length here. You've had to make a
37:07
lot of decisions, right? You've changed the way the company
37:09
works? A lot, a lot. You obviously have a way
37:11
of thinking about the company. It's very specific. Although I
37:14
will say, Rubik's cubes are meant to be solved. And
37:16
I'm not sure that you want anyone solved. I think
37:18
you want them to remix the cube, not solve it.
37:20
But what's your framework for making decisions? How do you
37:22
make decisions? Well, I take in information a lot. And so
37:24
I'd like to be, so I'd like to a like to a, so I'd like to
37:26
a, so I'd like to a, so I'd like to a, so I'd like to a,
37:28
so I'd like to a, so I'd like to a, so
37:30
I'd like to a, so I'd
37:33
like to a, familiar with the
37:35
facts on the ground. And so
37:37
I am very consultative, take in
37:39
expertise, absorb, so I feel like
37:41
I have a level of mastery
37:43
sufficient to that make a decision.
37:45
That's one part. I'm very inquisitive.
37:48
I think the strength of good
37:50
journalists is they know how to
37:52
ask questions and they have a,
37:54
they're driven by curiosity. As an
37:56
executive, I'm driven by curiosity. I
37:58
want to figure. out how things
38:01
interact, to understand the nuances, and
38:03
so that's for my part. I
38:05
also, at the same time,
38:07
believe in letting managers manage or
38:09
creators create and making sure that
38:11
what I'm doing is to help
38:13
make those managers or those creators
38:15
make them successful. And so depending
38:18
on where we're focused, if
38:20
we're focused on something that is
38:22
important for the whole company, I
38:24
will be informed and I will
38:26
make the decision consulting my management
38:28
team. Within the framework
38:30
of the strategy that I've outlined
38:32
and the intricacies of that, I
38:35
am a firm believer in letting
38:37
the managers make decisions. And so
38:39
I have a very flat structure
38:41
where there is a lot of
38:43
autonomy to within that framework make
38:45
decisions, because I think that allows
38:47
people to move faster, allows the
38:50
company to move faster, allows us
38:52
to experiment and without going through
38:54
a central clearinghouse constantly. And so
38:56
I want, I guess both, I
38:58
want on the one hand, I
39:00
want to have the expertise. So I'm
39:02
informed, there's a limit to that because
39:04
you can't do that when you are thinking
39:06
about macro issues about everything. And so
39:08
I have to be judicious and how
39:10
I do that, where I focus that
39:12
thirst. And then on the other
39:14
hand, having that flat structure and empowering
39:17
people. We
39:20
have taken a break. We'll be back in just a minute. It's
39:31
been reported that one in
39:33
four people experienced sensory sensitivities,
39:36
making everyday experiences like a
39:38
trip to the dentist, especially
39:40
difficult. In fact, 26 % of
39:42
sensory sensitive individuals avoid dental
39:45
visits entirely. In
39:47
sensory overload, a new documentary
39:49
produced as part of Sensodyne's
39:51
Sensory Inclusion Initiative, we follow
39:53
individuals navigating a world not
39:55
built for them, where bright
39:57
lights, loud sounds, and unexpected
39:59
touch. can turn routine moments into overwhelming
40:01
challenges. Burnett Grant, for example, has
40:04
spent their life masking discomfort in
40:06
workplaces that don't accommodate neuro divergence.
40:08
I've only had two full-time jobs
40:10
where I felt safe, they share.
40:12
This is why they're advocating for
40:14
change. Through deeply personal stories like
40:16
Burnett's, sensory overload highlights the urgent
40:18
need for spaces, dental offices and
40:20
beyond, that embrace sensory inclusion. because
40:22
true inclusion requires action with environments
40:24
where everyone feels safe. Watch Sensory
40:26
Overload now, streaming on Hulu. These
40:28
days I can do anything from
40:30
my phone. Book a vacation, order
40:32
a meal from a five-star restaurant,
40:34
buy and trade stocks, but maybe
40:36
the most amazing thing I can
40:38
do is make my dirty laundry
40:40
disappear, and then reappear perfectly washed
40:42
and folded. I have rinse to
40:44
thank for that. I just schedule
40:46
a pickup in the rinse app
40:48
or at rinse.com. A rinse valet
40:51
comes to get my clothes and
40:53
before I know it, they're back,
40:55
crisply folded and ready to wear.
40:57
They even do dry cleaning, which
40:59
is returned hanging in a nice
41:01
rinse garment bag. And with rinse,
41:03
my satisfaction is guaranteed. If for
41:05
any reason I'm not happy, they'll
41:07
reclaim my clothes for free. Best
41:09
of all, rinse saves me tons
41:11
of time each week. That's time
41:13
I get to do something I
41:15
love versus something I have to
41:17
do. So if you want to
41:19
save loads of time by not
41:21
doing loads of laundry, remember. There's
41:23
an app for that. Rinse. Sign
41:25
up now and get $20 off
41:27
your first order at rins.com. That's
41:29
r-i-n-s-e.com. Where's your next vacation going
41:31
to be? Well, if you're looking
41:33
to have a vacation to remember
41:35
for your entire family, then look
41:38
no further than Virginia Beach. At
41:40
Virginia Beach, you'll be transported to
41:42
a place where you'll immediately feel
41:44
welcomed. A place all about the
41:46
good vibes, finding your tribe, and
41:48
your happiness. Enjoy their long pleasure
41:50
beach that hosts their annual festivals
41:52
and concerts. And if you're in
41:54
the mood for dinner, Make sure
41:56
to check out their waterfront dining
41:58
made from their fresh local seafood
42:00
and front-to-table ingredients. Plus, there are
42:02
an abundance of outdoor activities that
42:04
will have the whole family resting
42:06
easy when the day is done.
42:08
And the best part about it
42:10
is, you don't have to do
42:12
it alone. Virginia Beach provides different
42:14
activities for anyone of any age
42:16
to enjoy. So if you're looking
42:18
for that change of pace this
42:20
year, it might be time to
42:22
consider a nice trip to Virginia
42:25
Beach. Go to visit Virginia beach.com
42:27
to learn more. Welcome
42:32
back. I'm talking to Tao Jones CEO
42:34
Almar Latour. Before the break, we were
42:37
talking about one of the big decoder
42:39
questions, how he makes decisions. Now we're
42:41
going to take that idea and put
42:43
it into practice, specifically around AI, which
42:45
has become a major part of Tao
42:47
Jones business over the last few years.
42:49
Let's apply that to AI, that framework.
42:51
I think it's useful to have the framework
42:53
where we talk about this next set of
42:56
big shifts, because that's a lot of decisions.
42:58
You mentioned factiva earlier that's a
43:01
data platform. you're promising some generative AI
43:03
tools there. You want to offer more
43:05
of those tools to your customers across
43:07
the board. You've made some deals with
43:10
open AI. At the same time, you're
43:12
suing perplexity. What is the shape of the
43:14
AI opportunity to you? When you look
43:16
at it broadly, you're suing perplexity. You
43:18
think they're taking information away for you
43:20
without compensating you. You've made a deal
43:22
with Open AI at some rate. We
43:25
can talk about whether that rate is
43:27
enough. And then you're offering the tools
43:29
to the users to the users, right. When
43:31
you look at that whole set of things,
43:33
the number one question to
43:35
me is, okay, how big of the business
43:38
is this really? Because I don't
43:40
know if anyone's making more
43:42
money from AI than they're
43:44
spending on building my tools right
43:46
now. So we are, I think, very,
43:49
very early stage in that. And so
43:51
it's hard to say. What I can
43:53
say is that on a product side,
43:55
we see some of our products outperform
43:58
versus what we had planned. for
44:00
them. So we have a product
44:02
in our risk business called Integrity
44:05
Check. It's more of a self-serve
44:07
model where you don't have to
44:09
wait for Dow Jones to get
44:11
back to you and do its
44:14
computation using AI ourselves, but out
44:16
of review of the customer. But
44:18
instead, having the customer do some
44:21
basic research on risk and compliance
44:23
themselves, sort of assessing who they
44:25
can do business with and getting
44:27
that to sort of an 80%
44:30
reliability from there than using that
44:32
as a springboard to say, okay,
44:34
now within this, the remaining 20%
44:37
I need help from the company.
44:39
That product is fairly young and
44:41
that's performing expectations. So on the
44:44
whole, I think this will be
44:46
a net positive, but We've got
44:48
to unpack what we're talking about first.
44:51
We've got to get the foundational
44:53
elements of this right. If
44:55
we don't have proprietary information
44:57
that is truly proprietary, then
44:59
we're going to lose this
45:01
game. So you see us
45:03
engage in building products, building
45:06
a marketplace with factiva, and
45:08
deploying tools internally. But all
45:10
of that has to happen
45:12
in tandem with solving that
45:14
foundational question. So let me ask you.
45:16
Last summer, the deal with Open AI,
45:18
the reporting is that it's about $250 million.
45:21
Is that correct? Read the Wallster Journal. That's
45:23
the Wallster Journal site, so I believe them.
45:25
I had Nick Thompson, the CEO of the
45:28
Atlantic on the show a few months ago.
45:30
He told me one of the reasons that
45:32
he made a deal with Open AI was
45:34
to set the market rate, which is
45:37
useful for fair use litigation, which is
45:39
useful for other kinds of deals. Do
45:41
you think 250 million is enough of
45:43
a rate to set the market to
45:45
set the market? I mean, you got
45:47
to ask, I'm not going to talk
45:50
about specific amounts, but you got to
45:52
ask to what is that amount for.
45:54
And then what I am personally
45:57
less interested in is a single amount,
45:59
but more. an operating model and a
46:01
business model for how you do
46:03
business going forward over a long
46:05
period of time. How does that
46:07
operate? Is when information is used?
46:09
Is that the value of that
46:11
information recognized along the way? And
46:13
is there a mechanism that helps
46:15
realizing that valuation? Set the dollars
46:17
aside inside of the opening idea.
46:20
What are the signals you're looking
46:22
for that indicate whether the deal
46:24
was a success for a failure?
46:26
because of the way we've set
46:28
up that deal. I'm not going
46:30
to talk specifically about that deal. I
46:32
was a super nice try because I
46:34
almost bet. But I'm going to try
46:37
again. No, no, I'm sure you will.
46:39
But more broadly, I can say, so
46:41
how can you tell the generative AI
46:44
tools that you're deploying or models that
46:46
you're deploying within your own business? How
46:48
are they successful? That's by usage. and
46:50
is it generating revenue on a consistent
46:53
basis? Is it just a blurb? Like,
46:55
oh, it's a novelty factor and we
46:57
move on. We're pretty early in that
47:00
process, so I don't know yet what's
47:02
ahead fake, what's real. Some of the
47:04
products, I can tell this is real,
47:06
right? Some of it is a
47:08
shift in user experience and
47:10
in user requirements and... this is
47:13
going to have to be
47:15
table stakes, offer a UX
47:17
that is billed around generative
47:19
AI because the customer expects
47:22
that. So you're getting at
47:24
the trickiness of establishing the
47:26
value writ large. But overall,
47:28
I think of generative AI as
47:30
an accelerant for the strategy
47:32
that we have. It will
47:34
allow us to go deeper. in our
47:36
verticals faster and more efficiently and
47:39
in ways that we couldn't even
47:41
imagine and we've all talked about
47:44
the wonders of generative AI but
47:46
doing research in ways that we
47:48
couldn't do before human and otherwise
47:51
within Dow Jones we talk about
47:53
authentic intelligence that's the
47:56
combination of generative AI and
47:58
human guidance and we that
48:00
that's a sweet spot for
48:02
a certain B2B products that
48:04
we're building. So it's accelerating
48:06
going deeper. It will accelerate
48:09
going wider, i.e. scaling our
48:11
vertical strategy because we can
48:13
stand up verticals much faster,
48:15
whether that's a geographic vertical, because
48:18
now we can say. all right
48:20
we can launch in this language
48:22
and it's so reliable and it's
48:24
a lot cheaper and then connecting
48:26
everything is a massive generative AI
48:28
is a massive accelerant because now
48:30
with a thin layer on top
48:32
we can extract data from all
48:34
these different data do the tools
48:36
work well enough for you to trust
48:38
it on a case-by-case basis when
48:40
it's very specific and we are
48:42
answering a question from a customer
48:44
and it's often a co-creation where
48:47
we are solving a certain problem
48:49
and we have very narrow parameters.
48:51
Then I think it works. When
48:53
you go wide, you get a
48:55
wide answer. And so our strategy
48:57
is built around being specific,
48:59
being focused on verticals and AI
49:01
fits nicely with that. And in
49:04
fact, allows us to go much
49:06
deeper, be much more specific and
49:09
be more discerning. And so under
49:11
each vertical you can create subverticals
49:13
using a much larger data pool.
49:16
You're describing something that happens within
49:18
Dow Jones, within its products.
49:20
More broadly, News Corp has been
49:23
pretty harsh about platforms and
49:25
work. News Corp CEO Robert
49:27
Thompson, I think, famously is
49:29
critical of Google. The company
49:31
was behind the laws in
49:33
Australia that require platforms to
49:36
pay publishers for linking. AI
49:38
represents that opportunity as well or that
49:40
challenge as well, right? That instead of
49:43
using your tools, someone might use a
49:45
chat sheet BT or a Google Gemini
49:47
or something and just receive an answer.
49:49
Do you think that these deals you're
49:51
making, are they hedges against that
49:53
outcome? Are they investments in that
49:55
outcome? A lot of publishers, for example,
49:58
I'll just give the example. of
50:00
sort of the millennial digital media startup
50:02
boom were predicated on we will just
50:04
be the most viral thing on Facebook
50:07
and Facebook will pay us that money
50:09
and that obviously did not pan out
50:11
and I think people are very wary
50:13
of making that same mistake with AI
50:16
but you have one of these deals
50:18
so how are you? Yeah no but
50:20
this is why at the very start
50:23
I said I see those deals in
50:25
a separate category there it's foundational it's
50:27
about principles that what Our Gen A.I.
50:29
answer spits out is relevant
50:32
to our customer in a
50:34
way that some other provider
50:36
with maybe a more general offer
50:38
is not. And so we
50:41
have to make sure that
50:43
when we combine our proprietary
50:45
journalism and our proprietary data
50:48
and our convening power with
50:50
generative AI and with LLMs,
50:53
we have to make sure
50:55
that what the outcome is. to
50:57
a query is a reliable and b
50:59
is something that you can't find
51:02
as somewhere else or be at
51:04
the scale of where you can
51:06
find it somewhere else within that
51:08
vertical. So I think There's
51:10
a distinction between establishing the principles and
51:13
getting value for that, getting forward value
51:15
for if you are being used, but
51:17
then there is a separate category of
51:20
efficiency tools that we use in the
51:22
company and yet another category where we
51:24
say, here's where we build products that
51:26
have to answer a certain question that
51:29
exists in the market in any different
51:31
industry in our case. And we're going to
51:33
give a superior answer and you're
51:35
going to need that answer in
51:37
order to be more successful than
51:39
the next person working on solving
51:41
that problem in a certain industry.
51:43
So if that's about energy pricing
51:45
and forecasting energy prices, we want
51:47
to be the most reliable on
51:49
that or what's happening in the
51:51
chemical industry. We want to be
51:53
the leading voice in that and
51:56
generative AI should be one way
51:58
in which you get that out of us. in
52:00
a proprietary sense. And so that
52:02
should be hopefully very different than
52:04
going to any chat bot and
52:06
asking that same question. And maybe
52:08
you get an approximation, but it
52:11
might not be as reliable. But
52:13
hopefully there will be sufficient proprietary
52:15
data in our answer that will
52:17
make that competition uneven in our
52:19
advantage. And so that I think
52:21
is the task. I feel very
52:24
strongly that we cannot go in
52:26
to this new era with a
52:28
view of, well this is what
52:30
these companies have to do for
52:32
us. Like we have to agree
52:34
on the principles of the value,
52:37
but then it's really up to
52:39
us to create superb products and
52:41
answers to complex questions in a
52:43
very complex world. to realize the
52:45
value that these new tools offer.
52:47
And so both those things have
52:49
to exist. I've talked to a
52:52
lot of publishers and media CEOs
52:54
over the past several years about
52:56
where the traffic comes from, how
52:58
the payments work, where the value
53:00
is going, setting aside AI for
53:02
a minute. It feels like the
53:05
nuclear question everyone is asking is,
53:07
well, if Google is just indexing
53:09
our sites and taking the data,
53:11
eventually, we will have to block
53:13
Google. in a way that many
53:15
publishers were comfortable using their robots
53:18
file to block open AI and
53:20
other crawlers. Have you ever considered
53:22
going that far? Oh, I'm not
53:24
going to speak specifically to Google.
53:26
We're a partner and we have
53:28
lots of things that we do
53:31
together. There's also things that we
53:33
disagree on. News Corp. I think
53:35
famously the most outspoken in this
53:37
year. Yeah, no, no, no, absolutely.
53:39
So this is not on my
53:41
radar in the way that you
53:44
expressed that. That's the short answer
53:46
to that. I think, I guess
53:48
in taking your question in a
53:50
different way, we have to emphasize
53:52
O&O and we have to make.
53:54
sure that being in our world,
53:57
in our universe, in your individual
53:59
vertical or in one of our
54:01
broader products or in the entire
54:03
Rubik's Cube, you have an experience
54:05
that you cannot have somewhere else.
54:07
That's on us. How far do
54:10
you go in putting a wall
54:12
around that? Yeah, we'll see over
54:14
time. You are in litigation against
54:16
perplexity. They've taken some data. I
54:18
think you'd... don't like that. I'm
54:20
guessing by the fact the loss
54:23
it was filed. If you win
54:25
that case, or the New York
54:27
Times Company wins its case, or
54:29
I know Cheryl Crow wins her
54:31
case, that will upend the market
54:33
as we understand it, right? There
54:36
will be some new fair use
54:38
precedent that is created. How does
54:40
that change how you think about
54:42
building into playing your own AI
54:44
tools? Again, I put
54:46
this in a separate box. Like, we
54:49
got to build our... Well, let me
54:51
challenge you on that for one second,
54:53
just to get it into the right
54:55
framework. Right now, it feels like the
54:57
entire industry is just assuming that winter
54:59
lose these cases. The money will be
55:02
sort of outlabeled to build at the
55:04
same rate we've been building, right? Opening
55:06
out, well, maybe the winter lose. We're
55:08
gonna maybe the rates go up and
55:10
it's just more expensive through opening eyes
55:12
doing because they have to pay all
55:15
the singer songwriters in the world Maybe
55:17
it also seems to me that potentially
55:19
the rates are so high that the
55:21
entire structure of the industry changes So
55:23
the industry in this case aye Yeah,
55:26
yeah, yes, right like suddenly we have
55:28
to the compliance cost of making sure
55:30
all of our data is licensed before
55:32
we feed it into the model for
55:34
training Skyrockets because the penalties are high
55:36
under copyright law that It feels like
55:39
an under-considered risk. These lawsuits are just
55:41
going to play out, and something will
55:43
happen. The way that I think Google
55:45
was able to roll over the Viacom
55:47
lawsuit when YouTube started, or the Google
55:50
Books lawsuit, because they were sort of
55:52
the plucky upstart, and the value of
55:54
those tools was so high, that they
55:56
got to win a bunch of lawsuits.
55:58
I don't think the AI companies feel
56:00
like plucky up. starts. I don't think
56:03
that public sentiment is with a bunch
56:05
of giant tech company billionaires anymore. It
56:07
feels like those lawsuits might go the
56:09
other way. And at that point, some
56:11
of the tools you are using to
56:13
build with or some of the partners
56:16
you have, their cost structures might change
56:18
so dramatically. That is going to stop
56:20
us from... Yeah, that everyone's strategy has
56:22
to change. And I'm just wondering how
56:24
much you are considering that. I see
56:27
where you're going with it. I think
56:29
the answer to the individual cases, it's
56:31
a... a little bit the blind man
56:33
and the elephant. There are different patches
56:35
of fair use that different legal cases
56:37
are pursuing. One case is not going
56:40
to, it might reverberate, but it's not
56:42
going to be necessarily absolute. And so
56:44
I hate to say this as an
56:46
answer to any question, but there is
56:48
a... a big wait and see. At
56:50
the moment, I got to go on
56:53
the assumption that as a technology, generative
56:55
AI is present in my world, is
56:57
going to be present even more, is
56:59
going to be present, and an expectation
57:01
from consumers, whether they're corporate or consumers,
57:04
out in the wild. And so we
57:06
cannot continue to build and then think
57:08
at the same time, like, oh, it
57:10
may all just disappear, and by the
57:12
way, we might be the culprit because
57:14
we're applying that, which is an interesting.
57:17
scenario. I don't think it will play
57:19
out that way, but you're one of
57:21
the litigants. That's what I mean. It's
57:23
interesting because you are... But I don't
57:25
know that it will be debilitating. I
57:27
don't think that the commercial agreement that
57:30
we have with Open AI, the value
57:32
of which I can say, but you
57:34
just cited that, that has obviously not
57:36
stopped Open AI from developing. And so
57:38
I believe in a market mechanism, and
57:41
I think that's where we'll end up
57:43
that there will be a gravitation to
57:45
that, rather than stopping the industry in
57:47
its tracks. Opening I famously has not
57:49
made one dollar in profit. That's the
57:51
thing that I, it's right, they have
57:54
to build a business that's valuable not
57:56
to support deals. Yeah, but Amazon didn't
57:58
for a long time either. I feel like
58:00
we've brought up Jeff Bezos in a
58:03
variety of ways on this. I'm very
58:05
curious to see how your lawsuit plays
58:07
out with perplexity and how those businesses
58:09
develop will have to have you back
58:12
as that progresses because there's something there
58:14
that feels it's almost invisible by men
58:16
in the elephant. It's there. It's very
58:19
big, and I think this next year
58:21
we'll see how it shapes the business.
58:23
I want to end by talking about
58:25
press freedom. It's something you care about
58:27
a lot. You've talked about it a
58:29
lot. Obviously, you're the publisher of the
58:31
Wall Street Journal. You famously had Evan
58:33
Gerschkovich detained in Russia in March 2023.
58:35
You worked very hard across a number
58:38
of administrations to bring him back. This
58:40
is a very challenging time for press
58:42
freedom both abroad and it feels like
58:44
in the United States. What's your
58:46
view of the landscape right
58:48
now? of polarization and that
58:50
makes covering the news trickier
58:53
than ever before, but
58:55
also I think increases
58:57
the value and the
58:59
contribution that we bring
59:01
to society as a
59:03
free press. And so on
59:05
the one hand, with
59:07
all the changes that
59:09
we're seeing, including against
59:11
media, this is a time
59:13
that any journalist should be made
59:15
for. Right, if your heart is
59:18
in explaining complexity to the world,
59:20
there's never been a time when
59:22
we've had this to grapple with.
59:25
And so I think on one
59:27
hand, we can offer enormous value.
59:29
On the other hand, it's become
59:32
a lot harder to do that.
59:34
And the statistics around the world
59:36
don't lie. There are well over
59:38
300 people who were killed last
59:41
year doing journalism. been put
59:43
in prison and there
59:45
is a harsh dialogue
59:47
in society that makes
59:49
it under many circumstances
59:52
less comfortable to go
59:54
after a story. Sometimes
59:56
I measure whether we
59:58
did a story. very well
1:00:00
by how much I got in
1:00:02
terms of complaints from the left
1:00:04
and from the right after certain
1:00:06
stories. And so the temperature is
1:00:09
high. But let me push
1:00:11
you on that too. Yes, please.
1:00:13
That is, that's an old chestnut
1:00:15
in journalism, right? If everyone's unhappy,
1:00:17
you're doing your job right. We're
1:00:19
in a place... Well, it's also
1:00:21
like my daily existence, honestly. Right.
1:00:23
So that is, it's a very
1:00:25
young chestnut for me. I mean,
1:00:27
it's there every day. But yes,
1:00:29
I know where you want to
1:00:31
go. This is a pretty asymmetric
1:00:33
information landscape right now. One side
1:00:35
is vastly more willing to lie.
1:00:37
One side is vastly more willing
1:00:40
to even change the data. The Trump
1:00:42
administration is making noise that they'll take
1:00:44
government spending out of GDP, which would
1:00:46
dramatically change almost everything the Wall Street
1:00:48
Journal does, right? At the most fundamental
1:00:50
level, we might not be able to
1:00:52
trust the government data anymore. That's
1:00:55
a threat to press freedom. At the same time, they
1:00:57
will spin it as a good thing. Elon Musk is
1:00:59
out there trying to spin this as a good thing. You
1:01:02
don't see the left playing that kind
1:01:04
of game with the data in
1:01:06
that way to sort of metaphysically create
1:01:08
political outcomes, right?
1:01:11
There's not as much trying to
1:01:13
tweet things into reality that Elon
1:01:15
is doing. So I'm not going to
1:01:17
left -right things in this conversation. What
1:01:20
I can say is how do
1:01:22
you respond to an information
1:01:24
ecosystem or maybe in an asymmetrical
1:01:26
manner? I'm saying how do
1:01:28
you respond to an information ecosystem
1:01:30
where Donald Trump has threatened to
1:01:32
sue pollsters in Iowa that he
1:01:34
didn't like the results of their
1:01:36
poll? Or where Brendan Carr, the
1:01:38
chairman of the FCC, is potentially holding
1:01:41
up the CBS
1:01:43
Skydance merger over his
1:01:45
investigation of 60 minutes editorial content.
1:01:47
Yeah. So it's a very clear
1:01:49
answer, I think, to that.
1:01:51
It's not an easy answer, but
1:01:53
the first answer is stick to
1:01:56
your principles. In our case, we
1:01:58
believe in reporting the fact. in
1:02:00
the newsroom, we believe, in free markets
1:02:02
and free people on the opinion side.
1:02:04
And you stick to that, and you
1:02:06
do not let go. All right? And
1:02:08
you double down on that. And
1:02:10
that's our contribution to the information
1:02:13
ecosystem. And we're going to do
1:02:15
more of that. And by the
1:02:17
way, that's a demand-driven thing as
1:02:20
well. But we're talking about press
1:02:22
freedom. This is an answer to
1:02:24
that. You're not going to change
1:02:27
your reporting. If you start doing
1:02:29
your reporting. and omitting facts that
1:02:31
you know to be true or
1:02:34
start self-censoring, then that game is
1:02:36
lost. You have. That's one. Okay.
1:02:38
Second, you got to keep
1:02:40
a cool head. We
1:02:42
live in an environment
1:02:44
where taunting and provocation
1:02:46
is the norm. And
1:02:48
so you can take that bait
1:02:51
or you cannot. And you
1:02:53
have to then, in my
1:02:55
view, not be... hysterical in
1:02:57
response to every little provocation
1:03:00
that might exist. And in
1:03:02
fact, you might get more respect
1:03:04
if you do not respond to
1:03:06
every provocation. And then you have
1:03:08
to recognize the moments when
1:03:10
principles are at stake, when
1:03:13
you have to fight or
1:03:15
you have to express your
1:03:17
disagreement. And so keep on doing
1:03:19
what you're doing. Do even more
1:03:21
of it in our case reliable
1:03:23
create reliable information. It's going to
1:03:25
be it's going to be good
1:03:28
for society It's going to be good
1:03:30
for you as an organization Keep
1:03:32
a cool head and stick to
1:03:34
your principles and that last part
1:03:36
is also non-negotiable non-negotiable all these
1:03:38
three of fact non-negotiable You've got
1:03:40
to stick to your principles. If
1:03:43
you start shifting and making certain
1:03:45
concessions at the wrong moment, there
1:03:47
will be a very high price
1:03:50
to pay for that. You have
1:03:52
colleagues in similar positions across the
1:03:55
media that are making concessions,
1:03:57
right? A. B. C. Settled its case
1:03:59
with the. Trump administration, CBS looks like
1:04:01
they might settle the 60 minutes case
1:04:04
because the threat of the Skydance deal
1:04:06
being blocked in some way hangs over
1:04:08
them. Are you saying you would not
1:04:11
make those concessions? Are you saying they
1:04:13
should not? I'm not saying either one
1:04:15
of those because I'm not commenting on
1:04:18
their individual situation. I just think that
1:04:20
there are moments where as an organization
1:04:22
you're going to have to evaluate and
1:04:25
the AP just went through this, is
1:04:27
this a moment where I speak out
1:04:29
and where I stick to my guns
1:04:32
or not? And I think those moments,
1:04:34
you better choose carefully, you better have
1:04:36
a clear view of your principles and
1:04:38
understand what you actually stand for and
1:04:41
understand the ramifications because people will point
1:04:43
back to certain moments and you want
1:04:45
to make sure that you're on the
1:04:48
right side of that. No, I'm going
1:04:50
to ask you this one directly because
1:04:52
I think I just need to hear
1:04:55
it therapeutically, but I think your reporters
1:04:57
probably need to hear it too. If
1:04:59
the pressure comes to you from the
1:05:02
Trump administration, are you saying that you'll
1:05:04
fight in a way that it feels
1:05:06
like a lot of other big media
1:05:09
companies are choosing to cave? I think
1:05:11
the question is just to load it
1:05:13
and that you make it very specific.
1:05:16
I think we have fought for our
1:05:18
principles for decades. We have stood up
1:05:20
for reporting for decades. We have a
1:05:23
legal team that is incredibly strong, that
1:05:25
has fought for press freedom and for
1:05:27
our journalism again and again. If we
1:05:30
make a mistake, we correct. We own
1:05:32
up to that and that is absolutely
1:05:34
part of the value structure. We stand
1:05:37
up for principles, period. What happens if
1:05:39
the generative AI makes a mistake? Depends
1:05:41
on what? mistake it is. And so
1:05:44
we actually in building and co-creating some
1:05:46
of these products that answer very narrow
1:05:48
questions, we were sometimes surprised at mistakes
1:05:51
that snuck in and we just wanted
1:05:53
to make sure that we can't release
1:05:55
products without having screened for that. But
1:05:57
you're going to have to correct. Do
1:06:00
you think that that information environment
1:06:02
where some of the tools are less
1:06:04
reliable and some of the institutions
1:06:07
are less reliable or perhaps even
1:06:09
openly hostile to the press? Do
1:06:11
you think that's something that you will
1:06:13
be able to chart as Dow Jones
1:06:15
alone or do you think that's an
1:06:18
industry-wide effort? Because it does not feel
1:06:20
like there's a lot of coordination
1:06:22
across the industry right now. I
1:06:24
think the industry should shoulder a
1:06:27
lot of this together in a
1:06:29
loosely formed... coalition or through solidarity.
1:06:31
But I'm focused on Dow Jones'
1:06:34
success, but I certainly share, and
1:06:36
on our values, but I certainly
1:06:38
share. our findings with colleagues all
1:06:40
the time and there's a very
1:06:43
active dialogue amongst media leaders that
1:06:45
A. I want to foster and
1:06:47
be I participate in and so
1:06:50
if you look at very difficult
1:06:52
moments like the Evan case or
1:06:54
getting people out of Afghanistan during
1:06:56
the rapid withdrawal from US forces
1:06:58
there we worked very very closely
1:07:00
together there's very close contacts among
1:07:02
amongst a lot of those leaders.
1:07:05
I think that's a healthy thing
1:07:07
and I would like to see
1:07:09
more of it. Well, Amara, are you
1:07:11
doing us so much time? Tell us what's
1:07:13
next for Dowgens. Well, there's tomorrow's news,
1:07:15
so I want you to tune
1:07:17
in, and so definitely come to
1:07:20
the Wall Street Journal every day.
1:07:22
But for us, you'll see us
1:07:24
focus on international rebalancing, rebalancing our
1:07:26
portfolio to make sure that we
1:07:28
are as strong outside the borders
1:07:30
of the US as we are
1:07:32
here, focused on video, focused on
1:07:34
deeper data products. But overall... We
1:07:37
will continue to be focused on
1:07:39
what we have been focused on
1:07:41
for the entirety of our existence
1:07:43
and that's providing reliable information. Amazing.
1:07:45
Thank you so much for being on
1:07:47
Dakota. Thank you so much for having
1:07:49
me. Other thing, Allmar, for taking the time
1:07:51
to join me on Dakota, and thank you for
1:07:54
listening. I hope you enjoyed it. If you like
1:07:56
to let us know, we thought about this episode
1:07:58
or really anything else, drop us the line. can
1:08:00
email us at decoder at the
1:08:02
verge.com. We really do read all
1:08:05
the emails. You can also add
1:08:07
me up directly on threads or
1:08:09
Blue Sky, and we have a
1:08:12
Tiktok and an Instagram now. They're
1:08:14
both at Decoder pod. They're a
1:08:16
lot of fun. If you like
1:08:19
Decoder, please share it with your
1:08:21
friends and subscribe wherever you
1:08:23
get podcasts. Decoder is
1:08:25
a great master cylinder.
1:08:27
We'll see you next time.
1:08:30
Now in its sixth year,
1:08:32
the Alex Partners Disruption Index
1:08:34
explores what the best performing
1:08:37
and fastest growing companies are
1:08:39
doing differently, as they anticipate,
1:08:42
shape, and respond to disruption.
1:08:44
How are the world's top
1:08:46
leaders identifying ways to integrate
1:08:49
AI into their businesses? Learn
1:08:51
more on the latest tech
1:08:54
insights at disruption. Alex partners.com.
1:08:56
Disruption. Alix partners.com. to get
1:08:59
straight to the point and
1:09:01
deliver results when it really
1:09:03
matters. You understand tech. You
1:09:05
already use a VPN, a
1:09:07
password manager, and an ad
1:09:09
blocker. Surely you are not
1:09:11
exposed. Right? Find out for
1:09:13
sure by going to cloaked.com/VOS.
1:09:15
That's c-l-o-a-k-e-d.com/V-O-X. There you can
1:09:18
plug in your phone number
1:09:20
to see if any of
1:09:22
your personal information is vulnerable.
1:09:24
Cloked can help you get
1:09:26
your data removed from hundreds
1:09:28
of data brokers, generate unlimited
1:09:30
working phone numbers to protect
1:09:32
your real one, and also offers
1:09:34
$1 million insurance against identity theft.
1:09:37
That's cloked.com/Vox, and check out with
1:09:39
Code Podcast to get 30% off
1:09:41
a Cloke subscription. Today
1:09:46
at T-Mobile I'm joined by a
1:09:49
special co-anchor. What up everybody? It's
1:09:51
your boy, Big Snope deal, double
1:09:53
sheet. Snoop! Where can people go
1:09:55
to find great deals? Hand to
1:09:58
t-mobile.com to get four iPhone 16s
1:10:00
with Apple intelligence on us. Plus
1:10:02
four lines plus four lines for
1:10:04
Apple intelligence on us. Plus four
1:10:06
lines for 25. That's four lines
1:10:09
for Apple intelligence, with
1:10:11
Apple Intelligence, with Apple
1:10:13
intelligence, requires IOS 18.1
1:10:15
or later.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More