Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
We all know data is
0:02
valuable. We use it to
0:04
tell a story, to make
0:06
informed decisions for our businesses,
0:09
but turning data into actionable
0:11
insights can be a challenge.
0:13
It's time to unlock the
0:16
true potential of your business
0:18
data with Domo's AI and
0:20
data products platform. Domo lets
0:22
you channel AI and data
0:25
into innovative uses that deliver
0:27
a measurable impact. Ask your
0:29
data anything at any time.
0:32
Anyone on your team can
0:34
use Domo to easily prepare,
0:36
analyze, visualize, visualize, automate and
0:38
distribute data all amplified by
0:41
AI. Domo goes beyond productivity.
0:43
It's designed to transform your
0:45
processes, helping you make smarter
0:48
and faster decisions and drive
0:50
real growth. All powered by
0:52
Domo's trust, flexibility, and years
0:54
of expertise in data and
0:57
AI innovation. Data is hard.
0:59
Domo is easy. Make smarter
1:01
decisions and propel your business
1:04
forward with Domo. Learn more
1:06
today at Domo. Domo. That's
1:08
AI.domo.com. The agile brand. Stay
1:10
curious, stay agile, and join
1:12
the top enterprise brands and
1:14
Martec platforms as we explore
1:16
marketing technology, AI, e-commerce, and
1:19
whatever's next for the Omni
1:21
Channel customer experience. Together we'll
1:23
discover what it takes to
1:25
create an agile brand built
1:27
for today and tomorrow, and
1:29
built for customers, employees, and
1:31
continued business growth. I'm your
1:33
host, Greg Kilstrom, advising Fortune
1:35
1000 brands on Martec, AI,
1:37
and marketing operations. The Agile
1:39
Brand podcast is brought to
1:41
you by Tech Systems, an
1:43
industry leader in full-stack technology
1:45
services, talent services, and real-world
1:47
application. For more information, go
1:49
to T-E-K systems.com. To make
1:51
sure you always get the
1:53
latest episodes, please hit... on
1:55
the app you listen to
1:57
podcast on, and leave us
1:59
a rating so others can
2:01
find us as well. And
2:03
now on to the show.
2:06
We are recording live at Qualtrix
2:08
X4 in Salt Lake City and
2:10
seeing and hearing all about how
2:12
to create and enable amazing customer
2:14
and employee experiences. It's important to
2:16
collect customer experience data, but if
2:19
it's not driving change across your
2:21
organization, is it really helping your
2:23
business? Today we're going to talk
2:26
about making meaningful cross-functional change using
2:28
CX research and data as a
2:30
guide. I'm joined by Adam Hagerman,
2:32
director of UX research for employer
2:35
products at Indeed. Adam has led
2:37
transformative efforts at Indeed to turn
2:39
customer experience research into cross-functional strategic
2:41
change, driving real improvements in both
2:43
user satisfaction and product success. Adam,
2:45
welcome to the show. Thanks for
2:47
having me. Yeah, looking forward to
2:49
diving in here before we do,
2:51
why don't you... Give us a
2:53
little background on yourself and your
2:55
role at indeed. Sure. I lead
2:58
a team of UX researchers. We
3:00
look over the employer products. We're trying
3:02
to make sure that what we end
3:04
up shipping for people to consume is
3:06
solving relevant needs and helping them do
3:09
what they need to do better, faster,
3:11
cheaper, easier. Wonderful. Great. So yeah,
3:13
let's dive in here. And so
3:15
we're going to talk about a
3:18
few things here, but. I want
3:20
to start by talking about transforming
3:22
satisfaction measurement into strategic decision making.
3:25
So you and your team indeed
3:27
have transformed your approach to measuring
3:29
user satisfaction. What led to this
3:32
shift? We needed to? Yeah. Satisfaction
3:34
measurement is not new. We've been
3:36
doing it since like phone surveys
3:39
from from ye old in days
3:41
that you would get at dinner time.
3:43
And the tool we were using was
3:45
the same one. the net promoter score.
3:47
It's evolved, it's iterated over time, it's
3:49
had improvements here or there, but at
3:52
the end of the day it's a
3:54
brand measurement, and we have a product we
3:56
need to work on. NPS is well known,
3:58
and my stakeholders were very... very
4:00
excited. They're not anti-user sentiment. It's just
4:02
the tool they were using wasn't as
4:05
helpful as it could have been. We
4:07
asked the question, can we make this
4:09
better? What can we do? Here are
4:12
the shortcomings, here's how it's preventing us
4:14
from helping people do what they need
4:16
to do, better, do what they need
4:19
to do, better, faster, cheaper, easier. It's
4:21
not giving us the insight we need,
4:23
so let's find a new way to
4:26
do it. many people listening out there
4:28
are using NPS, you know, you name
4:30
it. What were some of the telltale
4:32
signs that it wasn't giving you everything
4:35
that you needed? If it tells us
4:37
to push a lever, and we push
4:39
the lever, but nothing happens, it's not
4:42
actually telling us what lever to push.
4:44
Makes sense? I guess that's the answer.
4:46
Yeah, yeah. Hey. So, um... You mentioned
4:49
that data was harnessed not just to
4:51
inform, but to quantify impact and to
4:53
guide strategy. How did you approach turning
4:56
research into something measurable and actionable for
4:58
the business? Research is the process of
5:00
collecting information. The reason we collect information
5:03
is because we need to make a
5:05
decision. The product stakeholders need to make
5:07
a decision. Do we do it this
5:10
way? Do we do it that way?
5:12
They receive information from lots of resources
5:14
They they get feedback from their go-to-market
5:17
team. They get feedback from the engineering
5:19
team They get feedback from random person
5:21
on the street and They have to
5:24
take all of that information and make
5:26
a decision What we bring to the
5:28
table is kind of the collective baseline
5:31
for what our users want our job
5:33
is to advocate for users among that
5:35
entire? organism of ecosystem of information floating
5:38
around. Data collection is a deliberate act.
5:40
Just because something's been collected doesn't mean
5:42
it's what you should be collecting. And
5:45
we ask that question, are we collecting
5:47
information that actually helps us advocate for
5:49
users? Once we were able to do
5:52
that and demonstrate, here's what we're doing
5:54
and here's what it means. for you,
5:56
here's your return on investment. It was
5:59
an easier case to make. Does that
6:01
answer your question? Yeah, I think so.
6:03
I mean, so is it, because there's
6:06
lots of signals, right? So I mean,
6:08
again, there's some go-to measurements that a
6:10
lot of people use, like NPS and
6:13
others, and again, to your point, nothing
6:15
wrong with that, but if it's the
6:17
sole measurement, there's some questions. Right. So
6:20
yeah, so I guess how do you,
6:22
how do you. determine is it an
6:24
incremental like in the advertising world it's
6:27
like media mix modeling or something like
6:29
that is it is it a similar
6:31
approach I actually used a very similar
6:33
approach to media mix modeling in media
6:36
mix modeling which what you're really trying
6:38
to do is of all the money
6:40
that's floating around in brand advertisements activation
6:43
whatever you're trying to see where does
6:45
an incremental dollar give me more benefit
6:47
so if I have one dollar left
6:50
to spend am I going to put
6:52
it in this thing or that thing?
6:54
And what my measurement was trying to
6:57
do was say, of the experiences available,
6:59
where we could put our investments into
7:01
improving user experiences. Where should you put
7:04
your dollar? And we had to do
7:06
a mathematical exercise, create an empirical argument
7:08
for that's the right place to put
7:11
the dollar, and then we had to
7:13
hope that it worked. That we weren't
7:15
lying. So we created systems of accountability
7:18
for ourselves. in order for us to
7:20
advocate for this new type of measurement,
7:22
it needs to meet these criteria. And
7:25
we set out a protocol for how
7:27
we were going to check ourselves. And
7:29
before we were ready to really roll
7:32
full scale and say, this is truth,
7:34
listen to us, we wanted to make
7:36
sure that we were actually representing the
7:39
lived experience of our people, the customers
7:41
that we have. So we took that
7:43
medium mixed modeling approach. How can we
7:46
model where return on effort into fixing
7:48
user experiences would give us that outcome?
7:50
So what was that process like then?
7:53
Lots of math. Yeah, I would imagine,
7:55
right? What's the process then of convincing
7:57
people to listen to, you know, again,
8:00
people get really stuck in their ways.
8:02
This is a change management thing as
8:04
much as it is a measurement thing,
8:07
right? You hit it on the nail,
8:09
or you hit the nail on the
8:11
head. It's a change management thing. People
8:14
are coming in with their own set
8:16
of expectations, biases, baggage. This is what
8:18
it means for me either in good
8:21
or bad terms. And the approach I
8:23
like to take is just being brutally
8:25
honest. This is what's going on. Here's
8:27
how moving forward with this, here's what
8:30
you can expect as a consequence. I'm
8:32
not going to tell you what to
8:34
do. But if you're going to do
8:37
this, here's your consequence. I'm giving you
8:39
other options, and by the way, I've
8:41
dated to back it up. Yeah, I
8:44
mean, that's the key thing, right? Instead
8:46
of a feeling or a hunch or
8:48
I did this at this other place,
8:51
you've got the incremental improvements, right? Yes,
8:53
it's working in a theoretical and hypothetical
8:55
space, theoretical and hypothetical space, but we
8:58
adhere to all the rules of statistics
9:00
that have come to us through like
9:02
the current philosophy of science and that's
9:05
how we construct our argument. Yeah, yeah.
9:07
So how did that go, I guess,
9:09
at first? Like, is that, was a
9:12
metric? Yeah. There were some people who
9:14
saw what we were trying to do
9:16
and they were like, yeah, let's go.
9:19
Yeah. There were other people who were
9:21
more, they had been burned in the
9:23
past. I guess is the best way
9:26
to say it. We work with very
9:28
smart, intelligent, experienced people and indeed was
9:30
not their first job. So they have
9:33
the baggage from wherever they were and
9:35
whatever research team did that thing. So
9:37
when they hear Adam saying, this is
9:40
what we're going to do now or
9:42
don't do that or whatever, I have
9:44
to acknowledge that they're also a human
9:47
being that have their own set of
9:49
experiences, baggage, whatever. And the connection I
9:51
make with them where I try to
9:54
make is we're both here to do
9:56
the... do the same thing, where we
9:58
both want this. same good outcomes? What
10:01
questions do you have? If there are
10:03
things that I can do to make
10:05
you feel more comfortable, I'd like to
10:08
know what it is, it may just
10:10
be a matter of I didn't say
10:12
it on that slide. So it's having
10:15
frank conversations. Well, and this is where
10:17
it comes down to, you know, there's
10:19
lots of talk about data-driven decision-making, but
10:21
this is the culture shift part of
10:24
that, right? Is, again, I did this
10:26
thing at this other place and it
10:28
worked really well, so it. you know,
10:31
work again at this new place in
10:33
different circumstances. Our context is different and
10:35
sometimes that's part of the argument. When
10:38
I'm convincing people, it's like, yeah, over
10:40
there it worked and I can see
10:42
why it would or I can also
10:45
see why it would fail miserably. It's
10:47
that idea of no context is exactly
10:49
the same and the success depends upon
10:52
your context and bringing that out having
10:54
empirical arguments. We all
10:56
know data is valuable. We use
10:58
it to tell a story, to
11:01
make informed decisions for our businesses,
11:03
but turning data into actionable insights
11:05
can be a challenge. It's time
11:07
to unlock the true potential of
11:10
your business data with Domo's AI
11:12
and data products platform. Domo lets
11:14
you channel AI and data into
11:16
innovative uses that deliver a measurable
11:18
impact. Ask your data anything at
11:21
any time. Anyone on your team
11:23
can use Domo to easily prepare,
11:25
analyze, visualize, visualize, automate and distribute
11:27
data all amplified by AI. Domo
11:30
goes beyond productivity. It's designed to
11:32
transform your processes, helping you make
11:34
smarter and faster decisions and drive
11:36
real growth. All powered by Domo's
11:38
trust, flexibility, and years of expertise
11:41
in data and AI innovation. Data
11:43
is hard. Domo is easy. Make
11:45
smarter decisions and propel your business
11:47
forward with Domo. Learn more today
11:50
at Domo. Domo. That's AI dot
11:52
domo.com. Want to learn more and
11:54
join the discussion about marketing. in
11:56
AI, attend a premier conference dedicated
11:58
to marketing in AI. That's Mayicon,
12:01
the Marketing Artificial Intelligence Conference, from
12:03
October 14 through 16 in Cleveland,
12:05
Ohio. Mayicon brings together the brightest
12:07
minds and leading voices in AI.
12:10
Don't miss this opportunity to connect
12:12
with a dynamic community of experts,
12:14
visionaries, and enthusiasts. The agile brand
12:16
is proud to be the lead
12:18
media sponsor of this important event.
12:21
Register today at marketing AI institute.com.
12:23
That's marketing AI institute.com and use
12:25
the code agile 150 for $150
12:27
off your registration fee. I can't
12:30
wait to see you there. versus
12:32
the sustainable growth and thing, you
12:34
know, a lot of this is
12:36
there can be quick winds when
12:39
you do, when you make any
12:41
change sometimes or any good change,
12:43
but how do you look at
12:45
balancing between, you know, let's get
12:47
some of those quick winds under
12:50
our belt versus, okay, this is
12:52
going to achieve sustainable growth, which
12:54
is hard to project. I had
12:56
the long-term vision in my head.
12:59
I knew what I wanted to
13:01
do. I knew where I needed
13:03
more information like... Here's where it
13:05
might fail. But I had to
13:07
start with the quick wins. It's
13:10
just like any other product that
13:12
you push out. Your first users
13:14
are going to be your best
13:16
sources of feedback. They're going to
13:19
say, OK, Adam, here's how you
13:21
and your team maybe thought about
13:23
this differently. Here's feedback I would
13:25
have, because here's how my context
13:27
will change how you want to
13:30
approach something. So the quick wins
13:32
are necessary, but you have to
13:34
know where you're going. What the
13:36
corollary to your question is. people
13:39
who do a bunch of quick
13:41
wins but don't know where they're
13:43
going. And that's how you get
13:45
cruft. Like that, that. That's what
13:48
it is. So thinking about your
13:50
research, I try to think about
13:52
my research programs as a creative,
13:54
whatever we're doing today is building
13:56
on what we did yesterday and
13:59
we'll build on it tomorrow. So,
14:01
yeah, so the quick winds are,
14:03
and I'm a huge fan of
14:05
that approach, but to your point,
14:08
as long as there's a seriously
14:10
like, but it's almost as much
14:12
about, I mean, there's benefit to
14:14
the business, ideally there's benefit to
14:16
the customer as well with those,
14:19
but it's also about. kind of
14:21
winning hearts and minds, right? Is
14:23
that that's a big part of
14:25
it? You have to show that
14:28
it works. Yeah. That if somebody's
14:30
going to say, hey Adam, I
14:32
want your stuff on my surface
14:34
because I want to use your
14:36
tool to like make sure that
14:39
our user experience is good. I
14:41
want to deliver on that. I
14:43
want to actually say, yes, I
14:45
helped you do that. Yeah. So
14:48
what would your advice be to
14:50
leaders? Let's say they're not sitting
14:52
in the research of the data.
14:54
component of the organization, but again,
14:56
they read the same things I
14:59
read about data-driven decision. They know
15:01
somewhere in their head that this
15:03
is, there's a lot of value
15:05
here, but they're having a hard
15:08
time kind of getting past, whether
15:10
it's biases, whether it's other loud
15:12
voices in the organization. What's your
15:14
advice to them to kind of
15:17
just make the first step? Well,
15:19
that's a, that's a deep question.
15:21
I'm sure it depends too. Well,
15:23
the Kierkegard would say take your
15:25
leap of face, stare into the
15:28
abys, stare into the abys, stare
15:30
into the abys, stare into the
15:32
abys, And I do take kind
15:34
of a similar approach. I talk
15:37
about that the series of experiences,
15:39
the bag of lessons learned, that
15:41
helps us decide how we should
15:43
move forward. We may not have
15:45
all of the information we need
15:48
to make a decision, but we
15:50
have a lot of information that
15:52
can help us know whether we're
15:54
making a bad decision. Check yourself
15:57
before you wreck yourself. Have your
15:59
internal system of accountability, and then
16:01
at some point you just need
16:03
to go. I remember when I
16:05
was at the media briefing the
16:08
other day and you had mentioned
16:10
You know, indeed is unique to
16:12
some organizations in the being a
16:14
technology company Access to data things
16:17
like that is is is a
16:19
little more prevalent than in some
16:21
more maybe more legacy companies and
16:23
things How much do you credit
16:25
you know having that access to
16:28
data like is is it always
16:30
a good thing? Are there are
16:32
there good and bad things about
16:34
it? I think it's good to
16:37
have access to it There's a
16:39
barrier to entry. You have to
16:41
know how to type in the
16:43
right query. You have to know
16:46
what are the signals that maybe
16:48
you didn't do the query correctly.
16:50
There's a lot of self-awareness that
16:52
goes into working with that. The
16:54
people who primarily work in that
16:57
space, this is what they do
16:59
all day long. So they have
17:01
those tips and tricks to be
17:03
conversant in that kind of work.
17:06
I viewed, or I continue to
17:08
view, democratization of information as inherently
17:10
a good thing. My concern comes
17:12
with, do we have the appropriate
17:14
guardrails in place? Have we checked
17:17
ourselves before we wreck ourselves? Well,
17:19
a lot of, yeah, and I'm
17:21
a huge fan of it as
17:23
well, of it as well as
17:26
data literacy, as well as the
17:28
ethical components of course. How do
17:30
you make sure that there's kind
17:32
of a unified version of the
17:34
truth than with all that? Because
17:37
again, it's a great thing to
17:39
have access, but everybody's asking their
17:41
own questions, so on and so
17:43
forth. I'm not the only person
17:46
participating in this conversation. And there
17:48
are times where I know my
17:50
data is not the data somebody
17:52
needs at that point. I should
17:55
respect that because the data I
17:57
have to offer is not the
17:59
data they need at that moment.
18:01
I try to think instead, how
18:03
do I... participate in that conversation?
18:06
How do I add an additional
18:08
layer, let people know that this
18:10
layer exists, that they can use
18:12
it to ask questions of their...
18:15
own. One of the important aspects
18:17
of our of the program I
18:19
talked about was that it didn't
18:21
just live in Qualtrix. It was
18:23
actually married with all of our
18:26
other data. So people could do
18:28
their own investigations and they could
18:30
use our data source as part
18:32
of as evidence in what I
18:35
know you don't want to use
18:37
the word argument but in whatever.
18:39
You can use it. I feel
18:41
like I'm not answering your question.
18:43
Well, I mean, it's kind of
18:46
a broad question. You know, I
18:48
think it's, it is, and I
18:50
think this is where platforms and
18:52
kind of, we do need some
18:55
kind of unified like source of
18:57
truth, of like at the end
18:59
of the day, this is about
19:01
happier customers and longtime customers. And
19:03
so, you know, at the end
19:06
of the day, that's the goal,
19:08
right? So if there's a, if
19:10
there's many different answers to that,
19:12
maybe there could be some challenges
19:15
there. I guess where I'm getting
19:17
at is I don't want to
19:19
preclude somebody from having access to
19:21
this. But I do want to
19:24
make sure that when people are
19:26
accessing it, they understand what they're
19:28
accessing. We do that through documentation.
19:30
We say this is what you'll
19:32
find in this database, in this
19:35
data frame. We have lots of
19:37
information about what is this program.
19:39
How can you use this information?
19:41
Oh, do you have questions about
19:44
how we came up to it?
19:46
Look over here. We try to
19:48
make sure people are empowered to
19:50
use the data in the correct
19:52
way or in the intended way.
19:55
Correct. Correct makes it sound like
19:57
there is a right and wrong
19:59
in the intended way. Well, and
20:01
I would imagine in that and
20:04
empowering them as well, you're also
20:06
getting some amazing ideas that one
20:08
person with access centrally would not
20:10
have. This goes back to that
20:12
idea of a creative understanding. Yeah.
20:15
We don't have to be the
20:17
only source of truth. But what
20:19
we're doing is we're adding to
20:21
our knowledge. Absolutely, love it. Well,
20:24
as we wrap up here, just
20:26
a couple questions for you, I
20:28
know we're almost wrapping up the
20:30
event here, but you know, wanted
20:32
to ask what's been a highlight.
20:35
so far for you, Qualtrix X4?
20:37
I usually enjoy the day two
20:39
keynotes. Day one is very product
20:41
forward, like look at the cool
20:44
stuff we're doing. Day two is
20:46
more, how should you think about
20:48
doing your work? And I always
20:50
enjoy those reframings because when we're
20:53
just going about our day, we
20:55
do what we do. And sometimes
20:57
we need to stop and listen
20:59
to how do other people do
21:01
it? How are they approaching this
21:04
existential question? And I always enjoy
21:06
those. Yeah, so one last question
21:08
for you, I like to ask
21:10
everybody, how do you stay agile
21:13
in your role and how do
21:15
you find a way to do
21:17
it consistently? Change is inevitable. On
21:19
my team, I don't have a
21:21
lot of process. I do not
21:24
have a lot of, you must
21:26
do this, then this, then this,
21:28
then this. Because I know as
21:30
soon as you build the process,
21:33
something's going to change and it
21:35
all just goes out the window.
21:37
For myself and my team, I
21:39
try to make sure that. We're
21:41
all oriented around what problem we're
21:44
trying to solve, and we understand
21:46
that the way we do that
21:48
will be flexible. Yeah. Love it.
21:50
Well, again, I'd like to thank
21:53
Adam Hagerman, director of UX Research
21:55
for Employer Products at Indeed for
21:57
joining the show here at Qualtrix
21:59
X4 in Salt Lake City. You
22:01
can learn more about Adam and
22:04
indeed, and Qualtrix, by following the
22:06
links in the show notes. Thanks
22:08
again for listening to the agile
22:10
brand brought to you by Tech
22:13
Systems. If you enjoy the show,
22:15
please take a minute to subscribe
22:17
and leave us a rating so
22:19
that others can find the show
22:22
as well. You can access more
22:24
episodes of the show at the
22:26
agilebrand.com. That's the agilebrand.com. And contact
22:28
me if you're interested in consulting
22:30
or advisory services or are looking
22:33
for a speaker for your next
22:35
event. Go to www. Greg kilstrom.com.
22:37
That's g-r-e-g-k-i-h-l-s-t-t-o-m.com. The agile brand is
22:39
produced by MissingLink, a Latina-owned strategy-driven,
22:42
creatively-fueled production co-op, from ideation to
22:44
creation. They craft human connections through
22:46
intelligent, engaging, and informative content. Until
22:48
next time, stay curious and stay
22:50
agile. They're not just a company,
22:53
they're your plant partners who've been
22:55
perfecting their craft for 60 years.
22:57
They deliver beautiful, high-quality, easy-to-care for
22:59
plants. They even offer virtual plant
23:02
consultations and an insider club for
23:04
rare plant access. Check out www.
23:06
Costa Farms, 15 for a 15%
23:08
discount on your first purchase. You
23:10
can also purchase this unique plant
23:13
brand at Lowe's, Walmart, Amazon, and
23:15
Home Depot. Go to W-W-W-D-C-O-S-T-A- farms.com
23:17
today. Before we continue, I wanted
23:19
to share a key strategic resource
23:22
that a majority of the Fortune
23:24
500 are already aware of. Finding
23:26
the best technology, business, and talent
23:28
solutions is not easy. With business
23:31
demands and competitive pressures mounting, you
23:33
need to be able to design,
23:35
deploy, and optimize your technology to
23:37
provide leading customer experiences while driving
23:39
business growth. Those of you that
23:42
have been listening to this show
23:44
for a while know that this
23:46
podcast is brought to you by
23:48
Tech Systems, a global provider of
23:51
technology, business and talent solutions for
23:53
more than 80% of the Fortune
23:55
500. Tech Systems accelerates business transformation
23:57
for their customers. Whether you're looking
23:59
to maximize your technology ROI, drive
24:02
business growth, or elevate customer experiences,
24:04
Texas systems enables enterprises to capitalize
24:06
on change. Learn more at techsystems.com.
24:08
That's t-e-k-systems.com. Now let's get back
24:10
to the show.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More