Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:03
What does it take to hire
0:05
at scale? When scale means
0:07
hundreds of thousands of roles
0:09
globally every year across fulfilment
0:11
centres, corporate offices, tech hubs,
0:13
headquarters and warehouse facilities? And
0:15
how do you do it
0:18
efficiently, effectively, fairly and powered
0:20
by data from start to
0:22
finish? I'm David Green and
0:24
today on the Digital H.R.
0:26
Leaders podcast. I'm joined by
0:28
Ashish Paruka. Director of Data
0:30
Science and Global Head of
0:32
Talent Acquisition Analytics at Amazon to
0:35
explore exactly that. With a career
0:37
steeped in analytics and a front
0:39
row seat to one of the
0:41
most complex and sophisticated hiring machines
0:43
in the world, Ashish brings a
0:45
unique perspective on how to make
0:47
people data work at both volume
0:49
and depth. In our conversation we
0:51
dive into how Amazon approaches the
0:53
twin challenges of high volume and
0:55
high stakes hiring. how analytics is
0:57
used to optimize cost, quality and
0:59
fairness, and what it really takes
1:01
to assess top talent without bias.
1:03
We also look beyond Amazon to
1:05
the broader people analytics profession.
1:07
What skills matter most? Where
1:09
do the biggest opportunities lie? And
1:11
how do organisations just getting
1:14
started avoid falling into the
1:16
trap of data theatre and
1:18
instead focus on driving real
1:20
impact? So if you're leading
1:22
talent strategy, building or scaling
1:24
your analytics capability, Or just
1:26
wondering how to do more
1:28
with less and do it
1:30
smarter, this episode is one
1:33
for you. So without further
1:35
ado, let's get the conversation
1:37
started. Ashish, thanks for joining
1:39
me today. To start the
1:42
conversation off, could you share
1:44
a little bit about your
1:47
career journey that has led
1:49
you to where you are
1:51
today and your role at
1:54
Amazon? Thank you David first of all for
1:56
having me on your podcast and all the great
1:58
work you do for the field. I'm a big
2:00
fan and really glad to be
2:02
on your show. So a little
2:05
bit about me. So when I
2:07
did electrical engineering and data science,
2:09
I did not anticipate being in
2:12
the people analytics space. But,
2:14
you know, early part of
2:16
my career was exploration of
2:18
different roles like data science,
2:21
marketing, product. And then it
2:23
was really some key mentoring
2:25
conversations that drove me. to where
2:27
I am. And I'll talk to
2:29
you about the one about people
2:31
analytics. So I had been working
2:33
in product for about 10 years,
2:35
really enjoying the talking to the
2:38
customers, understanding their needs,
2:40
bringing tangible things that help
2:42
people and make a positive
2:44
impact, and also bringing in
2:46
technology and analytics to do
2:48
the good work in that
2:50
space. And I had a few options
2:52
for different roles and I was having a
2:55
conversation with my mentor on like, which one
2:57
should I go with? And she said, you
2:59
know, all these roles are great, but she's
3:01
like, when I see sparkle in your
3:03
eyes is when you're talking about analytics
3:06
and when you're talking about people.
3:08
And by the way, there is a
3:10
role for that called people analytics. And
3:12
I was like, oh, wait, HR. And
3:14
you know, she caught me by surprise.
3:16
And you know, it's funny how sometimes
3:18
others know you better than you do.
3:20
And she said, you know, you're trying
3:22
to change your company's hiding strategies and
3:24
performance management and start even your day
3:27
job. And you bring so much energy
3:29
to those work streams that you should
3:31
really look into it. And you know,
3:33
I respect her a lot. So I
3:35
looked into it and more I learned,
3:37
more excited I got. Well, you've hit
3:39
a very, very prescient advice that you
3:41
got there. So you were clearly a
3:44
natural fit for people analytics.
3:46
So when you found out more
3:48
about people analytics, what really attracted
3:50
you to the field, and maybe
3:53
an add-on question there, is
3:55
what advantages do you
3:57
feel that you had from the working
3:59
outside HR, people and if you prefer
4:01
some of the skill sets that you could
4:03
bring into the field. No, that's a
4:06
great, great question. I think about this
4:08
as in three factors. One is purpose,
4:10
then the ability to make impact and
4:12
the scale of impact you can have,
4:14
and then do you have the skill
4:17
sets to actually make that impact, right?
4:19
So if you think about purpose, you
4:21
know, things like fairness, go deep with
4:23
people, right? And everybody I think
4:25
has a story of well, they
4:27
felt being wrong, then things were
4:29
not really and so on. And
4:31
I have that story too. So
4:33
when you're working in a space
4:35
that can help people get
4:37
opportunities that are based on
4:40
merit and things that more fair,
4:42
I think there's a sense of
4:44
purpose to it and sense a mission
4:46
to it that you may not find
4:49
in a lot of other spaces. And
4:51
for me, that is the
4:53
number one factor that got
4:56
me interested in people analytics.
4:58
the work that you're doing at Amazon.
5:00
Now obviously everyone listening to this
5:02
show will know that Amazon is
5:04
one of the largest organizations
5:06
in the world and I can
5:09
only imagine what the scope and
5:11
volume of hiring will be and
5:14
obviously in your role as global
5:16
head of talent acquisition analytics you're
5:18
looking all the data that supports
5:21
that process. How does a scale
5:23
and scope of talent acquisition at
5:25
Amazon differ from your previous
5:27
roles? We are hiring hundreds of
5:29
thousands of hourly employees, tens of
5:32
thousands of corporate employees each year,
5:34
right? And that might be the
5:36
size of entire companies in many
5:39
places. But I do think the
5:41
biggest difference is having skin in
5:43
the game. What I mean by
5:46
that is as my team is
5:48
building models and putting them into
5:50
production that drive marketing, that drive
5:53
the funnel that are hiring, so
5:55
for example in hourly hiring. If
5:57
our models are wrong or
5:59
they are down because of system
6:02
issue, that directly impacts if we
6:04
hire the exact number of people,
6:06
we need to hire at a
6:08
location, which impacts if packages get
6:10
delivered to people on time, right,
6:12
which impacts the bottom line of
6:14
Amazon. So that direct accountability
6:17
for something yet you do that
6:19
impacts the company's bottom line, it's
6:21
very different, right? In many other
6:23
places, you might have people are
6:26
not interested in doing reporting or
6:28
making interesting insights, then the operations
6:30
team have to take those insights
6:33
and make it a reality. Here
6:35
teams are structured for single thread
6:37
ownership and what you build you
6:39
put into production and it affects
6:41
things in market, right? Even on
6:43
the corporate side, you know, we
6:45
ran six million assessments last
6:47
year, right? And this system has
6:49
to be on globally 24-7 and
6:52
if it's down for an hour,
6:54
it can affect. tens of thousands,
6:56
lots of different businesses like prime,
6:58
AWS, you know, different devices
7:00
and you might lose out on
7:03
a really important candidate who could
7:05
build the next big thing.
7:07
I think that accountability is
7:10
a big difference, but with that
7:12
accountability also comes lots of
7:14
benefits that make it a
7:16
pleasure to work here, right?
7:18
With scale, you can actually
7:21
do advance and then it
7:23
takes. and science and show
7:25
what matters versus what doesn't
7:27
matter. What are the myths
7:29
versus what are the realities with
7:31
the underlying tech that
7:33
AWS offers us. You can have sensors
7:36
everywhere and make sure no
7:38
data is lost in the
7:40
ether, right? It's you have
7:42
reliable, always available data, which
7:44
is massive when if you're
7:46
trying to do science and
7:49
and also the company culture
7:51
being that of data back decision making
7:53
is really critical. That was one of
7:55
the main questions I had when I
7:57
was switching from the business side over
7:59
to. people and under text is like,
8:01
okay, even if you come up with
8:03
great insights, our people, our executives,
8:05
our companies really gonna make a change,
8:08
but are they gonna say like, yeah,
8:10
in theory, data back decision making, all
8:12
of us should do it, but in
8:14
my unique situation, it doesn't apply,
8:17
right? My judgment is, you know,
8:19
more important and there is tremendous
8:21
room for judgment always with
8:23
people, right? But there are things where
8:25
that numbers can get you more right
8:27
and wrong and we should leverage. How
8:30
does work really get done
8:32
in your organisation? Worklittix helps
8:35
companies measure collaboration, productivity and
8:37
AI adoption, using real data,
8:40
not guesswork. With insights into
8:42
meetings, tool usage and work
8:45
patterns, Worklittix helps enterprises optimize
8:47
team performance to get more
8:50
done faster. They're currently offering
8:52
podcast listeners a complementary AI
8:54
adoption assessment. Gain insight into
8:57
your organization's AI usage and...
8:59
and unlock its
9:02
full potential.
9:04
Limited spots
9:07
are available.
9:10
Learn more
9:12
at worklitics.co.org/AI.
9:15
That's W-O-R-K-L-Y-T-I-C-S-I-C-S-C-O-4-S-A-I.
9:28
What are some of the
9:31
innovations or tools that you've
9:33
implemented to to optimize Amazon's
9:35
high volume hiring processes?
9:38
I'm happy to talk about that, you
9:40
know, you know, things like AWS Connect,
9:42
help us build our own ATS system
9:45
that can scale and have
9:47
the fungibility and quick, quick
9:49
iteration that we need, as
9:51
well as, you know, things like Sage
9:53
Maker. help us build models and
9:55
put them in production. But let
9:57
me digress a little bit because
10:00
you know, it's it's most important
10:02
to solve the right problem,
10:04
right? And then the technology
10:06
should be at the end of the
10:08
end of the journey, right? So let
10:10
me talk about it. Okay, what is
10:12
the problem? Right. So we
10:14
have 3,000 plus sites around the
10:17
world, right? And every week, a site
10:19
may need a site might tell me,
10:21
okay, I need five hires. The week
10:23
after that, I might need two 33
10:25
hires. And the week after that.
10:27
that site made me 37 hours.
10:29
And the site next to it
10:31
might say 150 hires, no, 45
10:34
hires and 55 hires, right?
10:36
So you have 3,000 distinct
10:38
strings of demand coming through
10:41
and then we have three weeks
10:43
of heads up to know like
10:45
how much demand will be.
10:47
So you have short heads of
10:49
time, we have unique demand,
10:51
and then you have the scale
10:54
of Amazon, right? So that's
10:56
that's the. problem on one
10:59
side. On the other side, I
11:01
have 16 different dials or levers
11:03
I can pull to get exactly
11:05
the number of hires and I
11:08
need to get 95% accuracy
11:10
in this operation. So now, if
11:12
all these moving pieces and your
11:14
head can start spinning. So
11:17
that's the problem. Now let's
11:19
think about the context, right?
11:21
So six years ago, Amazon
11:24
used to outsource this hiring.
11:26
And we didn't have an internal
11:29
team hiring in the hourly space.
11:31
Six years ago, we decided, like,
11:33
okay, let's bring it in-house. And
11:35
in the first year, we needed to
11:37
double the hiring, right? And imagine doing
11:39
it for the first time and you
11:42
have to double. And then in the
11:44
next two years, we had to go 6X,
11:46
right? And a lot of that had to
11:48
be built based on judgment and
11:50
subject matter experts. So, you know,
11:52
about three years ago when I
11:54
joined, we had this massive operation.
11:56
For example, in marketing we had
11:59
300 export. sitting in different media
12:01
markets, and telling us, like, hey,
12:03
when to start, when to stop
12:05
marketing, what channels, how much to
12:08
spend. And we don't have all
12:10
this data captured. So we know
12:12
we are getting the highest we
12:14
need, but we don't know the
12:16
intelligence behind, like, what is the
12:19
team? Like, how are we really
12:21
making decisions? Where are we making
12:23
good decisions versus not? So the
12:25
first step was actually capturing all
12:27
the data, right, like a lot
12:29
of the times people. really underestimate
12:32
investment in data and that get
12:34
distracted by shiny AI topics. If
12:36
you don't invest in data, you're
12:38
really capping the upside of what
12:40
you can achieve. So the first
12:43
year was really about capturing data
12:45
on what are the inputs people
12:47
are using to make decisions, how
12:49
are they making decisions, when do
12:51
the outcomes actually pan out versus
12:54
not. And the next step was
12:56
just to put some simple rules
12:58
around it. Like, hey, what is
13:00
the common theme across our whole
13:02
network when certain decisions work versus
13:04
others don't? So some of the
13:07
decisions started getting automated through rules,
13:09
there was still a lot of
13:11
judgment involved, right? And that alone
13:13
saved or reduced our cost by
13:15
50%, right? And then the year
13:18
later, we were able to reduce
13:20
the cost by 90% by taking
13:22
the rules and replacing them with
13:24
machinery. Right. So there is this
13:26
journey, right, of capture the data,
13:29
simple things first, put it into
13:31
market, even if it's not perfect,
13:33
learn from it, and then go
13:35
to more advanced machine learning models.
13:37
And even with machine learning models,
13:39
it's not fully automated. We still
13:42
have judgment in the right spots.
13:44
Right. So we've, we've done that
13:46
transition over time. And when you
13:48
come to that final stage, and
13:50
that's where I can talk about
13:53
our. and sage maker and how
13:55
it can make help us you
13:57
know run these models at globally
13:59
for these 3,000 slides. and the
14:01
thousands of hires that we are
14:04
making at that scale and be
14:06
accurate, right? But it comes at
14:08
the end of the journey, right,
14:10
of prototype and iterate over time.
14:12
Let's switch a little bit to
14:14
corporate hiring. Now you'll obviously do
14:17
a lot on the corporate hiring
14:19
side at Amazon, but obviously not
14:21
anywhere near the volume of that
14:23
you're doing on the just on
14:25
the hourly paid workers. What are
14:28
you currently doing to ensure that
14:30
you're acquiring? the top talent in
14:32
the space, but also assessing them
14:34
in a fair and unbiased manner.
14:36
Let me start personally by saying,
14:39
like my mission is to minimize
14:41
guesswork and bias when it comes
14:43
to people's decisions. Like that's what
14:45
drives me, right? And corporate hiding
14:47
space is where that is probably
14:49
most applicable. So if you think
14:52
about our hiring experience today for
14:54
most companies, from candidate perspective, They
14:56
feel that they're tossing their resume
14:58
on a pile of thousands of
15:00
resumes They don't know if anybody's
15:03
even gonna get take a look
15:05
at it and They are probably
15:07
their baseline expectation is not to
15:09
hear back from anyone when you
15:11
apply on that right and they're
15:13
told okay, you need to network
15:16
with the hiring manager or the
15:18
recruiter da da da da da
15:20
and then there's a lot of
15:22
luck involved and if you even
15:24
get a fair shot at evaluating
15:27
what you can do and how
15:29
does that relate to a job
15:31
Right. And then on the flip
15:33
side, if you're thinking about the
15:35
recruiter, you've got stacks of thousands
15:38
of resumes, right? It's not practical
15:40
for you to really evaluate all
15:42
thousand resumes, right? So you put
15:44
some shelters and narrow things down
15:46
and you're able to maybe look
15:48
at a few hundred and, you
15:51
know, 15 to 30 seconds at
15:53
a top, right? And in that
15:55
time, it's a really tough job
15:57
to pick out of that pile,
15:59
really the top. candidates, right? And
16:02
that's where, again, you know, human
16:04
judgment and machines together. can do
16:06
a better job. So we have
16:08
automated resume reviews, we have online
16:10
assessments that we have scaled for
16:13
like hundreds of job families. So
16:15
we have 455 job families for
16:17
which we have online assessments. We
16:19
ran 6 million plus assessments last
16:21
year. And the thing I love
16:23
about assessments is it's an opportunity
16:26
for candidate to show what they
16:28
can do. Right. Here the skills
16:30
that are important for the job.
16:32
Show me what you can do.
16:34
Right. And so we had, we
16:37
gave people six million plus metered
16:39
base shots to get a job
16:41
at Amazon, right? And for some
16:43
candidates that can truly be life
16:45
changing and it could be life
16:48
changing, you know, for the company
16:50
and our customers because, you know,
16:52
if you hire the top talent
16:54
who was best suited for that
16:56
job, then we can make a
16:58
bigger difference for the customers. And
17:01
we also have machines. helping candidates
17:03
navigate, right? So each company has
17:05
different titles and different ways of
17:07
organized function. And so when candidates
17:09
come in, they can upload a
17:12
resume and we can help them
17:14
direct like, hey, based on your
17:16
skills, here are actually the jobs
17:18
you might be best suited for
17:20
Hilton levels in families. So there's
17:23
a lot of technology that can
17:25
help improve both the candidate experience
17:27
as well as a reporter experience
17:29
to create the best match in
17:31
the end. between the job and
17:33
the person. And then that's really
17:36
what we are going for in
17:38
the corporate space. Very good. Very
17:40
good. That's 6 million assessments. Wow.
17:42
Yeah. Yeah. That's a lot of
17:44
data as well, obviously. Yes, that's
17:47
the time of data, but it
17:49
also requires some incredible subject matter
17:51
expertise and things like AIO psychology,
17:53
as well as engineering to make
17:55
sure all you can deploy these
17:58
assessments and keep it up. 24-7
18:00
all through the year. So a
18:02
lot of amazing work that the
18:04
team is doing. Yeah, that really
18:06
good. So on the topic of
18:08
corporate hiring, let's turn the lens
18:11
on people analytics itself, which in
18:13
some respects is still an emerging
18:15
field, still growing. What would you
18:17
say to someone considering a career
18:19
in this area, and that by
18:22
extension, what would you say are
18:24
the top skills that you look
18:26
for when hiring people and the
18:28
six professionals? Yeah. So my pitch
18:30
for anybody who's, you know, maybe
18:32
on the tech side, business side,
18:35
to come over to people and
18:37
analytics is the same as we
18:39
talked up to. So, hey, this
18:41
is a really mission driven field.
18:43
You can make 10x, 100x difference
18:46
in this field. And a lot
18:48
of the skills that you learn
18:50
on the business side and the
18:52
tech side can translate very nicely
18:54
in this view, right? So it
18:57
can be a really fulfilling endeavor
18:59
and you can be in the
19:01
frontier of lots of problems that
19:03
haven't been solved yet. So that
19:05
would be my pitch for somebody
19:07
considering. And especially at Amazon, you
19:10
have the scale of data to
19:12
actually do the similar types of
19:14
analytics and science that you could
19:16
do on the business side. In
19:18
terms of what I would advise
19:21
people in general, if they're thinking
19:23
about career in this space, it
19:25
may not be too different from
19:27
career in other spaces, is I
19:29
feel going forward big wins are
19:32
going to be at intersection of
19:34
functions. So let's take data engineering,
19:36
right? I feel especially not just
19:38
in people on analytics, but with
19:40
the advances in AI, a key
19:42
skill that everybody's gonna need is
19:45
data engineering, right? And it's quite
19:47
scarce, actually. We find that finding
19:49
great data engineers is harder than
19:51
finding great scientists, just based on
19:53
a number of professionals in the
19:56
space. But if you are a
19:58
greater data engineer with advances in
20:00
tooling from like ABUUS and what
20:02
AI can do these days, going
20:04
deep in data engineering alone may
20:07
not be enough. And we are
20:09
gonna need some people who are
20:11
like incredibly deep in data engineering,
20:13
but for most people, if you
20:15
can marry your skills of data
20:17
engineering by also understanding the process,
20:20
understanding the business. then you can
20:22
make the right decisions on where
20:24
to put the sensors, what is
20:26
it that you need to capture,
20:28
how do you prioritize work, right?
20:31
And that's going to be most
20:33
valuable to your end customer. That's
20:35
what's going to be helping you
20:37
make the most impact, right? So
20:39
intersection of those two things. Let's
20:42
take about, talk about science, right?
20:44
I talked about like in selection
20:46
space, you need deep expertise in
20:48
iO psychology. But if you marry
20:50
that with ML. you can scale
20:52
so much faster, right? Like one
20:55
of the biggest challenges I've heard
20:57
in the time I've been in
20:59
the people analytics space is like,
21:01
it takes too long for us
21:03
to build selection mechanisms, right? Like,
21:06
it takes 18 months, right? Oh
21:08
my God, like world changes in
21:10
18 months, right? But how can
21:12
we marry multiple? So like, you
21:14
know, we had automated resume review
21:17
for 20 job families. last year
21:19
and now we have it for
21:21
220, right? Similar scale upgrowth in
21:23
other areas like online assessments. So
21:25
how can we marry different sciences
21:27
together to scale? In product, I
21:30
think that's where probably three intersections
21:32
come in. I think product as
21:34
a discipline is really hard to
21:36
find great product managers, but if
21:38
you can do product and software
21:41
and science, That's a killer combination,
21:43
right? Is it really, and if
21:45
you think about where the world
21:47
is going, right? You are going
21:49
to need all these, these skills
21:51
to really leverage harness the goodness
21:54
that AI can bring you. It's
21:56
going to be sciences. going to
21:58
be software and it's going to
22:00
be product. So in my, in
22:02
general, I would say intersection of
22:05
things is where big wins are.
22:07
So if you are only in
22:09
your career, explore versus maximize, right?
22:11
If you just stay in one
22:13
function and maximize, you may be
22:16
limiting the upside long term and
22:18
might be better off the new
22:20
career going broader and collecting a
22:22
broader set of skills so that
22:24
you are building a good foundation
22:26
for a. a greater future in
22:29
the long term. I want to
22:31
take a short break from this
22:33
episode to introduce the Insight 222
22:35
People Analytics Program, designed for senior
22:37
leaders to connect, grow and lead
22:40
in the evolving world of people
22:42
analytics. The program brings together top
22:44
HR professionals with extensive experience from
22:46
global companies, offering a unique platform
22:48
to expand your influence, gaining your
22:51
influence, tackle real-world business challenges. As
22:53
a member you'll gain access to
22:55
over 40 in-person and virtual events
22:57
a year, advisory sessions with season
22:59
practitioners, as well as insights, ideas
23:01
and learning to stay up to
23:04
date with best practices and new
23:06
thinking. Every connection made brings new
23:08
possibilities to elevate your impact and
23:10
drive meaningful change. To learn more
23:12
head over to Insight22. board slash
23:15
program and join our group of
23:17
global leaders. Interestingly, maybe it's just
23:19
two parts of this question as
23:21
well, you know, what roles are
23:23
you currently hiring for on your
23:26
team at Amazon? And then the
23:28
second part of that, you know,
23:30
how the skills and capabilities evolving
23:32
over time, but the second part
23:34
is also, you know, how's... Where
23:36
do you envisage maybe in the
23:39
next 12 months, 18 months that
23:41
you might be hiring additional skills,
23:43
maybe new skills into into into
23:45
your too? Yeah, so we have
23:47
25 rolls open right now. Given
23:50
the size of the team, we
23:52
typically hire about 50 people each
23:54
year. We've got rows in data
23:56
engineering, as I said, like, you
23:58
know. We have a large presence
24:01
in data engineering globally. We've got
24:03
software engineers for deploying things and
24:05
automated resume reviews. We also have
24:07
IOCology roles as well as data
24:09
science and machine learning roles. And
24:11
as always, I always would look
24:14
for product people who can do
24:16
all three, like product science and
24:18
software that is always a need.
24:20
for that. So you would have
24:22
a range of roles available and
24:25
I would invite both people who
24:27
are in people under takes for
24:29
a long time as well as
24:31
who are on the business side
24:33
to come take a look because
24:35
you know one of the hesitations
24:38
people may have from moving from
24:40
the business side is the scale
24:42
of data and access to technology
24:44
that you may have on the
24:46
business side. Do you have that
24:49
on HR side as well and
24:51
like at a company like Amazon
24:53
you do. So it might be,
24:55
if you're considering a change, this
24:57
might be a good place to
25:00
start, right? And then, then you
25:02
can go from there. And if
25:04
we can talk a little bit
25:06
about the product management piece, because
25:08
again, it's an area that we've
25:10
noticed growing in many people analytics
25:13
functions around the world, you know,
25:15
and we've identified a gap in
25:17
the research that we've done around
25:19
democratizing data and the adoption of
25:21
those products as well. and product
25:24
management plays a really important role
25:26
in that, doesn't it? Not just
25:28
of our user experience, but it's
25:30
around making the products easier. Yeah,
25:32
products. I love to talk a
25:35
little bit to the product management
25:37
piece. That'd be great. So on
25:39
one side, if you think about
25:41
the users, they are on this
25:43
journey about data literacy and then
25:45
come science literacy. So what they
25:48
deeply know is the problems they
25:50
haven't had today, right? But if
25:52
you if you ask them for
25:54
solutions, they might just tell you,
25:56
hey. help me do what I
25:59
do faster or easier, right? But
26:01
you may not get the transformative
26:03
idea from your users, right? So
26:05
as a product professional, you have
26:07
to understand the need, but then
26:10
also understand the technology to see
26:12
what is possible, right, to come
26:14
up with a function, come up
26:16
with a solution, and then you
26:18
have to sell that solution to
26:20
the customer, right? That's also really
26:23
challenging because they might not believe
26:25
you. Right. And then in the
26:27
end, as I said, they are
26:29
on the hook to deliver, right?
26:31
So you paint a picture which
26:34
may look like a crystal ball
26:36
and they might not sign up
26:38
for that solution. So like how
26:40
do you bridge? You deeply need
26:42
to understand the technology so that
26:45
you're not promising a crystal ball.
26:47
You're promising something real. And at
26:49
the same time, you have to
26:51
sell it to the customer so
26:53
that they believe it. And then
26:55
once you have that alignment, you
26:58
have to go build it, right?
27:00
And in building it, there is
27:02
such a dramatic curve of improvement
27:04
in technology that is so, you
27:06
know, what used to be technical
27:09
debt over years, you can accumulate
27:11
technical debt within months now, right?
27:13
The things you're building your infrastructure
27:15
on can be obsolete within months.
27:17
So how do you also stay?
27:20
on the frontier of technology where
27:22
things are going to make the
27:24
right investment decision and build things
27:26
right, give your... functionality to be
27:28
on the most modern technology. That's
27:30
the other side of the story.
27:33
So really incredible opportunity for people
27:35
who can do that. They can
27:37
really unlock the potential of all
27:39
the individual functional experts we have,
27:41
if you have great productivity. So
27:44
as someone who's been deeply involved
27:46
and embedded in talent acquisition analytics,
27:48
and I know you were doing
27:50
similar before you came to Amazon
27:52
as well, what do you see
27:54
as the biggest opportunities for people
27:57
analytics to drive even further impact
27:59
with regards to tenant acquisition? Yeah,
28:01
I think in talent acquisition space,
28:03
the biggest problem is quality of
28:05
art, right? If you can nail
28:08
that, I think businesses would be
28:10
willing to invest a lot more
28:12
in this space, right? And it's
28:14
about measuring the quality of hire
28:16
and then showing you can actually
28:19
move the needle, right? In general,
28:21
even if you are just doing,
28:23
let's say, skill-based hiring and using
28:25
best practices, you're probably in the
28:27
top half of the companies anyway,
28:29
right, that a lot of companies
28:32
who are still not doing. skill-based
28:34
hiring. They're not doing structured interviews,
28:36
right? These are like proven practices
28:38
for, you know, decades now. So
28:40
if you're not doing that, do
28:43
that, right? I think that would
28:45
be the first step. But even
28:47
if it's skill-based hiring, and you
28:49
know, given the limited time we
28:51
have to evaluate talent, you know,
28:54
a few hours, yeah, getting skill-based
28:56
signals is probably a good, good
28:58
way right now. But that still
29:00
leaves the unknown of can this
29:02
candidate put these, put these skills
29:04
together to deliver the tasks that
29:07
they need to complete in. environment
29:09
of this company in the culture
29:11
of this company, right? So that
29:13
remains unknown. A lot of the
29:15
misses we end up having are
29:18
because people may have the underlying
29:20
skills but cannot put it together
29:22
or maybe cannot put it together
29:24
in this environment. It's a great
29:26
experience for the candidate to see
29:29
how it's really like. to work
29:31
in a particular role in a
29:33
particular company, what they would be
29:35
asked to do day to day,
29:37
and can they pull all the
29:39
skills they have together, both the
29:42
soft skills and the functional skills,
29:44
to deliver the solutions that the
29:46
company needs, right? And that's where
29:48
I believe the world is going.
29:50
And if you are able to
29:53
do that, then you will be
29:55
able to show measure more accurately
29:57
a person's ability to do high
29:59
quality work. in that particular job
30:01
and it would be better for
30:04
the candidates as well because it's
30:06
a life-changing decision to you know
30:08
change a job move potentially to
30:10
another place and then you have
30:12
all these unknowns that you don't
30:14
know how we're going to cut
30:17
out right so more closer we
30:19
can get to real-life job preview
30:21
and I believe with AI we
30:23
can I think that would be
30:25
the next frontier. If we look
30:28
more generally at the field of
30:30
people analytics, you know, and this
30:32
leads quite nicely at you, one
30:34
of the key topics for everyone,
30:36
business leaders talk about its productivity,
30:39
we've discussed it in the past.
30:41
You posted some excellent articles on
30:43
it a few years ago as
30:45
well. But for the benefit of
30:47
listeners, what role can people analytics
30:49
play in your opinion in terms
30:52
of unlocking productivity at the individual,
30:54
the team, and maybe the organizational
30:56
level as well? Great question. And
30:58
you know these days it's pretty
31:00
easy to jump to AI as
31:03
the answer for productivity. and you
31:05
know I always say it's most
31:07
important for us to solve the
31:09
right problem first right and get
31:11
the structure right and the technology
31:13
comes at the end like it
31:16
don't start with technology right and
31:18
speaking more broadly of people analytics
31:20
a big part of people analytics
31:22
is consulting with the business right
31:24
so if you're thinking about organizational
31:27
level productivity I still think it
31:29
is more about How do we
31:31
make sure we're solving the right
31:33
problems? How do we make sure
31:35
we have the right people to
31:38
solve those problems? How do we
31:40
make sure those people have the
31:42
resources to solve the problem and
31:44
so on so forth, right? So
31:46
making sure we have that structure,
31:48
right? And a lot of those
31:51
things might be more about having
31:53
clear decision-making mechanisms, right? Having a
31:55
great way to do workforce planning
31:57
to make sure your top talent.
31:59
with the right skills and are
32:02
in the right roles. And there
32:04
are areas where AI plays a
32:06
role in that, but it's not
32:08
just about AI, right? Like, you
32:10
know, when I talked about, you
32:13
know, we cut costs and now
32:15
we're adding by 50% and, you
32:17
know, improved accuracy in the high
32:19
90s, that it had nothing to
32:21
do with AI, right. So it's
32:23
really important not to be jumping
32:26
to that. and state do the
32:28
right problem solving techniques that break
32:30
down the big complex productivity problem
32:32
into components. Man it, make sure
32:34
that you can solve each of
32:37
the components, make sure you understand
32:39
the ecosystem and you can do
32:41
system thinking to figure out what
32:43
you should solve first before we
32:45
solve next and how, how, what
32:48
is a knock-on effect of solving
32:50
one thing on the another. And
32:52
then be aware of what AI
32:54
can do. in those steps. So
32:56
be aware of AI and apply.
32:58
that too, when is the right
33:01
tool for the job, right, versus
33:03
using AI, you know, looking, hammer
33:05
looking for an ale, that's probably
33:07
not the right way to apply
33:09
approach productivity. That's really, it's really
33:12
interesting, Chief, because I mean, when
33:14
we were But Jonathan and I
33:16
were writing Excellence in People Analytics
33:18
and prior to that and we
33:20
weren't the only people saying it
33:23
as well, not going to claim
33:25
ownership on this. You know, when
33:27
People Analytics was maybe in the
33:29
mid-2010, you know, lots of people
33:31
wanting to jump to the fanciest
33:33
type of analytics. I want to
33:36
do organizational network analytics and, you
33:38
know, we'd always say, well, what's
33:40
the problem you're trying to solve?
33:42
You said, but we'll properly define
33:44
what the problem is. Maybe have
33:47
some... questions that you want to
33:49
answer, some hypotheses that you want
33:51
to test. And only then, once
33:53
you've gathered the data together, do
33:55
you start thinking about, oh, what's
33:58
the solution for this, what, you
34:00
know, and that's when AI comes
34:02
in there, and not, as you
34:04
said, not, not a hammer looking
34:06
for a nail, I like that,
34:08
and an analogy. I'll give you
34:11
an analytics for our software organization,
34:13
right. And the team was quite
34:15
burnt out. and the customers were
34:17
not very happy about the outcomes.
34:19
And the team had received 400
34:22
different requests for projects through the
34:24
year, the previous year before I
34:26
joined. And they had all great
34:28
science and they had applied the
34:30
best available tools at their disposal,
34:32
but nobody was happy, right? So
34:35
I was like, okay, in that
34:37
situation, try to go as high
34:39
up as possible in terms of
34:41
customer. So run to the CIO
34:43
and ask him, okay, you're funding
34:46
this team, right? Tell me, because
34:48
of the work we have done,
34:50
is there something you are doing
34:52
differently this year compared to Vastir,
34:54
right? And he could. Okay, nothing.
34:57
There was nothing that the tech
34:59
organization was doing differently because of
35:01
the 400 things we did last
35:03
year and the team is burnt
35:05
out, customers, nobody's happy. Right. Now,
35:07
as I look, all right, so
35:10
what if we did two things,
35:12
right? And really made a difference
35:14
in two things and what would
35:16
be your two things? At that
35:18
time, it was about hiding and
35:21
repension. All right, so we're gonna
35:23
staff 60% of the team on
35:25
these two problems, right? And then,
35:27
you know, you know, there's always
35:29
this executive concer executive concierge executive
35:32
concierge. you know, questions from up
35:34
top that come and you want
35:36
to have like 10% for that
35:38
and then rest is Katie at
35:40
all right. So if that is
35:42
what we did, we might get
35:45
two, that's better than zero. Now,
35:47
this is a risky endeavor, right?
35:49
You might have this conversation and
35:51
you might be like, okay, I
35:53
actually don't need this team because
35:56
I haven't done anything different. But
35:58
so it's a delicate conversation. But
36:00
yeah, at the end of it,
36:02
when we actually focused on those
36:04
two problems and now we were
36:07
able to. increase the throughput of
36:09
software engineering by 50% but just
36:11
focusing on that piece, solving that
36:13
problem deeply. Then there were conversations
36:15
about, okay, how can we fund
36:17
this team for a third priority?
36:20
Right. So really like if we
36:22
had stayed in this like, you
36:24
know, solve 400 problems with top
36:26
technology, we may not have actually
36:28
made any impact. So that focus,
36:31
solving the problem, the right problem,
36:33
the right way. And maybe AI
36:35
is the answer to some of
36:37
those problems, maybe it's not, maybe
36:39
sometimes four-week average can solve your
36:42
problem. And you don't need AI.
36:44
It's really important, I mean, so
36:46
many times we hear about analytics
36:48
that are burning out because they're
36:50
just doing too much, but they're
36:52
not having much impact. And it's
36:55
because they're doing too much and
36:57
not prioritizing the right things. And
36:59
that's such a great example. of
37:01
doing that, you know, and that's
37:03
again how you build trust with
37:06
your internal customers, isn't it? What
37:08
are their key challenges that... trying
37:10
to solve for, help them solve
37:12
for them, they'll work with you
37:14
more and then, and actually it's
37:17
the business that will help you
37:19
get more investment in the team,
37:21
whether it's people, technology, other resources.
37:23
So yeah, really good example, yeah,
37:25
there as well. So I'm going
37:27
to go to the question of
37:30
the series as Shish now and
37:32
then I'm going to come back
37:34
and just ask for you maybe
37:36
to give some sort of key
37:38
things for people to take away
37:41
for them. So the question of
37:43
the series for those first time
37:45
listeners, this is something we ask
37:47
everyone in a series of the
37:49
Digital H.R. leaders podcast. And it
37:51
really is. We're going to talk
37:54
about AI a little bit here.
37:56
How can H.R. use AI to
37:58
improve employee experience and well-being? And
38:00
if you want to extend that
38:02
into candidate experience, please feel free
38:05
to do so. Yeah. You know,
38:07
the one way we are thinking
38:09
about this. is where is human
38:11
touch truly needed and truly beneficial,
38:13
right? And what is taking away
38:16
from humans being able to spend
38:18
that time in that human touch?
38:20
So think about recruiting, right? On
38:22
one side, there's a candidate who
38:24
could be making a life-changing decision,
38:26
right? Could be uprooting their family
38:29
and moving somewhere or being at
38:31
a job, they're doing really well,
38:33
but they're interested in something different.
38:35
These are big decisions. And it's
38:37
really important to have somebody you
38:40
can trust on the other side,
38:42
right? And then that is really
38:44
where humans can differentiate, which is,
38:46
you know, at least personally for
38:48
me talking to a machine may
38:51
not feel the same way. But
38:53
recruiters today have do so much
38:55
admin work, right? There are like
38:57
30, 40% of that late can
38:59
be admin work and that takes
39:01
them away from being connected to
39:04
the candidate, right? And how can
39:06
AI take off that stuff from
39:08
that plate to make sure, not
39:10
just in, let's say, operations roles,
39:12
even in data engineering? Right. I
39:15
remember back in the day, working
39:17
on name frame and you know,
39:19
other technologies, it was a bear
39:21
to understand where your data is,
39:23
where it is coming from. Today,
39:26
you're the data of US. It's
39:28
like having a conversation with your
39:30
data, right? Understand the metadata behind
39:32
like where does this field? What
39:34
does it mean? What is the
39:36
range? What is the source? What
39:39
is the latency? I can have
39:41
a conversation with my data. That
39:43
is just phenomenal experience for a
39:45
data engineer. And now they are
39:47
freed up to do the engineering
39:50
work they need to do versus
39:52
the admin work of just understanding
39:54
the basics, right? We've got some
39:56
last question, Shish, really enjoyed this
39:58
conversation and, you know, and I
40:01
think it hopefully... listeners will really
40:03
be able to take a lot
40:05
from it. But for those people
40:07
listening that are working in organizations
40:09
that are maybe just beginning their
40:11
journey to embed analytics into recruitment
40:14
or not an acquisition or maybe
40:16
want to take it to the
40:18
next level, what advice would you
40:20
give? Where should they start? And
40:22
what are the key success factors?
40:25
And I appreciate that not the
40:27
same success factors don't necessary apply
40:29
to every organization. So I think
40:31
there are a few different patterns,
40:33
right? So one pretty well-known. pattern
40:35
is types of analytics. So you
40:38
could have descriptive, like what is
40:40
happening, right, diagnostic, like why is
40:42
it happening, predictive, like what will
40:44
happen, and then prescriptive, what should
40:46
I do about it, right? So
40:49
people analytics teams evolve in that
40:51
direction as they have, you know,
40:53
more time under their belt, more
40:55
data, more size of the team,
40:57
they can move. Usually everybody starts
41:00
with this script, right. Now, that
41:02
is one model. Now, the thing
41:04
I would add to that is
41:06
rather than going descriptive on everything,
41:08
a lot of people on analytics,
41:10
I see, do tons of reporting
41:13
for everything under the sun. So
41:15
they are taking the descriptive piece
41:17
and going really broad. I would
41:19
actually highly recommend people to go
41:21
deep in few problems. So don't
41:24
do reporting for everything. Pick two
41:26
problems or one problem that business
41:28
really cares about solving this year
41:30
and go deeper in that problem
41:32
and make a difference in that
41:35
problem. Because let's say, you know,
41:37
this year, it might be about
41:39
workforce planning. If you can really
41:41
make a difference there, then the
41:43
team will, your team will get
41:45
funded to solve another problem and
41:48
another problem. But it's from the
41:50
business perspective, it's really about, are
41:52
you making an impact at the
41:54
bottom line? Before we wrap up,
41:56
where can listeners find out more
41:59
about you and learn about the
42:01
work you're doing and you and
42:03
your team are doing at Amazon?
42:05
And maybe, is there some way
42:07
they can go? If someone's thinking,
42:10
okay, I'd like to work with
42:12
Ashish and his team as well.
42:14
Yeah, so best way to find
42:16
me is on LinkedIn. And best
42:18
way to work with us is
42:20
go to the Amazon website and
42:23
search ITA, intelligent talent acquisition. So
42:25
if you search ITA in quotes,
42:27
you'll get the direct match for
42:29
25 roles that we are having
42:31
for today. I'm sure there will
42:34
be more coming up very soon.
42:36
And we'll put that in the
42:38
show now, but basically go to
42:40
the Amazon site, search ITA, intelligent.
42:42
talent acquisition and you'll see the
42:45
roles that Ashish was talking about
42:47
earlier. Asish, thank you very much.
42:49
I look forward to bumping into
42:51
you hopefully at a conference in
42:53
the next few months as well.
42:55
And thank you so much for
42:58
having me. Conversations like this remind
43:00
us that scale isn't just a
43:02
numbers game, it's a strategy game.
43:04
And when done right, people analytics
43:06
doesn't just support hiring, it transforms
43:09
it. Thank you to Ashish for
43:11
joining me and offering such a
43:13
clear-eyed view on how Amazon is
43:15
navigating hiring complexity with clarity and
43:17
using data to drive real im-
43:20
And thank you to you
43:22
all of you,
43:24
as always, for
43:26
the sling and
43:28
tuning week. Here at
43:30
week. 222, we are on a
43:33
mission to we are
43:35
on a mission
43:37
to help as
43:39
many HR and
43:41
people analytics professionals
43:44
and leaders skills,
43:46
the skills, strategies
43:48
and confidence needed
43:50
to drive real
43:52
business value. if you enjoy
43:54
if you enjoyed
43:57
today's episode, please
43:59
do leave a leave
44:01
a review with
44:03
share it with
44:05
your we can we
44:08
can help people
44:10
drive meaningful business
44:12
transformation. And as always, always,
44:14
like if you'd
44:16
like to dive
44:19
deeper and learn
44:21
more about us
44:23
here at Insight222,
44:25
follow us on on
44:27
explore our resources
44:29
resources at .com, or
44:32
subscribe to our -weekly
44:34
newsletter at my .com.
44:36
all That's all for now. Thank you
44:39
for tuning in we'll we'll be back
44:41
next week with another episode of of
44:43
HR Leaders H.R. Leaders Until then, take
44:45
care take stay well. well.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More