Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Bob Safian here with a hard
0:02
truth about business. Oftentimes, saving
0:04
money just isn't enough, which
0:07
is why the most innovative leaders
0:09
today are reimagining how financial
0:11
strategy drives growth. Enter
0:13
Brex, a modern finance platform
0:15
supporting corporate cards, banking,
0:17
expense management, and more. From
0:19
startups extending their runway to
0:22
established companies optimizing every dollar,
0:24
Brex is more than a
0:26
tool, it's a growth accelerator.
0:28
Over 30 ,000 companies are
0:30
already proving it. Discover your
0:32
company's peak performance at brex.com. Hey
0:35
folks, Jeff Berman here,
0:37
co -host of Masters
0:39
of Scale. In a
0:41
time when we are riding a
0:44
roller coaster of news, it really
0:46
pays to take a big step
0:48
back, see the bigger picture,
0:50
then chart the course forward. At
0:53
this year's Masters of Scale Summit,
0:55
we're doing just that. By bringing together
0:57
bold leaders who are shaping the
0:59
future, we all want to see. Join
1:02
speakers like Reid Hoffman, Kara
1:04
Swisher, Andrew Ross Sorkin, Aza Raskin,
1:06
Derek Triseater, and many, many
1:09
more this October in San Francisco.
1:11
Apply to attend at
1:14
mastersofscale.com slash response. Again,
1:17
that's mastersofscale.com
1:19
slash response. Hey,
1:22
everyone, Bob Safian here. Today,
1:24
we're sharing something special and a
1:26
little different for rapid response. It's
1:29
an episode of our sister show,
1:31
Pioneers of AI. Every week,
1:33
host, Rana El -Calubi talks to
1:35
some of the most innovative and
1:37
inspiring AI researchers, entrepreneurs, and investors.
1:39
As a pioneer herself in the
1:41
field of emotion AI and a
1:43
CEO turned VC, Rana cares
1:46
deeply about how AI is
1:48
changing life and work as we
1:50
know it. For the episode,
1:52
we're sharing Today, Rana talks with
1:54
a startup founder who's exploring
1:56
AI for so -called un -bossing, turning
1:59
some management tasks over to
2:01
bots, effectively reshaping how we approach
2:03
middle management and human resources. It's
2:06
a bold, provocative notion. As
2:08
you listen, think about your own work
2:10
and whether at times it might be better
2:13
to seek advice from AI instead of
2:15
from your actual boss. or as
2:17
a boss yourself, whether you let
2:19
AI give advice on your behalf. There's
2:21
lots to chew on, so let's
2:24
get to it. Here's Pioneers
2:26
of AI with Rana L.
2:28
Kalyubi. Picture
2:31
the best manager you've had to
2:33
date. The one who gave
2:35
you the perfect advice, challenged you
2:37
to be better, maybe even
2:39
helped you jumpstart your career. It's
2:41
a rare thing to have a manager
2:43
who can also be a mentor and champion
2:45
of your personal and professional growth. On
2:48
the other end of the spectrum, we've
2:50
all had a bad manager. Those
2:52
that micromanage, parachute in and out,
2:55
or lack communication skills. And
2:57
it's no slight to the individual. Managing
3:00
is often a thankless, difficult
3:02
job, which makes it a role
3:04
ripe for improvement. AI could
3:06
help. I
3:09
have seen some recent study
3:11
that 41 % of Gen Z
3:13
trust AI more than their
3:15
manager. So I think
3:17
one misconception is that people don't really
3:19
want to use it or don't trust it.
3:21
I think actually they do and they're
3:23
using it whether you like it or not.
3:27
Catherine Vaughnian is watching
3:29
this trend closely. She's
3:31
co -founder and CEO of
3:33
Tough Day. A company she
3:35
started after decades working in
3:37
people management at big tech
3:39
companies, including Salesforce. She's
3:41
building AI managers to help
3:43
professionals thrive. Her advice? Try
3:46
new AI tools right now
3:48
at every level. People are
3:50
in sort of analysis paralysis
3:52
over what we should do with
3:54
AI. And I would say
3:56
start using AI, thinking
3:59
through unintended consequences in advance
4:01
as best you can and prepare
4:03
for those is great. And you
4:05
do that alongside the team that
4:07
is actually trying to make the
4:09
solution work. Today,
4:12
Catherine and I are digging into
4:14
questions around the future of work
4:16
and AI. We talk
4:18
about the concept of unbossing, approaches
4:20
to DEI work at this critical
4:22
moment in time, and how AI could
4:24
be a solution to how we
4:26
work moving forward. I'm
4:30
Rana El -Calubi, and
4:32
this is Pioneers
4:34
of AI, a
4:36
podcast taking you behind the scenes
4:39
of the AI revolution. Hi,
4:47
Catherine. Welcome to Pioneers of AI.
4:50
Hi, Rana. I'm excited to be here. Thank
4:52
you for having me. Catherine
4:54
has been in management at tech
4:56
companies for decades. And in
4:58
all those years, she gained some valuable
5:00
insight on how technology can improve
5:02
our work lives. A couple
5:04
of years ago, she saw an opportunity,
5:06
which eventually led her to co -founding
5:08
her company Tough Day. So
5:10
I wanted to know more about her background
5:12
and the aha moment that led her to
5:14
founding her company. I
5:17
started my career at Lotus back
5:19
in the 90s working on collaborative technology
5:21
at a time when the internet
5:23
was brand new for business and trying
5:25
to figure out how that technology
5:27
would empower people to work better and
5:29
smarter and create value and all
5:31
of those things. You know,
5:33
I've rode the tech waves, mobile,
5:36
social, AI. I've been working in
5:38
AI since 2012. So
5:40
I guess, you know, the last
5:42
10 years before starting Tufte,
5:44
I was at Salesforce and I
5:46
was leading innovation projects there
5:48
internally and externally with customers and
5:50
partners and serving our ELT
5:52
over that time and the leadership
5:54
team, ELT leadership team. And
5:57
along the way, one of
5:59
projects that we did internally
6:01
was how to improve belonging
6:03
and inclusion inside using tech.
6:06
And of course, that work
6:08
starts by deeply understanding and
6:10
researching what the challenges are
6:12
for folks internally, not only
6:14
inside Salesforce, but in the
6:16
world. And that research
6:18
was fascinating. We learned a lot
6:20
about what happens when people
6:23
struggle and where they turn. So
6:25
there are two kind of
6:27
common paths in that. One is
6:29
that they go to some
6:31
kind of reporting system internally and
6:33
actually that's quite rare. Only
6:36
about half the people will ever
6:38
report something that actually needs
6:40
to be reported and there's a
6:42
lot more that happens before
6:44
then. So that solution typically means
6:47
we're going into investigation and
6:49
that is scary for everybody. And
6:51
then the other place that people
6:53
go are colleagues and often
6:56
friends and family at home. That's
6:58
what we hear the most even
7:00
today. So when you go to
7:02
friends and family at home, you're
7:04
not necessarily getting great advice. You're
7:06
not necessarily talking to someone who
7:08
knows HR or knows management best
7:10
practices or employment law or any
7:12
of those things. So the big
7:14
insight was that we need something
7:16
before you actually talk to someone
7:18
in HR about a problem, a
7:20
place where people can go. And
7:23
we created that there in something
7:25
called the warm line. A
7:28
warm line instead of, you know,
7:30
a hotline. A place for people
7:32
to call when something felt off
7:34
but hadn't reached any kind of
7:36
critical stage. So
7:39
when we built this warm
7:41
line and staffed it with
7:43
amazing humans who were really
7:45
smart on the topic of
7:48
DEI and lots of other
7:50
things, HR and management, they
7:52
were able to handle a small
7:55
population. The initial cohort got the
7:57
answers that they needed. They got
7:59
guidance. They got help. They got
8:01
empathy. They were able to go
8:03
back to work very quickly. Their
8:05
performance improved. It was a magical
8:07
story. And unfortunately,
8:10
that solution doesn't scale. It
8:13
doesn't scale because the warm line
8:15
needed to be staffed by humans with
8:17
the right skill set. Catherine and
8:19
her team simply could not find and
8:21
train enough people with the right
8:23
mix of skills and experience to serve
8:25
every person at the company. So
8:28
we know that everyone in the
8:30
organization has these moments where they need
8:32
help and the only way to
8:34
scale the solution so that everyone gets
8:36
the help they need is to
8:39
use AI. And
8:41
here was the opportunity. Imagine
8:43
an AI people manager that could
8:45
help employees express their personal
8:47
challenges and concerns and jump in
8:49
on problems so they didn't
8:51
need to escalate. That's what
8:53
Catherine set out to build. It
8:55
was just at the time when
8:57
ChatGPT was born and I knew that
9:00
that technology could be used to
9:02
to do things differently and support workers.
9:05
And I did a couple experimental
9:07
projects on my own, got
9:09
a couple teams together. And
9:11
what we learned in that
9:13
is that actually we could
9:15
get great content from HR
9:18
management, employment law, and create
9:20
this experience with AI and
9:22
be able to serve everyone
9:24
even more safely and consistently
9:26
and expertly than most humans
9:28
can. So let's
9:30
get into exactly what Tough Day does.
9:33
Walk us through how an experience would
9:35
look like for a person using Tough
9:37
Day. Yeah. So first, we
9:39
have created both generative and
9:41
agentic AI to support workers.
9:44
And it is a B2B sale.
9:46
So we're selling to organizations. And
9:48
essentially, the core product has
9:51
already been fine -tuned on
9:53
great management HR, employment law,
9:55
and other kinds of content.
9:57
And then in addition to
9:59
that, we're ingesting the company's
10:01
information. So their employee
10:03
handbooks, their strategy, their values,
10:05
their... organizational wisdom, learning and
10:07
development, anything that the organization
10:10
would use to onboard an
10:12
employee, we would say use
10:14
that to onboard our AI. And
10:16
over the course of a week, that
10:19
AI gets really smart about that organization.
10:22
The point is that once it's really
10:24
home to the organization, then they
10:26
invite their people to engage with our
10:28
AI, which we call Tuffy, and
10:30
that was named by users. So an
10:32
employee will get an email that
10:34
says, Hey, We'd like to
10:36
offer you the opportunity to talk
10:38
with Tuffy. Click your Go set up
10:41
an account. They sign up
10:43
though with their personal email
10:45
and their personal phone number, their
10:47
personal device and we tell
10:49
them and the organization tells them.
10:52
Please do this so that you
10:54
have complete control over all of
10:56
your own data. This is a
10:58
safe place where you can go
11:00
have a conversation about anything and
11:03
not worry about saying things the
11:05
wrong way or asking a dumb
11:07
question. It's really very much a
11:09
place where they can feel comfortable
11:11
with whatever is going on. I
11:14
think this is really important because
11:16
trust is at the center of this.
11:18
We're going to come back to
11:20
trust a little later on. But if
11:22
I have any inkling that the
11:24
company might get access to my conversation
11:26
with Tuffy, I'm just not
11:28
going to go to it. So
11:30
I think it's really smart that you
11:32
did that outside of the company,
11:34
that you could do this just kind
11:36
of on your own personal device. Yeah,
11:39
absolutely. And we're really
11:41
providing the trust to both
11:43
the employer and the
11:45
employee. First of
11:47
all, a lot of people in the
11:50
organization would never say anything. We know
11:52
that people don't necessarily tell the truth
11:54
on employee surveys, and they're afraid to
11:56
say what's really going on. And
11:58
if they did any of this the
12:00
company's devices, they would have access to that
12:02
data, whether they were looking at it
12:04
or not, which just creates an error of.
12:07
concern. Let's just call it that. And
12:10
for the organization, they want to be
12:12
able to understand what's going on, but
12:14
they don't want to have the whole
12:16
mess exposed to them either. That's a
12:18
danger to the organization. It's
12:20
a risk. So for them, having a
12:22
bit of a barrier and saying, go
12:24
here and have the conversations you need
12:26
to have is actually very helpful for
12:28
the organization. Can you share
12:31
some of your key learnings? Look,
12:33
how are people using Tuffy?
12:35
What are they going to it
12:37
for? So first, I
12:39
will say about a third
12:41
of the challenges are
12:43
really policy related, you know,
12:45
explain our how FMLA
12:47
works, or I don't understand
12:49
how how to get
12:51
my corporate card, all the
12:53
kind of operational things.
12:55
Do I get a lactation
12:57
break? Things like that. It
13:00
can get more complicated. You know,
13:02
what are the gender rules now, especially
13:04
with a new administration? What
13:06
are the bathroom rules? So there's
13:08
a lot of interesting. change
13:10
happening where each organization can
13:12
define how do they want
13:14
to address those issues. And
13:16
the one example I use
13:18
often is nepotism. So in
13:21
a family -owned business, nepotism
13:23
is expected and positive. And
13:25
that kind of organization would want
13:27
to potentially explain that. And
13:30
in other organizations that's frowned upon or
13:32
against policy. So there's a customization
13:34
there and I think a lot of
13:36
employees have questions about things and
13:38
what is ethical, what is allowed, all
13:41
of that. We can
13:43
answer those questions very well,
13:45
very quickly. The other
13:47
two thirds of the issue
13:49
are the sticky and interesting
13:51
things. So you could think
13:53
about a manager relationship and
13:55
all the different types of
13:57
managers, maybe a micromanager, maybe
13:59
there's some toxic behavior, maybe
14:01
there's favoritism. Maybe someone
14:03
just has poor communication style
14:05
and choices. And I'll
14:07
just say the alternative there
14:10
often is an employee calling
14:12
a manager and saying, hey,
14:14
can i have five minutes of your
14:16
time can we have a quick
14:19
coffee i'm confused of course that would
14:21
normally happen but think about the
14:23
hours and hours of time that it
14:25
ultimately takes for managers to have
14:27
all of those conversations we're just saying
14:29
tuffy plays the role of being
14:31
the first point of contact when you're
14:33
confused or you're struggling come to
14:35
tuffy first so that when you Go
14:37
talk to your manager or you
14:39
go talk to someone else in the
14:41
organization. You're fully prepared and you're
14:43
focused on the right things. How
14:46
does tough day deal with
14:48
issues that have legal repercussions? So
14:50
for example, a sexual assault, what
14:53
would it say?
14:56
Yeah, so we are a communication
14:58
platform and not a reporting
15:00
platform. So I think this also
15:02
provides the workforce a safe
15:04
place to go. to talk about
15:06
what's going on. And
15:08
if people do come
15:10
in and there's something egregious,
15:12
someone was assaulted, the
15:15
AI will be very, very
15:17
curious and ask for more
15:19
context, collect more
15:22
information. Ultimately, diagnose
15:24
the situation as something that
15:27
should be reported, but we are
15:29
providing the agency to the
15:31
individual employees. So we want to
15:33
build their competence and their
15:35
confidence and help them understand why
15:38
it should be reported, why the company
15:40
really wants them to report it, how
15:42
they should report it, and what to
15:44
expect in the process. So
15:46
as an example, a lot of people will
15:48
worry, you know, if I report this, will
15:50
the other person be alerted? Well, the reality
15:52
is like, yeah, they're going to be part
15:55
of an investigation. But there's a
15:57
lot of ways that you can
15:59
engage with the employee and support them
16:01
in the process so that they
16:03
feel comfortable with that. And ultimately, they'll
16:05
make that decision. So
16:07
we can see the benefit for
16:09
employees. Why would a company
16:11
be incentivized to do this? Is
16:13
it mostly like cost effectiveness and just
16:15
cutting costs or is there, yeah,
16:18
like when you sell a tough day,
16:20
what is the value proposition to an
16:22
organization? Well, there is
16:24
a lot of efficiency. So
16:26
going back to that use
16:28
case of of the manager spending
16:30
a lot of time having
16:32
a lot of conversations and you
16:34
could probably spend 150 % of
16:36
your time answering questions or
16:38
taking the coffees and and having
16:40
some coaching sessions mentoring sessions. A
16:44
lot of what is discussed
16:46
can probably be handled by the
16:48
AI. So if you can cut
16:50
back 80 % of those questions,
16:52
that's a big savings. And
16:54
that's really the value prop. The
16:57
other value prop is retention. So
16:59
if you have these challenges sort
17:01
of festering and people are not
17:03
getting their problems resolved, even if
17:05
they're small, they tend to quiet
17:08
quit or take PTO. What's
17:10
quiet quit, by the way?
17:12
Oh, quiet quitting is when
17:14
you are doing the very
17:16
minimum in your job so
17:18
that you don't get fired,
17:20
but you're not very productive.
17:23
And it also results in regrettable
17:25
attrition, and often organizations don't
17:27
know until it's too late, often
17:29
in the exit interview that
17:31
something was going on. So
17:33
this is an opportunity to
17:36
serve people better listen and
17:38
learn from what's going on we
17:40
provide anonymized data to the
17:42
organization so that they use that
17:45
like a weather vane to
17:47
know where the problems in my
17:49
organization that they can go
17:51
address and as they do they
17:53
improve the employee experience ultimately
17:55
improve retention and. The cost
17:57
of replacing someone who attrits is
17:59
two times their salary. So you can
18:01
think of losing one good employee
18:04
that might be making $100 ,000 a
18:06
year, cost $200 ,000 a year. After
18:08
a short break, we talk
18:11
about the trend towards conscious
18:13
unbossing, popular with younger professionals.
18:16
And this new concept might just change
18:18
the workplace as we know it. Stay
18:21
with us. Stripe
18:35
helps many of the world's most
18:37
influential businesses grow their revenue and
18:39
build a more profitable business. Whether
18:42
it's Hertz making check out a smooth
18:44
ride for their customers, OpenAI
18:46
answering unprecedented demand, or
18:48
PGA chipping away at
18:50
back office inefficiency, Stripe's
18:52
financial infrastructure platform helps
18:54
companies achieve ambitious goals.
18:57
No matter what success looks like for your
18:59
business, Stripe helps ensure the
19:01
complexity of financial systems doesn't get in
19:03
your way. Learn more
19:05
at stripe.com. No
19:08
matter your industry, everyone's looking
19:10
to save money. But the best
19:12
leaders in finance aren't just
19:15
cutting costs, they're working hard to
19:17
drive growth. And Brex's modern
19:19
finance platform helps them do just
19:21
that. Brex offers the world's
19:23
smartest corporate card, banking, expense management,
19:25
and travel all in one
19:27
place. Over 30 ,000 companies
19:29
from scrappy startups to large
19:31
organizations are using Brex as
19:33
a competitive advantage, and you
19:35
can join them. Get the modern
19:38
finance platform that works as hard as
19:40
you do at brex.com. So
19:44
I want to zoom out. Tough Day
19:46
is coming out at this really interesting
19:48
moment when it comes to the future
19:50
of work. So I want to kind
19:52
of dissect some of the current trends
19:54
that you're seeing around the future of
19:56
work and how I guess Tough Day
19:58
is helping address some of these concerns
20:00
or some of these opportunities that these
20:02
trends are creating. So the first is
20:04
this whole idea of conscious embossing, yes,
20:06
which I only recently heard about. And
20:08
it's basically a trend where young professionals
20:10
like my daughter, she's about to graduate
20:12
from college, essentially don't
20:14
want traditional managerial roles and they
20:16
don't also want to have managers,
20:19
it tells more about that. Exactly.
20:21
What is that? Yes, 72 % of
20:23
Gen Z says that they are
20:25
consciously unbossing. And to your point,
20:27
they do not want to have
20:29
a manager. Okay. And they do
20:32
not want to be a manager.
20:34
And the reason is if you
20:36
dissect it, what has been modeled
20:38
for them by their managers is
20:40
burnout, stress. not
20:43
very enjoyable job number one but
20:45
also no matter how hard they
20:47
try a lot of them still
20:49
get fired their employee reviews are
20:51
bad they're just like why would
20:53
I even want to do that
20:55
and the truth of the matter
20:57
is. most
21:00
people who become people managers didn't do it because
21:02
they want to become people managers they did
21:04
it because it was the next sort of rung
21:06
on the ladder it's like what you have
21:08
to do to get promoted or what you have
21:10
to do to make more money. So
21:12
82 % of people are what
21:14
they call accidental managers and they
21:17
just don't have. the capabilities, or
21:19
frankly, many of them not the
21:21
interest. So Gen Z is saying,
21:23
you know, I'm happy to manage
21:25
myself. I can get the information
21:27
that I need. I know how
21:30
to network. I am very
21:32
open to feedback, and I can
21:34
get feedback from anyone. Why does there
21:36
have to be this formal structure? And
21:39
I have to say, just as we make
21:41
our way in the world with Soft Day, in
21:44
one -on -one conversations and
21:46
behind closed doors, everyone
21:48
agrees. The role of
21:51
the managers kind of outdated. And
21:53
great managers do exist. And
21:55
the problem with the great managers
21:57
is they don't scale. And
22:00
so if we could give them a
22:02
way to augment themselves and create their
22:04
sort of digital replica so that they
22:06
have got a partner, they can spend
22:08
their quality time having those coaching conversations
22:10
and great conversations that they should be
22:12
having. with their team
22:14
and then offset the rest
22:17
of it with an AI. That
22:20
will help improve the experience for
22:22
everyone. The other side of
22:24
the coin is the bad
22:26
managers or the mediocre managers who
22:28
are actually costing the company
22:30
time, talent, and money. Yeah,
22:32
we're just going to use Tuffy instead. So
22:35
the second kind of interesting trend that is
22:37
also very related to what Tough Day is
22:39
doing is this idea of the great flattening.
22:41
Yes. Tell us what
22:43
that is. Yes. So
22:45
organizations are flattening out the
22:47
middle layers. And
22:49
Gartner actually has some research recently
22:51
that says 20 % of companies
22:53
are getting rid of 50 % of
22:55
middle management in the next two
22:57
years. And I think
23:00
part of that is because
23:02
we've spent so much money trying
23:04
to train managers to be
23:06
good managers. And we're not
23:08
showing a return on that learning.
23:10
And organizations are looking
23:12
for other answers. So
23:15
I think, you know, it might
23:17
be a little bit challenging right now,
23:19
but actually, if you think about
23:21
the real role of the manager, a
23:23
lot of those things can be
23:25
outsourced to AI. So
23:27
there's another shift happening specifically
23:29
in the HR landscape. A
23:31
LinkedIn survey found that HR had
23:33
the highest turnover rate out of
23:35
jobs that they tracked. Why
23:38
do you think that is the
23:40
case? And does that kind of
23:42
solidify the opportunity for a tough
23:44
day. I will say in
23:46
terms of HR, we're seeing
23:48
a few different things. One is,
23:51
know, everyone who goes into HR goes in,
23:53
I think they're people, people, right? Initially, you
23:55
think like, I'm going to do a good
23:58
thing. I'm going into this field. And
24:00
as you get into it, it
24:02
may or may not be your
24:04
cup of tea. It is largely
24:06
about protecting the organization. And
24:09
that's great. We need people to
24:11
design employee experiences and deliver services
24:13
and measure all of that. It
24:15
may or may not have been
24:17
why you got into HR in
24:19
the first place. So I think
24:21
there's some of that. I
24:23
think there's so much
24:26
stress and pressure on HR.
24:28
The typical first point of contact for
24:30
an employee is their manager. And
24:33
if that piece is broken, one
24:35
of the next calls is
24:37
hr so they're getting bombarded with
24:39
all of these challenges that are really
24:42
hard and emotionally draining so i
24:44
think there's some burn out there and
24:46
and then in terms of all
24:48
the solutions how do you solve for
24:50
all of this it's just it's
24:52
it's hard work. HR business partners are
24:54
the heroes of an organization. They're
24:57
doing lots of great work, but I
24:59
think depending on the state of
25:01
the organization and what resources they have
25:03
available, what their benefits are, every
25:05
organization has to figure out what's the
25:07
right recipe for their culture and organization,
25:10
and that might be part of it.
25:12
Now, we're also kind of in
25:14
this moment of time where
25:17
We're seeing top -down pushback on
25:19
diversity, equity, and inclusion efforts. We're
25:21
seeing this at the federal
25:23
level, of course, where President Trump
25:26
is seeking to end government
25:28
support for programs promoting DEI. We've
25:30
already seen a lot of
25:32
references to DEI taken offline on
25:34
federal websites. And this is
25:36
all being contested in the courts,
25:38
of course. It'll see what
25:41
happens. We have seen
25:43
a ripple effect already at companies like
25:45
Meta and Google where they've rolled back
25:47
some of these efforts. I
25:49
would imagine that a lot of the
25:51
concerns that show up in the workplace
25:53
are related to equity at work and
25:55
inclusion. What do you think
25:57
of all of that? And again, how can
25:59
Tough Day help? On the
26:01
topic of DEI, when an
26:04
organization works with Tough
26:06
Day, We're tracking what is happening
26:08
in the federal government, and we're tracking
26:10
those trends. And this is an area
26:12
where every organization has to do a
26:14
bit of their own way of finding.
26:17
At the federal level, there
26:19
are things going on, but
26:21
then also at the state
26:23
level. So state by state,
26:25
there's new AI law. There's
26:27
also new DEI -related regulations. Right
26:31
now or at a point where
26:33
organizations have to navigate a lot of
26:36
that on their own and you
26:38
see companies like Costco and Apple and
26:40
Microsoft doubling down on DEI and
26:42
saying, you know, this is
26:44
super valuable for our organization is
26:46
just part of our values and
26:48
we have to, we have to
26:50
invite diversity into the organization to be
26:52
a strong resilient organization and to
26:54
serve our customers best. And
26:56
then on the other end of the spectrum,
26:58
you have. organizations saying,
27:01
you know what, I just don't want
27:03
to fight this machine. Like some
27:05
are saying we don't have to call
27:07
it DEI. Like we love the
27:09
diversity and what have you, but maybe
27:11
we've moved beyond this language. So
27:13
I think there's some of that where
27:15
they're kind of in the middle.
27:17
And then you have organizations just saying
27:19
like, it's all out the window.
27:21
So when we work with companies, we
27:23
ask them, how do you want
27:25
to handle things like bathroom issues? And
27:28
there were customizing the experience
27:30
and the answer for each
27:32
organization. Even
27:34
though a lot of the
27:36
inspiration for Tufte came
27:38
from looking at underrepresented groups
27:40
and their challenges, we
27:42
continue to see that none
27:44
of these issues are
27:46
just about one population. And
27:48
it just doesn't matter who the person is.
27:50
I think it matters. How
27:53
you handle the situation and ultimately
27:55
we want to help people
27:57
have better communications and interpersonal relationships
27:59
and working relationships and that's
28:01
the point. But to
28:03
do this tough day needs to
28:05
train toughy with each company's values
28:07
and cultural norms. So
28:09
how do they do that and how
28:11
are they any different than say
28:13
Microsoft's copilot. We get to that after
28:15
a short break. Convert
28:23
more shoppers into buyers with
28:25
Stripe, the financial infrastructure platform
28:27
to help grow your revenue.
28:30
Businesses saw an 11 .9 %
28:32
revenue uplift on average when
28:34
they used the Stripe Optimized
28:36
Checkout Suite. Optimizations like
28:39
intelligently surfacing the right payment
28:41
methods to each customer, thanks
28:43
to Stripe's machine learning, which is
28:45
trained on trillions of transactions
28:47
globally. This means more
28:49
revenue for you and a better experience
28:51
at checkout for your customers. See
28:54
what Stripe can do for your
28:56
business at stripe.com. If
28:59
you're building a small business, you
29:01
know the questions never stop. Join
29:03
us for the latest installment of
29:05
Masters of Scale Strategy Session, the
29:07
small business playbook. A free
29:10
live virtual event presented in alliance
29:12
with Capital One Business. On
29:14
May 15th, our session is co -hosted
29:16
by leaders who know how to
29:18
scale with impact. Amy Errett,
29:20
a four -time entrepreneur and CEO
29:23
of Madison Read, and Sarah
29:25
Rob O 'Hagan, business innovator and
29:27
best -selling author of Extreme U. They'll
29:29
be answering real questions from business
29:31
owners like you, Let's
29:46
go behind the scenes and kind
29:48
of geek out for a bit
29:50
about how you've built Tuffy. So
29:52
the first question I have is, why
29:55
can't just a company, you know,
29:57
takes, I don't know, say a Microsoft
29:59
co -pilot and feed it its own
30:02
policies and employment law and just
30:04
basically replicate what you guys have done.
30:06
So just talk to us a
30:08
little bit about the technology behind the
30:10
scenes and what is your competitive
30:12
mode and differentiator against say a Microsoft
30:15
co -pilot. Yeah, so number
30:17
one, I do think coming back
30:19
to the issue of trust, it's
30:21
really hard to have any of
30:23
these conversations internally and know that
30:25
your organization can see the data.
30:28
You just can't have that
30:30
level of trust with
30:33
an employee on the stickiest,
30:35
trickiest kinds of challenges. So
30:38
I don't think Microsoft Copilot
30:40
is going to be able
30:42
to have that level of
30:44
relationship and trust. So
30:46
that's one. Number two, we
30:49
have partnerships with a lot
30:51
of content partners that have gated
30:53
content. They will not allow
30:55
the big LLMs, including open AI
30:57
or Plexity or with co -pilot.
31:00
They just don't have the
31:02
content. What are examples of that?
31:04
So one of my favorite examples
31:07
is Charter. CharterWorks.com
31:09
is kind of an up
31:11
and coming thought leader. They're
31:13
a bunch of journalists from
31:15
Wired Magazine and Wall Street
31:17
Journal, New York Times, The
31:19
Atlantic. These are serious professionals
31:21
that all they do is research and
31:23
report on work. And they
31:25
have amazing content. And we have
31:27
a partnership with them. So we
31:30
have all of their content that
31:32
is subscription based. So that's
31:34
an example of how we
31:36
make our platform smarter. Yeah.
31:38
Does Tuffy have a
31:41
personality? Is it like culturally
31:43
sensitive or culturally specific? Yes.
31:47
Yes. So first,
31:50
Tuffy has values and Tuffy
31:52
is very curious. So
31:54
our first phase of development, we did
31:56
what's called Wizard of Oz testing. Can
31:59
you explain what that is for people
32:01
who are not familiar with it? That's
32:03
cool. So we had sitting around a
32:05
table at any given time during this
32:07
process. We would have five or six
32:10
experts, always a lawyer, an
32:12
HR person, a manager, and
32:14
a therapist. And we asked
32:16
early testers to interact with the
32:18
platform so that they would come
32:20
in and ask questions. We told
32:22
them, it's not really an AI.
32:24
You'll interact with it like AI,
32:26
but we actually have these human
32:28
experts behind the scenes. That's the
32:30
Wizard of Oz piece, right? Yeah,
32:32
the Wizard of Oz. And the
32:34
really cool thing is most people
32:36
forgot they were actually talking to
32:38
humans. What happened for all of
32:41
us sitting around the table is
32:43
that we would decide very quickly
32:45
who was the best person to
32:47
answer this question. and or
32:49
tackle it. And what that meant was usually
32:51
that person was asking five or six
32:53
different questions to really understand. So just like
32:55
if you go to a lawyer and
32:57
you ask a question, they're not giving you
32:59
advice at first. They're going to ask
33:01
you more questions. That is
33:03
really how we develop
33:06
Tuffy to be curious
33:08
and get more context.
33:11
And then diagnose what the situation
33:13
is because so many people
33:15
will ask one thing and they're
33:18
asking the wrong question or
33:20
they don't really know what the
33:22
issue is. So, so I
33:24
will say Tuffy is generically very
33:26
curious and. wants to understand
33:29
a situation and bring empathy to
33:31
the dialogue. But
33:33
in terms of your question
33:35
about cultural customization, when
33:37
working with the Hawaii Employers
33:39
Council, which is
33:41
a network of 700
33:43
companies all based in
33:45
Hawaii and that represents
33:47
170 ,000 workers and HEC
33:49
is using Tough Day.
33:52
to join its HR consulting team
33:54
and provide advice to those employees,
33:56
but also as a go -to -market
33:58
partner. It's interesting. We started down
34:01
the path and we got some feedback
34:03
that maybe Tuffy sounded a little too
34:05
New York. And they
34:07
asked, could we make
34:09
Tuffy seem more Hawaiian?
34:12
And that was that language. That was
34:14
really culture and value. So they asked
34:16
us if we could apply, quote, a
34:18
lo -ha spirit to our AI. How
34:21
did you do that? So
34:23
first, they did give
34:25
us redacted transcripts
34:27
of conversations between really
34:29
talented and successful
34:31
HR leaders, managers, what
34:33
have you. So
34:35
we could understand the
34:37
cultural tone of
34:39
the dialogue and After
34:41
we did customize the experience, users
34:43
noted it right away. We got
34:46
great feedback in the platform. And
34:48
our helpfulness rating went from
34:50
88 % to 99 .6%. And actually
34:53
back to kind of this,
34:55
because I'm always kind of thinking,
34:57
OK, what is a company's
34:59
competitive mode? That kind of
35:01
training and that kind of data is
35:03
extremely unique. Like you can't scrape the internet
35:05
for this kind of data. And that's
35:08
very powerful. So one of
35:10
the things that we're very passionate
35:12
about on this podcast is how
35:14
do we build responsible and trustworthy
35:16
AI? And
35:18
you're in a very tricky
35:20
space, right? Like people's
35:23
careers are at stake, people's
35:25
livelihood are at stake. And
35:27
a recent Pew study found that
35:29
52 % of employees say they're worried
35:32
about the future impact of AI. use
35:34
in the workplace. So how
35:36
are you building trust
35:38
into Tuffy? Yeah. Well,
35:41
first, I believe, we
35:43
all at Tough Day believe
35:45
that AI is a more human
35:47
positive solution, often than a
35:50
human. And I will say,
35:52
I believe in humans, don't get
35:54
me wrong, I'm a human
35:56
-centered person. But I will say,
35:58
Humans are messy. Humans have
36:00
bad days. Humans can make
36:02
mistakes and do make mistakes.
36:04
But with the AI, we
36:07
want to make sure that the
36:09
AI is not biased, that the
36:11
AI has safety guardrails, that the
36:13
AI has a hallucination defense system
36:15
in certain circumstances, especially when you're
36:17
calling on a policy or a
36:20
law or what have you. You
36:22
want to make sure this is
36:24
not AI making it up. It
36:26
is specific. That's part of
36:28
it. The other part is
36:30
all that red teaming and knowing
36:32
where all the pitfalls can be.
36:35
So there's a lot of scenario
36:37
planning to think through what could
36:39
possibly happen. You want to
36:41
think through what are all the things
36:43
that could go wrong and build the
36:45
right knowledge and guardrails into the product?
36:48
Yeah. So final question. I
36:50
am very passionate about this idea of
36:52
human -centric AI. So AI that It's
36:54
going to help unlock human potential.
36:56
It's going to augment and amplify our
36:58
abilities, not replace us. But I
37:00
do spend a fair amount of time
37:02
thinking about this following question. In
37:05
this age of AI, where AI can
37:07
be patient and curious and empathetic like
37:09
Tuffy, what does it mean to
37:11
be human? I
37:13
do think being human, if
37:16
you think about all the human needs,
37:18
like even Maslow's hierarchy of needs, the
37:22
positive part about having AI
37:25
involved is a lot of
37:27
our physiological needs and even
37:29
I think community needs in
37:31
terms of interpersonal skills and
37:33
understanding one another and being
37:35
curious and all of that,
37:37
we can accelerate or kind
37:40
of meet those needs faster.
37:42
That frees us up for
37:44
those higher level needs of
37:46
learning and innovation and problem
37:48
solving. And so I think
37:52
The communication and community is
37:54
a human thing, but I
37:56
think it's going to
37:58
be accelerated and improved by
38:00
technology, including AI. I
38:02
do think we might actually
38:04
find ourselves being more empathetic because
38:06
we have relationships with AI
38:08
that are empathetic. I also think
38:11
there's another interesting thing in
38:13
talking with neuroscientists. When humans
38:15
are under stress, our brains
38:17
actually shrink. So if we
38:19
think about what's happening to
38:21
people in the world of work
38:23
or even in our communities,
38:25
the more stressed out we are,
38:27
the less capable we are
38:29
to use the most creative parts
38:31
of our brain. And I
38:33
think what we'll see is that
38:35
this technology by solving some
38:38
of the things that cause us
38:40
stress is going to make
38:42
our brains actually grow and be
38:44
more intuitive and more empathetic
38:46
and more creative and innovative to
38:48
solve the biggest problems in
38:50
the world. Like, I'm absolutely a
38:52
human optimist. That's
38:55
the perfect way to end our
38:57
conversation. AI that is going to help
38:59
expand our brains like literally. Thank
39:02
you for joining us, Catherine. This
39:04
was awesome. Thank you. Thank you so
39:06
much, Rana. Such a pleasure. One
39:10
of my key learnings in starting
39:12
and scaling my company, Affectiva, was
39:14
that the biggest challenge was actually
39:16
not creating Emotion AI, but
39:18
it was dealing with people and all
39:20
of the human messiness that comes
39:23
with it. What I love about Tough
39:25
Day is that their AI platform
39:27
handles that human messiness in a safe
39:29
and supportive way. They
39:31
can manage these issues before they really
39:33
escalate, which allows professionals to focus
39:35
on the work. Often
39:37
the conversation about AI and the
39:39
future of work is focused on job
39:41
loss, but the conversation shouldn't stop
39:43
there. It's so important for
39:46
us to ask, how can AI augment
39:48
our abilities at work What are
39:50
the areas of work that slow us
39:52
down, and where would we rather
39:54
be focusing our energy? One
39:56
of these areas is management, but there
39:58
are so many other opportunities. From
40:01
scheduling and automating bureaucratic tasks, to
40:03
maybe even putting in lunch orders
40:05
before you get hangry. AI
40:07
is still evolving, which means that
40:09
we have the opportunity to shape it. How
40:12
do you want to see AI work for
40:14
you on the job? Leave
40:16
us a voicemail at
40:19
601 -633 -411. That's
40:22
601 -633
40:25
-2424. Next
40:27
week on Pioneers of AI, we
40:29
take you live to the Abundant
40:31
Summit Stage in LA, where
40:33
I spoke with some leading investors
40:35
on the crazy -maze evaluations of
40:37
AI companies, what that means
40:40
for venture funding, and where investors
40:42
should place their bets in AI.
40:48
Pioneers of AI is a wait
40:50
-what original. Our
40:53
executive producer is Yves Tro.
40:55
Our producer is Rachel Ishikawa. And
40:57
our associate producer is Jordan
40:59
Smart. Our senior
41:02
talent executive is Stephanie Stern. Mixing
41:04
and mastering by Ryan Pugh.
41:07
Original music by Ryan Holiday. And
41:10
our head of podcasts is Litao
41:12
Moulin. You can join
41:14
the conversation on LinkedIn, Instagram,
41:16
TikTok, YouTube, and X. Just
41:19
search for at pioneers of
41:21
AI. Thanks so much for listening.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More