Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:05
Welcome back to the Sales for
0:07
Admin's podcast. Today, we're catching up with
0:09
Momet O-run, longtime friend of the pod
0:11
and true expert in data and AI.
0:14
I'm going to tell you, a lot
0:16
has changed in the world of artificial
0:18
intelligence since our last chat. Momet's here
0:20
to break it down from
0:22
hallucination risks to the role
0:25
of data cloud in creating
0:27
trustworthy AI experiences. If you've ever
0:29
been wondering how to make your
0:31
data more meaningful and your AI
0:33
outputs more reliable, well. You are
0:35
in for a treat. So make
0:37
sure to follow the podcast so
0:39
you don't miss a single episode.
0:41
And with that, let's get Mamet
0:43
back on the podcast. So
0:46
Mamet, welcome back to the podcast.
0:48
It's wonderful to me, Mike,
0:50
Mike. I know. You just come
0:52
by with all these wisdoms and
0:54
knowledge that you have in the
0:56
world. Last time we were on,
0:58
and I'll link to that show,
1:00
we were talking about hallucination risks.
1:02
And it's been a year, and
1:04
boy, I tell you, a year
1:06
in AI time, everything's changed. So
1:08
what's new in your world? What
1:11
are you paying attention to in
1:13
terms of AI and agent force?
1:15
To be honest, one of
1:17
the interesting things about having
1:20
been around the while is while
1:22
the technologies are new,
1:24
our overall objective haven't really
1:27
changed. And one of the
1:29
things I've been really trying to...
1:31
look back to us what
1:33
were past challenges we overcame,
1:35
what were the parallels, and
1:38
what were some of the
1:40
best practices that people newer
1:42
to the field around day-day
1:44
integration, artificial
1:46
intelligence may not
1:48
know about, though we can
1:50
share this knowledge while absolutely
1:52
picking up new ways of doing
1:55
things also, because we
1:57
definitely have new tools under our
1:59
belt. a organized way to assess
2:01
what may cause hallucination risk
2:03
and mitigating it has been
2:05
a truly hot topic. I
2:07
have been visiting old friends,
2:09
making new friends as I've
2:11
been traveling across different sports
2:14
events as well. And the
2:16
good news is people are
2:18
excited about the potential. People
2:20
are also excited about having
2:22
tangible methods. They can take
2:24
back to their organization. I'm
2:26
looking forward to sharing some
2:28
of these with you today.
2:30
Yeah, I mean, I get
2:32
a whole lot of... It's
2:34
interesting, you look at some
2:36
of the stuff that's out
2:38
in the world, and the
2:40
spectrum for people looking at
2:42
what's going on with AI
2:44
goes all the way from
2:46
everything that it says is
2:48
right to nothing that it
2:50
says it's right, and somebody
2:52
falls somewhere in between there.
2:54
But I feel like, you
2:57
know, I did podcasts in
2:59
2024, early 24, I think
3:01
in 23, even talked about
3:03
hallucination. The one thing that
3:05
it kind of came back
3:07
to, it seems to have
3:09
gone away, because I think
3:11
more of the conversation is
3:13
around the quality of the
3:15
data and what we're feeding
3:17
agent force and getting your
3:19
data ready. Am I right?
3:21
So given you mentioned 2324,
3:23
a lot of stuff was
3:25
a long time ago. Yeah,
3:27
in the AI world, right?
3:29
Uh-huh. A lot of the
3:31
hallucination risk conversations that were
3:33
happening, and this was mostly
3:35
around Czechy PT, was because
3:37
the information that was available
3:40
was up to a particular
3:42
date. It was predominantly unstructured
3:44
data available on the internet.
3:46
So if something was published
3:48
past a certain date, it
3:50
was not going to show
3:52
up in answers. One of
3:54
the big changes I think
3:56
in the ecosystem is... In
3:58
the past, we talked about...
4:00
IT solutions, data warehouses, analytics,
4:02
which was separate than marketing
4:04
segmentation and engagement. And then
4:06
we had these, you know,
4:08
really interesting LLM and generative
4:10
AI for the past several
4:12
years, several months, it feels
4:14
like years. The focus is
4:16
the idea of a truly
4:18
enterprise scale data platform that
4:21
can power automation, that can
4:23
power analytics, that can power
4:25
analytics, that can power analytics,
4:27
that can power analytics, that
4:29
can power analytics, that can
4:31
power analytics, that can power
4:33
analytics, that can power analytics,
4:35
that can power analytics, that
4:37
can look at structured and
4:39
unstructured data in order to
4:41
provide complete, compliant, and contextual
4:43
information that can also power
4:45
AI. I know we bought
4:47
like storytelling. I had a
4:49
really interesting experience with my
4:51
father a couple of weeks
4:53
ago. Do you mind if
4:55
I tell that story? Oh,
4:57
please tell me. I love
4:59
a good story. She's a
5:01
90-year-old retired Burgadier General. It's
5:04
a notary engineer. 90 years
5:06
young, you mean? Oh man,
5:08
I still barely keep up
5:10
with him. See, that's what
5:12
I'm saying. And as a
5:14
military engineer, you're always given
5:16
a mission and you have
5:18
what you have, right? That
5:20
is the typical mindset. And
5:22
in every country and every
5:24
place people are talking about
5:26
artificial intelligence, what it may
5:28
mean, and he said, okay,
5:30
look, I think this is
5:32
your field. Help me understand
5:34
what is new versus what
5:36
he was working with. in
5:38
older computing days. And why
5:40
are people worried? Why are
5:42
people excited? So I sat
5:44
next to him. We brought
5:47
up CHATGPT and I asked
5:49
a series of three questions.
5:51
The first question was, I
5:53
said, tell me what you
5:55
know about, you know, my
5:57
dad's name, Sunday or a
5:59
retired engineer, a retired soldier,
6:01
not even rank, and it
6:03
gave... What rank he retired
6:05
at, what branch of the
6:07
military, where he went to
6:09
school? Simple question, limited. context,
6:11
I said what else do
6:13
you know about them? I
6:15
said him without the name
6:17
and I got information about
6:19
article fields written for magazines
6:21
in a couple of his
6:23
books, post-retirement he did poetry
6:25
which is a wonderful way
6:27
to retire and he's like
6:30
oh it's interesting how does
6:32
it know that I'm like
6:34
well can people find your
6:36
books in online storage? It's
6:38
like yes though it is
6:40
available information it can leverage
6:42
all of these as it
6:44
searches. It's like okay that
6:46
it makes sense. Then I
6:48
asked a question What do
6:50
you know about his son?
6:52
And Chechie Petee says, I
6:54
do not know who his
6:56
son is. And he's like,
6:58
so why doesn't it know
7:00
we are related? And I
7:02
said, because the fact that
7:04
you and I are related
7:06
would be in a government
7:08
database. It would not be
7:10
in public records. It's not
7:13
something that's on the internet.
7:15
And for him, this was
7:17
an obvious separation. So you
7:19
asked the question. This is
7:21
a long-winded way of getting
7:23
there, perhaps. What have changed?
7:25
What have changed? A year
7:27
ago, I could dump a
7:29
bunch of knowledge articles, or
7:31
perhaps a meeting transcript and
7:33
say summarize, or I could
7:35
use knowledge articles to power
7:37
chat bots. Now I can
7:39
look at what do I
7:41
know about a person in
7:43
their transactional context based on
7:45
their order history, based on
7:47
their case history, based on
7:49
their knowledge of the product.
7:51
and I can give much
7:53
more person-lised recommendations because the
7:56
AI platform idea as opposed
7:58
to an LLLM technology idea
8:00
is bringing together matching technology
8:02
where we used to think
8:04
about as duplicate management and
8:06
CRM, right? That mindset has
8:08
evolved and it is there
8:10
to provide contextual interactions. DataCloud
8:12
is not just powering the...
8:14
generative AI capabilities for Agent
8:16
Ford, it is also providing
8:18
the unified insights that can
8:20
even... be constrained to only
8:22
what a person is supposed
8:24
to know about where admins
8:26
and architects can control this
8:28
given the permission model and
8:30
capabilities of flow, which for
8:32
me is incredibly exciting because
8:34
that means we can deliver
8:37
more value, we can use
8:39
the technology we are already
8:41
deeply familiar with, and we
8:43
can show the true potential
8:45
of AI while minimizing risk
8:47
to our organizations. and minimizing
8:49
confusion for our end users.
8:51
That's a fabulous story. I
8:53
feel you're spot on. Just
8:55
the level of understanding why
8:57
and what we have available
8:59
to us is huge. In
9:01
the email you sent me,
9:03
I want to pull out
9:05
a sentence because we're talking
9:07
about data and we're talking
9:09
about a lot of things,
9:11
but I think I feel
9:13
this is a good foundation.
9:15
You said, part of this
9:17
helped them realize why historical
9:20
CRM data management techniques do
9:22
not scale versus benefits of
9:24
data cloud. To the uninitiated,
9:26
and I'm one of them,
9:28
so I'm asking this question
9:30
for me, can you give
9:32
me what you mean by
9:34
historical CRM data management techniques
9:36
and help me understand that
9:38
versus the benefits of data
9:40
cloud? So if I think
9:42
about What is in the
9:44
Salesports Admin Data Management Toolkit?
9:46
We talk about a distinct
9:48
set of areas. We expect
9:50
admins to do. They configure
9:52
objects, object fields with validation
9:54
rules and some data management
9:56
rules such as do you
9:58
want to default value or
10:00
not, if it's required or
10:03
not. We talk to them
10:05
about duplicate management rules. which
10:07
led the impression that all
10:09
the fociates are bad and
10:11
we talk about storage optimization
10:13
more around performance because in
10:15
every arc had a storage
10:17
limit, you wanted to think
10:19
about when you may want
10:21
to offload storage either for
10:23
cost savings or build like
10:25
skinny tables for large data
10:27
volume handling. Those were the
10:29
domains of data management we
10:31
got to, which was fairly
10:33
technical focused on mostly data
10:35
entry operations. Let's fast forward
10:37
to even two years ago,
10:39
if you have a sale
10:41
for Sierra Morgue with Experience
10:43
Club. You need to have
10:46
intentional duplicate records because the
10:48
records and end-user maintain their
10:50
information should be separated then
10:52
how that customer's information is
10:54
maintained by employees. You may
10:56
also have records maintained by
10:58
partners. using Experience Cloud that's
11:00
still about the same customer.
11:02
So already thinking that for
11:04
a customer they should have
11:06
one record is no longer
11:08
sufficient and acceptable because partners
11:10
need to have their view
11:12
of the information, customers want
11:14
to maintain their own perspective
11:16
of what they're called what's
11:18
their best contact information, and
11:20
companies want to be able
11:22
to have their internal view
11:24
as well, such as... in
11:26
a customer segment, customer risk,
11:29
so on and so forth.
11:31
But personalized engagement requires a
11:33
complete understanding of what's happening
11:35
with an organization where you
11:37
only act on information you're
11:39
allowed to see and you
11:41
act on insights that is
11:43
relevant to the outcomes you
11:45
want to achieve. So three
11:47
things I really, really like
11:49
that data cloud brought in.
11:51
to our solution kids is
11:53
first I can provide the
11:55
holistic understanding of the individual
11:57
or a business contact even
11:59
So I have multiple contact
12:01
or lead records in my
12:03
CRM, even in this simplest
12:06
of architectures. Let's talk
12:08
about a non-profit example. Let's
12:10
say that, you know, we're
12:12
talking to Sam Ms. And Sam
12:14
is a donor. Sam was a
12:16
board member. Sam worked for an
12:18
organization that gave us grants. That
12:20
is us interacting with Sam
12:23
Dehuman into a business context
12:25
and in a donor relationship.
12:27
We are going to want to
12:29
track these through different departments,
12:32
probably through different records. But
12:34
when we want to know what do
12:36
we know about the people we engage
12:38
with, how do we send them a
12:40
person lives? Thank you. This is
12:43
where DataCloud powers that
12:45
unification perspective. Does that example
12:47
make sense before I tie to
12:49
the AI specific examples that extends
12:52
this? Yeah, no, it does. I'm following
12:54
along. So let's say that we
12:56
are now in a data
12:58
model that we have accepted
13:00
we should maintain contextual transactions
13:03
in our business applications,
13:05
whether we have one or multiple
13:07
CRM works and of course other
13:09
systems. We first unify it
13:12
around individuals business context
13:14
and accounts for now
13:17
related transactions, related emails.
13:19
donation history from external
13:21
systems or cases regardless
13:23
of your industry can
13:26
come together in one umbrella. Now,
13:28
if I want to create a
13:30
personalized thank you message, we can
13:32
look at overall interaction history
13:34
and not just think that we
13:37
have seen someone for the first
13:39
time because they are using their
13:41
new email address in their new
13:43
corporate role, but they've been
13:45
a lifetime member. So, generative
13:47
AI solutions work better. When
13:49
interactions across a person's
13:51
contact points can be
13:54
made accessible, wooden compliance
13:56
rules of course, and agentic
13:58
solutions work better when
14:01
it can understand what are
14:03
all of the different type
14:05
of transactions that may be
14:07
associated to an individual or
14:09
an account, even when they
14:11
are distributed across multiple accounts
14:13
records, multiple contact records, even
14:15
multiple theorem orgs. You know,
14:17
you can see me, I
14:19
am pointing to things in
14:21
on the whiteboard in front
14:24
of me, but this is
14:26
something that used to take
14:28
organizations months is not years
14:30
to put on place. And
14:32
having done this now, like
14:34
for real, with a few
14:36
non-profits as part of my
14:38
pro bono work, I know
14:40
we can do assessment and
14:42
planning in a few days.
14:44
We can then onboard the
14:46
data and configure data clouds,
14:48
data unification capabilities in less
14:51
than a month. And that
14:53
includes identifying bad data that
14:55
is in the system in
14:57
A&A.com. They are still present,
14:59
whether you're August 3 years
15:01
old or 20 years old,
15:03
by filtering out irrelevant data,
15:05
by putting directly the standardizations
15:07
in place. These are all
15:09
part of a single umbrella
15:11
of capability, where as an
15:13
admin, you just worked with
15:16
the admin tools in the
15:18
past, and now many of
15:20
these transformation capabilities, configurable rules,
15:22
are accessible, fill under the
15:24
setup tree, fill under the
15:26
Salesforce tabs, that allows us
15:28
to be... more productive field
15:30
sports professionals and allows us
15:32
to decrease the total cost
15:34
of ownership as we support
15:36
our organizations. I mean, I've
15:38
always thought when I've asked
15:41
people a rhetorical question, what
15:43
is the most important thing
15:45
that your company owns? And
15:47
99% of the time when
15:49
I ask people that question,
15:51
they get it wrong because
15:53
they mention a patent or
15:55
a brand. or a
15:57
product that they produce. And I.
16:00
say no, it's your data. The
16:02
data that you have is the
16:04
most important thing for you to
16:07
take care of. And ironically, it's
16:09
also the most, least paid attention
16:12
to because we just throw things
16:14
in and we'll sort it and
16:16
figure it out later, right? Hurry
16:19
up, move on to the next
16:21
thing. And now, as you bring
16:23
up, the unification of all these
16:26
systems, or we've put all this
16:28
data, and the management or mismanagement
16:31
of it now is the vital
16:33
importance because now we can truly
16:35
link all of this information and
16:38
have AI sort through it and
16:40
give us the relevant information that
16:42
we need by just thinking through
16:45
a few more processes. I think
16:47
what's important and what you said
16:49
is AI is additive to what
16:52
we have had because I agree
16:54
data is the most important asset
16:57
and the fact that No organization
16:59
I've ever been a part of
17:01
or helped had perfect data is
17:04
something we just need to accept
17:06
but not live with. Right. I
17:08
have a friend that has a
17:11
small marketing agency. He probably has
17:13
200 people in his little CRM.
17:15
I promise you his data isn't
17:18
good. Even that, right? I mean,
17:20
nobody's got perfect data. So what
17:23
matters is, and this is what
17:25
we talked about last year is.
17:27
We can't assess data quality as
17:30
a technical concept. We can't just
17:32
look at what is in my
17:34
object. Is it good? Is it
17:37
not good? We always need to
17:39
look at data in context of
17:42
a business outcome. I think an
17:44
example I give often is how
17:46
much data you need to start
17:49
an opportunity is different than the
17:51
amount of data you need to
17:53
close on opportunity. What you want
17:56
to gather if you lost a
17:58
big opportunity is different than what?
18:00
you probably would ask people to
18:03
capture if you lost a small
18:05
opportunity. So these are all proportionate
18:08
to the business benefit, where I
18:10
don't think historically we did a
18:12
great job explaining as professionals, whether
18:15
we are admins, architects,
18:17
business, endless. But when it
18:19
comes to AI, because agentic AI
18:21
puts so much focus and emphasis
18:23
on use cases and the persona
18:26
we are empowering. If it is a
18:28
sales agent, we want to find
18:30
out what is the job a
18:32
sales agent is supposed to do,
18:34
what is the information they
18:37
need, what are the rules they
18:39
should follow, and whether you have
18:41
100 fields or 800
18:43
fields in your accounts
18:46
and opportunities objects, we
18:48
still need to look at what
18:50
data is reliable today.
18:52
Is that sufficient? If it is
18:54
not sufficient, we need to go
18:57
through some type of data improvement
18:59
process or when to look
19:01
at a different use case.
19:03
If we have sufficiently
19:05
reliable data, we need to look
19:08
at how do we ensure our
19:10
prompts both use data from
19:12
those fields that have
19:14
reliable data and sufficient
19:16
metadata? And also know
19:18
when a subset of records
19:20
don't have sufficient data
19:22
quality in those very same
19:25
fields. And then third, just
19:27
because it works today, we
19:29
shouldn't assume things are going
19:31
to be the same tomorrow
19:33
because processes are changing, configurations
19:36
are changing, people habits
19:38
are changing. So by monitoring
19:40
what's happening in our business
19:43
applications and catching deviations.
19:46
we can avoid unexpected bad
19:48
surprises also in the
19:50
flows. Honestly, these are things
19:52
with a time machine we should
19:54
have taught off and incorporated
19:56
into our automation flow into
19:59
our reports. the attention wasn't
20:01
there as much as it
20:03
is today, though people being
20:05
excited about AI, but we're
20:07
about to this nation risk,
20:09
is one of the best
20:11
things that happened to ensure
20:13
we can provide reliable data
20:15
for all types of decision-making
20:17
through Salesforce. Right. Well, I
20:19
mean, what do they say?
20:21
2020's hindsight. If you could
20:23
go back and know the
20:25
future. then you'd obviously plan
20:27
for it, but it also
20:29
creates opportunity for us to
20:31
be creative and corrective in
20:33
how we move forward, which
20:35
means that every solution you're
20:38
thinking of today moving forward
20:40
is going to look very
20:42
different than before Agent 4s.
20:44
You know, one of the
20:46
things I'm still noodling on,
20:48
and I'll probably spend a
20:50
few more years newtling, is
20:52
how do we make sure?
20:54
we can take better advantage
20:56
of unstructured data that is
20:58
the best majority of all
21:00
interactions. I remember being excited
21:02
about Einstein activity capture, which
21:04
was a few years ago,
21:06
and it's still an untapped
21:08
potential, but now we are
21:10
analyzing that data, we're incorporating
21:12
that data. The more we
21:14
can streamline the end-user experience
21:17
to capture information. know when
21:19
information may be missing in
21:21
complete, you know, potentially out
21:23
of date. So they can
21:25
improve it in a tactical
21:27
surgical way and then be
21:29
able to explain to them
21:31
why we're making certain recommendations
21:33
in AI assisted suggestions. I
21:35
think that's also going to
21:37
increase the confidence for agentic
21:39
experiences where humans are engaged
21:41
in a secondary level. Like
21:43
I know that. I would
21:45
like AI tools to give
21:47
me results I can believe
21:49
in when I'm directly engaging
21:51
first. Before I'm willing to
21:53
expose it to perhaps less
21:55
savvy or less aware of
21:58
my underlying processes and users,
22:00
I think a lot of
22:02
people are going to go
22:04
through the journey. So thinking
22:06
about process mapping, thinking about
22:08
testing strategies are also going
22:10
to be important considerations for
22:12
all of us. I mean,
22:14
that's the whole point is
22:16
to test, right? You want
22:18
something reliable. and to ask
22:20
why, I think the important
22:22
thing is people get things
22:24
wrong too. You know, we
22:26
sometimes look at some of
22:28
these technology solutions as infallible
22:30
as they're always perfect and
22:32
they're not. They're imperfect because
22:34
they're built by imperfect people.
22:37
But that doesn't mean that
22:39
you can't constantly iterate on
22:41
your solution. I remember long
22:43
time ago when I was
22:45
an admin, I feel like
22:47
it was Josh Burke or
22:49
it was another developer was
22:51
always like, you know, every
22:53
year I look at the
22:55
code I wrote for the
22:57
previous year and wonder, why
22:59
did I write it that
23:01
way? And it's because you're
23:03
a year smarter. 100 percent
23:05
agree. I have like 13
23:07
years of still sports presentations
23:09
in my tropics folder and
23:11
when I look at it,
23:13
it's fascinating to see what
23:16
is still true. And it's
23:18
interesting to see when some
23:20
recommendations have completely changed over
23:22
the course of the last
23:24
15 years. Because we are
23:26
learning, and I think we
23:28
need to be honest about,
23:30
look, yes, this was the
23:32
recommendation based on what we
23:34
knew. Here's what we learned
23:36
since, and here's why we
23:38
are recommending X today, that
23:40
is different. I think on
23:42
the other side, as professionals,
23:44
we need to remember. None
23:46
of us have all the
23:48
answers and what we knew
23:50
yesterday might have changed today.
23:52
So look, I love your
23:55
podcast. I love some of
23:57
the things that come out
23:59
of the various... blogs because
24:01
people share what they've learned
24:03
at a level of frankness,
24:05
including what we stopped
24:08
doing. And that is a fine off
24:10
of being learning humans. And
24:12
it's the best way to
24:14
be. Absolutely. How do we
24:16
learn to be better every single
24:18
day? Well, I feel like that's a
24:21
really good place to end
24:23
this episode on because... I always
24:25
want to learn more and I
24:27
appreciate you coming on the podcast
24:29
and helping everybody else learn more. It's
24:31
my pleasure. I know that, you know, there
24:34
are so many thoughts we can always get
24:36
into. I hope these sessions enable
24:38
more personal connections and if
24:40
you're listening to it and we
24:42
run into each other at an
24:44
event, let's grab coffee, let's talk
24:46
about data or life because we're going
24:48
to learn from each other, we will
24:50
make each other better and thank you
24:52
Mike for the opportunity. Absolutely.
24:55
So Mamet took us for
24:57
a ride from a
24:59
90-year-old general, his father, all
25:02
the way to data that
25:04
doesn't quite behave. And the
25:06
takeaway? Well, AI is like
25:08
a great intern. It's only
25:10
as good as the notes
25:12
you give it. So let's
25:14
feed it well and ask
25:16
better questions. But anyway, huge
25:18
thanks to Mamet for the
25:20
wisdom and the stories. If
25:22
you learn something today or
25:24
you just enjoyed the ride,
25:26
can you do me a
25:28
favor and just share the
25:30
podcast and spread the data love? Now
25:32
until next time, we'll see you in
25:34
the cloud.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More