66. Hassan Jan, Senior Audit Manager, Fidelity Canada

66. Hassan Jan, Senior Audit Manager, Fidelity Canada

Released Wednesday, 8th November 2023
Good episode? Give it some love!
66. Hassan Jan, Senior Audit Manager, Fidelity Canada

66. Hassan Jan, Senior Audit Manager, Fidelity Canada

66. Hassan Jan, Senior Audit Manager, Fidelity Canada

66. Hassan Jan, Senior Audit Manager, Fidelity Canada

Wednesday, 8th November 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Hello and welcome

0:00

to The Assurance Show.

0:03

We have a special guest with

0:03

us today, Hassan Jan, Senior

0:06

Audit Manager at Fidelity

0:06

Investments in Canada.

0:10

Hassan, welcome to the show.

0:12

Hi Yusuf, thanks for having me.

0:14

Why don't we start with

0:14

a brief overview of who you are,

0:18

where you've come from, career

0:18

wise, and where you are now.

0:21

Okay, sounds good. I am a CPA or an

0:23

accountant by trade.

0:27

So I went to school for accounting. And thereafter I joined one of

0:29

the big auditing firms, PWC.

0:34

And I was thrown into a

0:34

rotational program where

0:37

they said, you're going to learn how to code. And that was a big shock

0:39

and a surprise to me,

0:41

but I said, okay, and

0:41

I was open to learning.

0:45

So alongside my accounting

0:45

and auditing training, I

0:48

was learning how to code and

0:48

just quickly became really

0:52

interested in data analytics.

0:55

Over the course of my career,

0:55

I've just spent a lot of time

0:58

working in internal audit

0:58

departments, and specifically

1:02

using data analytics to

1:02

enhance those departments, do

1:05

some new things, do auditing

1:05

in different kinds of ways.

1:09

So right now I work for Fidelity

1:09

Canada, and that's basically

1:14

what I'm trying to do there. Just use data analytics, other

1:16

kinds of machine learning

1:20

to enhance our capabilities

1:20

and to enhance our practice.

1:23

Okay, do you want to tell us a little bit about Fidelity and what you do?

1:26

Sure. Fidelity, we manage people's

1:27

investments, whether you

1:30

are an institutional client

1:30

or you just want to make

1:34

investments, on behalf of

1:34

yourself or your family.

1:37

So we manage billions of

1:37

dollars worth of people's money.

1:40

more recently, we've

1:40

started to get into crypto

1:43

investments as well. So we have a new Bitcoin

1:45

offering, which is always

1:48

on people's radars or

1:48

in the news these days.

1:51

But that's the basic way I

1:51

can describe our business

1:54

is we manage people's money

1:54

and we make them money

1:58

by managing their money.

1:59

There'll obviously

1:59

be a range of use cases for

2:03

internal audit data analytics. Do you want to give us a

2:05

bit of flavor of the types

2:07

of use cases you've seen.

2:09

For sure. I kind of like to separate it

2:10

into, two buckets if you will.

2:14

So on the one hand we have

2:14

analytics which are used

2:17

for internal purposes. , a big use case that often comes

2:19

up is reporting to the board.

2:23

and on the other hand we

2:23

have data analytics that

2:26

we deploy on the audits. Or sometimes we use for

2:28

continuous monitoring.

2:31

I've been in few departments

2:31

and organizations over the

2:34

past 10 years and I've always

2:34

seen board reporting, a use

2:38

case or maybe even reporting

2:38

to senior management.

2:42

I find the use case often

2:42

looks something like this.

2:46

There's data sitting in

2:46

different systems and

2:51

management or the board

2:51

doesn't have an overall view.

2:55

Some picture or some metric. And so you can pull data

2:57

from those different systems,

3:00

aggregate it, and then

3:00

automate that process and

3:04

then present some kind of

3:04

metrics to senior management.

3:07

And I think that's also just

3:07

contingent on the fact that,

3:11

whether you have good systems

3:11

or not, I've worked with a

3:13

lot of organizations that have

3:13

data or audit data sitting in

3:17

different locations, and so

3:17

that's obviously, an easy use

3:21

case and probably more common

3:21

than I would like to see.

3:25

I'd like to see internal

3:25

audit shops using technology

3:28

in better ways but that

3:28

just happens to be the case.

3:31

Another use case that, we're

3:31

working on particularly

3:35

is continuous monitoring

3:35

for our AML department.

3:40

AML just refers to

3:40

anti money laundering.

3:43

Here in Canada, like in many

3:43

parts of the world, you have

3:46

to do some AML monitoring,

3:46

which basically means that you

3:51

have to monitor transactions. And if that's fidelity,

3:52

and if some of them are

3:56

suspicious, then we have to

3:56

report those to the regulator,

3:59

they will do a deeper dive. But our AML department

4:01

has a whole program

4:04

for AML monitoring. We audited them a couple

4:06

years ago and found a

4:09

few gaps in the program. And so we've developed a few

4:11

monitoring controls that we run

4:15

on a continuous basis, to help

4:15

them pick up some transactions

4:19

on the one hand, and then

4:19

on the other hand to ensure

4:23

that they're doing their job. Which basically means

4:25

if something's risky,

4:28

it's a transaction, it's got to

4:28

go through a series of reviews.

4:31

And so, one of the use

4:31

cases we have is, helping

4:34

to make sure that they're

4:34

doing the number of reviews

4:36

that they're required to do.

4:38

Maybe we'll pick up on the first and then jump into the second.

4:41

You were talking about board

4:41

reporting, I suppose internal

4:44

audit are usually in a pretty

4:44

unique position that you get

4:47

to see across the organization

4:47

and the silos that might exist.

4:51

So, looking differently

4:51

at the organization than

4:55

individual departments might?

4:58

Yeah, The nice thing about internal audit is, for example,

4:59

we're looking at auditing,

5:02

data privacy next year. We've had a lot of

5:04

movement in, that area. And so, if we go and audit

5:07

the data privacy group, we get

5:10

familiar with their policies,

5:10

their procedures, some of the

5:13

work that they're doing, some of

5:13

the requirements they have for

5:16

the rest of the organization. And then, you know, later

5:18

on, we'll go audit the

5:20

IT department or finance. we have that knowledge

5:22

in the back of our heads.

5:25

And so we might be auditing

5:25

finance according to their

5:28

policies and procedures, but

5:28

we also know there's some

5:31

data privacy requirements,

5:31

, do these apply to you?

5:34

And so knowledge of different

5:34

departments within your

5:38

organization just helps you

5:38

to, you know, to understand

5:42

the requirements that they

5:42

have for other departments.

5:44

another example is just

5:44

like knowing people, right?

5:48

So I know a lot of people

5:48

at Fidelity, and if someone

5:52

comes to me with a problem

5:52

and I might be able to

5:55

redirect them to someone

5:55

else within the organization.

5:58

a good example is we have an

5:58

emerging tech team at Fidelity

6:02

Canada and they're supporting

6:02

a lot of departments with

6:06

automation and they're trying

6:06

to get into more advanced

6:09

analytics capabilities. Some people within our

6:11

organization, because we're large, don't

6:13

really know about them. Especially if they're like

6:15

lower down on the rungs of

6:17

the organizational ladder. And so connecting them with

6:19

people to enhance their

6:22

processes or, their capabilities

6:22

is, is just another advantage.

6:26

but the list goes on and on

6:28

Separately, you were

6:28

talking about expanding your

6:31

use of large language models. So you set yourself a target

6:33

of a new prompt every day.

6:36

How is that going, and what does

6:36

that mean for both analytics,

6:39

but also audit more broadly.

6:42

It's hard to say, right? I mean, I have a lot of

6:43

predictions, on the one hand.

6:46

On the other hand, I know

6:46

what I know based upon my use

6:50

and experience with ChatGPT. So right now at Fidelity, we're

6:52

vetting ChatGPT and getting

6:57

it ready, for use in Q4, maybe

6:57

Q1 We're going through the due

7:01

diligence process and make sure

7:01

we're comfortable with, with

7:05

the tool and, the algorithm and

7:05

getting our policy requirements

7:10

up to date and, you know, just

7:10

getting comfortable with it.

7:14

So I haven't been using

7:14

it for work specifically

7:17

more on personal use. When it first came out

7:18

at first we had access.

7:22

But the big concern for

7:22

Fidelity, and I'm sure a lot

7:24

of organizations, is you can't

7:24

put company data in there.

7:28

Or, or if you do, it's gotta

7:28

be under strict circumstances.

7:32

It all comes back to privacy. But in my personal use, I've

7:34

done some pretty cool, I

7:38

mean, I can't say I've done

7:38

some pretty cool things, but,

7:41

I've seen the technology do

7:41

some pretty amazing things.

7:44

Like in one case, I gave

7:44

it a fake audit program.

7:48

And I said, create a risk

7:48

and control matrix, right?

7:53

And at first I didn't

7:53

ask it to make a matrix.

7:56

At first it just kind of

7:56

went down in bullet form.

7:59

Like here are the controls

7:59

that you would test and

8:02

here are the procedures. And here are the risks.

8:06

And I was pretty impressed,

8:06

and then I said, Okay, now

8:09

translate that to a table form. And then it translated

8:11

everything in a table form,

8:14

so like, you had your, your

8:14

controls, and then you had

8:18

your risks, and then I said,

8:18

Now, now put tick box or check

8:21

marks to see which risks are

8:21

mitigated by which controls.

8:25

And then it proceeded to do that. I've also asked it, and I think

8:28

everyone knows this, write me

8:31

an introduction for this audit

8:31

report, which is obviously fake.

8:36

And it just, it does it

8:36

in, in a matter of seconds.

8:39

So I think based on my

8:39

experience with the tool so far,

8:43

like it's definitely going to

8:43

be used to write audit reports.

8:47

Specifically to get the

8:47

language and the tone right.

8:51

A lot of the times we go back

8:51

and forth, it could be a month,

8:53

two months with management

8:53

to fine tune the audit report

8:56

and I'm sure a lot of people

8:56

struggle with that as well.

9:00

But I think LLM models will be

9:00

able to kind of get the tone

9:04

right and get the language

9:04

right and get the report

9:07

ready for consumption at

9:07

the senior management level.

9:10

Sometimes reports are just two

9:10

in the weeds and they got to

9:14

kind of step it up to a higher

9:14

level to make it understandable

9:16

for senior management. So I think LLMs will

9:18

be used for that. And then just in terms on,

9:21

analytics I don't even know.

9:24

That's where I have more predictions. I could tell you a bit more

9:26

about that, but I think

9:28

that is the biggest wild

9:28

card in my, imagination.

9:31

so talking about that,

9:31

and , we'll get into some of the

9:34

specifics that you brought up in

9:34

in the KNIME webinar later on.

9:37

But one of the things that you

9:37

said that was a challenge was

9:41

being able to understand, Some

9:41

of the data that was provided

9:44

because of the sheer volume. And I assume that, you know,

9:46

it's not just, the number of

9:49

records but also just the number

9:49

of fields that you need to be

9:53

able to navigate to properly

9:53

understand what's going on and

9:56

there'll be a range of fields

9:56

that are ambiguous and maybe

10:00

difficult to understand and do

10:00

you see a use case there for

10:04

trying to unleash some sort

10:04

of advanced model That would

10:11

enable that understanding

10:11

of, of the data so that you

10:15

don't have to go back and

10:15

forth with IT and frustrate

10:18

them a little bit, I suppose.

10:20

Yeah, in theory, I

10:20

could, visualize an LLM model,

10:24

which goes and pulls the data

10:24

for you, and, you know, the old

10:29

school way is you, you send your

10:29

data requests to the IT team,

10:33

or maybe you pull it yourself

10:33

because you have system access,

10:36

and you get a big dump, and then

10:36

you open it up, maybe in some

10:41

tool or some, Or even Excel, and

10:41

you're like, where do I begin?

10:46

There's just too much data here. I could, you know, in theory

10:47

visualize one of these

10:50

models that, you prompt it. This is my objective.

10:53

this is my, my audit test. What fields and

10:55

records do I need?

10:58

And then it just goes into

10:58

the system and it pulls

11:01

exactly what you need. And then it does the

11:04

test for you, right? That's conceivably where

11:07

this technology is going.

11:11

So what does that

11:11

mean in terms of, you know,

11:15

and I and others would have

11:15

had the benefit of, coming

11:17

through junior ranks, et

11:17

cetera, in terms of auditing.

11:20

And. What the technology does is

11:21

potentially replaces some of

11:25

that more basic work that really

11:25

is the foundation for a lot of

11:30

the other work that follows. What do you think that means for

11:32

juniors coming into the team?

11:36

Yeah, that's a tough question. I think it means you need

11:37

to learn a new skillset.

11:41

that's probably more true

11:41

for all of us to be honest.

11:45

But I think especially for

11:45

juniors, we're going to have

11:48

to learn how to prompt these

11:48

machine learning models to ask

11:54

the right questions, to define

11:54

our requirements up front.

11:58

that defining the requirements

11:58

up front is very important.

12:01

I've interacted with Chad

12:01

GPT before for like 20 30

12:06

minutes and just really wasn't

12:06

satisfied with the output.

12:10

I know the technology is

12:10

getting better with time,

12:13

but I also think maybe I

12:13

wasn't satisfied with the

12:16

output just because I wasn't

12:16

prompting it in the right way.

12:20

On other occasions I've

12:20

been really successful

12:23

in 2, 3 to 5 minutes. So I think...

12:26

You know, it's kind of

12:26

generally called prompt

12:28

engineering or prompting. I think junior auditors

12:30

are going to have to get good at that.

12:34

How to interact with these

12:34

machines, these technologies,

12:39

in order to get the kind

12:39

of output that you want.

12:42

and coupled with that, another

12:42

skill set that juniors are

12:45

going to have to, acquire. And. This, this maybe even borders

12:48

on personality traits as well,

12:51

but I think creativity is

12:51

going to be really important

12:55

because these technologies

12:55

are so powerful and they can

12:58

do a lot of the grunt or the

12:58

legwork for you, you have to

13:01

know where you're going or

13:01

what you want it to do if you

13:06

are creative, that helps you. In some sense, because you can

13:08

do auditing in new ways, or

13:13

you could get the machine to

13:13

do some kind of test that's

13:16

never been done before. So I think that kind of

13:18

borders on personality

13:21

traits, or, or even skills.

13:23

I think it can also be

13:23

acquired to some extent.

13:26

But I'd say prompting and

13:26

creativity, or just knowing, you

13:30

know, how to get from A to B,

13:30

it's kind of like logistics, is

13:33

going to be important as well.

13:36

And, and this is the last question I'll ask on this before we move on, but do you think

13:38

that means that as managers,

13:43

senior leaders within the team,

13:43

we need to think differently

13:47

about how we manage team members

13:47

and those new juniors coming in?

13:52

So it's not the traditional,

13:52

let me teach you how to audit.

13:55

It's a bit beyond that

13:55

because there is some

13:59

built in capability that we can leverage already.

14:01

I think it's definitely

14:01

going to change the way and the

14:05

approach we have for training. the fundamentals

14:07

will always be there. Auditing will always be about

14:10

assurance to our stakeholders.

14:13

It will always be about

14:13

conducting tests and testing

14:17

policies and procedures

14:17

and some of these things.

14:20

But the way in which we do that

14:20

is going to change dramatically

14:23

with these models and these

14:23

technologies therefore the

14:27

way do our training as well. next year, one of our

14:29

goals is to really use

14:32

Chats GPT to write reports. And I think that as we start

14:35

to do that, well, I think

14:37

we'll start to get a sense

14:37

for what kind of training

14:40

we need to provide people. And just the bigger, larger

14:42

question, which is what

14:44

kind of work are people

14:44

going to be doing and how is

14:47

their work going to change? So it's, going to change our

14:49

operating models as well.

14:53

as I alluded to before

14:53

I came across you largely

14:55

because of the, presentation

14:55

you gave around the work that

14:59

you had done using KNIME. And just wanted to talk

15:00

a little bit about that. I know some people here

15:02

may have seen that.

15:05

And if you haven't seen it, we'll put a link in the show notes so you

15:07

can go and have a look.

15:09

But I wanted to explore a

15:09

little bit in terms of, some

15:13

of the challenges that you

15:13

came across, I know, obviously

15:15

what we're talking about is a

15:15

low code environment really

15:19

powerful, regardless of the

15:19

tool that you're using, there

15:22

is that need to understand the

15:22

data and iterate through that

15:25

understanding of the data. And I know that you

15:27

mentioned that there were some challenges around that.

15:30

Do you want to, explain

15:30

some of that and, what

15:32

you think some solutions

15:32

might be coming out of that?

15:35

I know you had a whole

15:35

lot of value coming out of

15:37

the work that you did, but

15:37

just what you found and

15:40

what you think you might do

15:40

differently going forward.

15:43

This harkens to a

15:43

bigger problem I've seen,

15:46

throughout my career. Again, when I started my career,

15:48

I was taught how to code.

15:52

And so I'm coming from

15:52

that vantage point.

15:55

Data literacy is the problem or

15:55

the issue that auditors need to,

16:01

grapple with , and understand. Thank you.

16:03

Like conceivably, if you just

16:03

look at and think about your

16:06

organizations right now, how

16:06

many systems do you have?

16:10

could be 10, it could be

16:10

100, it could be 500 plus.

16:14

And then what kind of data

16:14

exists in those systems?

16:18

And then if you were going

16:18

to take that data and

16:21

connect dots between them,

16:21

how would you do that?

16:25

That knowledge is

16:25

starting to get to what

16:28

we call data literacy. How do we take disparate

16:30

systems, datasets, join them

16:33

together, and then analyze

16:33

them an appropriate way?

16:37

But I think more fundamentally

16:37

it's just, getting down

16:41

to brass tacks here,

16:41

just joining datasets.

16:44

you can understand how to

16:44

join datasets together.

16:47

Then the sky's the limit

16:47

with what kind of analysis

16:50

you can start to do. So I kind of broadly describe

16:52

it as data literacy is the

16:56

issue, and in particular it's

16:56

joining datasets together.

17:00

We can talk a bit more that,

17:00

but I think that's, .. A

17:03

skill set I wish internal

17:03

auditors would pick up more

17:06

and some of the people on our

17:06

team, but it's hard, right?

17:09

Because there's competing

17:09

priorities, there's audit

17:11

reports to get out the door,

17:11

there's trainings to attend,

17:15

there's this, there's that. I've done training with

17:17

people on our team and, you

17:20

know, I've shown them how

17:20

to join data sets together.

17:23

And like a lot of times

17:23

people just look at me

17:26

and they're like, but why

17:26

would I ever use this?

17:28

Right? Because I come from a coding

17:29

background and I've used audit

17:32

analytics for a very long

17:32

time, I know, I know why.

17:35

But trying to communicate that,

17:35

to a business auditor or an

17:38

IT auditor, sometimes it just

17:38

doesn't stick and maybe that's

17:42

because of their interests

17:42

, or some other factor.

17:45

But that's why I'm starting

17:45

to just look for interest

17:48

in analytics and if people

17:48

have interest, then start

17:51

to double down there. I think some people just

17:52

want to be business auditors

17:55

and they're kind of just

17:55

wired a bit differently.

17:59

They're more focused on relationships, that kind of thing.

18:01

Not to say these things

18:01

are mutually exclusive, but

18:04

it's hard to be everything. And it's also hard to

18:06

specialize, and I think some

18:09

people like doing both of those,

18:09

but some people like to just

18:12

fall in the middle as well.

18:13

do you think any of that? What might be perceived as lack

18:14

of interest if not real lack

18:18

of interest, could be driven

18:18

by fear or lack of confidence.

18:23

Definitely. I know, , when I first started

18:24

coding I was thrown into an

18:28

environment at a big four with

18:28

deadlines, audit deadlines at

18:34

that for external audit clients. And it was very stressful and

18:36

I had no idea what I was doing.

18:41

And there wasn't really much help. And the training was

18:43

conducted by someone who

18:46

wasn't physically present

18:46

in our office at the time.

18:51

that just made things more different and there was time zone differences.

18:54

I struggled for a very long

18:54

time and maybe I'm just a bit

18:59

more judgmental on this side

18:59

of the thing because I've,

19:01

had all this experience and

19:01

all these years under my belt.

19:05

But I could definitely

19:05

understand why someone would

19:07

be fearful or just intimidated. Just like, I am with

19:09

learning new stuff, right?

19:12

Like there's all, we're

19:12

all scared of something and

19:15

maybe that could definitely

19:15

be the case in some of

19:17

these situations as well.

19:19

Yeah, look, I completely understand what you're talking about.

19:21

I know many years ago, I

19:21

traveled around a few of

19:25

the offices of the Big

19:25

Four that I was working at,

19:27

training people to use ACL.

19:30

And you'd have, you know, 20, 25

19:30

people in a room, or 20, 25...

19:34

Generally, external auditors in the room. And if you got one sort

19:35

of bright face out of

19:39

that, it was a lot. Generally, you got a

19:40

lot of blank stares. I need to do this to tick

19:43

off some sort of requirement

19:45

as opposed to, you know, I

19:45

really need to learn this.

19:48

And I think that's changed. Not fast enough, but

19:49

slowly that's changing.

19:53

I've always wondered what the gap was. Is it not seeing

19:54

the value or is it.

19:58

This is too hard, and maybe

19:58

there's a combination of those,

20:02

but I'm always interested

20:02

in, and keen to get your

20:04

thoughts on ways in which we

20:04

can Eliminate some of that.

20:08

So the value stuff, there's

20:08

been a lot spoken about that.

20:10

And, you know, a lot of

20:10

it is evident, but how to

20:13

eliminate the confidence gap. what are your thoughts

20:15

and what have you seen work over the years?

20:19

I think the Tencent

20:19

answer is chat GPT and LLMs

20:23

will solve that problem very

20:23

quickly because some people

20:27

just don't like coding, And

20:27

if you can ask a computer

20:33

or a piece of tech to. Do all the hard work

20:35

and the legwork for you.

20:37

you can do data analytics too. It just looks a bit different.

20:41

But what can we do in the meantime? I don't know.

20:44

I've kind of just settled on a compromise. I think I've talked to a lot of

20:46

people recently about how they

20:50

structure their departments

20:50

specifically analytics

20:54

and how they work with. You can turn a lot of teams

20:56

and obviously that depends on

20:59

the size of the organization. But just based on my

21:01

conversations recently,

21:03

I've kind of just settled on

21:03

something like this, right?

21:07

If you're interested,

21:07

then we can work with you.

21:10

We'll train, you and we can

21:10

kind of build you up to be

21:13

like a data analytics champion. Like a lot of people use this

21:15

term in their organizations.

21:19

And you can, you know,

21:19

support the auditors for audit

21:23

execution and things like that. But in terms of getting over the

21:25

fear something that I've seen

21:28

some success with is not like

21:28

a theoretical training but like

21:34

training that's on an actual

21:34

file, audit file cause then it

21:38

becomes really relevant to them. And especially if they're

21:40

working on that audit.

21:42

It's like, Hey, let's

21:42

sit down and we have this

21:45

audit procedure and we have

21:45

two weeks to complete it.

21:49

And what kind of data do we need

21:49

and what are we going to do?

21:53

And then let's actually

21:53

work through the problem

21:56

in nine or whatever

21:56

tool that happens to be.

21:59

I've seen a bit more interest

21:59

, in there when I've done it with

22:01

some of our business auditors. And that's something

22:03

I'm actually planning. I observed that like

22:05

not too long ago.

22:08

So that's something I, it's a

22:08

hypothesis I want to test out

22:11

in the next six months and see

22:11

if there's more traction there,

22:14

but I think, I think there is.

22:16

Funny, you say that, cause I was talking to somebody not too long ago

22:18

Chief Audit Exec, who said

22:21

exactly the same thing is

22:21

that we don't have time for

22:25

protracted training sessions,

22:25

and frankly, there's not a lot

22:28

of budget for these sorts of

22:28

things, but if we can learn.

22:31

As we go on a particular audit,

22:31

that's far more useful because

22:34

we're actually delivering something as part of that training as opposed to you know,

22:36

training course that will go in

22:39

the one and come out the other. So you've got

22:41

business auditors who. we can find ways of training

22:44

up, but when you talk about

22:46

really technical people, have

22:46

you found any challenges with

22:50

retaining those people within

22:50

audit once they get a certain

22:54

level of data related skill set?

22:57

Oh, for sure. they're highly sought after.

23:00

And I think the

23:00

opportunities for them are

23:04

quite good, to be honest. Yeah, retaining these people

23:06

is always a challenge.

23:09

Just because they have, they

23:09

have a unique combination of...

23:13

Coding slash analytics and

23:13

auditing or that could even

23:18

be, analytics and finance

23:18

or analytics and whatever.

23:22

they're a rare breed because

23:22

they not only have domain

23:25

knowledge, But they have this

23:25

expertise as well, and so

23:29

they're going to be useful

23:29

wherever they go especially as

23:33

organizations focus more and

23:33

more on data and curating it

23:38

and extracting value from it. So it's, always a challenge.

23:42

The thing that's encouraging

23:42

to me rotational program at

23:46

Fidelity, and so every four

23:46

months we're getting a co

23:49

op student, and I've noticed

23:49

with the past For five co

23:54

op students, they come in

23:54

and they all know about data

23:58

analytics and they're all

23:58

eager to try and to learn.

24:03

Whereas I remember going

24:03

through university and it just

24:08

wasn't even a buzzword, right?

24:10

We had a few courses

24:10

on, data or IT auditing.

24:14

And that was kind of my

24:14

introduction to ACL at the

24:17

time, although I didn't know it. But that's kind of encouraging

24:20

to me because it means that

24:23

professors are talking to

24:23

their students about analytics,

24:27

or they're hearing it at

24:27

a conference, or they're

24:29

hearing it somewhere, and

24:29

they're getting interested,

24:32

and then they're moving

24:32

into the workforce, and they

24:34

want to get their feet wet. So, I'm hopeful that because

24:36

there's more interest,

24:40

it seems, at that level. That when they come into

24:42

organizations, they're going

24:45

to be more willing , to stick

24:45

with it , and to find maybe

24:48

even a career in data analytics

24:48

and that kind of starts to

24:52

solve the problem a bit because

24:52

right now we're dealing with

24:56

a very small pool of people

24:56

who have that domain knowledge

25:00

in auditing or assurance and

25:00

then the analytic skills, but

25:03

the interest and the hype at

25:03

the university level, college

25:06

level is hopefully starting

25:06

to solve some of that problem.

25:10

terms of the work that

25:10

you've seen over the years

25:12

maybe more recently, find

25:12

the biggest challenges to be?

25:16

In the upfront planning for

25:16

the work or is it in getting

25:19

the data or is it in processing

25:19

it or Where do you think the

25:23

biggest problems are?

25:25

Ooh, the biggest. I think in the planning

25:26

phase, to be honest it,

25:28

it's almost that transition

25:28

from, planning to fieldwork,

25:32

I think right there. I think during the planning

25:34

phase, it's easy to

25:37

understand what kind of

25:37

tests the team wants to do.

25:41

at that stage, you're still

25:41

kind of talking a bit more

25:43

theoretical without actually,

25:43

pointing at systems and saying,

25:48

I need data from this system. But once you start to kind of

25:50

transition to the data request,

25:54

like you go from planning, okay,

25:54

what kind of data do I need?

25:57

I think that's where, and then you start communicating that information to IT and

25:59

there's back and forth.

26:05

I think that's where probably biggest bottleneck is. Because oftentimes, we

26:07

have a plan for what we

26:10

want to do with analytics. And then we start talking

26:11

to people on the ground,

26:14

and we realize, , Oh, it's

26:14

just not going to happen

26:16

for whatever reason, right? someone could be on

26:18

vacation, and they just

26:20

can't pull the data. Or we don't have access

26:22

to the data via some

26:24

system or some connection. Or it's going to take, time

26:27

for a developer to build a

26:30

report which meets our needs. Thank you. These kinds of things I find

26:32

often get in the way, and

26:35

they're big bottlenecks. And it's kind of unfortunate

26:37

because it, you set out in

26:40

the planning phase to use

26:40

analytics and to do these great

26:43

and wonderful things, and then

26:43

it just gets killed in the

26:46

water like two weeks later. So there's other bottlenecks, of

26:48

course, but that's fundamental

26:52

because you just, you don't even

26:52

get the plane off the ground

26:55

and you don't even get started.

26:57

Do you find more

26:57

of your work being in?

27:00

Testing what you know to

27:00

be controls that should

27:02

be operating or is it more

27:02

exploratory and almost

27:08

substantive to an extent.

27:10

It's a bit of both. I can tell you a bit about who

27:11

I am and what I like to do and

27:16

how I like to get creative,

27:16

but usually in the planning

27:19

phase, we, I sit down or part

27:19

of someone on our team sits

27:23

down with the business auditors

27:23

and they have, they generally

27:26

have a pretty good idea of

27:26

you know, the controls they

27:29

want to test and the areas

27:29

they want to go look into.

27:33

And so when, when we communicate

27:33

with them, they It's like

27:37

we're going to use analytics

27:37

to test controls, right?

27:40

And we do that and we'll execute. Me personally, when I get

27:43

to the end of that, I like

27:46

to do a bit of exploration. I usually set aside 30

27:49

minutes to an hour, maybe

27:51

a bit more depending on my

27:51

schedule to just explore data.

27:56

And I think that's

27:56

just my creativity.

27:59

You know, I look at a data set

27:59

and I don't get overwhelmed.

28:03

I'm like, what's here? Like what insight is here?

28:07

That is just not

28:07

apparent to me right now.

28:10

And so, personally, I like

28:10

to spend a time doing some

28:13

exploration, some analysis. And, I just like to visualize

28:15

stuff and see what patterns

28:19

are, what trends are there. And then that gets me

28:21

thinking about questions

28:23

to ask to the business. At that point, I'm kind of

28:25

veering away from traditional

28:28

internal auditing but they

28:28

are good questions if you

28:31

find any kind of insights

28:31

or anything that you

28:34

think would just warrant

28:34

a greater understanding or

28:38

exploration from the business. And that's, there's more

28:40

challenges around that, but

28:43

that's that's personally

28:43

what I like to do.

28:46

mentioned that you use

28:46

a variety of visualization

28:51

techniques to identify things

28:51

like anomalies, outliers, etc.

28:54

And while they, I mean,

28:54

they're related to controls,

28:57

of course, but they also

28:57

give you just a broader view

29:00

of what's going on, right?

29:02

Oh, for sure. I used to be a consultant and

29:03

like the two favorite words

29:06

we had were effectiveness

29:06

and efficiency, right?

29:09

and at the time we were,

29:09

consulting with municipal

29:11

governments and we were

29:11

evaluating their programs.

29:14

And so conceivably, as an

29:14

auditor, you could look at

29:17

the different activities

29:17

within a business's programs,

29:20

and you can start to assess

29:20

how effective they are or

29:23

how efficient they are. And I find oftentimes with

29:25

data analytics or some of the

29:28

trends and the results that I

29:28

see, just even doing a simple

29:32

bar chart sometimes it gets

29:32

you thinking about the program

29:35

effectiveness so, when people

29:35

invest money with us the more

29:40

they invest they get a greater

29:40

discount on the management

29:44

fees that we charge them. And so for some of our

29:46

institutional clients we were

29:49

doing the audit there and

29:49

I was just looking at some

29:51

trends in, in the billing

29:51

and how we've built them

29:53

over the past past year. And some pretty interesting

29:55

pictures emerge, right,

29:58

because some of these contracts

29:58

have been entered into

30:01

over the course of decades. And so you start to see,

30:03

like, in some cases there's a

30:06

consistent pattern of billing,

30:06

and in other cases there's not.

30:11

And it just prompts

30:11

questions, like, how is

30:14

the program being run? Maybe there's an area to

30:15

tighten up or to standardize

30:18

some of our billing procedures. Because when I started

30:20

to look at the data...

30:23

I just started to see

30:23

some outliers, right?

30:25

So it kind of begs the question,

30:25

like what's going on here?

30:28

Why are we giving bad terms

30:28

or really good terms to

30:32

this individual client? And what does that mean

30:34

for our billing practices?

30:37

And so if you look at billing

30:37

as a whole, you might start

30:40

to ask, like, how can this

30:40

become more effective?

30:42

And by looking at even simple

30:42

bar charts it can start to,

30:46

give an understanding of. How we can make that more

30:48

effective or where we

30:50

should go investigate more.

30:52

Sunny, I know you

30:52

spoke about delving into data

30:55

privacy and obviously you're

30:55

going to be using LLMs as

30:58

your organization starts to. get into that a bit more,

31:00

but where to next for you

31:03

in terms of, auditing and

31:03

what's the next big things

31:06

you, you looking forward to?

31:08

Yeah. So I kind of divided into

31:09

two, there's like pie

31:13

in the sky, dreams and

31:13

predictions just one thing.

31:17

But then there's like more

31:17

concrete actions and those,

31:21

the concrete actions are using

31:21

LLMs to write audit reports.

31:27

And like I said, get the tone, right. Get the level of detail, right.

31:32

that's definitely on our radar. The other thing that's on our

31:34

radar is just the creation

31:37

of risk and control matrices,

31:37

like I mentioned earlier.

31:41

So all these things just

31:41

revolve around audit execution.

31:45

So just speeding up audit

31:45

execution using LLMs is, is

31:51

a concrete step that we're

31:51

looking at deploying next year.

31:55

We're obviously, it's

31:55

going to be new for us.

31:58

So. There's a bit of exploration

31:59

that's going to have to happen, but that's,

32:01

that's kind of step one.

32:06

And then I imagine we'll

32:06

have some, success there, and

32:10

that's going to prompt us to,

32:10

rethink how we, do training

32:14

or how people do their work.

32:17

it's going to trigger a whole

32:17

bunch of other questions.

32:19

But then in terms of. Pie in the sky and ideally

32:22

where I want to go is basically

32:29

centralizing data into

32:29

warehouses or into the cloud.

32:36

And I'm talking about like the

32:36

entire organization's data in

32:40

one spot and then basically

32:40

strapping AI on top of it.

32:45

That's the ideal, I think,

32:45

from a strategic perspective,

32:50

an operational perspective, or

32:50

even an auditing perspective.

32:54

I was at an AI conference

32:54

maybe three weeks ago, hosted

32:58

by KPMG, and there was a

32:58

Chief Technology Officer of

33:02

a startup AI company here in

33:02

Toronto called Cohere, and

33:07

they're trying to do just that. For enterprises, it's like

33:09

a one stop shop for all

33:14

your data needs across

33:14

the entire organization.

33:18

So they're trying to create

33:18

that environment and then

33:20

strap AI on the top of it,

33:20

which I think is going to

33:24

be exceedingly powerful. So that's where I would

33:27

like Fidelity to go

33:31

Given, and it may just

33:31

be a maturity thing, but, you

33:35

know, it may actually be a real

33:35

thing for a long time, given

33:39

how not 100 percent reliable

33:39

these things can be, How do you

33:45

foresee providing assurance over

33:45

the results that are produced

33:49

so that you can rely on them

33:49

the same way that we would

33:53

rely on, humans undertaking a

33:53

particular, task that we can

33:57

see all the way through as

33:57

opposed to a bit black box ish.

34:02

I think the solution is

34:02

what's called human in the loop.

34:05

We're going to be able

34:05

to use LLMs or machine

34:08

learning to do testing. We're going to get some

34:10

results and then we have to

34:13

take those results, sit down

34:13

with someone and ask them, is

34:17

this a true exception or not? And just fed it with them,

34:20

validated with them, and

34:23

I think that's how we get

34:23

around some of these problems.

34:27

comes to LLMs, there's this

34:27

term called hallucination.

34:31

And so, model might be giving

34:31

you some wild, crazy results.

34:35

That can be tuned

34:35

up or tuned down.

34:39

But again, I think the solution

34:39

is just to get confirmations,

34:42

old school audit confirmations

34:42

with their clients.

34:46

So it could be that

34:46

we take some of the time

34:48

that we save and directed

34:48

to deeper or broader levels

34:54

of quality assurance than

34:54

we need to do right now.

34:57

I think so. Yeah. I would hope that audit

34:58

departments and teams are

35:01

already vetting findings

35:01

from any analytics with,

35:05

their clients before going

35:05

to the reporting stage.

35:07

But yeah, conceivably we'll

35:07

have more time to do that.

35:11

either on the audits or if we're

35:11

doing more continuous monitoring

35:15

type of activities as well.

35:18

Fantastic. Hasan, we are out of time.

35:20

Excellent conversation today. Thank you for that

35:21

and for joining us. if anybody wants to find out

35:24

more about you and the sort of

35:26

work you do, you know, where's

35:26

the best place to find you.

35:29

Find me on LinkedIn,

35:29

just DM me or send me a message.

35:34

Excellent. We'll put a link to

35:34

your LinkedIn profile in

35:37

the show notes as well. Hasan, thank you very

35:38

much for joining us.

35:40

Yusuf, thanks again. This is great.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features