#263: Analytics the Right Way

#263: Analytics the Right Way

Released Tuesday, 21st January 2025
Good episode? Give it some love!
#263: Analytics the Right Way

#263: Analytics the Right Way

#263: Analytics the Right Way

#263: Analytics the Right Way

Tuesday, 21st January 2025
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Welcome to the Analytics

0:02

Power Hour Hour. Analytics

0:04

topics covered conversationally and

0:07

sometimes with explicit

0:09

language. Hey everyone,

0:12

and welcome to the

0:14

Analytics Power Hour Hour. This is

0:16

episode 2, 63, and I'm Valkroll

0:19

from Facts and Feelings. You know,

0:21

writing can be hard. While I

0:23

am absolutely just opening the show

0:26

with some... off-the-cuff, exemplaraneous remarks, it's

0:28

not hard at all for me

0:30

to imagine a world where the

0:33

intro that we do for every

0:35

episode is carefully written out ahead

0:37

of time. But that definitely wasn't done

0:39

here. Nope. I'm totally freestyling and

0:41

free associating. And that's how this

0:43

Tim-style rambling I'm doing, which just

0:46

happens to be the topic of

0:48

writing, is a nice transition to

0:50

what this episode is all about. It's

0:52

a first for the Analytics Power

0:54

Hour hour. And no, I don't

0:56

mean because it's the first time

0:58

I've done the show opening, it's

1:01

because we've secured an exclusive designation

1:03

as the official podcast for what

1:05

is sure to be the most talked

1:07

about Analytics book of 2025.

1:09

The book, you might ask? Analytics,

1:11

the right way. A business leader's

1:14

guide to putting data to productive

1:16

use. I'm joined today by Julie

1:18

Hoyer from further for this

1:20

discussion, Julie. Are you excited

1:22

to talk with these book

1:24

authors? Oh my gosh, absolutely.

1:26

Have been waiting for this

1:28

all, what, month? Very nice. I'm

1:31

also joined by Tim Wilson,

1:33

my colleague from Facts and

1:35

Feelings, and he's more of

1:37

a guest today than a

1:39

co-host because he's one of

1:41

the co-authors of the book.

1:43

Tim, welcome to the show,

1:45

I guess. Hopefully this is

1:47

the last time we'll use this

1:50

little gimmick maybe. We'll stop doing

1:52

cool shit. We won't have you

1:54

on as a guest. How about that?

1:56

Oh. No, never. You'd never. And we're

1:58

joined by Tim's co- Dr. Joe

2:00

Sutherland. In addition to working with corporate

2:03

executives as a consultant and advisor, Joe

2:05

founded the Center for AI Learning at

2:07

Emmer University where he also teaches and

2:09

as it happens, Julie and I both

2:11

got to be his students in a

2:14

way when we worked with him together

2:16

at Search Discovery, now further. Joe has

2:18

a list of credentials that is frankly

2:20

kind of intimidating. Let's see if I

2:22

can get through it. He has one

2:25

political science degree from Washington University in

2:27

St. Louis and three more, including a

2:29

couple of doctors from Columbia. He's a

2:31

fellow at the Wheatonbaum Center on the

2:33

Economy, Government, and Public Policy at Washio.

2:36

He worked in the Obama White House

2:38

from 2011 to 2013, casual. He published

2:40

academic papers all over the place. He's

2:42

been on this podcast three times now,

2:44

believe it or not. That's an accomplishment.

2:46

Sure is. Yeah, but intimidating, not really.

2:49

If you know Joe, he's not scary

2:51

at all. Today, we get to welcome

2:53

him as our guest. Welcome to the

2:55

show, Dr Joe. Thank you very much.

2:57

It's good to be back. That's the

3:00

reason we wrote the book, actually, was

3:02

because Tim dangled the podcast appearances. He

3:04

said, hey, you'll actually be. All we

3:06

have to do. So let me on

3:08

as a guest. I love it. I

3:11

just need to bring somebody with some

3:13

real credentials. That's, that was the, yeah.

3:15

That's the hook. I love it. I'm

3:17

so excited for this one. So I

3:19

guess a good place to start would

3:22

be asking you guys just a little

3:24

bit about how this book came to

3:26

be. I know you guys worked together

3:28

at Search Discovery because I was there

3:30

to see it, had the privilege to

3:33

see it, but this didn't come together

3:35

till a few years later. So I'm

3:37

curious kind of how it started, a

3:39

little bit of the origin story, and

3:41

what did you guys see that was

3:44

not out there in this space that

3:46

you wanted to kind of address with?

3:48

Analytics the right way. That is a

3:50

great question. I actually have a specific

3:52

memory of when this book like hatched

3:55

in my mind, which is I was

3:57

like on my back patio on the

3:59

phone with Tim, this is like years

4:01

ago, and I think one of us

4:03

just goes, we should write a book.

4:05

And I mean, it's true. And the

4:08

truth is, like, I do think we're

4:10

ideologically aligned in so many ways when

4:12

it comes to, like, the practice

4:14

of data and analysis and machine

4:16

learning and artificial, all these things

4:18

that you hear about today. And

4:20

I just knew that by coming

4:22

together with Tim, something wonderful would

4:24

be made. And where it went

4:26

to, right was. I get a

4:28

lot of these customers, clients, or

4:31

folks, you know, I guess I

4:33

encounter a lot of them at

4:35

the center all the time, who go,

4:37

I'm ready for AI. Can I get into

4:39

it right now? Let me just buy it. Like,

4:41

you know, like, let's do it. And you know,

4:43

they never asked the question like,

4:45

well, what are you actually trying

4:47

to achieve? And how do we

4:49

get there first? And do you

4:52

even have the data availability? Have

4:54

you thought through where your investments

4:56

need to go? And I actually

4:58

think that the principles behind, like,

5:00

making our way towards these artificial

5:02

intelligence projects and capabilities at companies,

5:04

which are truly transformational, the principles

5:06

are universal. I mean, you can

5:08

really link them back to any

5:10

data or analytics question. And I

5:12

wanted to give, you know, the

5:15

corporate executives of the world and any sort

5:17

of business leaders, I wanted to give them a

5:19

book that would basically say, hey, look. Read this

5:21

and or give it to your people

5:23

have them read right and you'll get

5:25

there. That's kind of what I was

5:27

hoping to get out of it when we

5:30

started. And that's no small like task

5:32

either. That is a lofty goal. Well, I

5:34

mean, I think part of what happened Joe

5:36

and I met like he was thinking

5:38

about like the introduction happened.

5:41

I remember sitting in Atlanta

5:43

in a conference room, me thinking.

5:45

This guy's gonna make me feel stupid

5:47

we hit it off and then as

5:49

we work together I have some very

5:52

clear memories of Sort of having an

5:54

expectation than when when you're

5:56

bringing a data scientist and

5:58

that's kind of what Joe's sort

6:00

of the role the branding he was

6:03

we were using for him at the

6:05

time was data scientist and I had

6:07

I'd gone through this journey on my

6:10

own where I was going to try

6:12

to become a data scientist like a

6:14

few years before and kind of realized

6:16

after a few years, like, no, I

6:19

can do really useful stuff, but I'm

6:21

not really going to be ever something

6:23

that I would consider a data scientist.

6:25

But I had this expectation that when

6:28

you talk to a data scientist, they're

6:30

going to start immediately talking about models

6:32

and methods and, you know, the vast

6:34

quantities of data and the number of

6:37

times that Joe would get brought in

6:39

and there would be somebody, we want

6:41

to do an X, we want to

6:43

do machine learning, we want to build

6:46

a model that, and he very consistently

6:48

would say, we first have to define

6:50

the problem. We have to frame the

6:52

problem. And so having someone who had

6:55

all the horse power to do all

6:57

the go super deep. And I think

6:59

Julie, you might have even lived it

7:01

more than I did. Like, yeah, he

7:04

can really go super deep on the

7:06

technical was always saying, but the way

7:08

companies tend to fall down is they

7:11

skip that clarity on what they're trying

7:13

to do. What are their ideas? So

7:15

we had, while traveling, while just doing

7:17

catch-ups, we had many, many, there are

7:20

many memories in my mind of Joe

7:22

and I sitting across from each other

7:24

at a coffee shop, at a restaurant,

7:26

at a bar, having these discussions where

7:29

I was actually learning a lot. He

7:31

introduced me to like the fundamentals of

7:33

like causal inference, which kind of blew

7:35

my mind. And I was like, oh,

7:38

this is a very important idea, not

7:40

all of the mechanics and the details

7:42

that go into it. Just the basic

7:44

ideas behind what you're trying to do

7:47

and why you're trying to do it

7:49

is really, you know, powerful. So I'd

7:51

had an idea to write a book

7:53

eight or nine years ago. This book

7:56

has very prominent vestiges of that. It

7:58

is a much, much richer book. because

8:00

there was a lot more depth of

8:03

thought, a lot more experience, a lot

8:05

more collaboration on a much broader and

8:07

deeper set of projects going into it.

8:09

But it was, it's not a book

8:12

to say this is going to teach

8:14

you data science, but it's also not

8:16

a kind of lofty hand-waiving book that

8:18

is just. you know, get all the

8:21

data and get all the data super

8:23

clean. We really wanted to write one

8:25

as Joe said for kind of the

8:27

business manager, the business leader, the business

8:30

executive so that they are positioned to

8:32

actually get value out of their data,

8:34

out of their analytics in a productive

8:36

and efficient way. So that's interesting.

8:39

You both called out the audience kind

8:41

of in your description there. And I

8:43

think that that's a really interesting choice

8:46

because you think, oh, I'm going to

8:48

write an analytics book, I'm going to

8:50

write it to my people, to my

8:53

analytics cohort and professionals. How come you

8:55

guys made that choice? Was that kind

8:57

of always there from the beginning or

9:00

did that kind of come together as

9:02

you were starting to frame out what some

9:04

of the topics you're going to dive

9:06

into were? I mean we make so many

9:08

points right and I think that they're

9:10

all like just new mental models for

9:12

thinking of that was one of the

9:14

reasons I love the collaboration with you

9:17

Tim was like we just developed a

9:19

really cool new mental models for how

9:21

to think about the world and how

9:23

to think about data and analytics and

9:25

all those exercises that we go through

9:27

in corporate America but you know a

9:29

few thoughts but one is I've realized

9:31

more over the past few years that

9:33

there is the zeitgeist in the analytics

9:35

or IT or technology industry vertical

9:37

what have you where in a lot of

9:39

ways you feel like you could just purchase

9:42

insight like you know I don't know

9:44

I feel like it comes from a

9:46

variety of forces right and we talk

9:48

about this in the book where there

9:50

it's not like there's some sort of

9:52

bad actor out there who's trying to

9:54

convince you to buy their product when

9:56

it really doesn't create any value at

9:58

all right there there is a reason

10:00

why these things happen, but

10:02

I just don't get the

10:04

sense that as a business

10:06

leader these days, you can

10:08

always trust everything that comes

10:10

from your tech or analytics

10:12

or data folks without understanding

10:14

sort of the more fundamental

10:16

concepts. I'd be curious to

10:18

know your thoughts about that

10:20

too. Yeah, I mean, we

10:22

definitely had a lot of

10:24

discussions about this and I'm in

10:26

a spot that having in many

10:29

ways, kind of facts and feelings

10:31

and kind of the drive behind

10:33

facts and feelings. The consultancy

10:36

that Val and I and

10:38

Maddie Wish now started is

10:40

it's aligned with that that there

10:42

is there are just forces

10:44

that are naturally happening in

10:46

the world in the business

10:48

world that kind of over

10:50

index towards collect more data,

10:53

run fancier models. find

10:55

more technology, hire more

10:57

data scientists, push to

10:59

do more. And it just seemed

11:01

like to us when we were

11:04

working with clients that there was

11:06

kind of they were trying to

11:08

start on step four and

11:10

they'd skipped steps one, two

11:12

and three. And even if the

11:15

analyst or the data

11:17

scientist was trying to go back

11:19

to steps one, two and three.

11:22

which is around thinking, and this

11:24

is not, there's not like a

11:26

six step thing in the book,

11:28

so that's a metaphorical steps

11:30

one, two and three, that

11:32

that's really where the most

11:34

opportunity to kind of redirect an

11:37

organization's investment is much

11:39

more about getting the business owners

11:41

who are trying to get. value

11:43

out of the data? If they

11:45

get off the hook and get

11:47

told to just lob it to

11:49

the analytics team and say bring

11:51

me some value, having grown up

11:53

in that analytics world and feeling

11:55

how difficult that is and sort

11:57

of slowly realizing that, oh, it's

11:59

because we're not putting enough up

12:01

front thought into it. So

12:03

even though the audience is kind

12:06

of the, you know, business leaders,

12:08

we certainly think analysts

12:10

who and data scientists who read

12:12

it will hopefully will help them

12:15

think differently as well and give

12:17

them confidence to say, no, no,

12:19

I have to go engage more

12:22

farther upstream. We have to have

12:24

clarity of. Wait a minute, this

12:26

is an analytics ask, am I

12:29

trying to just like objectively and

12:31

concisely measure the performance of a

12:33

campaign? Or I'm actually trying to,

12:36

you know, figure out something to

12:38

make a decision going forward

12:40

and giving everyone kind of

12:42

a little more clarity of language

12:44

and, you know, ways to interact, but

12:47

it really does go to a lot

12:49

of that burden falls on the business

12:51

with the layer of I think there's...

12:54

I think we agreed there

12:56

were some sort of

12:58

fundamental misconceptions that the industry

13:00

has. Analysts have it as well.

13:02

Often, business tends to have it

13:04

as well. More data is better.

13:07

If I have all the data,

13:09

you'll build me a perfect model.

13:11

You'll get to an unambiguous truth.

13:13

So I think there is a

13:15

level of like statistical fluency

13:18

that they're not super. difficult

13:20

ideas. They're kind of mind-blowing.

13:22

That's the nerd in me,

13:25

like the potential outcomes

13:27

framework, like, boy, give me

13:29

that second cocktail and get

13:32

the wrong person in the corner

13:34

and they are, they are,

13:36

yeah. I talked about counterfactuals like

13:38

four miles into a seven mile

13:41

run with my trainer. that

13:43

where was she going to go?

13:45

You know, she, uh, no, no,

13:47

no. But just to jump in

13:49

on that, like, there are more

13:52

pieces to this book that I

13:54

just, I want to

13:56

communicate to the audience.

13:58

Number one, my. So my wife

14:00

reviewed the manuscript and she goes, Joe,

14:03

this is kind of like your philosophy

14:05

on life in a treatise, like in

14:07

a statistics like, you know, sort of

14:09

framework. And I think there are just

14:12

a lot of really cool, it's not

14:14

dispensations, like we dispel a lot of

14:16

the misconceptions that will help you, I

14:18

almost feel like, live your life like.

14:20

better. It's hard to describe. Like, you

14:23

know, I think back to, you know,

14:25

I got all these degrees, right? And

14:27

I only have one doctorate, by the

14:29

way, just just just, I don't have

14:32

multiple doctors. Just one. Oh, shit. We'll

14:34

have to re-report that. No, no, you

14:36

don't have to re-record. It's fine. But

14:38

the, you know, it's like, I always

14:40

thought, like, why did I get a

14:43

degree? and statistics and or at least

14:45

you know with the with a methodological

14:47

focus and statistics and applications of machine

14:49

like what did I do that and

14:51

it's like I really do think it

14:54

was to sort of like self-sooth and

14:56

cope with the natural like OCD impulses

14:58

and anxieties of life that I've experienced

15:00

my whole like once I understood the

15:03

world in probabilities and sort of through

15:05

the framework of a probabilistic approach like

15:07

It made my life better. I took

15:09

things less personally. Like I made better

15:11

decisions and I really do believe that

15:14

the way that we think about the

15:16

world for this book is actually going

15:18

to be really helpful to people. So

15:20

that's one point that I want to

15:23

make. I mean I just took up

15:25

like photography for self-soothing but you know

15:27

if you got to go get you

15:29

know some variable number of advanced degrees

15:31

you know to each their own. Well

15:34

I'm glad you guys called out too

15:36

that like this is still valuable for

15:38

analysts to read especially because now I

15:40

can't wait to just like buy this

15:43

and make all the analysts I know

15:45

read it I'm so excited about that

15:47

so I'm glad you touched on that

15:49

because that was going to be my

15:51

follow-up question but the other thing you

15:54

guys already mentioned that I really wanted

15:56

to touch on was the misconception so

15:58

Going through, that was one of the

16:00

opening parts that I love the most,

16:02

is that you guys broke down, like,

16:05

what are the misconceptions

16:07

of, like, how we got here? Like, why is

16:09

it the way it is? And so without giving

16:11

away too much, I didn't know if you guys

16:13

wanted to, like, dive into the ones you

16:15

mentioned. We can't. I don't think we're, I

16:17

mean, it's so eloquently put in the book,

16:20

that even if we just kind of off

16:22

the cuff try to rattle a rattle them off,

16:24

try to rattle them off, try to rattle

16:26

them off. And I'll give a little credit

16:28

to Matt Gershoff on this as

16:31

well when years ago at a

16:33

super week and he said, you

16:35

know, these three things that businesses

16:37

about making decisions under conditions of

16:39

uncertainty, there's a cost to reducing

16:41

uncertainty and uncertainty can't be eliminated.

16:43

So I sort of had that

16:45

this, he kind of introduced me to

16:47

this idea that the goal was not

16:50

to eliminate uncertainty and there are

16:52

diminishing returns and I still think

16:54

that is like, that is a

16:56

huge thing. Like we've lived

16:59

that where people say what is

17:01

the answer and you have way

17:03

too many. Data professionals

17:06

walking around, you know, quoting Deming,

17:08

you know, saying, God we trust

17:10

all others must bring data. And

17:12

they just kind of wield a

17:14

misunderstanding of that as though, you

17:16

know, you have a, without data,

17:19

you're just another person with an

17:21

opinion, you know, F you. And

17:23

I'm like, well, that has perpetuated

17:25

this huge misconception that, that the data

17:27

gives you an objective truth. And

17:29

it just. It's just never perfect

17:31

data. So even getting truths about

17:34

the past, which aren't that useful,

17:36

it's never truly perfect. And it

17:38

certainly says, truths about tomorrow, you're

17:40

just not going to get. And

17:43

it's like, even though people say, yeah,

17:45

that totally makes sense, but we

17:47

just operate where when the analyst says,

17:49

I don't know, I can't, I can't give

17:51

you a definitive answer. So to

17:53

me, that's probably like one of

17:55

my favorite misconceptions is that this

17:58

gold rush for data is. because

18:00

it's going to let us eliminate

18:02

or essentially eliminate uncertainty, which is

18:04

just a fool's errand, but that

18:06

is what the industry is doing.

18:08

So there's one of my favorite

18:10

misconceptions. I don't know. You want

18:12

to do one, Joe? Yeah, your

18:14

take, Dr. Joe. Let me tell

18:17

the, uh, the he-oldy economist's joke,

18:19

because I actually, I love, I

18:21

actually love this one. And also

18:23

it does link back to the

18:25

misconceptions. Yeah, so one of my

18:27

favorite misconceptions just comes with this

18:29

idea that data are inherently unbiased

18:31

and as a trained statistician economist

18:33

I could tell you that's just

18:35

totally false. There's actually a great

18:37

economist joke that goes as follows.

18:40

So CEO of a major company,

18:42

you know, he's hiring for a

18:44

role. He brings in three folks

18:46

to interview for the job, a

18:48

mathematician, a statistician, and an economist.

18:50

You know, CEO calls the first

18:52

guy in, he's the mathematician, he

18:54

says, look, what does two plus

18:56

two equal? And the mathematician goes,

18:58

well, it's four, of course. CEO

19:00

goes, four, are you sure? He

19:03

goes, yes, exactly four. That's exactly

19:05

what the answer is. CEO is

19:07

not pleased, calls him the statistician.

19:09

He says, what's two plus two

19:11

equals four, two equals four, and

19:13

statistician. The CEO is still not

19:15

pleased. So he calls the economist

19:17

and gives him the same question

19:19

and the economist gets up from

19:21

the chair, he looks around very

19:23

sneakily, he closes the door, closes

19:26

the shade, sits right next to

19:28

the CEO and goes, what do

19:30

you want it to equal? And

19:32

I just, it's so true. Oh,

19:34

oh, there's a laugh? Who did

19:36

that? That's cool. We've leveled up

19:38

the production of this since your

19:40

last time you came on the

19:42

show. So this is like, I

19:44

feel like we just entered the

19:46

new, well, this is, well, you

19:49

know, 2K, new millennia. Like, but

19:51

I actually, I love that joke

19:53

because there's this old adage too.

19:55

It's like with a nut torturing,

19:57

the data will. confess whatever you

19:59

want them to confess, right? And

20:01

that's just the truth about data.

20:03

So stop thinking about it as

20:05

something that's inherently on bias is

20:07

how you deal with it and

20:09

how you build confidence in your

20:11

methodology that really lets you get

20:14

to the right answer. I love that.

20:16

It sounds like you guys have a lot

20:18

of things that you're packing into this book

20:20

that you're packaging for these business leaders. Like

20:22

how do you... How did you walk them

20:25

through this? Like was there an overarching like

20:27

framework that you leveraged? Because I think that

20:29

I intuitive one from, you know, working with

20:31

you all over the years. But I think

20:34

it would be helpful if you guys talked

20:36

that through a little bit if that was

20:38

one of the mechanisms that was kind of

20:40

driving the narrative and how you were

20:43

packaging it up. Sure. I mean, when

20:45

it comes to the, the, the outline

20:47

that was in the book proposal, this

20:49

is just kind of amuses me. I

20:51

get sort of irritated with business books

20:53

that feel like there's way too much

20:56

wind up where they're like you're into

20:58

like the fourth chapter and they're still

21:00

telling you what they're gonna tell you

21:02

in the book. We actually did have

21:04

to add like an additional introductory chapter

21:07

because we had so much to say

21:09

so I'm sure every author says well

21:11

we're not guilty of that but you

21:13

know like so there is that part of

21:15

just from the structure of the book is

21:18

there are a couple chapters up front

21:20

trying to say a lot of the common

21:22

ways of behaving are problematic

21:24

and let's help you understand,

21:26

you know, why those are

21:29

problematic. Then kind of the the

21:31

core of the book, it is kind

21:33

of a framework trying to keep things

21:35

as simple as possible, which is,

21:37

and I've talked about pieces of

21:39

this on many episodes of the

21:42

Analytics Power Hour podcast in

21:44

the past, but that fundamentally

21:46

when you're trying to put

21:48

data to use for an

21:50

organization, there are kind of three

21:52

discrete things you can do.

21:54

You can be trying to

21:56

measure performance objectively and concisely,

21:59

which so many organizations

22:01

really, really struggle to do well.

22:03

They may have a lot of

22:06

reports and dashboards, but they're not

22:08

really doing a good job of

22:10

objectively saying, how are we doing

22:13

relative to our expectations in a

22:15

meaningful way? There's validating

22:17

hypotheses. That's like the analysis

22:20

or testing or that's got

22:22

multiple chapters devoted to it

22:24

because that's where we're trying

22:26

to make better decisions going

22:28

forward. Lots of ways to

22:30

validate hypotheses. I think the,

22:32

at least in marketing, there's

22:34

a lot of talking about, you

22:37

know, if you're doing AB tests

22:39

on a website, they'll say, what's

22:41

your hypothesis? Well, everything that we're

22:43

doing with lots of different techniques,

22:45

it really should be grounded in

22:48

validating a hypothesis. And then

22:50

the third is, you have data that

22:52

is just part of a process.

22:54

It's enabling some operational process. And

22:56

those sort of, sort of, they

22:59

fit. together. Interestingly, and we did

23:01

do a lot of kind of thinking

23:03

and talking about how to talk about

23:05

AI. This is not a book that

23:07

is AI, AI, AI, AI, AI, AI,

23:09

AI, because we went in saying AI,

23:11

it's purpose, it delivers value

23:14

because it is part of

23:16

an operational process. So it

23:18

actually fits in this one

23:20

area. And so everyone who's,

23:22

you know, if you're super

23:24

excited about AI, It's not

23:26

doing a whole lot. It

23:28

might be a code assisting

23:30

or something on validating a

23:32

hypothesis to develop some code

23:34

for a model, but it's

23:36

not like AI replaces the

23:38

analyst because those other two

23:40

measuring performance and validating hypotheses

23:42

really are much more about kind

23:45

of human human thought. You know the

23:47

2024 in retrospect was just you know

23:49

It was the year of agentic AI

23:52

right? It was all everybody was very

23:54

interested How do how do we use

23:56

large language models to replace analysts and

23:59

replace people and? You know, the truth

24:01

is like it really prays upon the

24:03

super lazy impulse that I think we

24:05

have as like a huge, you know,

24:08

as a human species in society, right,

24:10

which is like, man, if I can

24:12

just create a machine that could just

24:14

do my work for me and delegate

24:17

the work to the machine, like I

24:19

can go golf, you know, while it

24:21

does all the super valuable stuff that

24:23

I was doing, right? I'll just go

24:26

golf. And like the, if you read

24:28

through the book, you'll actually, it demonstrates.

24:30

Why that is like a super like

24:32

it's just not true You could never

24:35

really do that actually I'll jump in

24:37

on we kind of talk a little

24:39

bit about this in the book But

24:41

there's a guy who got the Nobel

24:44

Prize back in the 70s his name

24:46

was Herbert Simon and he had this

24:48

idea that and as a society what

24:50

we do when we're looking for the

24:53

answer to make a decision We sort

24:55

of just look in like our local

24:57

area and space and talk to our

24:59

friends. Good examples, like if you're trying

25:02

to find your ideal soulmate, you know,

25:04

so you can marry them, like most

25:06

of us don't go and date like

25:08

the 8 billion people in the world,

25:11

right? Like, and find the best one.

25:13

What we do is we go and

25:15

we ask our friends and we kind

25:18

of go, and it's somebody who's on

25:20

the periphery of your social circle that

25:22

you know growing up. Like, we look

25:24

to find the best possible alternative that's

25:27

just in the local area where we're

25:29

looking. were baked in with these impulses

25:31

and intuitions. But, you know, machines, like,

25:33

to find the best option and to

25:36

make the best estimate of what might

25:38

happen in the future if a decision

25:40

is made, they have to search like

25:42

the entire space of possible outcomes and

25:45

opportunities. It's kind of like we often

25:47

refer to as boiling the ocean. And

25:49

it's virtually impossible, right, to be able

25:51

to make really, really incisive decisions and

25:54

insights. with an approach that boils to

25:56

the ocean. It's actually just not even

25:58

really feasible within the amount of time.

26:00

that we have available to make those

26:03

decisions. And so I'm not sure why

26:05

I got into that, but I thought

26:07

it was important. Oh, agentic AI, that's

26:09

what it was. The takeaway there was

26:12

just, you know, I do think that

26:14

people with this artificial intelligence revolution happening

26:16

over as soon that we can delegate

26:18

to the machine, but the truth is

26:20

you're still going to have to go

26:23

through the decision-making processes that we articulate

26:25

in the book. I literally saw a

26:27

post on LinkedIn that said, because there's

26:30

so much around like the generative BI

26:32

and oh, the simple questions. And it's

26:34

like, you know, for instance, if I

26:37

want to know how many leads came

26:39

from California last month, I should be

26:41

able to do a natural language query.

26:43

And I'm like, that's literally no

26:45

one is saying I have very

26:47

simple, straightforward to find questions. And

26:49

it's spending me so long to

26:51

get to it. So that's to

26:53

me, it's kind of the saying, well

26:56

these dots. They're close enough to

26:58

connecting. Let me go ahead and make

27:00

the leap that I can just, you know, ask,

27:02

if I could just ask the AI to

27:04

give me, you know, insights, and then

27:06

it's like, for instance. I'm like, well,

27:08

well, nobody getting told how many leads

27:10

there were from California last

27:13

month is rarely the type of

27:15

question that takes you anywhere. Let me

27:17

actually, I do want to dig

27:19

in on this, because like, what

27:21

you're describing actually, I would think

27:23

of as, it's not even an

27:25

insight generation using artificial intelligence, what

27:27

you're describing is the process of

27:30

doing the query to get the

27:32

answer is just being replaced, right,

27:34

by some sort of generative AI

27:36

technology. So, like, you know, that

27:38

actually is consistent within our book,

27:40

you know, we kind of break

27:42

out insight generation from operationalization and

27:44

operational technologies enabling automation. you know

27:46

that example you just gave I actually would

27:48

throw it in the bucket of yeah it is

27:50

sort of like an operation enablement problem right which

27:53

is just oh we need to get to the

27:55

query faster right and that to me is consistent

27:57

with the use of AI but yeah it's it's

27:59

fine to do it it's just to

28:01

paint that as saying, and

28:03

this is what's going to replace. I'm

28:05

like, no,

28:08

you're actually missing the boat

28:10

on what should be going into

28:12

actually getting real value out

28:14

of this. If you think that

28:16

it's following that path is

28:18

what's going to do it. Well,

28:20

I want to draw back on something,

28:22

too, that you guys mentioned. You called

28:25

them decision making frameworks and I'm lucky

28:27

enough to have worked with you all.

28:29

And so it's very ingrained in me,

28:31

but I run across this a lot.

28:33

So where people talk about the value of a

28:35

new product, a new technology, it's like, it's

28:37

going to give us insights. And then you

28:39

ask them more about it. And they say,

28:41

Oh, it's going to give us knowledge. We'll

28:43

know what's going on. There's value in knowing.

28:45

And it's like, to a point, there's value

28:47

to knowing, but I'm like, the real value

28:49

comes when you act on the knowledge, right?

28:51

Like, and you guys make very clear distinction

28:53

about that, especially in this framework. And I

28:55

think that's how we as a small group

28:57

here today think about it. But it's still shocking

28:59

to me that I've run into a

29:01

lot of people that I have to make

29:03

that argument to and really say like,

29:06

I think there's one step farther. And like,

29:08

so when we talk about value to

29:10

clients and of different services and things as

29:12

consultants, like them being able to go

29:14

take action on what we've, you know, help

29:16

them learn, like that is really

29:18

the end point. And I'm, I think

29:20

a lot of people's minds will

29:22

be very blown and like opened to

29:24

that by reading this book, which

29:26

I am so excited about. Well, and

29:28

I do think what, what

29:31

happens and, and we have,

29:33

well, trying to explain sort

29:35

of counterfactuals, potential outcomes framework,

29:37

which I mean, I think Joe was

29:39

like, rain it into him, rain it in.

29:41

You're excited about this, but spin off book,

29:43

but, but what looks the right way part?

29:45

Are we doing another book now? Book number

29:47

two. We just have to do right now.

29:50

I remember when we were on the panel.

29:52

Well, I mean, to be fair, there were

29:54

lots of things where I just wants to

29:56

come back a fourth time. I

29:58

was the one who was

30:00

like, Hey, if I

30:02

come back I heard there's a gift. There's

30:04

a, you get the jacket. Yeah. There's a jacket.

30:07

Okay. Okay. But I would often take a crack

30:09

at saying, I'm going to try to describe this

30:11

because I am closer to the less deeply, deeply

30:13

immersed than the mechanics of this.

30:16

Joe would come back and basically

30:18

in there, there are footnotes in

30:20

the, but we had fun with

30:22

the footnotes, but there are lots

30:24

of times where we're like, if

30:26

a train statistic, we are taking a

30:28

shortcut. It is not material for

30:30

what we think people need to

30:32

know. Joe does have his reputation,

30:34

you know, so it would be

30:37

in the footnote and say, look,

30:39

this is, you know, technically not

30:41

quite correct, but it's good enough.

30:43

And that, which I think goes to a

30:45

lot of what we were trying to do

30:47

with the book, but I've seen that

30:49

again and again when someone says,

30:51

oh, we're just going to make this

30:54

change and we're just see if it

30:56

worked or not. And We sort of walked

30:58

through an example of saying, well, what if

31:00

you make a change and this is what

31:02

the data looks like? Because it usually

31:05

looks, it's not some abruptly massive step

31:07

function that says, look, we changed the

31:09

button color on this page and revenue

31:11

jumped way up. I deeply believe that's

31:13

what is human beings we think is

31:16

going to happen. We're going to do

31:18

something and it's going to have this

31:20

abrupt, sudden, immediate. Impact and we'll be

31:22

able to look at the chart and

31:24

the chart will kind of go along

31:27

and I'll have a big jump and

31:29

go on after and we'll say see

31:31

that's that's what happened That doesn't happen

31:33

and so helping people understand that that

31:35

it's like no if you're trying to

31:38

To if you're gonna have an intervention

31:40

if you're going to do something and

31:42

you want to see whether or not

31:44

it worked or not you can't just

31:46

say let me do it and then I'll wait and

31:48

look at it afterwards and we'll have a

31:50

you know we'll just it's gonna be obvious like

31:52

it's not obvious and then it gets dumped

31:54

on the analyst to say well figure out

31:56

the answer anyway and well the easiest way to

31:59

figure out the answer would have been to

32:01

think about how you were going to

32:03

answer that question before you actually

32:05

made the change? Well, it's amazing, right?

32:08

It's like, think about how you'd want

32:10

to answer the question before you

32:12

even try to answer it or get

32:14

into it, right? But number, if you

32:17

don't do it, and then you

32:19

force an analyst who, you know, God

32:21

forbid, hasn't had the experiences that we've

32:23

had in the wild west of data

32:26

analytics like you might end up

32:28

having somebody who looks at what happened

32:30

and you know you draw the wrong

32:32

conclusion right that's kind of the

32:34

risk like oh when we cut our

32:37

investment in sales professionals in the southeast

32:39

like our our efficiency went way

32:41

up well let's just cut more the

32:43

conclusions kind of good can be wrong

32:46

and if you don't think about

32:48

like the appropriate inferential framework You might

32:50

get to the point where you say,

32:52

well, we made that decision. We're

32:54

going to skip the process where we

32:57

vet it and figure out if the

32:59

inference was a solid inference. We're

33:01

just going to go right ahead to

33:03

automation. We're going to throw this into

33:06

the machines. We're going to have them

33:08

automate it to oblivion, right? And

33:10

then all of a sudden you get

33:12

somebody who's got no real-time feedback. And

33:15

I really think there's... There's a

33:17

risk, especially in this era of automation,

33:19

that we skip to this human out

33:21

of the loop stuff way too

33:23

fast, simply because we drew the wrong

33:26

inference. And part of the book is

33:28

thinking about how to slow that

33:30

process down. One of the parts too,

33:32

and we have teased it here before,

33:35

and I know we've had a

33:37

lot of conversations about it, and like

33:39

other sidebars, and so I'm really excited

33:41

to ask you guys about this,

33:43

is your ladder of evidence. Because I

33:46

know that is not easy to come

33:48

up with. I have had other conversations

33:50

with people, and it's like you

33:52

think it's so straightforward, and then you

33:55

get into, oh, but what if you

33:57

think about it this way, or

33:59

you know, that we have to walk

34:01

through Tim andize like ideation process on

34:04

this. I need to hear this

34:06

origin story. That was intense.

34:08

It was intense. It was

34:10

the subject of many like

34:13

long zoom comp facetime conversations

34:15

and it

34:17

actually I

34:19

think that

34:21

this section

34:23

alone single-handedly

34:26

reorganized the

34:28

book like

34:30

three times.

34:32

Wow. I

34:34

mean.

34:45

So the funny thing is that there

34:47

was a, I think it was like

34:49

a Shopify blog post buried somewhere that

34:52

had this idea of a ladder of

34:54

evidence that I had thought was really

34:56

useful and I'd written a little bit

34:58

on it. So that's kind of where

35:00

it started. I dug in enough to

35:02

say like, oh, wait, this is not

35:04

like some deeply established way of thinking

35:06

about things and where we landed,

35:08

we're also calling it a ladder

35:11

of evidence. And it is conceptually

35:13

consistent. It gets to that idea

35:15

of uncertainty, which also gets

35:17

to this idea of how

35:19

strong is the evidence I'm

35:21

using to make a decision.

35:23

So the latter is very

35:25

simply there's anecdotal evidence, which

35:27

is super super weak evidence,

35:29

but it's evidence. And

35:31

this is in the context

35:33

of validating hypotheses. If I

35:35

want to validate a hypothesis,

35:37

if it's low stakes or if I

35:40

have no time or any number of

35:42

factors, all I have a little bit

35:44

of evidence, you know what? Generally speaking

35:47

is better than no evidence, but

35:49

we need to recognize that that

35:51

is anecdotal. There is

35:54

descriptive evidence, which is, I

35:56

mean, tons and tons of techniques

35:58

across lots of different types of

36:00

data. That's where I think a

36:02

lot of analytics and a lot

36:04

of like research and insights lives.

36:06

It's it is stronger evidence because

36:08

we're looking with generally have more

36:10

data. I think actually it was

36:12

in the book and this was

36:15

this was credit to Joe that

36:17

descriptive evidence is when you got

36:19

a whole bunch of anecdotes kind

36:21

of gathered together. So it's kind

36:23

of a continuum. It's stronger evidence.

36:25

And then the third kind of

36:27

the strongest evidence is scientific evidence

36:29

which is generally speaking controlled experimentation

36:31

in one one form or

36:34

another and it's not like

36:36

these are good versus bad it is

36:38

a strength versus weakness of the

36:40

evidence it goes to the you

36:42

know criticality of the decision it

36:44

but it goes to understanding that

36:46

you can't just say it goes

36:48

back to those misconceptions just because

36:50

I have a billion rows of

36:52

data and I'm gonna run a

36:54

model on it that is still

36:56

Almost always not going to

36:58

be as good as running a

37:01

controlled experiment if I'm trying to

37:03

actually find a evidence for

37:06

a causal link between between

37:08

two things. So we spend

37:10

a whole chapter on descriptive

37:12

evidence and a whole chapter

37:14

on scientific evidence. So what what

37:17

were some of your earlier? like words

37:19

you tried to use because I also

37:21

feel like most of the time when

37:23

I've seen some version of this it's

37:25

using words that you see on like

37:28

data maturity curves you know it's it's

37:30

just like it does imply like

37:32

descriptive good better best or it

37:34

yeah predictive it always has to

37:37

get to predictive or

37:39

predictive descriptive yeah all those

37:41

those terms that are much more like talking

37:43

to the method itself. And I know these

37:46

kind of are, but your buckets that you

37:48

ended up on are so much nicer and

37:50

broad enough that you can't really get

37:52

down in the dirt on like nitty

37:54

gritty. Like someone can't, I feel like

37:56

come in and really be like, oh my

37:59

gosh, I completely. disagree. So I thought

38:01

it was very artful how you guys

38:03

landed there. So what were some of

38:05

the previous tries? Well, number one, somebody

38:08

can totally come in and disagree and

38:10

I fully expect them to do that.

38:12

I welcome you to comment on my

38:14

LinkedIn posts as much as you care

38:17

to disagree with us. It'll get the

38:19

it'll get the conversation going. If you're

38:21

in such disagreement that you want to

38:23

buy a hundred copies of the book

38:26

and burn them? Yeah, if you wanted

38:28

to go get 500 copies. No, you

38:30

can do this agreement. I mean, the

38:32

earlier, I think the way we had

38:35

thought about it before was actually like

38:37

in terms of like analysis rather than

38:39

like the weight of the evidence. Like,

38:42

no, it was kind of like. And

38:44

this is why I like where we

38:46

ended up because it was starting with

38:48

this, you know, what methodologies can you

38:51

use to answer your questions, right? And

38:53

it was kind of like, well, there's

38:55

easy methodologies and there's hard ones. Like

38:57

that was kind of where we had

39:00

started with it. But I think as

39:02

the. this sort of picture in my

39:04

head as we were like developing this

39:06

was was actually and I think we

39:09

ended up with a cartoon in the

39:11

book about this with the hand scale

39:13

right it was it was kind of

39:15

the scale which was like well actually

39:18

there's there's this question that has to

39:20

be answered and it has to be

39:22

weighed against some sort of weight of

39:25

evidence of you know against it and

39:27

And that's what I think started to

39:29

get us towards this idea of like,

39:31

well, if it's just a light question,

39:34

you just need a light amount. And

39:36

what are the usual forms of light

39:38

evidence? Well, it's usually just walking down

39:40

the hallway and talking to your coworkers,

39:43

to see if they're in a good

39:45

mood or a bad mood, right? It

39:47

could be very simple stuff. And that

39:49

was my memory, like the shift was

39:52

going from the methodological thought process and

39:54

mental model to thinking about it more

39:56

fundamentally. And that's what I think gave

39:58

us your... Yeah, it was like. historical

40:01

data analysis, research, primary and

40:03

secondary, and controlled experimentation. And I mean,

40:05

one of that is we were going

40:07

around and around trying to kick the

40:09

tires on what we had. We had

40:12

a whole debate around his secondary research.

40:14

Joe was like, that's anecdotal. And I

40:16

was like, what do you mean? It

40:18

could be like super robust and secondary

40:21

research. He was like, no, like you

40:23

got that in a study. If you

40:25

got access to the underlying data and

40:27

you knew the research question they were

40:29

trying to answer and you knew their

40:32

methodology and that lined up with what

40:34

you're trying to validate, sure, but that

40:36

never happens. Even like a scientific

40:38

journal, secondary research, it is always

40:41

one step removed. So I was

40:43

like. Yep, that's, you know, totally get

40:45

that. So that's one where like

40:48

primary research would fall into descriptive

40:50

evidence. Unless, you know, what if

40:52

you do, if you do a

40:54

small usability study, that's kind of

40:56

anecdotal. So there's a, there's a

40:58

little bit of a gray area, but

41:00

I think that ramp of saying how strong,

41:02

thinking of it as the strength of

41:05

evidence, I mean, ever since we kind

41:07

of hit that point, I am using

41:09

that word, I'm using that phrase a lot.

41:11

Yeah. Even after you said it, Joe, like

41:13

that was a light bulb moment for me when

41:15

you were like, yeah, it's not so much

41:18

the methods, it's the weight of evidence. It's

41:20

like, wow. It's also just so nicely

41:22

put for your audience because it's incredibly

41:24

practical too because if you're thinking I'm

41:26

like a leader inside of an organization

41:28

I perhaps have like multiple analytics teams

41:31

that I work and maybe some are

41:33

embedded maybe there's a center of excellence

41:35

then they're broken out by their specific

41:37

functions there's like the digital analytics team

41:39

which suffer from performance you know and

41:41

so if that's the construct in which

41:44

you think about all of this you

41:46

might not understand how to right-size the

41:48

evidence for the question or problem. my hand.

41:50

And so I think that this is going

41:52

to be one of those sections that really

41:54

connects with your audience. I think it was

41:56

very nicely done. So thank you. I'll have one

41:58

more thing, which is like I I do think

42:00

a lot of the books, and Tim actually

42:03

deserves credit for the tone of an approach

42:05

of the book as more of a fun,

42:07

entertaining, interactive, very like down to earth tone,

42:09

like, you know, I think a lot of

42:12

the scientific approaches can come

42:14

across as super heavy handed and super

42:16

duper, like, this is so full of

42:18

firepower that you could never deal with

42:20

it. And it's meant to be very

42:23

impressive, right? And it's like, and it's

42:25

methodological weight. And the way that we've...

42:27

had a lot of fun with this

42:29

book was thinking about like little simple

42:31

example. I mean this you know you

42:34

might go and be the vice president

42:36

of analytics at Coca-Cola or you might

42:38

be the CEO of like you know

42:40

murk pharmaceuticals or something you could be

42:43

any of these people but on the

42:45

day-to-day basis you're not sitting on the

42:47

top of a mountain with your hands

42:49

on your hips being like ha-ha you

42:51

know I mean what you're doing is

42:53

like you're going down the hall to

42:56

talk to Michael and Catherine there You

42:58

know, it's a much more down to

43:00

earth experience. I know, I really thankful

43:02

that Tim enforced that on the book.

43:04

Well, and on that note, because you

43:06

did bring up interactive, I did

43:09

want to bring up the little,

43:11

is it quizzes that you guys

43:13

are doing at the end of

43:15

each chapter, like the performance measurement

43:17

check-in? I think that that's super

43:20

fun. I think you guys should talk

43:22

about that. Sure. This was. my brainchild

43:24

and Joe's the one who actually

43:26

built it. But because we, I mean,

43:29

I think I come from pushing performance

43:31

measurement and how, and we did

43:33

a episode of the podcast around

43:35

goals and KPIs and the two

43:37

magic questions. And so part of

43:39

what we're trying to do, it's

43:41

actually useful, was we asked the

43:43

question, like, how do we measure

43:45

the performance of a book? And,

43:47

you know, there's the easy metric,

43:50

which is, well, how many did

43:52

you sell? But we're not writing

43:54

the book because we're trying

43:56

to drive sales. Like we,

43:58

so we apply. the chapter five

44:00

is all about performance measurement but and

44:03

you guys are both super familiar with

44:05

the two magic questions of like what

44:07

are we trying to achieve with the

44:09

book and we actually said what is

44:11

our answer to that question and when

44:13

we write that in the book and

44:16

say we want to arm you with

44:18

a clear and actionable framework and set

44:20

of techniques for efficiently and effectively getting

44:22

more value from your data and analytics

44:24

and then okay well how are we

44:26

going to decide if we've done that?

44:28

We said, we should ask people. We

44:31

want people on a chapter by chapter

44:33

and then for the book overall. We're

44:35

going to ask them. So there is

44:37

a analyticstrw.com, analytics the right way.com.

44:39

It's analyticstrw.com, but that's where the

44:41

TRW is, has like an evaluation

44:43

form. So at the end of

44:45

every chapter, we say, hey. help

44:48

us measure the performance. We have

44:50

a target set. We want to,

44:52

and it's published, a certain percentage

44:54

of people to say that they

44:56

somewhat agreed or strongly agree with

44:58

two questions about the information ideas

45:00

presented gave me a new and better

45:03

way to approach using data. And I

45:05

expect to apply the information presented to

45:07

the way I work with data in

45:09

the next 90 days. So we said

45:11

we will actually measure any when you

45:13

click submit. You will see how the

45:16

cumulative respondents to date.

45:18

perform against those targets, which

45:20

is kind of terrifying, but it

45:22

also seems like, well, that's how, you

45:24

know, if we did do a second

45:27

edition of the book, we should know

45:29

which the weakest chapters or which

45:31

the least impactful chapters were. But

45:33

so we're doing it kind of,

45:35

there's a meta one to say,

45:37

yeah, if you think about it,

45:39

you really do need to think

45:41

one level beyond what metrics

45:43

will be. available to what are

45:45

we really trying to do and how could

45:48

we best measure that? So yeah I'm pretty.

45:50

And where on the ladder of evidence

45:52

will that fall? Well that's performance

45:55

measurement. It's not validating

45:57

a hypothesis, right? So

45:59

that's the... Oh. So that's true. So

46:01

it's just objectively measuring. Yeah. Because we're

46:03

just trying to alert. We need to

46:05

we need to alert ourselves. The thing

46:08

that I'm actually worried about town. Oh,

46:10

here's a great time to bring it

46:12

up. Yeah, I know. I got concerns.

46:15

I'm like, you know, and I did

46:17

do some, you know, you can go

46:19

on this website, right? And it's like,

46:21

you could, I'm worried about like a

46:24

botnet coming and like. you know, over

46:26

just, you know, giving us a bunch

46:28

of poor scorers. So look, if you

46:31

come on the website and you see

46:33

the scores are really low, a botanet

46:35

got us. We've had some discussion about

46:37

correcting for that, but this is, yeah,

46:40

our assumption is this is not going

46:42

to, there aren't going to be foreign

46:44

actors saying, boy, if we can tank

46:47

the performance measurement of this book, that's,

46:49

that's going to give us a global

46:51

leg up. So, but who knows? I

46:53

have a hypothesis about which chapter is

46:56

going to score the highest on the

46:58

actionability over the next 90 days. So

47:00

maybe we should do our own little

47:03

back of the napkin target setting, see

47:05

if we can see how it lines

47:07

up against real data in the future.

47:09

Real, be real meta about it. You

47:12

could. Actually, the other fun thing about

47:14

the website, if you do go on

47:16

the website, is we have a merch.

47:19

We're actually not trying to make any

47:21

money off this merch. But it is.

47:23

I think it's actually pretty funny. It's

47:25

funny stuff. Like, so if you're a

47:28

real fan of the book, you can

47:30

also get merch online, printed t-shirts, etc.

47:32

I'm getting myself a t-shirt. Yeah, book

47:35

writing process. Joe makes, Joe makes the

47:37

crack like, oh, go to this URL.

47:39

It's in a footnote is like a

47:41

wise crack. So then we're going through

47:44

the process. I'm like, yeah, that's a

47:46

nice draft of the site, Joe, but

47:48

you did put... slash store in the

47:51

footnote so and then that actually is

47:53

what happened. We never intended to do

47:55

it and then we realized it'd be

47:57

kind of it's actually not. bad idea

48:00

so yeah I love that well we're

48:02

gonna have to move to wrap pretty

48:04

soon but I guess is there

48:06

any last parting thoughts dr. Joe

48:08

or Tim that you want to

48:11

share that you're really excited about

48:13

your future readers being able to

48:15

take away from Alex the right way

48:17

you know I Look, I started

48:19

programming in 1998, okay, in a

48:21

language called Basic. I don't know

48:23

how many of the readers or

48:25

the audience or the listeners will

48:27

even know what Basic is, but

48:29

it was a very easy to

48:31

use programming language on Microsoft Systems

48:33

back in the day. And, you

48:35

know, I have seen over the

48:38

last, how many, you know, 26, 7 years,

48:40

right? Just things seem to get more

48:42

and more complicated. Every, you know,

48:44

it almost seems like it used

48:46

to be so much, maybe I'm

48:48

just being nostalgic, but, you know,

48:50

everything from the documentation to the

48:52

methodology, they've gotten more complicated.

48:55

And I think that that's, for no reason

48:57

in a lot of ways. And I think that

48:59

that really deprives people. Like, I

49:01

think it deprives them of the opportunity to

49:03

use all these great tools that we have

49:06

because they, you know, not because they don't

49:08

have access to them. But I do think

49:10

that like the self-imposed misunderstanding

49:12

or feeling like they don't understand the

49:14

complexity of these things, like almost is

49:17

like a self deprivation of all the

49:19

great tools that we have out there.

49:21

And so my hope is just that

49:24

the book kind of reopens that door,

49:26

you know, and really simple and direct

49:28

terms. And I'm going to have to go

49:30

get an advanced degree to self-sooth

49:33

from the fact that I also

49:35

started programming and on basic on

49:37

Apple 2c but I just

49:39

did the math it was

49:41

in it was in 1985

49:44

so Apple got me there

49:46

too right right in with

49:48

these kids these days I

49:50

tell you it was a

49:52

little rough because he was

49:54

some of the language he

49:56

was dropping but I mean

49:58

I I hope I would die

50:00

happy if there were people using

50:02

some of the language in the

50:05

book and finding it as a

50:07

way for them to more like

50:09

act with more confidence

50:11

within their organizations.

50:14

I mean that's fundamentally

50:16

deeply believed this stuff

50:18

is not so complicated that

50:21

needs to be treated as

50:23

a mystical black box that's

50:25

so intimidating that I need

50:28

magical AI. to solve it.

50:30

There is so much fun

50:32

and joy and hard

50:35

creative thinking and that

50:37

is like the core

50:40

of like using analytics

50:43

productively. We're still a few

50:45

generations away before

50:48

human creative thought

50:50

isn't kind of at the core

50:52

of that so I'm hoping that

50:54

there are readers who say I

50:57

get it. I know it's not

50:59

it's not a hard thing or

51:01

a scary thing or a frustrating

51:03

thing to collaborate with my analyst

51:05

or to poke around in my

51:07

dashboard because I I know what

51:09

I'm trying to do why I'm

51:12

trying to do it and I

51:14

have ideas and I can treat

51:16

those ideas as hypotheses and think

51:18

about how strong is the evidence

51:21

I need to validate them. I

51:23

can feel fine making a decision

51:25

with very weak evidence. Because that's

51:27

okay. You can't, like, that's absolutely

51:29

okay. What's not okay is to

51:32

not realize that's what you're

51:34

doing. So, yeah, I guess I'm, I'm

51:36

passionate about it. Oh, that's

51:38

a little. All right, so when

51:40

does the book come out? Where can

51:42

we find it? It comes out, end

51:45

of January, was it January

51:47

20? Tomorrow. If somebody's listening

51:49

to this pot, January 22nd.

51:51

If they're listening to the

51:53

podcast the day that it

51:55

drops, then you can preorder

51:57

now and you're effectively ordering.

52:00

it because it is

52:02

available tomorrow on Amazon, on

52:05

Walmart, Target, Barnes &

52:07

Noble, wherever you get your books.

52:09

Oh, you fancy. You

52:11

can go to analyticstrw .com and

52:13

get links to it. You

52:16

can go to the wiley .com and

52:18

order it there. It'll

52:20

be out as an ebook

52:22

a little bit later and

52:24

actually it's coming out as an audio

52:26

book in about another month or so.

52:29

What? And so if you want

52:31

to go and listen to the sweet,

52:33

sweet stories of data and

52:35

lull yourself to sleep or perhaps

52:37

keep yourself busy in the

52:39

car, you can do that. And

52:41

Daniel Craig is reading it.

52:45

Damn. Actually, no, it's

52:47

a professionally trained

52:49

voice actor. Luckily it

52:51

was not either of

52:53

us because that would have been

52:55

difficult. I was hoping it

52:57

would have been. I

52:59

would have listened. Well, this has been

53:02

such a fun little reunion talking about

53:04

analytics right way. Long time coming. Very

53:06

excited for this. So thank you so

53:08

much for joining us, Dr. Joe. It's

53:10

been a pleasure. My pleasure. Thank

53:12

you for having me. And Tim,

53:14

you know, thanks. Thanks for being here.

53:17

We'll show you the thank you to you, even though you're co

53:19

Somebody had hit the record button. Yeah.

53:23

Well, one of the things that we love to

53:25

do is just to go around the horn

53:27

and share a last call, something that we think

53:29

our listeners might be interested in. So Dr.

53:32

Joe, you're our guest. Would you like to share

53:34

your last call first? So

53:36

yeah, related and also unrelated.

53:38

You know, I

53:40

went down to, we have a

53:42

camp, Emory University as a campus in

53:44

Oxford, Georgia. It's down in Newton

53:47

County. And I did a quick presentation

53:49

to their local chamber and I

53:51

asked, how many of you guys feel

53:53

like you have reasonable facility with

53:55

artificial intelligence technology, such that you could

53:57

use them in your business today? and

54:00

it was a big room, not one

54:02

hand went up. And I actually realized,

54:04

like, you know, we're all talking about

54:06

it here, you know, this analytics audience.

54:08

We talk about it all the time,

54:10

right? But not everybody has access to

54:12

these tools. And so we went and

54:14

raised money and started a basically workforce

54:16

development outreach tour. And if you're

54:18

interested in learning more about it

54:21

and how to get involved, you

54:23

know, we also offer certifications and

54:25

artificial intelligence, etc. Just go to

54:27

AI and you, Georgia. I know this

54:30

is a national audience, but

54:32

A.I. International. Georgia. Global. Global.

54:34

Sorry, it's a global audience.

54:36

This is Georgia, not the

54:38

country. This is the state

54:40

and the U.S. Just to

54:42

clarify. Love it. That's a good

54:44

one. All right, Julie, how about you?

54:47

What's your last call? My last call

54:49

is actually a tip I got

54:51

from, actually all of our, at

54:53

least previous or current, co-worker. Ricky

54:55

Messick. one of our faves. He

54:58

was telling, well because I was

55:00

sharing with him that I struggle to

55:02

make it all the way through like

55:04

listening to self-held books. I was like,

55:06

sometimes I just want them to get

55:08

to the freaking point. I'm like, they

55:10

say it so many ways to fill

55:13

up pages, we've talked about this before.

55:15

It's one of my shortcomings, I just

55:17

cannot finish them. So he told me

55:19

that he does this thing where he just

55:21

puts the playback speed like close to two,

55:23

like two X or maybe more. Because he

55:26

found that when it's going really fast,

55:28

you actually have to stay more focused

55:30

on what they're saying, and you will

55:32

retain and take in the information instead

55:34

of letting your mind wander. And he's

55:37

like, and then you get through the

55:39

book faster. So I have been trying that slowly.

55:41

I'm not up to. as fast as he

55:43

listens to it. But I think it works.

55:45

For all of our listeners, for all of

55:47

our listeners who listen to us on 1.5

55:49

or 2X, we'll tell us because that's they

55:51

really want to focus on it. They want

55:53

to focus on the content of the show,

55:55

not because they just want to get get

55:57

get get through it. I'm going to tell.

56:00

Okay, but the fact that they would listen

56:02

to it all still says something. One of

56:04

our former coworkers actually, one time, you

56:06

know, I missed a meeting and they

56:08

recorded it. It was like a four-hour

56:10

meeting and, you know, our co-worker goes,

56:12

no, no, just go back and watch

56:14

it. I said, oh, should I bill

56:16

four hours to, you know, to watch

56:19

the four-hour meetings? No, just watch it

56:21

at double speed. And so then I

56:23

think, well, if I watch it at

56:25

half speed, do I get to Bill

56:28

8? Yeah. Nice. That's awesome. That's good.

56:30

That's good. All right, Jimmy got a

56:32

last call for us? I do. It's

56:35

trivial, but because I'm a

56:37

sucker for getting a random data

56:39

set and pursuing it a little

56:41

too far. This was a while

56:43

back. I got it out of,

56:45

I think it was out of

56:48

Philip Bumps. How to read this

56:50

chart newsletter. But it's a guy

56:52

named Colin Morris, and he did

56:54

this kind of deep dive. It's

56:56

called compound pejoratives on Reddit, from

56:59

butt face to wank muffin, wank

57:01

puffin. And he basically took, took

57:03

compound. So to think like dumbass

57:05

or you know scumbag where you

57:07

have the two words and he

57:10

went and kind of managed to

57:12

pull I don't know like 20

57:14

of the front halves and 20

57:16

of the back halves and then

57:19

did like started with just a

57:21

little heat map of like what's

57:23

the for like dumbass is the is

57:25

the most common occurrence and you've got

57:27

ones that you know or. like a

57:30

lip hat, like that's not really used,

57:32

or a, or a, or a wink

57:34

sucker. Like there are ones that, so

57:36

you start to see like ones that

57:39

you're like, oh, you could use that,

57:41

like, but it's, it almost never shows

57:43

up. And then you're like, well,

57:46

that's cool. But then he, he

57:48

wound up going deeper and deeper

57:50

as to like, well, which like

57:52

affixes have the most affixes applied

57:55

to. So it's quite, quite

57:57

a bit of a of a dive

57:59

and it's It's really just entertaining. There's

58:01

nothing you can do with it

58:03

other than come up with, like, you

58:05

know, oh, you know, you're a,

58:07

you're a, you're a but Lord. So

58:09

you wind up, you can't help

58:11

it, like coming up with pejoratives, you're

58:13

like, somebody said it. That's

58:17

the perfect last call for the explicit

58:19

rating. Yeah. It looks podcast. I love

58:21

it. We had to get it there.

58:23

What about you, Val? What's your last

58:25

call? Yeah. Did we just get rated

58:27

high? Is this like an R rated

58:29

podcast now? Oh, it's always. It always

58:31

was. Yeah. I

58:34

steered clear of some of

58:37

the specifically R plus rated ones,

58:39

but they're there. What

58:41

about you, Val? What's your last

58:43

call? So

58:46

mine, I wanted to keep it

58:48

in the family, search discovery alum

58:50

slash some current family. This is

58:52

actually a podcast from experiment nation

58:54

when Nick Murphy was a guest

58:56

on in the summer of 2024.

58:58

And it was all about building

59:00

a learning library. And I've actually

59:02

been sitting on this last call

59:05

for a long time. So I'm

59:07

really excited to share this one.

59:09

If you don't know Nick, he

59:11

is, he's been a consultant for

59:13

a couple of years at further,

59:15

but he was an in -house

59:17

practitioner before that. And he's incredibly

59:19

pragmatic with his approach to consulting

59:21

and helping organizations think about the

59:23

power of experimentation and such a

59:25

joy to work with him and

59:28

his beautiful brain. But in this,

59:30

he kind of walks through kind

59:32

of like a base model for

59:34

how you would think about repository

59:36

of learnings, because as we all

59:38

know, that's the value of the

59:40

reason you experiment, right? Is to

59:42

get smarter and make better decisions

59:44

as we've touched upon today. So

59:46

this is a way to make

59:48

it something that everyone in your

59:51

organization can access and query and

59:53

search so it doesn't just live

59:55

in a PowerPoint presentation on someone's

59:57

drive. But yeah, he talks about

59:59

how CROs are often thought of

1:00:01

as the numbers go up wizards,

1:00:03

which I nearly did a spit

1:00:05

take on. He said that though, so so funny. But it's

1:00:07

good. It's a really great discussion and definitely walked away with some

1:00:09

good tidbits of and some sound bites that I can share with

1:00:11

my client. So definitely recommend that

1:00:14

one. Awesome. Whoo, go Nick. Whoo.

1:00:16

All right. So this has been an

1:00:18

awesome discussion. So I'm so thankful that

1:00:20

we were able to dive into Analytics

1:00:22

the right way with we got both

1:00:24

authors on our episode today for. the

1:00:27

groundbreaking launch of the book, but no

1:00:29

show would be complete if we didn't

1:00:31

throw a huge shout out to Josh

1:00:33

Crowhurst, our producer, who does a lot

1:00:35

of that work behind the scene. So

1:00:38

thank you, Josh. And as always listeners,

1:00:40

we would love to hear from you.

1:00:42

So you can find us in a

1:00:44

couple different places. The Measure Chat Slack

1:00:46

Group, our LinkedIn page, you can

1:00:48

also shoot us an email at

1:00:50

contact at analyticshour.io, or if you've

1:00:52

been listening in the past couple episodes,

1:00:55

you will know that you can visit

1:00:57

us in the comment section of our YouTube

1:00:59

channel. So, it's another place you can grab

1:01:01

and listen to this episode. So feel free

1:01:03

to reach out. We would love to hear

1:01:05

from you. So with that, I

1:01:07

know I can speak for all of

1:01:09

my co-hosts, Julie and Tim, when I

1:01:12

say, no matter what step of the

1:01:14

latter of evidence you are on, keep

1:01:16

analyzing. Thanks for listening. Let's

1:01:18

keep the conversation going with

1:01:21

your comments, suggestions, and questions

1:01:23

on Twitter at Analytics Hour

1:01:25

on the web at Analytics

1:01:27

Hour.io, our LinkedIn group, and

1:01:30

the Measured ChatSlat Group. Music

1:01:32

for the podcast by Josh

1:01:34

Crowhurst. Show smart guys who

1:01:36

want to fit in. So

1:01:39

they made up a term

1:01:41

called analytic. Analytics don't work.

1:01:43

Do the analytics say go for

1:01:45

it no matter who's going for

1:01:47

it? So if you and I were

1:01:50

on the field, the analytics say go

1:01:52

for it. It's the stupidest,

1:01:54

laziest, lamest thing I've ever

1:01:57

heard for reasoning in

1:01:59

competition. I've been on now

1:02:01

three talks. That's a, this

1:02:03

is my third time on

1:02:05

the show. We talked about

1:02:07

natural language processing. That's right.

1:02:09

N-L-P, attribution without cookies. And

1:02:11

this is the third one.

1:02:13

Ding-ding-ding. Helps said to Katie

1:02:15

Bauer that if you do

1:02:17

five, you get the jacket

1:02:20

like S&L, so. Well, you

1:02:22

know what you should tell

1:02:24

the audience. You should say

1:02:26

Joe's running for the S&L

1:02:28

jacket. I

1:02:31

would be fun to design

1:02:33

a jacket though, an APH.

1:02:36

I would wear it everywhere

1:02:38

to the detriment of my

1:02:40

children and wife. I actually

1:02:43

was going to, I was

1:02:45

going to send one to

1:02:47

Goose because Goose like sort

1:02:50

of brought me in Tim

1:02:52

together, like in life, and

1:02:54

so I just feel like

1:02:57

he, I wanted to just

1:02:59

send him one, just be

1:03:01

like, you know what, in

1:03:04

a way you. I don't

1:03:06

know. What is, yeah, what

1:03:08

is the protocol on that?

1:03:11

If you sign a book

1:03:13

and then gift it to

1:03:15

somebody, is it like kind

1:03:18

of juicy? Like that's... You

1:03:20

have to just make sure

1:03:22

you do like a little

1:03:25

red lipstick kiss by it?

1:03:27

Like, no, yeah. X-O. call

1:03:29

out goose. I think under

1:03:32

his, then they're, we didn't

1:03:34

figure out we could have

1:03:36

like a joint acknowledgement section,

1:03:39

so that was a good,

1:03:41

good, good catch, good co-author

1:03:43

stuff. I mean now he

1:03:46

gushes a lot about Sarah,

1:03:48

in his acknowledgements, Julie gets

1:03:50

no mention of my acknowledgements,

1:03:53

but, uh... Are you dedicated

1:03:55

it to her? Yeah. I

1:03:59

don't know. Julie,

1:04:02

I'm kind of disappointed

1:04:04

that you don't have

1:04:07

a little bit more

1:04:09

empathy for me that

1:04:11

I just do this

1:04:14

opening. You're like, yeah,

1:04:16

I saw. What about

1:04:19

it? Yeah, Joe, this

1:04:21

is the first party.

1:04:24

This is the first

1:04:26

for Val. Now you

1:04:29

will suffer. Yeah, no, it

1:04:31

is big. It really is big.

1:04:33

I didn't give enough appreciation to

1:04:35

that, because I would not be

1:04:37

mentally prepared for it. So I

1:04:39

do greatly feel for you. My first

1:04:42

thought when I wake up, first thought

1:04:44

before I go to bed, still not

1:04:46

ready, so. I'm literally sitting on a

1:04:49

looking at screens of three people

1:04:51

who all rise to the occasion

1:04:53

and come across so much more

1:04:55

powers and coherent than I do

1:04:57

in any situation. I'm feeling

1:04:59

great about you opening it.

1:05:02

Okay, well, I'll have to

1:05:04

borrow some of your confidence,

1:05:06

like I said before. But

1:05:09

are we feeling ready to

1:05:11

start? Do it. Power pose,

1:05:13

power, pose. My favorite. My

1:05:16

favorite is when you said

1:05:18

that and you lean away. It

1:05:20

was like... Yeah, like the

1:05:22

microphone. Power pose, power,

1:05:24

pose. I

1:05:27

was definitely if Tim had

1:05:29

the like the five four

1:05:31

three two one count on

1:05:33

on I was just gonna

1:05:36

start like no matter who

1:05:38

was talking I was be

1:05:40

like hey everyone and welcome

1:05:42

to the Analytics Power Hour

1:05:45

and my name is Valcro

1:05:47

and I definitely didn't

1:05:49

just take one but

1:05:51

two nervous dumps before

1:05:53

I got on this

1:05:55

episode tonight. Yeah.

1:06:00

Rock flag

1:06:03

and you

1:06:06

can't eliminate

1:06:09

uncertainty.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features