How a $1B+ Crypto Company Really Uses AI in Marketing | ft. Kraken CMO

How a $1B+ Crypto Company Really Uses AI in Marketing | ft. Kraken CMO

Released Tuesday, 25th February 2025
Good episode? Give it some love!
How a $1B+ Crypto Company Really Uses AI in Marketing | ft. Kraken CMO

How a $1B+ Crypto Company Really Uses AI in Marketing | ft. Kraken CMO

How a $1B+ Crypto Company Really Uses AI in Marketing | ft. Kraken CMO

How a $1B+ Crypto Company Really Uses AI in Marketing | ft. Kraken CMO

Tuesday, 25th February 2025
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

On today's show, we're covering all the

0:02

AI questions you've been too afraid to

0:05

ask. What we're doing is breaking down

0:07

with Meyer Gupta, the CMO at Cracken,

0:09

one of the best crypto exchanges in

0:11

the world. We're going to go through,

0:13

how do you use AI? Which model

0:15

is best for which task? And what's

0:18

the best strategy for AI adoption in

0:20

your company? You're going to leave this

0:22

episode with a whole new framing for

0:24

how to move forward with AI. Let's

0:26

get to today's show. We right back to the

0:28

show. We right back to the show. But first,

0:31

a quick word from our sponsor. Remember when

0:33

marketing was fun. When you had

0:35

time to be creative and connect

0:37

with your customers, with Hub Spot

0:39

marketing can be fun again. Turn one

0:42

piece of content into everything you

0:57

know, enjoy

1:00

marketing

1:05

again.

1:08

Visit

1:12

hubspot.com

1:16

to get

1:20

started

1:22

for free.

1:26

being here. So uh

1:28

kind of reintroduced you to our audience. I

1:30

think you were our very first guest, so

1:32

you've had a pretty incredible career. You're currently

1:34

the CMO and Chief Growth Officer over at

1:36

Cracking, one of the most impressive companies in

1:38

the crypto space, but just in a very

1:40

impressive company in general. And we wanted to

1:42

get you on because you are a forward

1:44

thinker, you were hitting up growth in Spotify,

1:46

you've led marketing and growth in freshly, just

1:48

a ton of incredible roles, usually in companies

1:50

that are actually trailbrasers in their industry. So

1:52

of course we wanted to get you on

1:54

to get you on to talk about to

1:56

talk about to talk about. But yeah, because

1:58

we want to know how. smart leaders and

2:00

just smart people in general are using

2:03

AI. So maybe we could just kick

2:05

off, would you give in our audience

2:07

a little bit of like context on

2:09

where you are seeing AI get used

2:11

across your market and growth teams. What

2:13

are some of the use cases that

2:15

you particularly find pretty interesting? Yeah, well,

2:17

let me, maybe if it's helpful, just

2:19

give context on what the growth theme

2:21

a crack and looks like. We are

2:23

obviously like many teams still experimenting. Some

2:25

areas are more mature in using AI,

2:27

some we are still learning, but The

2:29

growth theme at Cracken now in his

2:31

current stage, which I call our third

2:33

era in a way, is pretty much

2:35

end-to-end growth. So it has marketing. We

2:37

have a pretty decent-sized growth analysis and

2:40

research team, product design, product engineering, obviously

2:42

product-led growth as well. And to be

2:44

honest, the two areas where we started

2:46

using AI and experimenting was around creative.

2:48

both from a brand and storytelling standpoint

2:50

as well as performance. A lot easier

2:52

on performance of course because that's where

2:54

you need a lot of velocity and

2:56

variety and you know AI does a

2:58

great job in giving you those variations.

3:00

We're also a global organization you know

3:02

huge footprint in Europe where there are

3:04

so many different languages so you know

3:06

translation has been a huge area for

3:08

us both in product and off product.

3:10

But I've been really inspired by our

3:12

research team which is a pretty small

3:14

unit but they've been leveraging. AI quite

3:17

a bit for both call and conned,

3:19

you know, using platforms like bolt AI

3:21

for some of the call work. They've

3:23

been using pole fish. Again, I personally

3:25

haven't dug in a lot on these

3:27

platforms yet, but I know there's a

3:29

lot of internal inertia to see how

3:31

do we bring more efficiency, more velocity

3:33

in doing stuff that was taking much

3:35

longer in the past and also increase

3:37

the level of quality that we get

3:39

in some of these functions. What are

3:41

you using it for actually on brand?

3:43

Because we do get a lot of

3:45

folks on here when Kipp and I

3:47

have guests on. It is a lot

3:49

of performance marketing. It is a lot

3:51

of like informational content. I would love

3:53

to hear because crack and do a

3:56

ton of smart stuff on brand campaigns.

3:58

Thank you guys. have a formula one

4:00

or part of the formula one as

4:02

well so like what you what are

4:04

some smart things you do on brand?

4:06

Yeah it's harder you know what we

4:08

feel and we have this discussion at

4:10

least three or four times a week

4:12

we have a channel on slack where

4:14

we are brainstorming looking at all the

4:16

new tools that are coming in and

4:18

brand I think it always is where

4:20

AI is helping us get smarter and

4:22

come up with more ideas idea generation

4:24

versus actually coming up with the end

4:26

asset also because a lot of a

4:28

marketing right now has pivoted to being

4:30

very product focus. We are highlighting the

4:33

value proposition and the RTBs for our

4:35

core product, the interfaces, you know, we're

4:37

trying to show how customizable and flexible

4:39

our dynamic interfaces are for our crack

4:41

and pro product, for instance, which is

4:43

one of the best in the category.

4:45

And purely with AI, it's very hard

4:47

to focus in on a graph, focus

4:49

in on dashboards. So what we are

4:51

learning is. It's great to give us

4:53

ideas, it's great to give us different

4:55

variations, but when we are zooming in

4:57

in certain angles, that's where you still

4:59

need your more traditional way of creating

5:01

content. And the other place where the

5:03

team is really moving fast is sometimes

5:05

you have long form content, like we

5:07

have a lot of influences and KOLs

5:10

that we work with, but the team

5:12

is using different types of AI platforms

5:14

to create. shorter versions of that content

5:16

to drive faster distribution across social so

5:18

a lot of clip anything type of

5:20

you know platforms that you use to

5:22

create you know 100 different assets from

5:24

one long form asset that you may

5:26

have with an influencer which then you

5:28

can distribute it much higher velocity. Meyer

5:30

I like to add in on that

5:32

I think what's missed in the brand

5:34

creative product marketing side of things is

5:36

that historically there's just been a lot

5:38

of time and money spent trying to

5:40

guess is this idea match what the

5:42

person who I'm trying to communicate with,

5:44

like, does it match what they actually

5:47

want? And I think what you're saying

5:49

and what we found at Hub Spot

5:51

is like, AI is a fast and

5:53

cheap shortcut to that problem. We use

5:55

Claude internally on the Hub Spot marketing

5:57

team. Everybody has a license to Claude.

5:59

We have clawed projects and so we

6:01

have a whole project just all around

6:03

our core buyer persona. And so any

6:05

time we're writing a product page or

6:07

building a brand campaign doesn't matter what

6:09

it is. We can just ask basically

6:11

a fictional version of our customer what

6:13

they think. Give us feedback. What resonates,

6:15

what doesn't. And you have to spend a

6:17

lot of time and money on focus groups

6:20

and market research and stuff. And now the

6:22

cycle times can just be so much faster it

6:24

feels like. Are you guys seeing that? Yes, yes,

6:26

you know, obviously this conversation is lighting

6:28

me up to think about what other

6:30

things we could actually do in our

6:32

world almost right away. You know, one

6:35

element of what you mentioned, Kip, which

6:37

I think is different that AI brings,

6:39

different from a traditional model, is, see,

6:41

when you're trying to figure out how

6:43

your customer will respond, a creative team

6:45

or one of your subgrowth teams, they're

6:47

engaging with an internal research team to

6:49

understand, okay, tell me how, you know, what the

6:51

feedback could be. they're going to market, they

6:54

run the study, they run call in coin

6:56

and they come back. Not only speed and

6:58

velocity, but this is a direct hook

7:00

that now the growth teams have where

7:02

they don't have a dependency on another

7:04

layer. So it also helps them get

7:07

a deeper understanding of what the customers

7:09

may respond to and also bringing more

7:11

agility because you are now doing that

7:13

not once you've had an MVP of

7:15

an asset, but you're actually doing

7:17

it while you're in accepting in much early

7:19

on in the phase. Yeah, I think research

7:22

in general, I think you mentioned that

7:24

there is like one of the most

7:26

interesting use cases. The latest releases from

7:29

Deep Think, we covered 03, their

7:31

research capabilities. They actually are

7:33

displaced in most research roles, but we

7:35

have covered it before that there is

7:37

a really interesting way for these AI

7:39

models to replicate your customer, which we're

7:42

talking about, like how it can be

7:44

the voice of your customer, because it's

7:46

basically trained on the data of the internet,

7:48

but some of the recent agents like the

7:51

operator agent or another one that we're going

7:53

to cover soon on this channel called proxy

7:55

can actually help you laser in on parts

7:57

of the internet. So you could say like

7:59

go to G2. cried, go to these other

8:01

things, and use that data to replicate

8:03

my customer. And so then you're able

8:05

to actually have a back and forth.

8:07

There's a company that I'm an investor

8:09

in called Hyperband, and they actually do

8:11

this for companies where they'll build an

8:13

agent to replicate your customer in a

8:15

bunch of different scenarios. And then ramp

8:17

and sales reps can call that customer

8:19

and have a sales conversation and train.

8:21

on calling that customer. So I do

8:23

think it's a pretty interesting use case

8:25

in the future, which is, you know,

8:27

we've had multiple shows where today you

8:29

have this billion dollar, I think it's

8:31

multi-billion dollar industry, I don't know if

8:33

you know it as, because it's like

8:35

three billion dollar, this market research industry.

8:37

We looked it up before, it's more

8:39

than that, it's like eight to ten

8:41

billion, it's massive, massive, the market research

8:43

business. And like can you pay a

8:45

thousand people to like come back in

8:47

your data to like come back in

8:49

your data? ability for AI to just

8:52

do that en masse I think is

8:54

going to displace a lot of that

8:56

need as well and so I do

8:58

think there's a future where you'll be

9:00

able to test your creative test your

9:02

brand campaigns and get pretty instantaneous feedback

9:04

but you'll be able to pair that

9:06

with like internal data as well to

9:08

like be able to make sure that

9:10

a marketer can get things right more

9:12

than they get things wrong. Yes I

9:14

think On the research side, we're absolutely

9:16

seeing that. The pace at which we're

9:18

getting feedback, you know, is dramatically different

9:20

from when we are using a traditional

9:22

model. Now, one area where, you know,

9:24

we've taken a certain path where we've

9:26

created local instances, let's say, of chat

9:28

GPD, because there's confidential data, you know,

9:30

we are obviously for the nature of

9:32

the business that we are in. How

9:34

much are you guys seeing that being

9:36

done versus most brands and businesses leading

9:38

on all the data being in the

9:40

cloud? So one of the things that

9:42

we are doing a crack in is

9:44

we've created our local instances where all

9:46

the unstructured data is actually being pushed

9:48

into that local instance and is being

9:50

trained. And then, you know, you're just

9:52

asking a lot of questions and queries,

9:54

even just to understand user behavior, because

9:56

over a period of time, historically, these

9:58

have been PDFs, decks and docs with

10:00

tons of data, but I think creating

10:02

a local incentive just give that access

10:04

to thousands of crack nights pretty instantly.

10:06

Yeah, this is actually a super important

10:08

point. I think Kipp and I have

10:10

talked with this. The number one way

10:12

to make this very impactful within your

10:14

company is to be able to provide

10:16

the ability to collapse all the unstructured

10:18

data into a repository, that then you

10:20

can just have the employees easily access,

10:22

like so when they build things, it

10:24

can access that unstructured data. Obviously, Kep

10:26

and I are kind of maniacs, and

10:29

so we have... That's the nice way

10:31

to say it. We have... That's the

10:33

nice way to say it. We have

10:35

personal pay plans, I think, to every

10:37

AI tool, but we do have local

10:39

instances as well, very similar to the

10:41

way that you guys sound like you

10:43

guys sound like you're building. The unstructured

10:45

data unlock is crazy, right? And if

10:47

you look at some of the reports

10:49

with Google 2.0 flash, its ability to

10:51

extract data and insights from PDFs for

10:53

essentially 40, 50 cents is like. unreal

10:55

and kind of unmatched. There's two things

10:57

we've decided recently to do in our

10:59

team, Karen, right, which is one, we're

11:01

in the process of having technical writers

11:03

document all of the workflow so that

11:05

we can then be very clear on

11:07

what we're going to automate and what

11:09

we're going to automate now, what we're

11:11

then going to automate in the future.

11:13

And then the second thing is every

11:15

meeting's recorded. Yeah. Every meeting's recorded. And

11:17

we are creating like a whole nomenclature

11:19

for how those recordings and transcripts and

11:21

transcripts are stored. everything's recorded. We are

11:23

just going to unlock the creation of

11:25

unstructured data. And it was sweet, like

11:27

I had a big offsite last week

11:29

with a few people here and you

11:31

were there for a little bit and

11:33

we recorded everything. Then you dumped all

11:35

that in Claude and we could go

11:37

and basically build follow-ups and everything from

11:39

that. It's really great. So I think

11:41

those are two things that we're doing

11:43

here. And I think the other thing

11:45

we could talk about for everyone that

11:47

I don't think has been talked about

11:49

yet that Meyer kind of brings up.

11:51

of one of these frontier models? When

11:53

do you baby build a app on

11:55

top of their APIs that's specific to

11:57

your company? when you're trying to stand

11:59

up AI use cases within your company,

12:01

do you build that custom? Do you

12:04

have some opinions on that idea? I

12:06

want to kind of hear what you

12:08

think and hear what Meyer thinks there.

12:10

Okay, that is actually a pretty good

12:12

question, right? The one that I thought

12:14

about much more deeply is, I will

12:16

make sure, is this the same thing,

12:18

which is, when you're trying to stand

12:20

up AI use cases within your company,

12:22

do you build that custom? Do you

12:24

use a vendor? open source and non-open

12:26

source, like do you commit to a

12:28

certain model? I think that's one of

12:30

the harder things to figure out right

12:32

now, because that's like one of the

12:34

things that I'm trying to figure out

12:36

all the time across our go-to-market, which

12:38

is when you were trying to stand

12:40

up things in AI, the most important

12:42

thing is to get signal as quick

12:44

as possible, because it's really hard to

12:46

know what the capabilities are of the

12:48

model at scale. And so there's two

12:50

things there. I still think in a

12:52

lot of cases you do have to

12:54

customize it a lot to your needs

12:56

because I think AI works best when

12:58

it replicates workflows that are unique to

13:00

how you do things, right? The best

13:02

AI companies actually are shipping daily because

13:04

they're sitting with design partners looking to

13:06

see how these people work and then

13:08

ship into their needs to integrate AI

13:10

seamlessly into the workflow. And so you

13:12

actually in a lot of cases have

13:14

to build custom off-the-shelf software is even

13:16

really hard to get signal. I would

13:18

still always want to start with off-to-shelf

13:20

because I would want to try to

13:22

prove the use case as quick as

13:24

possible versus committing to building something internally.

13:26

The open source one and the closed

13:28

source one is actually a really interesting

13:30

question. So I've always believed open source

13:32

is going to be a really interesting

13:34

question. So I've always believed open source

13:36

is going to be a big part

13:38

of how companies use, based upon the

13:41

use case you want to do. So

13:43

you can imagine you're trying to. do

13:45

something in the future and you're using

13:47

some sort of interface and that interface

13:49

basically is able to capture the intent

13:51

like the use case you want to

13:53

do and then in behind the scenes

13:55

it can orchestrate and send to the

13:57

right model dependent upon your needs, right?

13:59

And that does not exist. Like I

14:01

actually try to invest in some companies

14:03

doing that, but it is actually a

14:05

way harder problem to solve than I

14:07

first thought. So I don't admire if

14:09

you have opinions on that, but that

14:11

there would be my like off-the-top opinions.

14:13

Yeah, this is a great point. I

14:15

think it's a multi-dimensional framework to be

14:17

honest, because there are so many different

14:19

layers to figure out what your strategy

14:21

should be. On one hand, I think

14:23

it's a call walk and run. I

14:25

think the lowest hanging fruit for any

14:27

business in any team is leverage a

14:29

SAS platform and experiment and prove incrementality

14:31

and then you figure out the path

14:33

forward. I think the path forward is

14:35

determined I feel based on a couple

14:37

of things. One is that core to

14:39

your core product offering. If it is

14:41

core to your core product offering you

14:43

want to own the IP, you want

14:45

to build it. Yeah. And that is

14:47

core within the product. So that plays

14:49

a key role. Second is, you know,

14:51

the security and confidentiality of that data.

14:53

And if you're playing around with that,

14:55

no-brainer, I think most of the SAS

14:57

platforms, most of the cloud-based solutions today

14:59

will fulfill most of your needs. And

15:01

if there is a very unique edge

15:03

case, then it's a big question, Kiran,

15:05

I do feel you have to ask

15:07

is, hey, is the cost-benefit analysis strong

15:09

enough where you go down that path

15:11

where you customize or you build an

15:13

adaptive layer on top of it? Or

15:15

is it reasonable enough that it can

15:18

solve 80% of your efficiency? and then

15:20

20% are edge cases. But the local

15:22

instance point that I made earlier, it

15:24

is because there is a certain type

15:26

of data that we just would not

15:28

want to leave our four boundaries, and

15:30

we felt that, okay, we can create

15:32

a local instance, continue to train it

15:34

over time with all the unstructured data.

15:36

So I think there are three or

15:38

four different layers, and I feel either

15:40

which way you got to start with

15:42

experimenting with what's out there, and absolutely

15:44

see some of the early signals. Yeah,

15:46

I agree. Look, I think to chime

15:48

in to try to help everybody watching

15:50

and listening to today's show, what's gonna

15:52

happen is a lot of companies are

15:54

gonna be like, oh, I'm behind, I

15:56

wanna get into this AI game. And

15:58

they're gonna go by business seats of.

16:00

GBT, of Claude, of Jim and I,

16:02

pick a core frontier model. And that

16:04

is a great place to start. Really,

16:06

really good place to start. There's nothing

16:08

wrong with doing that. What you will find

16:10

is, if you have very specific

16:12

use cases, building small applications on

16:15

top of the API, is like

16:17

orders of magnitude cheaper. Yeah. Right, like way

16:19

way way way way cheaper. And so if you are very focused

16:21

in what you're trying to do, I think you're

16:23

going to be better off to build custom. If

16:25

you are general use cases trying to figure that

16:28

out, I think probably going and buying those seats

16:30

is probably not a bad place to start. When

16:32

you buy the seats, you get a bunch of

16:34

features and those features continue to improve as they

16:37

roll out and make the product. better, but if

16:39

you have like very focused use cases where there's

16:41

a couple of very specific things you want to

16:43

do and use AI for you are far better

16:45

to build off of an API like a basic

16:48

web app or agent because it's going to be

16:50

way way way cheaper and then I suspect

16:52

if you're hosting things locally and you

16:54

have specific use cases or data privacy

16:56

concerns, that's probably where open source is

16:58

going to come in. You're going to

17:00

take an open source model, you're going

17:02

to run it locally, you're probably going

17:04

to pick one that's best for the

17:06

very specific problem or small set of

17:08

problems you have, right? And that's I

17:10

think probably the three sets of choices

17:13

people are making today. Right. Yeah. I

17:15

do have a question from you guys

17:17

because you of course have been digging

17:19

so deep into AI and talking to

17:21

so many leaders. From an operationalational

17:23

being most effective when AI is

17:26

assessed and adopted and evaluated within

17:28

each team? Or are you saying,

17:30

okay, actually find a person across

17:32

the board, across teams, because look,

17:34

there's so much innovation happening,

17:36

it's also noise, right, because you're

17:39

also running your core business, you're

17:41

trying to hit those KPI, you're

17:43

trying to hit outcomes, but this

17:46

is all relevant. So operationally, what

17:48

have you seen being most effective

17:50

in terms of experimenting and identifying?

17:53

What are the right tools to play around for what

17:55

type of use cases? This is a great question.

17:57

Keep and I spend a bunch of time on this

17:59

last week. So this is really timely. I

18:01

kind of divided it up into three

18:03

parts in terms of AI adoption within

18:05

the company. So we have these like

18:07

top down goals from the company. And

18:09

it's basically just a stake in the

18:12

ground. It's saying like we believe these

18:14

places across the company should be transformed

18:16

by AI because we know where AI

18:18

capabilities are today. But the thing and

18:20

I kept kind of bill for is

18:22

we build the use cases that we

18:24

believe AI is going to be transformational

18:26

for. in the future, regardless of where

18:28

the model capabilities are today. And what

18:30

we mean by that is, I think

18:32

a year ago we talked about this,

18:34

which is the model capabilities is a

18:36

solved problem. It's time and money. Yeah,

18:38

the model capability, don't worry that it

18:40

cannot do what you want to do

18:42

today. Bill the infrastructure and the setup

18:44

to do that thing and the model

18:46

will sell itself. That was 12 months

18:48

ago. If you look what's happened in

18:51

the last five months, that was like

18:53

pretty accurate, right? The model capabilities have

18:55

like been transformative. It's speeding up, not

18:57

slowing down. So I think putting a

18:59

stake in the ground and having large

19:01

bets at the company level and then

19:03

having a large pod that can go

19:05

after those pod that can go after

19:07

those bets that can go after those

19:09

bets. think and what I've seen in

19:11

other companies is it should be run

19:13

like a growth project not an IT

19:15

project you have to have a growth

19:17

methodology to how you approach those because

19:19

AI in and of itself is an

19:21

iterative technology it is not a deploy

19:23

the technology like regular software and there

19:25

you go right it is a very

19:27

like learn as you go then the

19:30

second thing is team enablement and then

19:32

the third thing is employee in general

19:34

enablement so I'll give you somewhat take

19:36

on employee enablement that might not go

19:38

down well with folks right I think

19:40

on the employee side of things The

19:42

job of the team in the company

19:44

is to provide the tools and the

19:46

kind of permission to go and play

19:48

with AI, integrate AI into your workflow,

19:50

figure out what works, what doesn't work,

19:52

like play around with it, here's the

19:54

tools, here's some like courses that you

19:56

might want to go take, nothing is

19:58

mandated, and if you don't understand that

20:00

this is a paradigm shift in how

20:02

you do work, and you don't want

20:04

to integrate it into your work, and

20:07

you are not curious about it, that's

20:09

on you, most important things that you

20:11

can do. The second one is the

20:13

hardest one, actually, and that's the one

20:15

I want to, like, pitch over the

20:17

KIP, which is, what do you do

20:19

at the team level, right? So should

20:21

teams understand how AI can transform what

20:23

they do, or should there be a

20:25

central AI team that goes into those

20:27

teams? spends time with them, looks at

20:29

their workflows, and it is the AI

20:31

specialist team, and it understands how it

20:33

can transform what they do. And I

20:35

think actually either or or of those

20:37

could work. I don't know if there's

20:39

a right option. Here's my take, because

20:41

this is a really good discussion, and

20:43

a bunch of marketing leaders and VCs,

20:46

you know, listen to this on RSS,

20:48

and so I'm sure we'll kick off

20:50

some debate. I think you have to

20:52

have both of what Karen just said,

20:54

of what Karen just said, is the

20:56

what Karen just said, is the honest,

20:58

is the honest, is the honest, is

21:00

the honest, is the honest, is the

21:02

honest, is the to take off in

21:04

a team or within an organization is

21:06

to centralize some valuable unstructured data use

21:08

case. Like the second our product marketing

21:10

leader had a couple projects in Claude

21:12

that was like here's everything we know

21:14

about our persona and you can just

21:16

get feedback from that persona because I've

21:18

taken all the time to train it

21:20

upload all of our decks and docks

21:22

and everything there and I've tailored the

21:25

output and everything for you. She did

21:27

another one about like removing any like

21:29

business jargon and stuff from your copywriting

21:31

like basic things that like everybody can

21:33

use, that is how you get real

21:35

adoption. And so I think the individual

21:37

teams are required to be the experts

21:39

to find and gather the unstructured data,

21:41

put it in the frontier models, Claude,

21:43

ChatGPT, whatever you're using, and make that

21:45

accessible to their team and the broader

21:47

team. I think you then need a

21:49

team of specialists who are experts in

21:51

AI and automation to say, we can

21:53

now. automate 90% of localization. This is

21:55

a solved problem. We got deep amount,

21:57

we got a bunch of different tools

21:59

where we can make localization 10 times

22:01

better and we're gonna go run and

22:04

build very specific workflows, custom software, whatever

22:06

it may be to go and solve

22:08

that problem that like your core team

22:10

isn't going to have enough knowledge. and

22:12

expertise to go into. We can go

22:14

and train that team and do everything

22:16

we need, but I think both is

22:18

the only possible outcome. Karen. Yeah. Yeah.

22:20

Let me tell you about a great

22:22

podcast. It's called Creators of Brands. It's

22:24

hosted by Tom Boyd. It's brought to

22:26

you by the Hubspot Podcast Network. Creators

22:28

are Brands explores how storytellers are building

22:30

brands online, from the mindsets to the

22:32

tactics. They break down what's working so

22:34

you can apply that to your own

22:36

goals. Tom just did a great episode

22:38

about social media growth called 3K to

22:40

45K on Instagram in one year selling

22:43

digital products and quitting his job to

22:45

go full-time creator with Gan and Mayer.

22:47

Listen to creators or brands wherever you

22:49

get your podcast. I totally agree. I

22:51

think it's a hybrid model and I

22:53

was thinking through how are we running

22:55

it and what are the teams that

22:57

are running it successfully? The big outlier

22:59

or the big difference in the scenarios

23:01

is anytime the actual business team, and

23:03

that could be marketing growth product, anytime

23:05

they are the ones generating the inertia.

23:07

They are the ones pushing the use

23:09

case. We are seeing better results and

23:11

outcome. Now, of course, in some cases,

23:13

they'll have a dependency on, you know,

23:15

somebody in a global data team or

23:17

engineering to figure out, you know, a

23:20

local instance of something or create customization.

23:22

But anytime they are the ones who

23:24

are driving and pushing for the outcome

23:26

because they are closest to the efficiency

23:28

opportunity in what they are doing, we

23:30

are actually seeing success. Yeah, I think

23:32

that makes a ton of sense. I

23:34

think curiosity has never been more valuable

23:36

than it is today. I was talking

23:38

to a friend about this earlier on.

23:40

Your time has never been more valuable

23:42

than it is today because the opportunity

23:44

to apply it to AI is so

23:46

huge. Like I think about this all

23:48

the time, which is... I have a

23:50

really high bar for what I should

23:52

spend my time on if it's not

23:54

AI. If I'm doing something during the

23:56

day and it's not AI... I'm like,

23:59

why am I doing this? Like it

24:01

has to be a really high bar

24:03

for this to be important. I don't

24:05

think that's healthy, by the way. I'm

24:07

not saying that's what we should do.

24:09

I actually don't think it's healthy because

24:11

it forces me into a bunch of

24:13

like tailspin. Well, the best quote I've

24:15

heard on this is from Dan Chipper,

24:17

founder of every friend of the pod.

24:19

He was reviewing open AI, many, many,

24:21

and he was like, these advanced reasoning

24:23

models are a bazuka for the curious.

24:25

just like you take out a whole

24:27

big problem really quickly and learn something

24:29

and I'm somebody who's for better words

24:31

addicted to learning and so like that

24:33

just becomes insatiable and you're right here

24:36

you just stop being like why am

24:38

I doing anything that is like a

24:40

bottom 20% yeah thing ever yeah that

24:42

is so spot on and I almost

24:44

feel it's like social you know when

24:46

social became a thing whatever 10 you

24:48

know 14 years back you could not

24:50

just assume that you would understand the

24:52

nuance of social and building communities if

24:54

you yourself were not in it. Exactly.

24:56

Yes. You have to live and breathe

24:58

it to then say, oh, I have

25:00

a chance at building a brand that

25:02

is distributed through social and build communities.

25:04

So, believe it or not, of course,

25:06

I followed both of you and a

25:08

real lot of your content. Late last

25:10

year, as I was getting into the

25:12

break. I actually dove really deep in

25:15

Claude myself with my financial data, personal

25:17

personal financial data. Me and my wife,

25:19

we always end up overspending and you

25:21

know, like it's a cluster F when

25:23

you actually look at your statements, there

25:25

are like thousands of rows and your

25:27

Excel eventually breaks. So you can't really

25:29

do anything. And then you look at

25:31

all kinds of pie charts that typical

25:33

financial systems give you, but it doesn't

25:35

really tell you anything. So I read

25:37

a lot about Claude and I saw

25:39

one of the post from Kieran where

25:41

you had shared or six platforms and

25:43

tools you using. You know, I used

25:45

three or four weeks to get into

25:47

Grock and Claude and Gemini. So what

25:49

I did was I cleaned up all

25:52

my data. I removed all the PI.

25:54

Man, I dumped all the CSP files

25:56

into Claude. And I went crazy asking.

25:58

kinds of questions because why are we

26:00

overspending? Where are we spending? You know,

26:02

and what are the repeatable spends? My

26:04

wife thought that she's overspending on Amazon

26:06

and I corrected that because, you know,

26:08

this clearly said, no, this is how

26:10

much you're spending on Amazon versus, you

26:12

know, all of the restaurants. And we

26:14

were surprised that despite having our own

26:16

car, we were spending so much an

26:18

Uber in a span of three months.

26:20

And then we, so I think the

26:22

only way you unlock AI. in your

26:24

professional life is when you're actually living

26:26

and breathing it and finding incrementality and

26:28

finding those use cases in your post

26:31

on life as well. 100% I think

26:33

personal life is a big one. What

26:35

I do now is everything I do,

26:37

I'll start with AI. And the financial

26:39

ones, that's a pretty interesting one. I

26:41

do something similar where I plugged in

26:43

my portfolio to open AI and then

26:45

asked it, how can I diversify this

26:47

more? I've made my first investment two

26:49

months ago in a clean energy fund.

26:51

I had no idea what this clean

26:53

energy fund was, nothing, like it was

26:55

all recommended by AI, broke down by

26:57

AI, added to my portfolio for diversification

26:59

in climate change, and I just said,

27:01

I'm going to like just invest in

27:03

this and see what happens, right? Because

27:05

I'm so committed to it, I'm like,

27:08

just invest in this and see what

27:10

happens, right? Because I'm so committed to

27:12

it. I'm like, literally putting my money

27:14

into, I've been fascinated by. is how

27:16

the reasoning models have become better with

27:18

strategy. While you're pulling that up, my

27:20

son has like medical stuff and like

27:22

if I get a complex medical report,

27:24

yeah, like I just upload that and

27:26

get like the full real like summary

27:28

of it. Yeah. Before like waiting for

27:30

a specialist is like magic. Exactly. The

27:32

health ones, I have all of my

27:34

health data now in a project. In

27:36

a project, right? Yeah, in one project.

27:38

That's super smart. Just to give an

27:40

example of how incredible AI is, not

27:42

that I was not listening to our

27:44

conversation was not dialed in, but I

27:47

wanted to set this up because I

27:49

wanted to show you. Keep in mind

27:51

when I'm going through this. I did

27:53

this whilst we were talking on the

27:55

pot. Wow. Okay. And the reason I'm

27:57

saying that is because I think it

27:59

would take a growth team a week

28:01

or so to be able to do

28:03

this. Now, one caveat is this is with

28:05

synthetic data because I can't show real data

28:07

on this show. I've been trying to get

28:10

good at trying to create synthetic data to

28:12

show use cases. The synthetic data is just

28:14

not complex. So what I've given it is

28:16

like a dashboard for a growing SAS company

28:19

at 30 million in ARR that wants to

28:21

double its AR over the coming years. And

28:23

so the first prompt is, and if you're

28:25

subscribing to the podcast and you're watching this,

28:28

we'll put the sheet for the prompts in

28:30

the YouTube comments. You don't have to try

28:32

to like look at the prompts. But the

28:34

first prompt is basically just saying, okay,

28:36

well, I need to get to that number in

28:39

12 months. Of all the metrics that you

28:41

see here, what are the three to five

28:43

I should focus on? And I'm going to

28:45

skip past this a little bit because it's

28:47

not that interesting because again, it's synthetic data.

28:49

So the data was pretty obvious that it

28:51

picks out churn rate, gives me examples of

28:53

why it picked out that tactics and tradeoffs,

28:56

picked out ARPO, picked that conversion rate

28:58

free to pay it. And this is

29:00

a PLG company. So it has premium

29:02

tier started tier, pro tier enterprise, activation

29:04

rate, and then referral rate, referral rate.

29:06

Cool. And so then I asked it

29:08

to put together this. And this is

29:11

where it started to get much better.

29:13

Now this is 01. This is not03.

29:15

The reason it's not03. I don't think

29:17

that's a European thing. I think this

29:20

is just an everyone thing. I can't

29:22

upload files or even graphics.

29:24

I can't upload anything. I can't

29:26

upload files or even graphics. I

29:28

can't upload anything to03. So then

29:31

it will take those metrics, right?

29:33

It does its own hypothesis. It

29:35

starts to give you pretty great experiments. When I

29:37

first looked at this, which is just like as

29:40

we were going through things in the podcast, I

29:42

think it's as good as an average growth team,

29:44

if not like a pretty competent growth team, where

29:46

it will give you the kind of experiments you

29:48

can run to improve churn rate. There's one here

29:50

that is actually pretty interesting. So it started to

29:52

do things now that I found it never used

29:54

to do, which is give me something that I

29:57

hadn't thought of. So this one here, early warning

29:59

warning outreach outreach. us to like basically trigger

30:01

usage signals and so when you see someone

30:03

drop in daily active users you take your

30:05

daily active user cohort and then you trigger

30:07

emails to cohorts where you see the daily

30:10

active usage drop off over time yeah which

30:12

is actually a pretty smart thing to do

30:14

one question yeah my mind's buzzing two questions

30:16

actually one is are you also uploading your

30:18

UX and design flow for it to understand

30:21

where the opportunities could be? That's a good

30:23

question. That's one question. Let's start there because

30:25

I'm very, very curious about how to leverage

30:27

this. This was going to be my end

30:29

and point, but I'm glad you brought this

30:32

up, which is, if I actually went through

30:34

this, I actually went through this, and so

30:36

I actually went through this, and so I'll

30:38

go away through this, and so I'll go

30:40

away through this, because I want to cover

30:43

your point, because I was going to think

30:45

more deeply about these problems. and do another

30:47

version you'll get better results better results better

30:49

results each time now there's been a release

30:51

lately on how deep-secret others and these reasonable

30:54

models were managing to get better results and

30:56

one of the key things was just every

30:58

time it thought it was finished giving you

31:00

the answer they would just give it the

31:02

word wait hmm that's it wait and they're

31:05

doing that to say think more and what

31:07

it's doing in the background is it's looking

31:09

through all of the things it's giving you

31:11

and stack ranking them and trying to give

31:13

a better one a better one and better

31:16

one so that's the reason the reason of

31:18

models in the background are trying to reason

31:20

out what is the best answer they can

31:22

give you not too similar from search but

31:24

like I think somewhat more sophisticated and then

31:27

it's giving me a big swing but your

31:29

point is really important because I'm pretty sure

31:31

the more context you give it, the better.

31:33

So the way to actually make this incredible

31:35

is you give it your actual real data,

31:38

which for cracking, I assume, is very complex,

31:40

right? Which would be better because then the

31:42

reason the model would be able to decipher

31:44

the first question of like really tell me

31:47

that metrics that matter in this complex model,

31:49

that is actually really important. The ones that's

31:51

picked here are like pretty self-explanatory. They're just

31:53

like best in class PLG metrics. it would

31:55

be interesting for a business like Hub Spot

31:58

or Cracken, it would pick more interesting things.

32:00

The second point is, okay, we'll turn that.

32:02

into a growth plan, but actually, here's all

32:04

of the experiments we've run. So you have

32:06

a library of all previous experiments, which again,

32:09

speaks to the fact, the most important thing

32:11

to do for AI to be impactful is

32:13

documentation. Born thing you get everything of. It's

32:15

really documentation, and you load in all your

32:17

experiments, and then to your point, which is

32:20

the next version I want to do internally,

32:22

which I think is a great idea. We

32:24

do have a ton of wire mapping and

32:26

flows. from our customer journey. And so I

32:28

would actually just add all of that into

32:31

the context window as well. And I suspect

32:33

you're going to get a first version of

32:35

a growth plan that needs to be edited,

32:37

but does not in any way need to

32:39

be rebuilt. Yeah. Okay. Here's what I would

32:42

love to do. Man, I would love to

32:44

come back in four weeks because I'm not

32:46

giving you guys much other than some abstract

32:48

shit. But all I have to figure out

32:50

today and I'm going to tell you the

32:53

type of growth questions. We are trying to

32:55

answer that any business that is global is

32:57

trying to answer. And I want to bring

32:59

some of those use cases back. But the

33:01

only caveat is I need to figure out

33:04

the local incident because we won't push that

33:06

into cloud. I think that is too scary

33:08

for me to even think that I'm going

33:10

to do it. Doesn't matter what type of

33:12

enterprise. But I get it. Here are some

33:15

no freaking brainer questions that I can assure

33:17

you. This guy. or any of these platforms

33:19

will help us get much faster. One is

33:21

in a global company I'm always looking at

33:23

growth rates difference between geographies. Yes. No-brainer just

33:26

tell me hands down why is activation rate

33:28

stronger here and worse there. Now they're all

33:30

just looking at all the variables all the

33:32

inputs in the activation rate and telling me

33:34

what the differences are. Two we always look

33:37

at okay who's a high value cohort and

33:39

who's a lowest value cohort but trying to

33:41

figure out session analysis to understand Okay, what

33:43

is it the high value cohort doing beyond

33:45

the aha moment? You know, that is driving

33:48

maximize LGBT. Now, all that is doing is

33:50

data crunching. It's just trying to figure out

33:52

patterns, which is what AI can do a

33:54

lot faster than a data scientist would. running

33:56

different queries. And then obviously we focus so

33:59

much on payback period because we challenge ourselves

34:01

to have very strong fully loaded payback periods

34:03

and then starting to look at even just

34:05

basic stuff okay what is the IRR on

34:07

your CAC? You know stuff where we are

34:10

spending a lot of energy which is rather

34:12

mechanical or you are searching for nuggets if

34:14

I can go back and figure out a

34:16

way to dump all my raw data I'll

34:18

come back with some very interesting insights that

34:21

we are generally trying to solve every single

34:23

day, but it's taking us long. Yeah, and

34:25

I'll just give one quick tip there. When

34:27

you get the internal data and you have

34:30

those internal trends, and you can anonymize it

34:32

because I'm going to say use deep research

34:34

here, and you might not be able to

34:36

use that locally, I don't know, but you

34:38

can anonymize it to make you feel better,

34:41

just give it the trends. Parrot with external

34:43

trends in those geographies. Is this exactly what

34:45

I was going to say? Yeah, okay, okay.

34:47

It's deep research plus the similar web API.

34:49

Yes, yeah, exactly. What happens is you get

34:52

all this data internally in any kind of

34:54

company of scale. If you're a 10 person

34:56

or 100 person or 1,000 first company, you

34:58

get some data and you're trying to contextualize

35:00

it. largely on like anecdotal information. And once

35:03

AI helps you synthesize it and find the

35:05

key trends and timing on those trends, you

35:07

can just basically bring in deep research, similar

35:09

with API, and a couple external sources, and

35:11

like, it will correlate very perfectly for you.

35:14

But promise you. You guys have totally changed

35:16

my mindset in 30 minutes. That's what we're

35:18

trying to do. I mean, look, obviously I've

35:20

always had this hurdle to think about how

35:22

am I going to apply it. It is

35:25

always in the back of the back of

35:27

the mind. We are spending so much energy

35:29

right now, we know exactly what questions we're

35:31

trying to answer. We know where we are,

35:33

we know how we can drive growth, but

35:36

the latency in getting those insights is just

35:38

insane right now. What I want to go

35:40

back and do is pick somebody from my

35:42

team to figure out how we're going to

35:44

launch, and then maybe you're right, you know,

35:47

there is elements of anonymized data, which we

35:49

can actually... upload and put patents. An interesting

35:51

thing is in crypto, market behavior is easily

35:53

captureable because the biggest market driver is a

35:55

global Bitcoin price and we have a lot

35:58

of price indexes. So that can eliminate that

36:00

variable and really understand what else is happening

36:02

in a particular geography different from another one.

36:04

Yeah, I would say you have actually incredible

36:06

external trends as well. like I would say

36:09

there's some like really interesting patterns across geographies

36:11

in your external trends because you know it's

36:13

such a vibrant and popular space. Yeah. Yeah.

36:15

I think that's a great place to leave

36:17

it. The last one I'll leave you with

36:20

that I think is like pretty interesting is

36:22

I have all these different project assistants. I

36:24

did one for growth. Now a project assistant

36:26

is basically we have an EA, she's incredible,

36:28

but like to be across every single thing

36:31

you do. that is hard to scale. Is

36:33

it possible? Yeah, I've started building project access

36:35

and for different projects and there is a

36:37

good one for growth, which is if you

36:39

have a Google Jam, Open AI, custom chat,

36:42

TV, or cloud projects, you can just. Literally

36:44

not care about the structure of anything that

36:46

people are giving you as long as it's

36:48

specific to this project Just say like hey

36:50

put all the things that are related to

36:53

this project your updates your experiments Whatever they

36:55

may be in a single decks doesn't matter

36:57

everything in a single folder right named project

36:59

Which is usually a goal like increased activation

37:01

rate by 50% in 2025 everything goes in

37:04

a folder goes into project assistant. It is

37:06

pretty incredible. You can get everything you need

37:08

if you're saying like tell me about the

37:10

three experiments in January that we did the

37:12

one that basically led to the biggest impact

37:15

and why the other two failed. You never

37:17

have to go on this like conundrum of

37:19

slacks in emails to try to figure out

37:21

stuff. And again, these mundane use cases are

37:24

the ones where there's huge upside because if

37:26

you're managing hundreds of people, all of that

37:28

stuff takes hours and hours of your day.

37:30

And that one I have found to be

37:32

incredibly valuable. And the last thing I was

37:35

showing kept earlier is like, it can build

37:37

out interfaces dynamically for you in the actual

37:39

console. example would be,

37:41

anytime I do a

37:43

meeting, I get the

37:46

meeting transcript and I

37:48

said, add all the

37:50

follow -ups into a table.

37:52

And so it will

37:54

keep updating the table

37:57

in the AI instance.

37:59

And I say, remove

38:01

this. And now at

38:03

these ones, I've done

38:05

those. It's just for

38:08

me, you can't share

38:10

it with people easily

38:12

because it doesn't write

38:14

back into a file.

38:17

But it's unbelievable. Like I can't describe

38:19

how much of a game changer these

38:21

assistants are. The key is to attach

38:23

them to a singular project. Yes. And

38:25

that's your repository for the data for

38:27

them. And then they are your assistant for

38:29

a specific project. And then you don't have to

38:31

care about all of this crazy structure. You just

38:34

say, please put all of your updates into this

38:36

folder. Do you like why do I need project

38:38

management software? Yeah, I don't think you do in

38:40

the future. You definitely don't in the future. It's

38:42

kind of wild. Yeah. And actually, even

38:44

if you do, even if you're using it,

38:46

it's not effective. Because as a human brain,

38:48

you're not actually mapping out all the dependencies.

38:50

Yeah, you're not thinking through. And also the

38:52

worst thing, the point you made about, okay,

38:54

what are the last three experiments you did? And

38:57

which one worked and why the one

38:59

that didn't work? The challenge we have

39:01

today in the mechanical operations is we are

39:03

not going back. We are not looking

39:05

at what we learn from those experiments because

39:07

it's too hard. It's too hard. It's

39:10

too hard. It's so hard. Exactly. Yes. So

39:12

it's not only that we are inefficient

39:14

because it's so manual. But then we

39:16

become ineffective because we are not technically applying

39:18

the learning from experiments that were done.

39:20

And imagine if you were not the one

39:22

who did it. Exactly. Then it's even

39:24

harder. Right. Literally, one of the best things

39:26

Kieran, you and I did in the

39:28

last six months was like, we were going

39:30

through planning at the end of last

39:32

year. And we just took some of the

39:34

raw decks and data from just planning.

39:36

And we were just put them in chat

39:38

to you. And we're just like, what

39:40

are all the dependencies we're not accounting for? What's

39:43

the stuff that's going to go wrong that

39:45

we're not thinking about? And you just

39:47

like, that's just been really, really hard. And

39:49

we got like, a great readout, we

39:51

changed a couple things was awesome. And you're

39:53

like, it took like 20 minutes

39:55

and you moved on. It was great. Yeah. You know,

39:57

with all the content that you guys are sharing Like

40:00

what's the best way to get the

40:02

portfolio of the platforms or different type

40:04

of use cases? It's like how do

40:06

you feel if the team's now going

40:08

into the second gear, for example, the

40:10

gear one is okay. Hey, we've tasted

40:12

blood. We've seen it works. Now it's about

40:14

expanding the horizons and really pushing the

40:16

limits. Now, do you guys have a

40:18

bit of a gear to playbook? Like exactly

40:20

what we just discussed. There are a lot

40:23

of names. I was trying to capture some

40:25

of them. And I'll dig in. Any thoughts

40:27

on that? Is that something that you guys

40:29

are helping guiding the industry with? Like going

40:31

from AI foundational use cases to then really scale

40:34

the AI use cases? Is that? Or like what

40:36

model to use for what thing? Is that what

40:38

you're asking? Yes. Yes. What's it what model to

40:40

use for what thing? And that's what's tricky. Yeah.

40:42

I think Claude for anything. Right. If you're marketing,

40:44

just buy Claude seats for everybody. Yeah. Right now.

40:46

Any write-in is like, Claude, even internal stuff for

40:48

execs and stuff, I still run through Claude. Claude

40:50

is a better creative model than open AI and

40:53

Google. Yeah. And a very good coder still, by

40:55

the way. From what everyone tells me, because I

40:57

can't delineate between all three, they all seem pretty

40:59

great at coding, Claude is like the preferred model

41:01

for codeine, and people don't know why. I don't know

41:03

if you've seen that, but people. I don't know why

41:05

it's so good, but it's so good, but it's so

41:07

good, but it's just like, but it's just like, like,

41:09

like, like, like, like, like, like, like, like, like, like,

41:11

like, like, like, like, like, like, like, like, like, like,

41:13

like, like, like, like, like, like, like, like, like, like,

41:15

like, We had Scott on who leads product for Entropic

41:17

and he was asking like, why do you love Clot

41:20

so much? I was like, well, it's like, why do

41:22

I like one friend better than the other? I don't

41:24

know. I just like gel better with Clot, right? It

41:26

just gets me, it understands me, it just knows me.

41:28

The O3 Open AI models are definitely better for strategy.

41:30

And so if you take a growth, that example, I

41:32

showed a growth dashboard and took the

41:34

data and then asked for a strategic

41:36

doc, I still find Open AI in

41:38

particular the recent models and you can

41:40

see on the benchmarks, it's just better

41:42

at strategy. Google Gemini is not far

41:44

behind and I need to actually get

41:46

deeper into the Gemini tutored releases. The

41:48

beauty of Gemini is just the integration

41:50

with Google's. platforms. It's huge. Yeah. Yeah.

41:52

That is their key advantage. And I

41:54

think the thing that could still mean

41:57

that they win because just the integration

41:59

into G drive. is you have all

42:01

the context, right? I got a

42:03

rough draft of everybody's performance review

42:05

just from all the decks in G

42:07

drive and Gemini. It's so much faster,

42:10

so much better. Oh man, yeah. This is

42:12

so amazing, man. I mean, obviously, if you

42:14

think AI at different places, but I'll

42:16

have so much more to share in

42:18

four weeks because I'm taking a lot

42:20

of energy away from it. Sweet, let's do it.

42:23

Yeah, I think this is a great episode because we

42:25

got some real insights into... we're a company like cracking

42:27

and you are how you think about AI. And then

42:29

just riffing on these things of like what are things

42:31

you're thinking about in the future and it'll be interesting

42:33

for you to come back on and tell us like,

42:35

do you do you find it good? Did you not

42:37

find it good? Has it been impact or not find it

42:39

good? Has it been impactful or not? Because A because AI,

42:41

because AI is such a broad thing, it's such a broad,

42:43

has it been impact or not? Because it has it has

42:45

been impact or not, has it, has it, has it, has

42:48

been, has been, has been, has been, has been, has been,

42:50

has been, has been, has been, has been, has been, has

42:52

been, has been, has been, has been, has been, has been,

42:54

has been, has been, has been, has been, has been, has

42:56

been, has been, has been, has been, has been, been, been,

42:58

been, been, been, been, been, That's the biggest part that comes

43:00

back to that operational question but I think this is

43:02

where leaders have to just take it on

43:04

upon themselves and just get the feed in

43:06

hand and I think the key is what

43:09

strategy you use to to filter out from

43:11

the clutter right exactly it is very easy

43:13

to get distracted exactly so that's why your

43:15

point at the end is important look we've

43:18

shared a lot of names a lot of

43:20

platforms but just focus yourself on two

43:22

or three. That's it. You know, and

43:24

those are the big ones and start

43:26

there. Then you can start to become

43:28

niche. Oh, here is this one thing.

43:30

But otherwise, it can get really noisy

43:33

and then you get overboard and

43:35

then you go back to your

43:37

old habits. Yeah, yeah. Yeah, I

43:39

actually think this is the most

43:41

interesting test of how quickly users

43:43

change behavior of all time. Yeah. Because

43:45

to your point, the natural inclination will

43:47

always be just, you know, doing the

43:50

thing I used to do. Well the other thing

43:52

here and you and I have talked about a lot is

43:54

that like I truly believe that like the what

43:56

model is best for what thing question is probably

43:58

not the right question. I think the right

44:00

thing to say is all of these things

44:03

are transformatively powerful and we are

44:05

underusing them. Yeah. And so if

44:07

I just took one, yeah, and

44:10

just obsessed about becoming deeper in

44:12

my adoption, fluency, expertise, and just

44:14

one, I'm probably far better than everybody

44:16

else. Yeah. Right? Because it's like there's

44:18

so much of all of these models

44:20

that we're underusing, because we are flipping

44:22

back and forth between all of them.

44:25

Yes. Yeah. Cool. All right. I think

44:27

this was a great episode. We really

44:29

appreciate you coming on. As always, you're

44:31

an incredible guest. And I think your

44:33

insights here are great for our audience.

44:35

So I appreciate it. And I think

44:37

we just were trying to go through

44:39

the process everybody has been going through.

44:42

Right. And like everybody is kind of living

44:44

in the same world over the last six

44:46

to 12 months. And I hopefully we shared

44:48

some perspective that's helpful. And then Meyer's going

44:50

to come back and give us around too,

44:52

which will be awesome. Which will be awesome.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features