Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Welcome to the Analytics
0:02
Power Hour Hour. Analytics
0:04
topics covered conversationally and sometimes
0:07
with explicit language. Hey everybody,
0:09
welcome. It's the Analytics Power
0:11
Hour Hour. This is episode
0:13
265. And I think it
0:15
was Socrates who said the
0:18
unexamined life is not worth
0:20
living. And I believe he
0:22
said that right before putting
0:24
on his uro ring, slipping
0:26
on his loop band, and
0:28
jumping into his eight sleep
0:31
bed. One thing for sure,
0:33
though, we've got a lot more
0:35
places to collect data about ourselves
0:37
than we did back in his
0:39
day. And I think it represents
0:42
some interesting possibilities, maybe some challenges.
0:44
So we wanted to talk about
0:46
it. I mean, we're data people.
0:48
So, you know, who better to
0:50
tackle this topic. And Julie Hoyer,
0:53
manager of Analytics at Further, do
0:55
you use any of these tools
0:57
to like measure stuff about yourself?
0:59
Funny enough, I religiously wear an
1:01
Applewatch and it's collecting things, but
1:04
I couldn't tell you the last
1:06
time I looked at the, you
1:08
know, the dashboard summary data in
1:11
the app, if I'm honest. Nice.
1:13
No, that counts though. That counts.
1:15
So I'm Tim Wilson, head of
1:17
Solutions and facts and facts and
1:20
facts. measuring your heart rate.
1:22
I've got my polar H10 heart rate
1:24
monitor on right now because just want
1:26
to see how excited I get throughout
1:29
this show. So we should run that
1:31
in real time along so the podcast
1:33
to see how excited or unexcited Tim
1:36
is on a topic or how stressed
1:38
out it makes him. And I'm Michael
1:40
Helbling and yeah, I think I've got
1:43
stuff on my phone that measures how
1:45
many steps I take and things like
1:47
that. Okay, but we needed a guest.
1:50
Somebody who could help shed some light
1:52
on this topic and bring this discussion
1:54
to you or listeners. So we found
1:56
one. Michael Tiffany is the CEO and
1:58
co-founder of Fulker Dynam. was the
2:00
founding CEO, then president of Human,
2:03
a cyber security company. He also
2:05
serves on various boards as well
2:07
as advises startups. And today he
2:10
is our guest. Welcome to the
2:12
show, Michael. It's a pleasure to
2:14
be here. Me and all of
2:17
my connected devices. Nice. Are you
2:19
big? Do you do quite a
2:22
bit of that? Or just, I
2:24
assume because of your company. You
2:26
probably do a lot of testing
2:29
at least. Rock and an Apple
2:31
watch. I'm wearing an aura ring,
2:33
I've got a connected scale, I've
2:36
got an eight-sleep bed, I'm I'm
2:38
breathing into this lumen device to
2:40
instrument my metabolism by by looking
2:43
at my out-breaths. Here's how weird
2:45
I am. I'm rocking a smart,
2:47
addressable breaker box. So, so among
2:50
other things. I'm measuring, I'm like
2:52
monitoring power to the stove to
2:54
just passively monitor how often I'm
2:57
cooking. Whoa. Yep. That's a whole
2:59
nother level. What's the eight sleep
3:01
bed? What's the eight sleep bed?
3:04
Yeah, I haven't heard of that.
3:06
It's magnificent. It's a bed that
3:08
circulates through it, interwoven through the
3:11
entire bedtopper, are small channels for
3:13
water. that run to a refrigeration
3:15
slash heating unit, so the bed
3:18
can either cool you down or
3:20
warm you up, or in my
3:22
case, key for marital bliss, cool
3:25
me down while warming my wife's
3:27
side. Wow. Wow, that sounds nice.
3:29
But it also does a lot
3:32
of measuring of like your sleep
3:34
quality and stuff like that at
3:36
the same time, right? That's exactly
3:39
right. Yeah. And in owning, I
3:41
was an early adopter owning this
3:43
thing. feels like owning a Tesla
3:46
where the same hardware has been
3:48
getting better and better with OTA
3:50
updates. So while I bought it
3:53
mostly for that temperature regulation, I've
3:55
seen its sleep monitoring, its measurement
3:57
of my heart rate in the
4:00
night. like just get better and
4:02
better and more accurate, which has
4:04
been a delight. Wow. Nice. Yeah,
4:07
I used to have an if-then-that
4:09
routine running against my scale to
4:11
dump any weights I did into
4:14
a Google sheet for a long
4:16
time, but that was a long
4:18
time ago. And I think the
4:21
company that made that scale doesn't
4:23
have an API anymore, so. That
4:25
was like my gateway drug, right?
4:28
Okay, nice. If this then that
4:30
scripts to like gather this kind
4:33
of stuff. Yeah, and look at
4:35
me now. Nice. I do diligently
4:37
when I'm traveling, I miss a
4:40
little bit that I don't have
4:42
a scale because I have it
4:44
every morning that is part of
4:47
the routine. Not to the point
4:49
of having a connected scale. I
4:51
actually was given a connected scale
4:54
for Christmas, I think a year
4:56
or so ago, and I'm like,
4:58
I don't think I need that.
5:01
It's just take a measurement and
5:03
punch it into my phone while
5:05
my toothbrush is running. Who knows?
5:08
Yeah, whatever works. Okay. Hmm. All
5:10
right. So it's not a competition.
5:12
What is... Well, yeah, I hope
5:15
not, because I'm not winning. Michael
5:17
wins. Yeah. No. Yeah. All right.
5:19
So, so yeah, Michael, what, what
5:22
is the word for this? Like,
5:24
so one of the things that
5:26
gets used a lot as sort
5:29
of self quantification or self-date data.
5:31
Yeah. But like... What is sort
5:33
of the holistic term for this
5:36
or what's going on in this
5:38
space? Because obviously there's, even we've
5:40
mentioned a bunch of different companies
5:43
and things like that, but there's
5:45
more, there's many, many more. And
5:47
you can go beyond that to
5:50
like, you know, DNA, like 23
5:52
and me and those kinds of
5:54
things as well. Yes. So, so
5:57
in the early days, I would
5:59
say the pioneering hackers who were
6:01
coming together and sharing tips and
6:04
tricks tricks, we're talking about the
6:06
movement as quantified self. And that
6:08
really was in its pioneering phase.
6:11
These days, like I just showed
6:13
you my oral ring, they've surpassed
6:15
a million sales in North America.
6:18
This is now a popular device,
6:20
not just a niche device. And
6:22
while that has taken off, quantified
6:25
self as a term. of art,
6:27
I would say, has actually declined.
6:29
And this is a good thing,
6:32
not a bad thing, because what
6:34
quantified self promises you by, you
6:36
know, just the meaning of the
6:39
words is a bunch of numbers.
6:41
And that's not what people want,
6:44
right? They want insights, they want
6:46
self-knowledge, and they want increasingly connected
6:48
wellness, connected health. And I think
6:51
that captures something important, which is,
6:53
you know, the intention, the goals
6:55
here. It's not really about counting
6:58
steps. It's actually about, you know,
7:00
10 more years of a good
7:02
life. Yeah, so what is, I
7:05
guess, is there a term or
7:07
is there a singular idea or
7:09
vision that everyone says this is
7:12
what we're trying to get to
7:14
is X? Right. If I had
7:16
to pick one, it would be
7:19
connected wellness. And the reason why
7:21
it's those terms in particular is
7:23
that we're in a transition right
7:26
now based on the recognition that
7:28
health care has for many, many
7:30
years really been something more akin
7:33
to sick care. It's about fixing
7:35
you after something is broken. And
7:37
that's not awesome. There are things
7:40
you should be doing right now
7:42
to improve your wellness that mean
7:44
that less things will go wrong.
7:47
So that's the, apart from just,
7:49
you know, branding and marketing, that's
7:51
the true reason why you're seeing
7:54
the word wellness more. It's to
7:56
try to differentiate the, you know,
7:58
proactive pursuit of optimal health versus
8:01
recovery from something going wrong. We're
8:03
doing that in two ways that
8:05
are new, signaled by the word
8:08
connected. One is. that we're wearing
8:10
increasingly smart devices that in effect
8:12
make you like a type A
8:15
personality, like make you like a
8:17
really good diarist without you having
8:19
to do any work, right? I
8:22
just step on my scale, I
8:24
don't write anything down, which is
8:26
nice. And so it's connected in
8:29
that sense, the devices somehow probably
8:31
really sending bites over the wire.
8:33
And then also connected in the
8:36
sense that this data by being
8:38
digitally native is more shareable. with
8:40
a doctor with a loved one,
8:43
maybe even, you know, just shared
8:45
socially because so much about staying
8:47
fit and healthy is, it's like,
8:50
it depends on social engagement and
8:52
like doing it with others. So,
8:54
so if I had to pick
8:57
two words to capture everything that
8:59
seems to be the, the ascendant
9:02
term, it would be connected wellness.
9:04
And who's, this is funny, I
9:06
would think of their, early days
9:09
of internet of things where there
9:11
was there was talking of if
9:13
you're imagine your garage door being
9:16
able to tell you that it's
9:18
got a bearing that needs to
9:20
be greased and it's going to
9:23
go out right with sometimes those
9:25
seem kind of forced like I
9:27
don't spend a whole lot of
9:30
time feeling like I need I
9:32
need to preemptively maintain my garage
9:34
door opener like it will break
9:37
every 10 to 15 years but
9:39
when you talk about the health
9:41
the logically, early detection, early detection
9:44
slash preventative care makes sense. Is
9:46
the thinking that that is in
9:48
the hands of a, a mean,
9:51
the data collection has to, has
9:53
to, is geographically tied to the
9:55
human, but is it something that
9:58
the health care? provider will say
10:00
I need I need your historical
10:02
data if you have it or
10:05
like who's Yeah, where does it
10:07
come from? Who's driving that? Here's
10:09
how I'm approaching this in my
10:12
own life, and I found this
10:14
to be transformative. Actually, goes back
10:16
to Michael's opening observation about Socrates.
10:19
Self-knowledge is incredibly hard. It's actually
10:21
incredibly difficult to achieve extraordinary self-knowledge.
10:23
And so the way it's done,
10:26
the best way to achieve extraordinary
10:28
self-knowledge and insight for the past
10:30
several thousand years, going back to
10:33
Socrates, going back to, you know,
10:35
Vedic religions in India, or even,
10:37
I was just looking at the
10:40
rule of St. Benedict, you know,
10:42
some 1500 years ago, he's writing,
10:44
everyone does the same thing worldwide,
10:47
which is dramatic simplification. You live
10:49
like a monk, like this is
10:51
the point of of the monastic
10:54
life, it's to dramatically simplify your
10:56
life, so then you can focus
10:58
and achieve extraordinary self-knowledge and insight.
11:01
And, you know, sometimes you peer
11:03
into the very nature of reality
11:05
as well. I don't want to
11:08
do that. Like, I want to
11:10
have the self-awareness of a monk
11:13
while actually engaging with the world
11:15
like a bone of a want.
11:17
And so the challenge I set
11:20
before me, being a computer nerd,
11:22
is Like, can I use computers
11:24
to help me out in this
11:27
regard? Because computers are infinitely patient
11:29
and honestly they're really good at
11:31
counting stuff. Like, I am, I
11:34
believe that Kung Fu Masters centuries
11:36
ago really could cultivate, you know,
11:38
the ability to like, just be
11:41
constantly aware of their own heart
11:43
rate. And that was probably awesome.
11:45
I'm not going to do that.
11:48
I'm going to put on an
11:50
Apple watch. So. That's sort of
11:52
an empowering view of the world,
11:55
but I would say. that something
11:57
must be missing because the people
11:59
dawning Apple watches or aura rings
12:02
or other kinds of instrumentation are
12:04
augmenting their bodies, they're augmenting their
12:06
lives with breakthrough technology that was
12:09
sci-fi just decade ago, but I
12:11
don't think we feel like the
12:13
six million dollar man, right? You
12:16
strap this in and you don't
12:18
like just feel magically empowered. So
12:20
what is it like? What's missing?
12:23
I think that siloization is a
12:25
really big limiting factor. And I'll
12:27
give you a health care example,
12:30
and then we'll go back to
12:32
like my connected breaker box. My
12:34
bed has all these awesome instruments,
12:37
right? Like it's measuring my HRV.
12:39
It'll tell me how long I
12:41
spent in deep sleep. But it
12:44
knows nothing about what I did
12:46
the day previously that contributed to
12:48
or ruin a good night's rest.
12:51
I, for instance, learned and other
12:53
folklore users have seen the same
12:55
thing, that by getting. Passive telemetry
12:58
on my eating. So I'm not
13:00
even a big food logger that's
13:02
like a little bit too much
13:05
work for me, but I will
13:07
put on a CGM. So I've
13:09
done multiple experiments wearing a connected
13:12
glucose monitor, a continuous glucose monitor,
13:14
that's just passively recording my blood
13:16
glucose. So therefore is going to
13:19
see blood sugar spikes when I've
13:21
eaten a bunch of carbs. And
13:23
what do you know? Like a
13:26
few weeks worth of experimentation showed
13:28
that... If I want better specifically
13:31
deep sleep, I should shift my
13:33
carbs if I'm going to eat
13:35
any to the beginning of the
13:38
day. So carbs before noon, I
13:40
sleep well. Carbs afternoon, you're starting
13:42
to get into a dangerous zone.
13:45
Like dessert after dinner, forget about
13:47
it, I'm going to have an
13:49
elevated heart rate and I'm going
13:52
to have shortened deep sleep. It
13:54
is impossible to know about that
13:56
causal relationship. unless you're somehow tying
13:59
the data that's drawn from the
14:01
CGM with the data that the
14:03
bed knows about. So we've surrounded
14:06
ourselves with these ostensibly smart devices,
14:08
but they're not really smart. They're
14:10
just data-producing devices. The smartness comes
14:13
from a higher level of analysis.
14:15
And I feel like people like
14:17
me are on the leading edge.
14:20
We're geeking out on our own
14:22
data. doing data science on this
14:24
raw data in a Python notebook,
14:27
which is like too much to
14:29
ask for from maybe, you know,
14:31
the average person, but that's going
14:34
to be within the grasp of
14:36
the average person, some extent already,
14:38
and to an increasingly large extent,
14:41
because of coding copilots. So people
14:43
who've never written a lick of
14:45
code before are sometimes, you know,
14:48
getting like one shot. outputs of
14:50
functional code from, you know, colloder,
14:52
chat, chute, chute, t, that means
14:55
that what used to be really
14:57
esoteric data science skills are becoming
14:59
increasingly within the grasp of ordinary
15:02
people, but only if you've gathered
15:04
and desilode the data, hence my
15:06
focus with Volkra. I think that's
15:09
something that I've been thinking about
15:11
a lot is how, even once
15:13
you have all the data... in
15:16
one spot so that you could
15:18
use it to paint a bigger
15:20
picture, ask more helpful questions about
15:23
your health, how do we determine
15:25
what good looks like? Because what's
15:27
interesting is some of the devices
15:30
seem to be making some of
15:32
that decision and determining like what
15:34
is a good range of these
15:37
metrics. Other ones don't. They truly
15:39
do just collect the data. So
15:42
it's interesting to think when you
15:44
start to connect those things and
15:46
tie them together. Kind of back
15:49
to Tim's question, does it become
15:51
a place where that is baked
15:53
in so an individual can go
15:56
out? these questions and get those
15:58
types of answers or is it
16:00
more so that the value is
16:03
it's all together and you could
16:05
take it to a professional to
16:07
help tell you what does this
16:10
mean is this bad and based
16:12
on that like what do I
16:14
do about it yeah I'm thinking
16:17
about I'm thinking about changing the
16:19
world in this order once you
16:21
have the self-knowledge that I'm describing
16:24
then you also have new ways
16:26
of sharing how it's going in
16:28
your life with another person, which
16:31
could be a doctor, but could
16:33
just be a spouse, could be
16:35
a group of friends. So everything
16:38
starts with solving the observability problem.
16:40
Like, I think it's too hard
16:42
to get help because it takes
16:45
so much effort to just describe
16:47
to anyone else, like, this is
16:49
what's going on with me. Like,
16:52
this is how I've slept the
16:54
last week or, you know, this
16:56
is what's stressing out. is you
16:59
can think of as the human
17:01
equivalent of what we call in
17:03
Devops like observability, right? So the
17:06
instrumentation, these connected devices, they're solving
17:08
the observability problem. Then there's like
17:10
this analysis problem, which we just
17:13
sketched. And then finally, there's new
17:15
forms of sharing. And I'm like
17:17
really excited about that. Like I
17:20
want to know how. Like my
17:22
friends are sleeping in general, right?
17:24
Like how is it going with
17:27
people that I love but now
17:29
live distant from me? And also
17:31
what's normal. So what I'm hoping
17:34
is by reducing the friction and
17:36
the risk of sharing personal observability
17:38
data like this, but by making
17:41
it secure and controllable. then we'll
17:43
also be able to pull this
17:45
data to find out what's normal
17:48
across. larger groups, so you can
17:50
kind of compare yourself to averages.
17:52
Right now it's like really hard
17:55
to tell. Am I a weirdo?
17:57
And I think the internet is
18:00
sort of good at solving those
18:02
problems if you can build the
18:04
bridge between the data collection and
18:07
the kind of social sharing that
18:09
you want to do. I've got
18:11
anxiety now, as it is with.
18:14
I mean with Strava or I
18:16
mean I had Fitbit before or
18:18
Apple, I mean, I mean there
18:21
does feel like a broad parallel
18:23
that is not encouraging which is
18:25
move away from us measuring ourselves
18:28
and just kind of the world
18:30
of digital where at a corporate
18:32
level there is this obsession with
18:35
let's gather everything we can. I
18:37
mean the 360 degree view of
18:39
the customer. taken to an extreme
18:42
would be a marketer knows how
18:44
often you're cooking so they can
18:46
you know make self easy cook
18:49
meals available to you or something
18:51
yeah right I mean there's the
18:53
there's the nefarious which I feel
18:56
like insurance and government we should
18:58
get into as well right but
19:00
just the idea I mean there
19:03
there have to be people listening
19:05
and because I'm experiencing it a
19:07
little bit myself like Oh my
19:10
god, like sharing, comparing, like don't
19:12
we have, don't we have a
19:14
challenge with our youth just from
19:17
the crude form of TikTok and
19:19
Instagram comparing themselves and it's not
19:21
good for their mental health. So
19:24
it's like this, gather all this
19:26
data first, hope the analysis happens
19:28
and then we're creating community, is
19:31
there a dark side or downside
19:33
to that we need to that
19:35
we need to figure out. I
19:38
think so. I think there's there's
19:40
extraordinary benefit and an extraordinary risk
19:42
and in that That's why an
19:45
entrepreneur most known for starting cybersecurity
19:47
companies have, that's why I've waited
19:49
into this. Our design with fulcra
19:52
importantly starts with who we're working
19:54
for and how we make our
19:56
money. When you create an account
19:59
with fulcra, your data belongs to
20:01
you. You are not sharing it
20:03
with your future self. And our
20:06
revenue model is. asking for money
20:08
for that service and we need
20:11
to re-earn our customers trust every
20:13
day and if we lose that
20:15
trust then they will stop paying
20:18
us money and we will be
20:20
very sad. So I think that
20:22
being a force for personal data
20:25
sovereignty in this way is something
20:27
you have to choose to do
20:29
at the foundation of your company
20:32
and build into your DNA. I
20:34
think that if you are an
20:36
ad-funded company, even if you are
20:39
a multi-billion dollar or multi-trillion dollar
20:41
or ad-driven company, you cannot just
20:43
cite to like pivot into a
20:46
new light of business where customer
20:48
data belongs to the customers and
20:50
is, you know, encrypted in motion
20:53
and at rest and like is
20:55
just designed for whatever the customer
20:57
wants to do with it and
21:00
nobody else. The control that people
21:02
have over their own data I
21:04
think is actually going to be
21:07
of increasing importance as. AI agents
21:09
become an increasingly important part of
21:11
the future. Because as we can
21:14
see over the last especially two
21:16
years of rapid improvement in generative
21:18
AI, it's going to be very
21:21
hard to control AI models by
21:23
trying to put a cap on
21:25
their capabilities. I just I don't
21:28
even see how that's going to
21:30
work. I don't think we can
21:32
say AI can only have an
21:35
IQ of 140. No higher. Like
21:37
that's just not going to work.
21:39
So. How are ordinary people going
21:42
to have any control over an
21:44
agent that they're asking So you
21:46
want to get help from a
21:49
helpful AI assistant. How are you
21:51
going to be able to accept
21:53
that help, share enough data with
21:56
that agent that you can get
21:58
some help, but make it like
22:00
a two-way door, make it a
22:03
revocable commitment? And I think there's
22:05
only one way to do that.
22:07
And that's to control access to
22:10
your own data. So you can
22:12
grant it to an assistant. And
22:14
you say, sure, you can read
22:17
my health data. But you can't
22:19
copy it. And if I change
22:22
my mind for any reason or
22:24
no reason at all, I get
22:26
to turn that off. If instead
22:29
all of our data is going
22:31
to live with some large tech
22:33
provider that's also running the models,
22:36
if the only way you get
22:38
the help is by like uploading
22:40
all of your data in a
22:43
one-step process, you've completely lost control.
22:45
And that's like not the future
22:47
that I want to bring about.
22:50
So what we're trying to do
22:52
here. is you know empower people
22:54
as I said with self knowledge
22:57
but it's even more broadly building
22:59
an important force for personal data
23:01
sovereignty so that we can have
23:04
the benefits of AI but put
23:06
people in control. It's interesting too
23:08
with being health data. I think
23:10
it brings a very different awareness
23:12
to the world of AI and
23:14
the sharing of data and your
23:16
own data that I think people
23:18
it's very different than today right
23:20
like some people think like Oh,
23:22
you care about what I clicked
23:24
on, what ads I saw, right?
23:26
Like, it's your data, but it
23:28
feels really different when you start
23:30
to talk about, like, your personal
23:32
health metrics. And so it's, I'm
23:34
really happy to hear you talk
23:36
about it that way, and it's
23:38
really helpful to hear you talk
23:40
about that way for even just
23:42
my understanding of, like, what could
23:44
this look like, what should this
23:46
look like ethically in the future,
23:48
but I really hope that it
23:50
kind of sparks that light bulb
23:52
for other people of like. When
23:54
we're talking about your data and
23:56
privacy and the importance of it
23:58
and how it interacts with AI,
24:00
like, yeah, thinking about it the
24:02
way you think about your personal
24:04
health data for all your other
24:06
data, I don't know, it really
24:08
sparked. some clarity for me. But
24:10
it also highlights the gap we
24:12
have in the United States around
24:14
data ownership and data rights as
24:16
a person because there's not laws
24:18
in the US about if you
24:20
give that data to somebody else
24:22
what they can use it for.
24:24
And so health data. can be
24:26
predictive of many different things potentially
24:29
so just like you know the
24:31
car insurance companies want you to
24:33
take the little thing and plug
24:35
it in to track all your
24:37
movements to save you money but
24:39
in reality right it's helping them
24:41
create better predictive models for what
24:43
the likelihood you're going to get
24:45
in an accident is and what
24:47
the risk you have to them
24:49
as an insured person is and
24:51
so in the same way like
24:53
where that data goes and so
24:55
like even if you take your
24:57
data and I think this came
24:59
up with 20 and me because
25:01
I think they were. contemplating selling
25:03
the company to somewhere else and
25:05
it's like well what what happens
25:07
to all that data if someone
25:09
else comes and buys that company
25:11
what are they allowed to do
25:13
with that data if they acquire
25:15
right yeah what happens with meta
25:17
buys forkera well I mean and
25:19
so like that's a legitimate concern
25:21
because there's no underlying regulatory structure
25:23
that says does someone who comes
25:25
along and buys a company like
25:27
that can do or not do
25:29
things with that they quote unquote
25:31
own now Yeah, right. I love,
25:33
I love this kind of thinking.
25:35
And I think that when you
25:37
dig into privacy by design at
25:39
many companies, you find that there
25:41
is this end state where people
25:43
just say, well, we never do
25:45
that. And like that is an
25:47
adequate answer because you cannot guarantee
25:49
that you will always have your
25:51
hands on the wheel. So in
25:53
fact, I would encourage anyone listening
25:55
as they're thinking through what privacy
25:57
by design at a at, you
25:59
know, a Olympic level really looks
26:01
like, you have to show how
26:03
you are preserving privacy, even if,
26:05
you know, ultra super mega. Corp
26:07
acquires your company. You actually need
26:09
to limit the powers you have
26:11
as a business operator to mess
26:13
with people's data and inspect their
26:15
data, right? So that even under
26:17
the conditions where you're acquired by
26:19
a company that doesn't share your
26:21
values, they can't just, you know,
26:23
like, switch on the... the data
26:25
vacuum mode and undo all of
26:27
your work. And there are absolutely,
26:29
this is not just me thinking
26:31
about this happily, there are good
26:33
patterns of privacy by design that
26:35
are built to operate at that
26:37
high level. And I think that's
26:39
absolutely the level that literally every
26:41
company should aspire to. But there's
26:43
the, there's having, there's following all
26:45
the principles of privacy by design
26:47
and then putting something in place
26:49
and then there is also. the
26:51
I mean you sort of said
26:53
it earlier there needs to be
26:55
a trust that somebody's going to
26:57
provide their data and explaining there
26:59
still winds up being you know
27:01
truck to the masses to those
27:03
million people with an or a
27:05
ring right if you say I
27:07
mean I would guess that most
27:09
of them are saying I don't
27:11
really care I'm not getting a
27:14
whole lot of thought take my
27:16
data but if you're going to
27:18
300 million people And the truly
27:20
paranoid fringe, the, and we're in
27:22
a very weird little subset of
27:24
four people here who are happy
27:26
to spend an hour talking and
27:28
thinking about this, and we're not
27:30
remotely scratching the surface of what's
27:32
actually going on in a design
27:34
to make that happen. So actually
27:36
convincing, you know, Joe Smith, that
27:38
No, this really is okay. And
27:40
maybe this becomes just a societal
27:42
breakdown thing. They're like, says who?
27:44
My cousin Vinny said, you're going
27:46
to use this for nefarious purposes.
27:48
And no amount of rationalization will
27:50
change your mind. Right. So to
27:52
me, this is a dimension. of
27:54
business design. I'm a business nerd.
27:56
And an observation that I've had
27:58
is that whatever a company says
28:00
its mission is, if the execution
28:02
on the mission is not exactly
28:04
what earns them money, that's not
28:06
the mission. The revenue is the
28:08
mission. Over time, if these two
28:10
things are not in alignment, I'll
28:12
tell you which one wins. It's
28:14
the one that increases earnings. So
28:16
you can just know that and
28:18
then you can consider that a
28:20
constraint of business design and then
28:22
construct a revenue model that is
28:24
truly consistent and in fact even
28:26
supports your mission. That's one of
28:28
the things that I'm most proud
28:30
of with the magnificent success of
28:32
human. Cybersecurity company fights cybercrime at
28:34
scale, goes after the profit centers
28:36
of cybercrime. Importantly. doesn't have to
28:38
sell to the CSO. It's not
28:40
just another layer of protection. If
28:42
you're in the business of fraud
28:44
detection, you actually reduce losses due
28:46
to fraud. And so the reason
28:48
why you get paid is that
28:50
you charge less than the savings,
28:52
right? So then every single customer
28:54
knows exactly why they're paying you
28:56
and the incentives of that company
28:58
are such that human makes the
29:00
most money by going after the
29:02
biggest source of cyber criminal profit,
29:04
which therefore means that it is
29:06
designed to have the biggest possible
29:08
positive effect on the world, which
29:10
is super cool. Here, with fulcrab,
29:12
here's the way I see this
29:14
playing out. Lots of people. So
29:16
consider the universe of like everywhere
29:18
there's data about you, right? Everything
29:20
you use that generates some data.
29:22
You know, Facebook knows some stuff
29:24
about you and Apple knows some
29:26
stuff about you and maybe, you
29:28
know, the, where a ring has
29:30
a little bit of data. And
29:32
I don't think you need to,
29:34
I don't think you need to
29:36
go around like to. all of
29:38
that. But if you and only
29:40
you have the superset, if you
29:42
have all your data from every
29:44
single one of those sources, then
29:46
you're the only one who has
29:48
the complete picture. And you could
29:50
decide to then invoke some right
29:52
to be forgotten, or you ask
29:54
for all your data to be
29:57
deleted, and then you'll truly only
29:59
have the only copy. But I
30:01
think it's good enough. that you
30:03
are the master of the complete
30:05
set? Because that'll alter incentives going
30:07
forward where some people who just
30:09
already have some sliver of data
30:11
about you, they don't have to
30:13
ask permission. They've already got it.
30:15
But if they want to have
30:17
access to the full picture to
30:19
provide a better service or whatever,
30:21
they have to ask you. And
30:23
to a great extent, I think
30:25
that's winning. Right? Like if individuals
30:27
are just in charge and get
30:29
to say yes or no, if
30:31
they're asked at all, that would
30:33
be pretty great. Right now, in
30:35
real time bidding for like most
30:37
of the ads that are getting
30:39
served to you, even though you've
30:41
had to answer a bunch of
30:43
nonsense cookie consent pop-ups, like no
30:45
one's really asking your permission for
30:47
doing, you know, some kind of
30:49
cookie or pixel sink that is
30:51
connected to some email newsletter that
30:53
you signed up for, that they're
30:55
using to figure out how many
30:57
people are in your household and
30:59
what your income is, right? You
31:01
had, you were just not involved
31:03
in any of that. And like,
31:05
that's the little turn that I
31:07
want to just make on society.
31:09
And we could do that through
31:11
lawmaking, we try to force people
31:13
to ask for your consent. But
31:15
I think what's even better is
31:17
to reward them, to rationally motivate
31:19
them to deal with you, like
31:21
they get better data. And we'll
31:23
deliver you a better experience. So
31:25
like they'll do it if it's
31:27
in their best interest. And I
31:29
think that happens when people are
31:31
in control of like the super
31:33
corpus. You bring up a point
31:35
that I actually would love. to
31:37
kind of circle back on because
31:39
it goes in two areas we've
31:41
talked about. One, I do feel
31:43
like if there was clarity, so
31:45
say you were the owner of
31:47
all your data, I feel like
31:49
the only way to get people
31:51
to share their data openly, like
31:53
on a large scale with companies,
31:55
is if those companies could tell
31:57
us as individuals like, We would
31:59
love this type of data from
32:01
you because then we could answer
32:03
these types of questions. Here's the
32:05
benefit. Like that value tradeoff they
32:07
talk about, like if you allowing
32:09
cookies, right? Like what does it
32:11
get you in return? Why should
32:13
you share this with this company?
32:15
But what's really interesting is we
32:17
know that one doesn't happen. I
32:19
think it would be amazing if
32:21
it could, but Because we know
32:23
that people aren't starting with a
32:25
question in mind always. There is
32:27
still the obsession that we talk
32:29
about a lot on the show
32:31
that companies have of like just
32:33
collect all the data and I
32:35
do feel like it goes into
32:37
the, you know, connected health conversation
32:39
we're having of people think if
32:42
I have all the data on
32:44
myself, then I'll be. able to
32:46
answer all these amazing questions. I
32:48
don't know what questions I'm exactly
32:50
going to ask, but if I
32:52
have all the data, I'll be
32:54
able to. And then you get
32:56
into the reality of a lot
32:58
of these questions. You can't answer
33:00
or you're answering them with data
33:02
that you inherently realize has biased
33:04
or errors in it. So then
33:06
it kind of takes you down
33:08
the path too of like, there's
33:10
a whole area of the industry
33:12
that spun up then to collect
33:14
more and better data, but we're
33:16
still probably going to miss the
33:18
piece of like. What's the motivation
33:20
of collecting all this data? Like
33:22
what do companies want to ask
33:24
and use it for? What do
33:26
you yourself want to ask? What's
33:28
a helpful question to ask? What
33:30
should you be collecting data to
33:32
then get out of it? So
33:34
I know there's kind of like
33:36
a lot of branches we could
33:38
take off that, but it's just
33:40
been interesting hearing the last couple
33:42
points you've made. I'll throw this
33:44
out there is like a concrete
33:46
prediction of the future. I think
33:48
the way this plays out is
33:50
that there's like... too much data
33:52
for a human to sort through.
33:54
There are too many potential use
33:56
cases for it all. But it
33:58
really does seem to me like
34:00
we're headed to a place where
34:02
helpful AI. assistance are within everyone's
34:04
grasp. So what I think will
34:06
happen is you will have a
34:08
kind of concierge agent, concierge, that
34:10
only works for you, that has
34:12
trusted access to your data. And
34:14
it intermediates with other companies agents,
34:16
and essentially negotiates on your behalf.
34:18
So instead of you having to
34:20
deal with a whole bunch of.
34:22
questions about consent and like individual
34:24
offers, there's just going to be
34:26
too much sort through. But you'll
34:28
be able to delegate it to
34:30
your agent and just be like,
34:32
show me the two marketing offers
34:34
that you think are really going
34:36
to land with me. That's how
34:38
much human detention I actually have.
34:40
And so your agent might be
34:42
dealing with like countless kinds of
34:44
unsolicited offers or ideas and is
34:46
providing the curation layer based on
34:48
knowing you. And then in a
34:50
rule space way, can share like
34:52
the little subsets of data that
34:54
are going to be able to
34:56
activate those offers or, you know,
34:58
make them work. I see. If
35:00
I'm right, that means that agent
35:02
to agent communication is going to
35:04
be the majority of internet traffic
35:06
within like 10 years. That's kind
35:08
of scary to think they could
35:10
be talking on your behalf in
35:12
the background and then that becomes
35:14
its whole own black box. Like
35:16
it's cool, but it kind of
35:18
scares me too. Just as long
35:20
as they open the pod bay
35:22
doors. Well, but I mean back
35:25
to that, I think there is
35:27
this, and I know I've run
35:29
through it when I've made a
35:31
fitbit, which I don't know, I've
35:33
probably gone through six fit bits
35:35
over, I don't know how many
35:37
years. And when I switched to
35:39
an Apple Watch, there was, I
35:41
genuinely felt a, oh my God,
35:43
I'm like losing all this historical
35:45
data. And I draw that parallel
35:47
to the business world. The reality
35:49
is, is I bet when somebody
35:51
has a heart issue, they get
35:53
sent home with a a heart
35:55
monitor and they say, let me
35:57
collect a couple of, let me
35:59
collect a couple of weeks, wear
36:01
this for a month. So as
36:03
you're talking AI agents, my brain
36:05
went off on a, I want
36:07
to lose weight, or I want
36:09
to sleep better, or I want
36:11
to do X or Y. Here's
36:13
all the data that I'm already
36:15
collecting in an aggregated way. Here's
36:17
what's already there. What can you
36:19
do with that? have the agent
36:21
tell me, you know what, you
36:23
should put a CGM on for
36:25
a while, but not turn into
36:27
this. There's nothing in my entire
36:29
history of working with Analytics that
36:31
makes me think that anyone is
36:33
going to be good at saying,
36:35
collect this data for a while
36:37
for a specific purpose because there's,
36:39
well, just in case, imagine the
36:41
next time you ask if you've
36:43
already been collecting that, then you
36:45
don't need to collect it for
36:47
another. two weeks. So I, the
36:49
exchange you just, you two just
36:51
had, had me thinking like, is
36:53
there a data, because that's one
36:55
of the privacy by design principles
36:57
is around like, collect a minimal
36:59
amount of data. So where does,
37:01
where does that fit into it
37:03
that don't collect it just in
37:05
case you need it, collect it
37:07
once you know what you need,
37:09
but this neb, get everything, and
37:11
then we'll. have the most to
37:13
work with. Some of it's not
37:15
going to ever, you know, matter,
37:17
right, or not matter enough to
37:19
make it worth it. And in
37:21
the name of prevention, it's kind
37:23
of hard to make that case.
37:25
That's right. So in the longevity
37:27
context, I think if you ever
37:29
want to train an AI on
37:31
yourself, you kind of want to
37:33
have as much data as you
37:35
can possibly afford to have. So
37:37
the things get different when you
37:39
think about data retention, when you're
37:41
thinking about it for your own
37:43
purpose. versus regulating businesses for their
37:45
commercial services. Like one of the
37:47
reasons why we felt I honestly
37:49
compelled to create full growth was
37:51
because because of the data lossage
37:53
that you just talked about. Like
37:55
the fact is that like Geocities
37:57
died, right? Like it turns out
37:59
the internet isn't forever, like data
38:01
will just completely go away. And
38:03
you've got a host of options
38:05
for saving files, you know, Dropbox,
38:07
drop, Google Drive, apples, There's no
38:10
streaming data store for consumers. There's
38:12
no like Kafka for people. So
38:14
for data like your location history,
38:16
your calendars, any biometric, right, like
38:18
my heart rate just keeps happening,
38:20
thank goodness. So it's not a
38:22
file, right? It'll never be a
38:24
file. It is a stream. So
38:26
I need a streaming data store
38:28
for it. And there literally were
38:30
no options. So we had to
38:32
write one ourselves. And the way
38:34
I see this. being brought to
38:36
bear over time is that all
38:38
of these data streams that I
38:40
have pouring into my full-grade data
38:42
store are capturing how I live
38:44
and how I live in what's
38:46
going on with me situational awareness
38:48
is one of the things you
38:50
need to give to a potential
38:52
assistant so they can actually be
38:54
helpful. Right now we're all experimenting
38:56
with chat bots where you have
38:58
to initiate every conversation and that's
39:00
really limiting. I want to live
39:02
in a world that's more like
39:04
what you just described him, where
39:06
some external source of intelligence points
39:08
out what I'm missing, like tells
39:10
me about a thing I wouldn't
39:12
have thought of, and is like,
39:14
dude, you need to put on
39:16
a CGM for a couple of
39:18
weeks. It's not going to be
39:20
forever. We just need to sort
39:22
of sample this diet of yours
39:24
and see what is up. I
39:26
want to leave this data corpus
39:28
behind for my heirs. And to
39:30
me... make all of this data
39:32
unambiguously mine and unambiguously inheritable, I
39:34
need to collect it before I
39:36
die. Like my kids are not
39:38
going to be writing to Amazon
39:40
or whatever and being like, please
39:42
let us export the data. Like
39:44
it's over by then. You need
39:46
to have it. It needs to
39:48
unambiguously be yours before the event.
39:50
And what is all this data
39:52
add up to? It adds up
39:54
to how I lived. It adds
39:56
up to who I did the
39:58
living with, you know, like you're
40:00
going to be able to in
40:02
some cases probably recreate my tone
40:04
by transcribing this podcast and, you
40:06
know, feeding it into 11 labs
40:08
and capturing my voice and you'll
40:10
capture some of my like vocal
40:12
intonations, but none of this tells
40:14
you about all that tacit stuff,
40:16
the person. all the procedural knowledge,
40:18
right? So an AI model that's
40:20
trained on me that lives on
40:22
after me is a model that
40:24
I hope will, you know, bake
40:26
cookies with my great-grandchildren. I'm extremely
40:28
proud of my almond flour chocolate
40:30
chip cookie recipe and it's not
40:32
just about the ingredient list, it's
40:34
about how I do it, right?
40:36
So you should be able to
40:38
like walk into the kitchen in
40:40
the future and, you know, boot
40:42
up GrandPappy Michael and... And we're
40:44
going to bake cookies together. This
40:46
is going to be great. But
40:48
only in the first part of
40:50
the day, not later. Yeah, that's
40:53
right. That's right. Yes, no compensation.
40:55
I was thinking there would also
40:57
be an agent saying, you have
40:59
not asked, hey, I'm grandpapi, Michael,
41:01
and you haven't asked me to
41:03
make cookies with you in a
41:05
while. Like, oh my God. Oh,
41:07
that's a little too on the
41:09
nose. Don't you want to connect
41:11
with your ancestry? guilt tripping beyond
41:13
the grave yeah you never call
41:15
yeah I mean there's there's something
41:17
that says like you you could
41:19
always be making better day-to-day like
41:21
you there is a bit of
41:23
a bleak you know hey do
41:25
you really want that next I
41:27
know you made the cookies that's
41:29
good but you really do you
41:31
need the third one we've been
41:33
monitoring you and I don't know
41:35
I mean it's yeah it is
41:37
interesting because obviously this vision of
41:39
the world creates and it kind
41:41
of brings to life some very
41:43
interesting possibilities kind of like you've
41:45
been talking about Michael and then
41:47
some concerns as well and so
41:49
it'll be very interesting to sort
41:51
of see how this progresses and
41:53
the one thing unfortunately we can't
41:55
progress with further we do have
41:57
to start to wrap up because
41:59
we're really at a time but
42:01
but it's this is pretty fascinating
42:03
and at the same time sort
42:05
of like I think on the
42:07
downside risk part of it we
42:09
all in sort of envision that
42:11
guy Brian Johnson is his name
42:13
that sort of like measures every
42:15
possible thing and wants to live
42:17
forever and and we're sort of
42:19
like yeah I don't think that's
42:21
me but I think there's a
42:23
somewhere there's a happy media to
42:25
catch recently he found out he
42:27
was doing there was one of
42:29
the things he was doing that
42:31
was actually working in the opposite
42:33
direction I can remember what it
42:35
was but well that's comforting actually
42:37
a little bit so that's fine
42:39
But it also is kind of
42:41
exciting to sort of think of
42:43
yourself like Neo and the Matrix
42:45
and you turn around and be
42:47
like, I know Kung Fu, right?
42:49
Because like, I didn't have to
42:51
study to become a Kung Fu
42:53
master, but now I have these
42:55
AI assistance and data that helps
42:57
me do the things they could
42:59
do. like understand my heart rate
43:01
and those kinds of things. Making
43:03
a data-driven decision that one of
43:05
your health interventions wasn't working is
43:07
kind of where we all need
43:09
to be, right? Instead of absorbing
43:11
the recommendations that supposedly worked for
43:13
the 22 people in the double-blind
43:15
clinical trial that might not work
43:17
for you, the question is what
43:19
works for you, specifically you, and
43:21
then you want to double down
43:23
on those and stop the ones
43:25
that don't. I'm optimistic about that
43:27
kind of tuning over time. I
43:29
think lots of people are going
43:31
to live for a very very
43:33
long time from here. Yeah, until
43:35
we upload ourselves into the machine
43:38
guide. Oh, wait. Yes, that's right.
43:40
Yes. Bring on the still up
43:42
on brains. I mean, we didn't
43:44
even touch on neural link. So,
43:46
you know, that's a second episode,
43:48
maybe. Okay, we have to wrap
43:50
up. But one thing we like
43:52
to do is go around the
43:54
horn, share something that we think
43:56
might be of interest to our
43:58
listeners. It's been a really awesome
44:00
conversation, though. Michael, thank you so
44:02
much for joining us to do
44:04
it. But yeah, you're our guest.
44:06
Do you have a last call
44:08
you like to share? It is
44:10
outrageously cold here in coastal New
44:12
Hampshire. It's going to get down
44:14
to three degrees Fahrenheit tonight. So
44:16
the first thing that pops in
44:18
my mind is actually just like
44:20
my favorite new product. I got
44:22
innu heat gloves. So get this.
44:24
There are gloves that take a
44:26
battery pack. The battery pack. Doesn't
44:28
use some weirdo proprietary connector, right?
44:30
It's USBC, thank goodness, right? So
44:32
I like charge the the battery
44:34
packs with a USBC outlet. I
44:36
snap them onto my gloves and
44:38
oh my god, they really do
44:40
work Just that's awesome. It's so
44:42
cool. Yeah, I just got a
44:44
fleece for Christmas that does the
44:46
same thing and it's you literally
44:48
just hit a button and it
44:50
turns on and it worms you
44:52
up all over. Michael Helpling this
44:54
is, Julie got that my son
44:56
got my wife the same vest
44:58
because we were gonna go skiing
45:00
and also in my helping had
45:02
shown me his and I was
45:04
like that's weird and then realized
45:06
that actually my wife had also
45:08
gotten one for Christmas and it's
45:10
similarly like hooked and she's had
45:12
electric gloves for a while. I
45:14
know right here I showed up
45:16
and I'm like talking I'm like
45:18
oh yeah I have an addressable
45:20
breaker box right like I have
45:22
an addressable breaker box right like
45:24
I'm doing all this like crazy
45:26
mad science clothing makes me feel
45:28
like I'm living in the future.
45:30
You know, like, yeah, it's a
45:32
holiday. That's awesome. That's awesome. All
45:34
right, Julie, what about you? What's
45:36
your last call? My last call,
45:38
I'm sure everyone's heard about the
45:40
congestion tax in New York. I
45:42
know it's a big thing, and
45:44
I had found the link to
45:46
the congestion pricing tracker, and it's
45:48
got some good data visualization. I'm
45:50
really interested to see as time
45:52
goes on, like. what do they
45:54
find? They even had done I
45:56
think a good job like stating
45:58
what they're hoping will happen from
46:00
it. So I love that they
46:02
actually paired it with, hey this
46:04
is how we're visualizing things, want
46:06
to know how they're going to
46:08
analyze it, what are their conclusions
46:10
going to be, but my favorite
46:12
part, Tim, is that when I
46:14
got to the bottom of this
46:16
tracker, it actually says that it
46:18
is run by, I'm guessing two
46:21
students at Brown University and supervised
46:23
by Emily Oster. So I was
46:25
like, you know, I wonder how
46:27
I love this. Oh my God,
46:29
it's great. So I've just been
46:31
peeking at it. It hasn't been
46:33
running obviously too long just for
46:35
this year so far. But I
46:37
think it's really cool and I'm
46:39
excited to see what comes out
46:41
of it, especially knowing that Emily
46:43
is involved. So I think that
46:45
might be a last call that
46:47
needs to become a future episode.
46:49
Awesome. That's so cool. I saw
46:51
that same thing. I was like,
46:53
oh my gosh. So that's so
46:55
cool. All right, Tim, what about
46:57
you? What's your last call? So
46:59
as I am, I tend to
47:01
do, I'm going to do a,
47:03
I'm going to do a threefer,
47:05
I think. So one, I want
47:07
to call out back three. Yeah,
47:09
they'll be quick. And Cassie Kasrachav
47:11
will be included in one of
47:13
them. So, uh, one, back when
47:15
we first started talking when we
47:17
first started talking to talking to
47:19
talking to talking to talking to
47:21
the fulker talking to the fulker
47:23
team talking to the fulker team
47:25
talking to the fulker team about
47:27
talking to the fulker team about
47:29
talking to the fulker team about
47:31
talking about talking about having Michael
47:33
on for this, tried out the
47:35
kind of hooked up what I
47:37
could and also kind of interesting
47:39
even I can find things that
47:41
that I'm not that connected but
47:43
you know there's no swarm connection
47:45
like it's it's crazy how many
47:47
things we have that are tracking
47:49
and the the challenge of tracking
47:51
everything but I think there is
47:53
a there's like a seven trial
47:55
if anybody wants to just download
47:57
the app, then you kind of
47:59
hook up whatever services and you
48:01
kind of get to see what
48:03
the aggregated data looks like. Is
48:05
that right? Yeah, yeah. Everyone should
48:07
give it a trial. I think
48:09
most people are surprised by the
48:11
data that they have and just
48:13
didn't know about. You quite likely
48:15
have years worth of step count
48:17
data that you didn't even know
48:19
because it was sort of silently
48:21
turned on by your iPhone. Yeah.
48:23
So it's always, even if you
48:25
just want to look and you
48:27
just want to delete the app
48:29
after seven. And it has a
48:31
very well documented API from, you
48:33
know, playing around with it. So,
48:35
you know, we're not, this is
48:37
not a, you know, paid endorsement,
48:39
but this whole discussion, if it's
48:41
got people thinking, oh, that's kind
48:43
of worth checking out. And I
48:45
think actually hearing you talk about
48:47
sort of division kind of makes
48:49
it a little more exciting to
48:51
people thinking. So, thank you. So
48:53
that's one. Number two, just a
48:55
PSA for anybody who, if you're
48:57
not already falling, castee- caught that
48:59
she's moved over to sub stack
49:01
and has, you know, went through
49:03
her three weeks of acting training
49:06
and whatnot. So that's just in
49:08
case decision dot sub stack.com. And
49:10
then my my core last call
49:12
is completely off. Not really analytics,
49:14
but over the holidays there was
49:16
reasons that I needed to explore
49:18
new podcasts and I did not
49:20
realize that Mike Brabiglia had a
49:22
podcast called Working It Out and
49:24
he has his David Sideris comes
49:26
on as kind of for his
49:28
second appearance and oh my god
49:30
like I I don't know of
49:32
like two more delightful people than
49:34
Mike Brabiglia and David Sideris on
49:36
the Working Out podcast so if
49:38
you're looking something for that is
49:40
just if you know either of
49:42
those guys and their sensibilities and
49:44
you're into them, that was like,
49:46
oh my God, just like heaven
49:48
of listening for 40 minutes or
49:50
however long it was. It has
49:52
nothing to do with analytics, but
49:54
had to put in a plug
49:56
for that as well. What about
49:58
you, Michael? You have six? Last
50:00
call? This is just a... What's
50:02
a for-fer? Is that a thing?
50:04
No. A quad? A quadfer. I've
50:07
got an octavefer for... No. It's
50:09
just really pushing the limits. I
50:11
have an AI agent that will
50:13
talk to your AI agent about
50:15
my last calls, so... a little
50:18
negotiate. No. So no, actually... So
50:20
one of the things I ran
50:22
into recently that I thought could
50:24
be a little bit helpful to
50:26
our audience is a lot of...
50:29
folks use Google Analytics for website
50:31
tracking and one of the things
50:33
that is kind of required to
50:35
make that tool useful at all
50:37
is to export it into big query.
50:40
But what's in big query is
50:42
often quite different than what's in.
50:44
GA4 and that's kind of hard
50:46
for business users. So Hamancho Sharma
50:48
actually did a mapping of those
50:50
metrics and the calculations to get
50:52
to those metrics pretty comprehensively from
50:54
GA4 to Big Query. So if
50:56
you're in that place where you're
50:58
trying to navigate that, that could
51:01
actually be quite a good resource
51:03
to use. And then alternatively,
51:05
you could also switch to a better
51:07
tool, but you know, in the meantime.
51:09
That is one that I would say
51:11
is one you could bookmark and use
51:14
as a good reference. So all right.
51:16
So I think this has been so
51:18
fun to kind of dive into this
51:20
conversation. Michael, so thank you so much
51:22
for joining us. It's been really cool.
51:24
I love kind of hearing your vision
51:26
for the future, the way you're thinking
51:29
about this, the way that the world
51:31
is progressing on these fronts is kind
51:33
of a very cool frontier that we're
51:35
on again, both with AI and the
51:37
connected health. and the Connect itself. So
51:39
it's, I really appreciate you, you know,
51:42
kind of sharing some of your thoughts
51:44
and things about that. And I'm sure
51:46
as our listeners have been listening, they
51:48
might have some thoughts and feelings about
51:50
this as well. We'd love to hear
51:53
from you. So go ahead, reach out
51:55
to us. The best way to do
51:57
that, probably on LinkedIn, or you can
51:59
connect. on the Measure Slack chat
52:01
group or also by email contact
52:04
at analytics hour. I. Oh, Michael,
52:06
do you, are you active on
52:08
social media at all? Could people
52:10
find you out there? Yeah, principally
52:13
find me on LinkedIn. The company
52:15
Fulgro Dynamics also on LinkedIn and
52:17
I'm on X with my old
52:19
school hacker handle of Kubla, KUBLA.
52:22
Get me up there and find
52:24
Fulgro on X as well. Awesome,
52:26
thank you. So you can reach
52:28
out to him as well and
52:31
follow him on those channels as
52:33
well. So you can hear what
52:35
the latest and greatest is in
52:37
this crazy changing world. All right,
52:40
well, hey, listen, one of the
52:42
things we're trying to do this
52:44
year is make sure that people
52:46
get access to this show. One
52:48
of the ways that you can
52:51
help with that is putting a
52:53
rating or review where you listen
52:55
to the show, whether it's Spotify
52:57
or Apple or wherever, we would
53:00
love to have you. Rate the
53:02
show review the show that really
53:04
helps apparently algorithmically Until AI can
53:06
take over and recommend us to
53:09
the right people I heart radio
53:11
is that really where we're yeah,
53:13
we're really targeting that one That's
53:15
our that's where our that's where
53:18
our listener lives is Let's package
53:20
that up. We'll get a series
53:22
A. Nope no time So but
53:24
yeah if you if you're listening
53:27
and you haven't done that before
53:29
We'd love it if you could
53:31
and we always just love hearing
53:33
feedback about the show as well
53:36
And it helps us think about
53:38
the future of the show so
53:40
really appreciated if you can and
53:42
no show would be complete without
53:45
a huge shout-out and a thank
53:47
you to our producer Josh Cohurst
53:49
for everything he's doing to make
53:51
the show possible so thank you
53:54
Josh And I am sure I
53:56
speak for both my co-hosts, Tim
53:58
and Julie. Thanks listening. Let's
54:00
matter what conversation what
54:03
you're measuring, just
54:05
remember, and questions on Twitter. Thanks
54:07
for listening. for listening. the Let's
54:09
keep the conversation going
54:11
with your comments, suggestions and
54:13
questions on questions on Twitter at
54:16
on the web on the web at
54:18
.io, Hour. Our Linked In Group, and and
54:20
the Measure ChatSlat Group. Music for
54:22
the podcast by by Josh Crowhurst.
54:24
So smart guys wanted to
54:26
fit in, so they
54:29
made up a term called
54:31
up a term don't work. Do
54:34
the analytics say go
54:36
for it, no matter who's
54:38
going for it? So
54:40
if you and I were
54:42
on the field, the
54:44
analytics it for it. It's
54:46
the stupidest, it? So if you and
54:48
thing I've ever heard for
54:50
reasoning the competition. say go for it
54:52
It's the stupidest laziest lamest thing I've heard for
54:54
keep analyzing in competition Music
55:02
podcast.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More