Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Have you ever wished you had
0:02
more influence at work that
0:04
people would naturally be more
0:06
likely to buy in on
0:08
whatever idea you're selling them,
0:10
whether they report to you
0:13
or not? Well, you're in
0:15
luck. I teach a virtual
0:17
10-week class on internal communication
0:19
and change management through Texas
0:21
A&M University, and it's
0:23
enrolling now. Get details and
0:25
enroll at HBO. and click
0:28
on certificate program. You get
0:30
to learn directly from me,
0:32
including live virtual office
0:35
hours over Zoom, with
0:37
a cohort of interested,
0:39
brainy folks like you
0:42
from around the world.
0:44
Again, learn more and
0:46
enroll in the Internal
0:48
Communication and Change Management
0:51
Course at HBL.TAMU. EDU.
0:53
That's HBL like Human
0:55
Behavior Lab. T-A-M-U like
0:58
Texas-A-N-M-U-N-U-N-U-N-U-N-U-S-D-E-D-U and, when you're
1:00
ready, enjoy the show. Welcome
1:02
to episode 476 of
1:04
The Braney Business,
1:06
Understandingless
1:09
Psychology of Why People
1:11
Buy. Today's episode is
1:13
all about setting up
1:16
experiments. Ready? Let's
1:18
get started. You
1:24
are listening to the brainy
1:26
business podcast where we dig
1:28
into the psychology of why
1:30
people buy and help you
1:32
incorporate behavioral economics into your
1:34
business making it more brain-friendly.
1:36
Now here's your host, Molina
1:38
Palmer. Hello, hello everyone. My
1:40
name is Molina Palmer and I want
1:43
to welcome you to the Brainy Business
1:45
Podcast. I share a lot of tests
1:47
and experiments on this show, some of
1:49
which have been done in academia, some
1:51
in industry, and hopefully they have inspired
1:53
you to want to do your own
1:55
experiments. Of course, for the
1:58
academics listening, you are potentially...
2:00
already doing lots of these, which
2:02
is awesome, but in business real
2:04
experiments are not as common as
2:06
we all might like. Experiments matter
2:08
because they can help you to
2:10
see if what you're trying is
2:12
working without having to rely on
2:14
just instincts or gut feelings. It's
2:16
one thing to say, I think
2:18
that did better than we would
2:20
have before, and it's another to
2:22
say we saw a 38% increase
2:24
in sales when we changed the
2:26
word them to the number 18.
2:28
feels totally different. To be able
2:30
to have evidence like that, to
2:33
prove that your work is working,
2:35
or to find areas where it
2:37
isn't worth the resources you're investing
2:39
in a type of content that
2:41
isn't paying off or anything else,
2:43
to be able to know that,
2:45
to have that evidence, you need
2:47
to run some tests. If you
2:49
haven't done this before, it can
2:51
feel a bit overwhelming, but it
2:53
doesn't have to be. In this
2:55
episode, which originally aired way back
2:57
in August of 2019, I share
2:59
my top tips for conducting your
3:01
own experiments at work. Get ready,
3:03
we're going to jump right in,
3:05
but before we get to it,
3:07
just one last thing, don't forget
3:09
links for my top related past
3:11
episodes and books are waiting for
3:13
you in the shownotes for this
3:16
episode, which are found within the
3:18
app you're listening to, and at
3:20
the Brainy business.com/476. All right, let's
3:22
talk about experiments. Experimenting is so
3:24
important for any organization, but I
3:27
know it can feel intimidating if
3:29
you've never done it before. The
3:31
truth is you probably have done
3:33
experiments and not realized it because
3:36
they don't need to be big.
3:38
In reality, good experiments are incredibly
3:40
narrow in their focus because if
3:43
you test too many things at
3:45
once, you don't know what contributed
3:47
to the result or why it
3:49
came about. I remember my first
3:52
research paper, which was published in
3:54
the Association for Consumer Research back
3:56
in 2009. Interestingly, as I was
3:59
looking back for doing this episode,
4:01
I believe this is actually the
4:03
10-year anniversary of that publication as
4:06
it came out in volume 8.
4:08
Amazing how time flies and that
4:10
it ended up being exact down
4:12
pretty much to the month. There's
4:15
of course a link for you
4:17
in the show notes if you
4:19
want to check that out. That
4:22
paper was the result of my
4:24
senior thesis as an undergrad at
4:26
the University of Washington, which was
4:28
a requirement for me to graduate
4:31
from the Global Honors Program. So
4:33
I was actually the only undergraduate
4:35
business student on the whole campus
4:38
who did a thesis like this.
4:40
It was uncharted territory for me,
4:42
and in working with my advisor,
4:45
I had to present ideas for
4:47
what I wanted to research. This
4:49
probably isn't a big surprise to
4:51
anyone that's listening, but my style
4:54
and inclination in that process was
4:56
to solve gigantic problems, and so
4:58
all my ideas were far too
5:01
grandiose for what could be achieved
5:03
in a project like this. As
5:05
I was required to write a
5:08
15-20-20-page paper on the subject, I
5:10
figured it needed to be a
5:12
big topic, because how else could
5:14
you have enough content for that
5:17
many pages? Boy was I wrong.
5:19
So every time I presented ideas,
5:21
he would send me away saying
5:24
to narrow them down, make them
5:26
smaller, and more specific. Truth be
5:28
told, it was a little bit
5:30
frustrating over time, but eventually got
5:33
to the final paper and the
5:35
title ended up being Global Advertising
5:37
Standardization in Japan and the United
5:40
States, a closer examination of high
5:42
involvement products. Yes, it's a mouthful.
5:44
The basics of what I did
5:47
for that study was obtaining magazines
5:49
from both the US and Japan
5:51
from the same month and year
5:53
to ensure the ads would be
5:56
the same and separated them into
5:58
categories based on emotional processing and
6:00
mental... or cognitive processing to see
6:03
when the advertisers were more likely
6:05
to standardize and use the same
6:07
ads in these two very different
6:09
countries and when they would be
6:12
more likely to use different ads.
6:14
My sister who obtained her undergraduate
6:16
degree in Japanese linguistics was living
6:19
in Japan at the time of
6:21
my research project and was kind
6:23
enough to translate a giant pile
6:26
of ads for me. Thank you,
6:28
Sis. I'm sure I said thanks
6:30
then, but I'm going to say
6:32
thanks now again. The study found
6:35
that advertisements for low cognition products,
6:37
so ones that you have to
6:39
think less about, were twice as
6:42
likely to be standardized as high
6:44
cognition products like cars. Ads using
6:46
pictures were more than twice as
6:48
likely to be standardized than those
6:51
using a lot of text. At
6:53
the time, I didn't realize how
6:55
useful this information was or how
6:58
impressive the results were. I remember
7:00
when my advisor called to let
7:02
me know that the paper had
7:05
been accepted by ACR for publication
7:07
and I said, well, they must
7:09
pretty much accept everything right. Is
7:11
it pretty easy to get in?
7:14
He politely said, um, no. How
7:16
naive I was. In looking back
7:18
and reading my paper today, there
7:21
were so many variables and items
7:23
cross-reference just for a study that
7:25
looked at one month of magazine
7:28
ads. In my original pitches to
7:30
my advisor, I'm sure I would
7:32
have recommended all types of advertisements
7:34
and over a longer span of
7:37
time. I also believe I recommended
7:39
including a third country to make
7:41
it extra interesting. As it was,
7:44
I had to break everything into
7:46
categories and I remember having hundreds
7:48
of ads strewn around my apartment,
7:50
looking for brand matches across the
7:53
countries and then categorizing them and
7:55
using a 100-point scoring rubric for
7:57
each ad which I modified. from
8:00
an existing model. I'm so glad
8:02
I did this and I learned
8:04
so much from the experience and
8:07
I can see why my advisor
8:09
kept telling me to think smaller.
8:11
The study I ended up with,
8:13
which felt incredibly small to me
8:16
at the time, was actually a
8:18
huge undertaking with important results and
8:20
it was a true academic experiment.
8:23
An experiment like this is probably
8:26
something you would want to bring
8:28
in outside help for, especially if
8:30
you haven't been trained in the
8:32
space. When there's a lot weighing
8:35
on the outcome of the findings,
8:37
say you're deciding whether to launch
8:39
a new product or want to
8:41
test out different names in a
8:43
rebrand, it's worth bringing in experts.
8:46
However, there are tests you can
8:48
do on your own fairly easily,
8:50
which can still have a great
8:52
impact on your business. And the
8:55
nice thing about doing small tests,
8:57
it means you can be more
8:59
agile and adapt quickly, which is
9:01
very helpful in business these days.
9:03
My general tips, which I will
9:06
break down throughout the episode, are
9:08
to keep these three things in
9:10
mind when setting up your experiments.
9:12
Be thoughtful, keep it small, and
9:15
test as often as you can.
9:17
I've already told you a little
9:19
about keeping it small. In addition
9:21
to making it so you can
9:23
actually do the test on your
9:26
own, it also allows for you
9:28
to understand what contributed to the
9:30
result you're seeing. As I talked
9:32
about in the color theory episode,
9:35
there are a lot of sites
9:37
out there claiming to know what
9:39
the best color for buttons and
9:41
calls to action are. And yet,
9:43
when you dig into them, they
9:46
all claim to have a different
9:48
perfect color. One says red buttons
9:50
are always better. Another says orange
9:52
and this one says green. In
9:55
some of those cases, they went
9:57
from a link in the text
9:59
as their call to action to
10:01
adding a big red or orange
10:03
button with the call out and
10:06
then claimed that the color is
10:08
what made the difference. Was it
10:10
the color or the fact that
10:12
there was a button at all?
10:15
Or the placement of the button
10:17
or the font or the verbiage?
10:19
Those likely all changed when the
10:21
button was added, so it isn't
10:23
just color that impacted the result.
10:26
It was a whole myriad of
10:28
things. This should instead be separated
10:30
into multiple mini tests to determine
10:32
what is really best for the
10:34
piece. AB testing in emails or
10:37
on websites are really great for
10:39
this. And many systems are set
10:41
up to track all that for
10:43
you really, really simply. Let's say
10:46
you send a weekly email to
10:48
your list. In the first week,
10:50
you would test the link versus
10:52
the button. And that's it. All
10:54
other text and layout and colors
10:57
and everything are exactly the same.
10:59
Whatever the link says, the button
11:01
says. In the next week's email,
11:03
you can look at the color
11:06
of the button, and don't be
11:08
tempted to also integrate things like
11:10
different fonts or placement or size.
11:12
Everything, including the subject line and
11:14
imagery and verbiage, must be exactly
11:17
the same. The only difference between
11:19
the versions is the color of
11:21
the buttons themselves. If you have
11:23
a large list, You could try
11:26
multiple versions, say four different color
11:28
versions, as long as they all
11:30
send at the same time, because
11:32
time of day or day of
11:34
week could impact performance as well.
11:37
But if you have a relatively
11:39
small list, say under a thousand
11:41
people, just stick with two options.
11:43
Now, you may spend a few
11:46
months going through color tweaks to
11:48
see what's the best performing button,
11:50
and that's okay. As long as
11:52
you're tracking what you do and
11:54
what the results were, you're learning
11:57
at every step of the way.
11:59
Use my tips from that color
12:01
theory episode, which is linked for
12:03
you in the show notes, to
12:06
narrow it down to contrasting colors
12:08
that will work. in your specific
12:10
situation and branding. And if it
12:12
feels like a waste of time,
12:14
remember that Google tested out more
12:17
than 40 different shades of blue
12:19
to determine the perfect color for
12:21
their links. They've publicly said that
12:23
the shade of blue, not the
12:26
difference from red to blue or
12:28
orange to blue, but between greenish
12:30
blue, yellowish blue, or a purplish
12:32
blue, results in an extra $200
12:34
million in ad revenue for the
12:37
company every year. While your business
12:39
may not be as big as
12:41
Google, and the results may be
12:43
comparatively smaller, what if finding the
12:45
right color combination on your emails
12:48
generated an extra 25% in clicks?
12:50
Or if you could tweak one
12:52
item at a time and convert
12:54
10 more people a month on
12:57
your website? These small tweaks don't
12:59
need to take a ton of
13:01
your time, but they uncover small
13:03
changes that can make a big
13:05
difference. One of the studies I
13:08
share most often is the one
13:10
with the end cap displays for
13:12
Snickers bars. This study used anchoring
13:14
and adjustment and found when they
13:17
said by 18 for your freezer,
13:19
there was a 38% increase in
13:21
sales over saying by them for
13:23
your freezer. This is a huge
13:25
difference and most everyone is impressed
13:28
when they hear about the results.
13:30
The article I wrote for CU
13:32
Insight on this concept called One
13:34
Word that increased sales by 38%
13:37
resulted in tons of credit unions
13:39
and others reaching out to me
13:41
to ask questions or see about
13:43
working together. But if it wasn't
13:45
tracked... No one would have known
13:48
the real difference. We all get
13:50
hunches all the time, but behavioral
13:52
economics shows us those are often
13:54
wrong because they're based on logic,
13:57
not the true rules of the
13:59
subconscious brain. A perfectly
14:01
rational being would not be impacted
14:03
by color or font size or
14:05
framing, but we humans are. So
14:08
this all matters, and testing can
14:10
help you to figure out what
14:12
really makes a difference for your
14:14
audience. Speaking of framing, some other
14:17
things you could test would be
14:19
how your ads or emails or
14:21
direct mailers or website pages do
14:23
when you change a number frame.
14:26
say in one you put 78%
14:28
of clients by for me again
14:30
and the next says four out
14:32
of five clients by again and
14:35
another simply says most clients buy
14:37
from me again. I know I've
14:39
mentioned before when I worked with
14:41
a credit union on their advertising
14:44
for their checking account. And we
14:46
changed from focusing on the APR
14:48
that you would get to asking
14:50
the question, did your checking account
14:53
pay you $315 last year? That
14:55
their month-over-month checking account openings went
14:57
up 60%. Again, they wouldn't know
14:59
this if they hadn't been testing
15:02
and tracking the information on that
15:04
project we worked on together. You
15:06
can also do tests on blog
15:08
post headers or copy on social
15:11
media posts, images used on ads.
15:13
Really, the options are almost limitless.
15:15
And this leads us to the
15:17
second important thing to focus on
15:20
when you're doing experiments, and that
15:22
is to be thoughtful. You know
15:24
this is one of my favorite
15:26
things since I close all my
15:29
emails and podcast episodes with this
15:31
phrase, but what does it mean
15:33
for experiments for experiments? For one
15:35
thing, being thoughtful means looking outside
15:38
of what you always do or
15:40
what you know to be true.
15:42
Often, the things we take for
15:44
granted are those that present the
15:47
biggest opportunity for learning. Like in
15:49
that previous credit union example, it's
15:51
really common to advertise on APYs
15:53
or APRs rates of any kind,
15:56
and that's what every... everyone else
15:58
does, so you think it's the
16:00
best way to go. But behavioral
16:02
economics teaches us that humans do
16:05
not always act rationally or with
16:07
much forethought. Make sure you fight
16:09
the tendency to make assumptions about
16:11
people's behavior as you look for
16:14
opportunities to learn in your organization.
16:16
It also means being thoughtful and
16:18
taking the time to plan before
16:20
you jump into a test or
16:23
start testing absolutely everything. While these
16:25
are all small items being tested,
16:27
in bulk they could add up
16:29
to a lot of time and
16:32
just a volume of data that
16:34
you don't want to deal with.
16:36
Doing multiple versions of every email
16:38
post, website page, and mailer could
16:41
quickly become a full-time job for
16:43
a few people. and you also
16:45
have to analyze the data that
16:47
comes in. In some cases, it
16:50
could be easy, the number of
16:52
clicks on a button, but in
16:54
others, it can get more complex
16:56
and take a lot more time.
16:59
Instead of testing everything, test the
17:01
right things. Before you start building
17:03
a test, know what problem you're
17:05
trying to solve and why it
17:08
matters to solve it. What are
17:10
you trying to achieve, and why
17:12
does it matter for your business?
17:14
This is useful for a couple
17:17
of reasons. First, it can narrow
17:19
your focus so you aren't scattered.
17:21
That means you can be more
17:23
efficient with your time and dedicate
17:26
enough resources to implement what you
17:28
learn and continually get better. Anything
17:30
can be worth testing, but everything
17:32
can be a waste of time
17:35
if you don't have a clear
17:37
focus and goal. Second, It helps
17:39
communicate the why behind studies and
17:41
your organization in general. If your
17:44
company is about driving value, then
17:46
all your tests should be about
17:48
creating more value for your customers.
17:50
What allows you to spend less
17:53
on advertising so you can give
17:55
discounts to customers? customers. How can
17:57
you showcase the new products most
17:59
effectively? If your company is instead
18:02
focused on getting additional products in
18:04
the hands of existing customers, increasing
18:06
the efficiency of your emails, and
18:08
getting people to notice them is
18:11
important, as well as getting them
18:13
to click. If you have an
18:15
application process, do you know where
18:17
people get stuck or why? What
18:20
would get them all the way
18:22
through? Does a certain type of
18:24
customer get stuck or is it
18:26
everyone? It is important, of course,
18:29
to focus on items that are
18:31
driving revenue and value to your
18:33
company. Make sure the juice is
18:35
worth the squeeze so you aren't
18:38
putting a huge amount of time
18:40
and effort into something that will
18:42
never pay for itself. Being thoughtful
18:44
allows you to pick your battles.
18:47
And if you're thoughtful up front,
18:49
you're building things out with intention
18:51
and hopefully putting together a schedule
18:53
of what you're wanting to learn
18:56
and why it matters to the
18:58
business. When you test things, they
19:00
are inevitably going to be findings
19:02
that seem as though they can
19:05
impact other areas. It's possible that
19:07
they do, but it's important to
19:09
understand general liability. To put it
19:11
simply, the results of one test
19:14
will not necessarily hold true in
19:16
every situation or for every business.
19:18
Think back to the red buttons.
19:20
Because they have a lot of
19:22
contrast, red buttons would probably do
19:25
really well on Facebook where they're
19:27
surrounded by blue. But on Target's
19:29
website, maybe not. Being thoughtful when
19:31
you build your experiments will mean
19:34
you understand what you're looking for.
19:36
and the parameters so you can
19:38
know how you might be able
19:40
to extend the findings reasonably beyond
19:43
the single test. In addition to
19:45
being generalizable, it's important to know
19:47
whether the data you're collecting is
19:49
qualitative or quantitative and how each
19:52
can be applied. Conversations with people
19:54
are qualitative. Number of clicks are
19:56
quantitative. They're both important and should
19:58
be used in tandem in businesses,
20:01
but you should have at least
20:03
a basic understanding of the benefits
20:05
and limitations of each. I've linked
20:07
to a short YouTube video that
20:10
outlines some of those differences and
20:12
there's a ton of information at
20:14
your disposal if you want to
20:16
learn more about these two categories
20:19
of research. And when you do
20:21
find something, dig a little deeper
20:23
to see what else you can
20:25
learn. To use a financial institution
20:28
example, say you sent an email
20:30
to a thousand people who are
20:32
pre-qualified for an auto loan. 500
20:34
got one version and 500 got
20:37
another version where you were testing
20:39
button color. Let's say 50 clicked
20:41
on the button in test A
20:43
and 150 clicked in test B.
20:46
Great! You know this color seemed
20:48
to work well, but what else
20:50
can you learn from the test?
20:52
200 out of a thousand people
20:55
clicked on the button. How many
20:57
ended up getting a loan? What
20:59
is similar and different about them?
21:01
How long have they been using
21:04
the financial institution? How many products
21:06
do they have already? What type
21:08
of products? Are they in certain
21:10
age or income brackets? How does
21:13
this all differ from the 800
21:15
people who did not click? If
21:17
you follow up with the 800
21:19
people who didn't click, what did
21:22
they have in common with each
21:24
other and different from the 200
21:26
clickers that you could call out
21:28
and feature in a second email
21:31
as another test? This can help
21:33
you segment better in future campaigns.
21:35
Say all people with at least
21:37
two products and who have been
21:40
a member or customer for over
21:42
five years get this message and
21:44
those who have one product get
21:46
a different message, which is still
21:49
different from those who've only been
21:51
with you for six months or
21:53
less. Digging a little deeper and
21:55
asking questions is really useful. for
21:58
ongoing experiments and becoming more efficient
22:00
and knowledgeable in your business. But
22:02
you can't dig deeper on information
22:04
you didn't set up for in
22:07
advance. If you know you'll want
22:09
to dig into demographics and other
22:11
details, you probably need to build
22:13
that into your data pull up
22:16
front. So think and talk through
22:18
what you may want to know
22:20
and what could matter before building
22:22
out your tests, unless you want
22:25
an analyst who is definitely not
22:27
your friend. Of course, it's important
22:29
to combine this with keeping it
22:31
small, so you find the right
22:34
balance for your business. Don't get
22:36
so bogged down with all the
22:38
things you could do that you
22:40
don't do anything, but don't scale
22:43
back so much that the information
22:45
isn't usable. And my final tip
22:47
is to test early and often.
22:49
Smaller experiments, like I've been talking
22:52
about here, allow for frequent nimble
22:54
testing and ongoing quick improvements. My
22:56
research study that I get to
22:58
tell you about in a couple
23:01
of weeks, took six months of
23:03
data collection with much more time
23:05
to plan, analyze, and then write
23:07
out the results. It took two
23:10
years from pitching the idea until
23:12
publication. Actually, almost exactly another one
23:14
of these to the day. I
23:16
guess August is a really big
23:19
time for me. I don't know.
23:21
It's kind of funny how that
23:23
works out. Not every academic test
23:25
takes this long, and it was
23:28
more like a year from final
23:30
pitch to paper completion, but it
23:32
still takes a while. Small tests
23:34
let you act quickly. So the
23:37
more you test, the more you
23:39
learn. As you get into a
23:41
groove with experimenting, it does get
23:43
easier. And, as you learn a
23:46
little more each time and combine
23:48
results and ideas, the bigger, more
23:50
exciting findings start to emerge and
23:52
come together. You get a little
23:55
smarter each time, learning more about
23:57
your... and customers. Oh, and don't
23:59
get discouraged when you have a
24:01
non-finding. Say if you test a
24:04
bunch of button colors and nothing
24:06
gets people to click more, or
24:08
if you change up subject lines
24:10
and don't have a difference, no
24:13
result is still telling you something
24:15
very important. Maybe it means that
24:17
you don't have to worry so
24:19
much about what you put in
24:22
that picture in your email blast
24:24
because no matter what you do,
24:26
it doesn't seem to matter to
24:28
people. Maybe you don't need a
24:31
picture at all, and you can
24:33
reduce staff time searching for the
24:35
perfect photos. Findings tell you what
24:37
attracts attention or what matters to
24:40
your customers, but non-findings tell you
24:42
what they don't pay attention to
24:44
or care about. This can help
24:46
you further focus your attention, and
24:49
it's very valuable information as well,
24:51
so you can run leaner and
24:53
smarter than you would have without
24:55
having taken the time to test
24:58
and experiment. Now you have some
25:00
simple tips and guidelines for running
25:02
those tests and experiments that will
25:04
make your business stronger and more
25:07
efficient Remember to keep it small
25:09
be thoughtful and test often and
25:11
Don't get bogged down by analysis
25:13
paralysis. It's better to just start
25:16
and figure it out than to
25:18
hold up and wait until everything
25:20
is perfect and never get going
25:22
you'll get better as you go
25:26
So, what got your brain buzzing
25:28
as you learned about setting up
25:30
experiments today? For me, I really
25:32
love experiments and think they're so
25:34
valuable in business. And while I
25:36
know that we all get busy
25:38
and it can be easy to
25:40
let this slip, I encourage you
25:42
to invest in doing small tests
25:44
throughout the year. You don't have
25:46
to test everything. And as you
25:48
heard from my tip to be
25:50
thoughtful, you really shouldn't test everything
25:52
because it's a waste of resources.
25:54
And it doesn't have to be
25:56
huge. And again, they typically shouldn't
25:58
be so that you can keep
26:00
it small and test often. in
26:02
building a testing habit in your
26:04
organization. Learn a little bit on
26:06
a regular basis, and you can
26:08
continually be improving, which is a
26:10
way to help you always stay
26:12
ahead of your competition. Emails and
26:14
website pages are a great place
26:16
to start because they're easy to
26:18
track, but there are countless areas
26:20
where you can test on internal
26:22
communication and opt-ins from employees to
26:24
sales language to the way you
26:26
present packages. There are so many
26:28
options to test. It's awesome. So,
26:30
what are you going to test
26:32
first? Or what have you tested
26:34
already? I'd love to hear about
26:37
it. Please come share it with
26:39
me on social media. You'll find
26:41
me as The Braney Biz pretty
26:43
much everywhere and as Molina Palmer
26:45
on LinkedIn. And of course, if
26:47
it is a more sensitive test,
26:49
feel free to email me, Molina
26:51
at the Braney business.com. I can't
26:53
wait to hear from you and
26:55
learn how you're using testing in
26:57
your work. And who knows? Maybe
26:59
it'll get featured on an upcoming
27:01
episode of the podcast. If you
27:03
have really cool stuff that you're
27:05
doing that would make a great
27:07
case study to share here or
27:09
on LinkedIn or other places where
27:11
I write articles, you know, let
27:13
me know. I'd love to hear
27:15
about it. As we close out
27:17
the show, don't forget about those
27:19
show notes with links to my
27:21
top related past episodes and books
27:23
and more. It's all waiting for
27:25
you in the app you're listening
27:27
to and at the brainy business.com/476.
27:29
And just like that episode 476
27:31
on setting up experiments is done.
27:33
Join me Friday for a brand
27:35
new episode with Dr. David Daniels
27:37
to discuss whether or not investors
27:39
value gender diversity. It's going to
27:41
be a lot of fun. You
27:43
don't want to miss it. Until
27:45
then, thanks again for listening and
27:48
learning with me. And remember to
27:50
be thoughtful. Thank you for listening
27:52
to the Braney Business podcast. Molina
27:54
offers virtual strategy sessions, workshops, and
27:56
other services to help businesses be
27:58
more brain-friendly. For more
28:00
free resources, visit
28:02
the Braney .com.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More