Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Oh, such a it's such a clutch off -season
0:02
pickup, Dave. I was worried we'd bring
0:04
back the same team. I meant those team.
0:06
I met shades. blinds.com made it crazy affordable
0:09
to replace our old it Hard to
0:11
install? No, it's easy. I installed these
0:13
and then got some from my mom.
0:15
She talked to a design consultant for
0:17
free and scheduled a professional measure and
0:19
install. Hall of installed. They're the number one
0:21
online retailer of custom window coverings in
0:23
the world. blinds.com is the talked to a.com
0:25
right now and get up to 40 %
0:27
off mom. She talked a free professional measure
0:29
rules and restrictions may apply free. This
0:32
week on The Gray Area, we're talking
0:34
to Chris Hayes about how our
0:36
digital devices have changed us. Now
0:38
it's like traffic or air travel.
0:40
Like, it's a thing that we
0:42
all just experience as a bummer.
0:44
That you just talk to about
0:47
like, doesn't it suck that? You
0:49
know, we can't pay attention. The phones
0:51
are always going off. Listen
0:55
to The Gray Area with me,
0:57
Sean Elling. New episodes every
0:59
Monday available everywhere. What
1:04
does it mean to be a good person? Should
1:06
I take zero flights ever?
1:08
How do I balance caring for
1:11
others versus caring for myself? Hey
1:15
y 'all, it's John Flynn. You're listening
1:17
to Explain It To Me. If
1:19
you're a regular listener, you know
1:22
the drill. You call in with
1:24
questions and we try to
1:26
find you the answers. But not
1:28
all questions have capital A answers. Sometimes
1:32
they're more
1:34
squishy. Is
1:37
it unethical to stay on Twitter
1:39
or Facebook given the dangerous positions
1:41
that are expressed by their owners?
1:43
I found out from a DNA
1:45
test that my dad isn't my
1:47
biological dad. Should I tell him? That's
1:51
where my colleague, future perfect senior
1:53
reporter Segal Sampmule comes in. Hi.
2:01
Hey, hey, yay, we're finally getting
2:03
to do something together. So here,
2:06
and explain it to me,
2:08
we're basically looking for questions
2:10
with answers that aren't all
2:12
that easy to find if you pop
2:14
them into a search engine.
2:16
Seagal answers questions, too, but
2:18
a different kind. She's all about
2:21
ethics. So I'm a big nerd
2:23
who did a degree in philosophy.
2:25
And I'm also just a person
2:27
who in my regular daily life.
2:29
I feel like I think about...
2:31
moral questions a lot and like
2:33
what does it mean to be
2:35
a good person is that even
2:37
a thing what does it mean
2:39
to live a good life all
2:42
of that stuff and so I
2:44
launched last year this philosophical advice
2:46
column it's called your mileage may
2:48
vary and it's not like other
2:50
advice columns out there because my
2:52
basic feeling about the advice column
2:54
as a genre is a genre
2:56
was born in 1690 and like
2:58
at the time a lot of
3:00
people thought like there's like one
3:03
objectively right answer to moral questions
3:05
and there's like one objectively right
3:07
way to be a good person
3:09
and I'm just like no like I
3:11
don't buy that premise like I
3:13
don't think there's one objectively right
3:15
answer to the like super complex
3:17
moral questions that our lives throw
3:19
at us so I decided to
3:21
reimagine the genre and do an
3:23
advice column that's based on value
3:25
pluralism. Say more about that what
3:27
does that mean? That's an idea
3:29
that was developed by philosophers like
3:31
Isaiah Berlin and Bernard Williams. It's
3:34
basically the idea that every single
3:36
one of us has not just
3:38
one thing we value, but like
3:40
we have multiple values. And they
3:42
can be equally valid, but sometimes
3:44
they're in conflict with each other.
3:46
And when your values clash, that's
3:48
when a dilemma is going to
3:50
arise for you. When we get a question,
3:53
the first item on the to-do list
3:55
is research. What journals are there about
3:57
the topic you asked us about? Who's
3:59
written about it? What gaps are there
4:01
in the existing reporting? I wanted to
4:03
hear how Segal's process is different from
4:06
ours. You know, I think most advice
4:08
columns you write in, here's my question,
4:10
they're like, boom, here's the answer, here's
4:13
what you should do. And I'm like,
4:15
no, what I'm going to do is
4:17
tease apart what are the different values
4:20
that are like implicit in your question
4:22
that are intention with each other? Then
4:24
I will curate for you from like
4:27
the past 2,000 years or so of...
4:29
wise philosophers, spiritual thinkers, spiritual thinkers, psychologists,
4:31
etc. thinking about this, what do wise
4:34
people say about when those values are
4:36
intention? Like what can you do? Okay,
4:38
here's an example of those values, intention
4:41
with one another. Someone who vehemently opposes
4:43
President Trump, wrote in and asked about
4:45
how to talk to family members who
4:48
voted for him. So Gaul decided to
4:50
bring in the work of social psychologist
4:52
Jonathan Hayt. If that name sounds familiar,
4:55
It's because he recently wrote a book
4:57
about smartphones and kids. He and some
4:59
others developed this this thing called moral
5:02
foundation theory. The research there suggests that
5:04
people in different political camps prioritize different
5:06
moral values. So liberals are the people
5:09
who they're really sensitive to the values
5:11
of care and fairness and conservatives are
5:13
people who are also really sensitive to
5:16
values of like loyalty, and sanctity or
5:18
purity. The point is, it's not like
5:20
some of these values are like dumb
5:23
or wrong and some are like right
5:25
and smart. Like that's not it. They're
5:27
just different values. He has this this
5:30
phrase that I that I like. He
5:32
calls them moral taste buds. Oh. And
5:34
it's like your your tongue has taste
5:37
buds for like sour whatever it is
5:39
sour bitter or sweet. You know, it's
5:41
not like some of those are dumb
5:44
and some of them are right. Like
5:46
they're all just different things. How does
5:48
that relate back to this Trump question?
5:51
I tried to bring that up to
5:53
help that question ask or think about
5:55
his family members because he was saying,
5:58
how can I even talk to these
6:00
people? Like, they just seem to not
6:02
care at all about human suffering. Like,
6:05
I think they just voted for Trump
6:07
to check a religious box on the
6:09
abortion issue. Don't you care that some
6:12
women will die from this? This is
6:14
gonna cause so much suffering. And I
6:16
tried to say to him, like, maybe
6:19
don't look at your relatives assuming that
6:21
they're just totally fine with human suffering.
6:23
And like, their values are diametrically opposed
6:26
to yours. It's like, they're not. politically
6:28
opposed. They're different values that they're
6:30
putting more weight on, but like,
6:32
they are values. We're not saying,
6:34
don't try to convince anyone to
6:36
change their mind. All views are
6:38
equally fine. All political orientations are
6:40
equally correct. We're just saying, how do
6:42
we nudge people to actually live out
6:44
those values in a more balanced way
6:46
or in a more authentic way? So after
6:49
combing through those ideas, Sogol starts
6:51
writing the answer in her column.
6:53
What do you want to put more
6:55
weight on? Do you want to put
6:57
more weight on this value or that
6:59
value? How do you want to balance
7:01
them? What would it what could it
7:03
concretely look like to balance them? And
7:05
I'll give people some ideas about how
7:08
you might balance them. If you find
7:10
the case for one more compelling than
7:12
the case for another, you might put
7:14
more weight on that. Are there moral
7:16
questions that come up for you and
7:19
your daily life? Like are there times
7:21
where you're like, like, This philosophical advice
7:23
column is really born out of my
7:25
own angst, I would say, about like,
7:28
oh God, how do I deal with
7:30
these things? I think a common one,
7:32
which probably a lot of people can
7:34
relate to, is how do I balance
7:37
caring for others versus caring for myself?
7:39
Yeah. The first question I ever got
7:41
and answered for this advice column
7:43
was someone writing in... Look, I
7:45
love my mom. I really care
7:47
about her. She has like a
7:49
whole slew of health conditions. I
7:51
do a lot to try to
7:53
help her out, but like as
7:55
she ages, I know she's going
7:57
to need even more help and
7:59
like... Would I be dropping
8:01
everything to take care of
8:03
my mom? I find that
8:06
so relatable. We might value
8:08
self-sacrifice and think that's admirable
8:10
sometimes, but we might also
8:12
really value self-preservation. So that
8:14
kind of thing. I think
8:17
about that a lot. After
8:19
the break, Seagal tackles a
8:21
question with a lot of
8:23
miles on it. Birthdays,
8:33
anniversaries, weddings, whatever the occasion.
8:35
It just got a little
8:37
more personal, with meaningful photo
8:39
gifts from Shutterfly. Add a
8:42
silly photo to a gold-rimmed
8:44
mug for your bestie. Put
8:46
your sweet puppy on a
8:48
cozy fleece blanket for your
8:50
teen. Gift your husband a
8:52
desktop plaque featuring all the
8:54
kids. Enjoy 40 of shutterfly.com
8:56
and make something that means
8:58
something. It's today
9:01
explained, I'm Noel King with Miles
9:03
Bryan. Senior reporter and producer for
9:05
the program, hello. Hi, you went
9:07
to public school, right, Miles? Yes,
9:09
go South High Tigers. What do
9:12
you remember about school lunch? I
9:14
remember sad, lasagna, shrink-rapped, and little
9:16
containers. I remember avoiding it. Do
9:18
you remember the nugs? The chicken
9:21
nuggets? Yeah, if I had to
9:23
eat school lunch, that was a
9:25
pretty good option. I actually liked
9:27
them, but in addition to being
9:29
very tasty, those nugs were very
9:32
processed, and at the moment, America
9:34
has got processed foods in its
9:36
crosshairs. It's true. We are collectively
9:38
very down on processed food right
9:41
now, none more so than health
9:43
and human services secretary nominee Robert
9:45
Fulride Kennedy Jr. I'll get processed
9:47
food out of school lunch immediately.
9:49
About half the school lunch program
9:52
goes to process food. Penn the
9:54
man who once saved a dead
9:56
bear cup for a snack fixed
9:58
school lunches. Today explained... every weekday
10:01
wherever you get your podcasts. This
10:03
is explained to me. We're
10:05
back talking with my colleagues, Zagal Samuel,
10:07
about her column. Your mileage may vary.
10:09
One of the things that I like
10:12
about her work is that it's thoughtful
10:14
without being judgy. I'll just give you
10:16
like a personal example when I was
10:18
an undergrad and like this was many
10:20
years ago because I'm old so like
10:22
I don't think climate was as top
10:25
of mind for me then as it
10:27
is now but I remember a bunch
10:29
of us would go have lunch together
10:31
on campus at this kitchen where they
10:33
like gave free like vegetarian lunches and
10:35
they had like plastic spoons and I once
10:37
was like having lunch there with some of my
10:39
friends and I used a plastic spoon and like
10:42
I tossed it out at the end of the...
10:44
meal and my friend who
10:46
was studying environmental sciences
10:49
looked at me and my one
10:51
little plastic spoon with such horror
10:53
and disgust and I was like
10:55
oh my god I felt like
10:58
laser beams were shooting out of
11:00
her eyeballs at me sending me
11:02
into flames the lighting me on
11:04
fire like it was the amount
11:06
of judgment seemed to me completely
11:09
out of proportion yeah with this
11:11
one little plastic spoon and This
11:13
is maybe silly on my part,
11:15
but at that age I found
11:17
that so off-putting and like repellent
11:19
It made me like have some
11:22
like yucky feeling about Environmentalist stuff
11:24
for a little bit Okay, so Gull has
11:26
been on the side of not so
11:28
friendly advice So she gets it, but
11:30
I wanted to know why she decided
11:32
to start this column now? What is it
11:34
about this very specific moment? We're in
11:37
that made her think this is what
11:39
the people need so I think that
11:41
there is a lot of emphasis these
11:43
days on, you know, you've probably heard
11:45
people talk about like optimization culture. Yeah.
11:48
And I feel like usually people talk
11:50
about how we see that showing up
11:52
in like, oh, optimize your diet, optimize
11:55
your exercise routine, that kind of stuff,
11:57
you know, like have soilens, like what?
11:59
I think like the less analyzed
12:01
version of this is like how
12:04
optimization culture is coming for our
12:06
souls and trying to tell us
12:08
to like optimize our moral lives
12:10
and optimize how good we are
12:12
as people and I think it's
12:14
great to try to be a
12:16
better person in some ways but
12:18
I think this lens can be
12:20
really really hard for people it
12:23
makes us feel like Nothing you
12:25
do is ever just going to
12:27
be good enough. Like you have
12:29
to be doing the most good
12:31
possible, otherwise you kind of suck.
12:33
And I just think that's a
12:35
pretty crushing way to live. It
12:37
also feels very performative, like who
12:39
are you doing this for? Are
12:42
you actually doing this for, you
12:44
know, I don't know, it feels,
12:46
maybe that's me jumping to a
12:48
judgment call, but yeah. Yeah, I
12:50
mean, I think sometimes it can
12:52
be performative. Sometimes there are people
12:54
who genuinely feel like, oh my
12:56
God, I really want to be
12:59
good, and that means I always
13:01
have to be doing the most
13:03
good possible. Otherwise, like, I'm falling
13:05
short. And I had a kind
13:07
of life-changing conversation with my best
13:09
friend once, who is a mathematician
13:11
and very rigorous sort of analytical
13:13
thinking. But she said to me,
13:15
you're goal is not to optimize
13:18
every possible outcome, which like you
13:20
cannot control. Your goal is to
13:22
live in line with your values
13:24
as best you can. And like
13:26
that was really kind of like
13:28
a aha moment for me. And
13:30
we talked about how like that's
13:32
really challenging because we have multiple
13:35
values and sometimes they conflict with
13:37
each other. And then when I
13:39
discovered value pluralism, I thought, oh
13:41
my God, this is such a
13:43
helpful lens for me. So maybe
13:45
it can be helpful for other
13:47
people too. Your latest column is
13:49
about travel, climate change, and trying
13:51
our best. Can you describe that
13:54
question to us? What was this
13:56
person asking? So someone submitted a
13:58
question to your mileage. that I
14:00
think is really relatable. She
14:02
was saying, look, I really care
14:04
about climate change. Normally, the advice
14:07
given is like, avoid flying
14:09
because flights generate a very huge
14:11
amount of carbon emissions, like not
14:13
good for the climate, take a
14:15
train or a bus, or like
14:18
your hybrid car, whatever, instead. But
14:20
this person lives in a kind
14:22
of area that is like disconnected
14:25
from public transport. So she
14:27
is. more than 12 hours drive away
14:29
from the nearest city. Oh wow, yeah,
14:31
yeah. So like there aren't buses, there
14:33
aren't trains, like that's not an option
14:36
for her. And so she really tries
14:38
to avoid flying, but like if
14:40
she never takes a flight, that
14:42
means she's like never going to
14:44
go on vacation anywhere? Yeah. Beyond
14:46
her like immediate vicinity. So she
14:48
was like, should I take zero
14:50
flights ever or can I take
14:53
like one flight per year? but
14:55
she's starting to feel super resentful
14:57
because she sees her friends are
14:59
flying out super carefree they're flying
15:01
every month to like watch a
15:03
game and so she feels like
15:06
her like one individual lifestyle choice
15:08
is like being erased by the
15:10
actions of her friends and she's
15:12
getting really resentful about it yeah
15:14
what was your initial reaction
15:16
when you read that question
15:19
honestly like my gut initial reaction
15:21
was like Oh, honey. Like, yeah, I'm
15:23
kind of like, okay, all right, I'm
15:25
glad you said that because I was
15:27
kind of like, listen, I love me
15:30
a little vacate. I'm not gonna
15:32
not go to my friend's wedding
15:34
or Bachelorette party. Like, listen, I've
15:37
reduced my food waste. I recycle. Sorry,
15:39
Kylie Jenner has a private jet. I'm
15:41
not going to feel bad for being,
15:43
like, Southwest Aylist, like, but then I
15:46
was like, wow, maybe I'm a bad
15:48
person. Well, it's one of these things
15:50
where like bamboozles our brain, right? Yeah.
15:53
Like, I found it really relatable and honestly
15:55
got me thinking more about my own
15:57
emissions. I felt like a core thing
15:59
embedded. in this question was like
16:01
this question of should we
16:03
be like moral purists or
16:05
absolutists? Like should we take
16:08
this sort of purist path
16:10
and doing extreme self-sacrifice to
16:12
never ever ever fly or
16:14
is it like look we
16:16
also value other things like
16:18
nurturing relationships with our friends
16:20
or our family who live
16:22
far away maybe or we
16:24
value learning about other cultures
16:26
or like developing your career
16:28
or whatever it is? So
16:30
my response was about helping
16:32
this person think through like
16:34
yes climate is a super
16:36
important value great given your
16:38
circumstances especially I don't think
16:40
we necessarily have to say
16:42
we're gonna never ever ever
16:45
take a single flight for
16:47
the rest of our lives
16:49
I think it's great to
16:51
minimize how much we fly
16:53
and I really I'm trying
16:55
to like cut down how
16:57
much I fly but I
16:59
wouldn't counsel people to try
17:01
to take the purest absolutest
17:03
path unless They are one
17:05
of those like magical beings
17:07
who can do that without
17:09
becoming resentful or judgmental of
17:11
others. So what does philosophy
17:13
have to say about flying?
17:15
One more break and we'll
17:17
find out. Can you talk
17:20
through the process of answering
17:22
this question about flying? Like
17:24
how did you go about...
17:26
quote, unquote, finding the answer.
17:28
Okay, so this is like
17:30
where it helps to have
17:32
a little bit of background
17:34
in philosophy from my academic
17:36
days, I guess, being a
17:38
big nerd, that I immediately
17:40
thought of this essay I
17:42
love by this philosopher, contemporary
17:44
philosopher, her name is Susan
17:46
Wolf. And she wrote this
17:48
really awesome influential essay called
17:50
Moral Saints. And she basically
17:52
writes about like this. category
17:55
of people that are trying
17:57
to basically do that thing
17:59
of optimizing their morality. like
18:01
the most good possible. And
18:03
she writes that that's like
18:05
not necessarily great all the
18:07
time because if you're trying
18:09
to spend every single moment
18:11
of your life like raising
18:13
money to donate to Oxfam,
18:15
then you're not spending any
18:17
time cultivating your musical talents or
18:20
like reading cool novels or
18:22
developing your awesome sense of
18:24
fashion or like whatever else.
18:26
that makes a human life
18:28
rich and meaningful and like
18:30
makes you a cool, interesting
18:32
person that like other people
18:34
value being around. So Susan Wolf,
18:37
that is like a super great essay
18:39
that I recommend for people about like
18:41
the downsides of being a moral saint.
18:43
So I knew for sure I wanted
18:46
to draw on that. And then I also
18:48
thought about how with the purest path,
18:50
there's something psychologically appealing to
18:52
some people about it because
18:55
It does make it really easy in
18:57
your brain when you think like what
18:59
is the okay number of flights for
19:01
me to take per year? The answer
19:03
is very clear. Zero. Mm. Like
19:05
you have certainty, you feel like
19:07
boom, it's super clear, I know
19:09
I'm doing my duty, whereas if
19:11
you're like taking the more moderate
19:13
approach, it's like how many flights
19:15
is okay? Is it like three
19:17
per year, one per year, one
19:19
per decade? Like who knows? It
19:21
feels very subjective, right. can cause anxiety
19:24
and uncertainty. So that made me
19:26
think about this other philosopher,
19:28
Bernard Williams, and I wanted to
19:30
pull in his argument that, you
19:33
know, he says, we love the appeal
19:35
of objectivity, like we have in
19:37
the sciences, it makes us feel
19:39
invulnerable. It makes us feel like
19:41
no one is going to be
19:43
able to come and accuse us
19:45
of making the wrong decision or
19:47
choice, because look. It's objective. The data
19:49
says, like, that's the thing to do.
19:52
Yeah. And he says it's a fantasy
19:54
to think that we can import objectivity,
19:56
like we have in the sciences, into
19:58
the domain of ethics. not a thing.
20:00
Okay, it sounds like this has
20:03
been a conversation going on for
20:05
a long time because sometimes I
20:07
think we are more uncomfortable with
20:09
uncertainty now more than ever and
20:11
we try to optimize everything, even
20:13
our interpersonal relationships. Does this feel
20:15
like a newer trend or more
20:17
extreme or is this just like
20:19
a case of, okay, there's nothing
20:21
new under the sun? Yeah, so...
20:23
This kind of actually goes back
20:25
400 years and I won't bore
20:27
you with a whole like history
20:29
spiel but basically this goes back
20:31
to like the invention of calculus
20:33
and like when the scientific method
20:35
was getting rolling and people started
20:37
to like have this notion of
20:40
facts and objectivity and oh my
20:42
god how awesome we can be
20:44
certain about things we can have
20:46
like certain knowledge in the world
20:48
great and that works awesome for
20:50
certain things in life right like
20:52
that works great in science for
20:54
example it works great if you're
20:56
trying to like make antibiotics it
20:58
does not work great in the
21:00
moral domain because like morality is
21:02
notoriously contested we've been arguing for
21:04
thousands of years about what is
21:06
the good and there's still no
21:08
consensus And we have so many
21:10
different competing moral values. So it's
21:12
not a good fit for that
21:14
kind of thinking. Our culture nowadays
21:17
is so saturated with like tech
21:19
bro energy and like Silicon Valley
21:21
mentality of like we can engineer
21:23
our way through any problem. Every
21:25
life problem is a math problem.
21:27
That it like calms you into
21:29
thinking. that you can bring that
21:31
same like calculator energy to all
21:33
aspects of your life and including
21:35
the moral one and I just
21:37
think that's a fantasy. The next
21:39
time a reader or a listener
21:41
finds themselves with a moral conundrum
21:43
which you know they pop up
21:45
aside from you know calling them
21:47
in or submitting them to you
21:49
what questions do you think they
21:51
should ask themselves to kind of
21:54
like figure out for themselves what
21:56
the quote-unquote right answer is. Yeah
21:58
okay I love that. I would
22:00
say, look under the hood of your
22:02
dilemma. What are the values that
22:04
you have that are intention here?
22:06
Because that's what's creating this
22:08
dilemma for you. It's not
22:10
about the actual nitty gritty
22:12
facts of the situation. It's
22:14
about the fact that within
22:16
yourself you have multiple different
22:18
things you value and they're in
22:20
conflict with each other. Can you
22:22
think through these values and figure
22:25
out, is there one you want to
22:27
put more weight on? because you're not
22:29
putting 100% of your eggs in one
22:31
of the baskets. Or maybe it's like
22:34
a life situation where you're like, I'm
22:36
honest to God, like 50-50 torn between
22:38
these different values. Okay, so how could
22:41
you balance between them? What could that
22:43
look like concretely to balance between them
22:45
and not torture yourself with guilt because
22:47
you're not putting 100% of your eggs
22:50
in one of the baskets? Maybe the
22:52
morally appropriate thing to do in the
22:54
situation is to be like, I'm going
22:56
to put 50% in this. basket and
22:59
50% in this one. Maybe that makes
23:01
sense for you in that scenario. Yeah.
23:03
What's a real life example of trying
23:05
to balance between two values? So I'll
23:07
go back to the one that I
23:10
personally find most, it's the one about
23:12
like caring for others, caring for yourself.
23:14
Yeah, yeah. I have really struggled
23:16
with this with my own family.
23:18
I love my family members so
23:21
much. I felt very torn between,
23:23
like, on the one hand, wanting
23:25
to do everything I possibly can
23:28
for them, for my grandmother, for
23:30
my dad, you know, wanting to
23:32
just sacrifice whatever. And on the
23:35
other side, being like, you know, I
23:37
also value my mental health. I
23:39
also like value having well-being
23:42
and, like, stability in my
23:44
life and... You know, if you're
23:46
sacrificing for others to the point
23:48
where it's really taking a massive
23:50
toll on your own mental health,
23:52
is that going to be
23:54
sustainable? First of all, like would
23:56
your family members want that? Is that
23:59
fair to you? You're also a
24:01
human person who counts as much
24:03
as the other people do. Is
24:05
that fair to your friends, your
24:07
children, your other folks you value
24:09
in life who are also counting
24:11
on you to be able to
24:13
show up for them in different
24:15
ways? So that's I think that's
24:17
like probably one that a lot
24:19
of us can relate to. Can
24:22
you give us a preview of
24:24
some questions you'll be answering soon?
24:26
Ooh. Honestly. I feel like it
24:28
might be that time when I
24:30
need to tackle the like, is
24:32
it ethical to stay on Twitter
24:34
or Facebook one? Okay, yeah, I'm
24:36
very much interested in that one.
24:38
I am still on Twitter and
24:40
Facebook. I don't like it, but
24:42
I feel like I need it
24:44
for my job. We're staying connected
24:47
to certain friends. So I like
24:49
haven't tackled that question yet, but
24:51
maybe I will. But it's honestly,
24:53
like, it's good because it's getting
24:55
these questions. pokes at me and
24:57
forces me to think through things
24:59
that otherwise I might not like
25:01
fully think through. All right, Sagal
25:03
Samuel, thank you so much for
25:05
explaining this to us. My pleasure,
25:07
thank you. That's Sagal Samuel. She's
25:09
a senior reporter here at Fox
25:12
and writes though your mileage may
25:14
vary column. You can find a
25:16
link to her work in our
25:18
show notes. And just because Sagal
25:20
answers questions, doesn't mean she doesn't
25:22
have some of her own. I
25:24
personally donate a fair bit of
25:26
money to like anti-poverty groups, like
25:28
give directly, which I think is
25:30
great, but sometimes I do wonder
25:32
on like a systems level, will
25:35
we just always have poverty? Is
25:37
that a built-in feature of our
25:39
world that is impossible to eradicate?
25:41
So, if you have a question
25:43
of your own, give us a
25:45
call. We're at 1,800, 618, 845,
25:47
and we would love to hear
25:49
from you. That's it for this
25:51
episode of Explain It to Me.
25:53
It was edited by Jorge Just,
25:55
and fact-checked by Caitlin Pinsi Moog,
25:57
mixing, sound design, and engineering this
26:00
week by Christian Ayala. Supervising producer
26:02
is Carla Javier. Special thanks to
26:04
Patrick Boyd and Rob Byers. If
26:06
you like the works, Sogal and
26:08
I do, consider becoming a Vox
26:10
member. Not only does it make
26:12
things like Sogal's column or this
26:14
podcast happen, but it comes with
26:16
some pretty sweet perks too. Go
26:18
to vox.com/members to learn more. Thanks
26:20
for supporting our journalism. Thanks for
26:22
listening. And thanks for calling in
26:25
your questions and reflections. Talk to
26:27
you soon. Bye.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More