Omar Usman – Grandstanding The Use and Abuse of Moral Talk Interview the Authors

Omar Usman
Share Page

AI: Summary ©

The speakers discuss the importance of social media to motivate behavior and empower people to act like one another. They stress the benefits of expressing one's true values and finding one's own values in publicity. The conversation also touches on the importance of prioritizing public opinion and promoting the brand on Twitter or Facebook. The speakers suggest rebuilding one's views and finding a more informed and nuanced view on a topic, building alternate institutions, and decreasing mental health. They also emphasize the need for updating early etiquette guides and creating good norms in the way that humans are.

AI: Summary ©

00:00:00 --> 00:00:00
			Mhmm.
		
00:00:01 --> 00:00:04
			Okay. So we're joined by Justin Tosi and
		
00:00:04 --> 00:00:07
			Brandon Warmke, authors of the book Grandstanding,
		
00:00:07 --> 00:00:09
			The Use and Abuse of Moral Talk.
		
00:00:10 --> 00:00:12
			Thank you guys for coming on. We wanna
		
00:00:12 --> 00:00:13
			talk about your book. So just to start
		
00:00:13 --> 00:00:15
			out with, if you if you guys could
		
00:00:15 --> 00:00:15
			maybe just a brief
		
00:00:16 --> 00:00:17
			intro,
		
00:00:17 --> 00:00:20
			not necessarily who you are, although definitely share
		
00:00:20 --> 00:00:21
			that, but
		
00:00:21 --> 00:00:24
			how you got into this particular topic because
		
00:00:24 --> 00:00:27
			it's one of those things that you see
		
00:00:27 --> 00:00:29
			it everywhere. And then when it's finally labeled,
		
00:00:29 --> 00:00:31
			you're like, the light bulb goes off like,
		
00:00:31 --> 00:00:33
			oh, that's what I've been seeing all this
		
00:00:33 --> 00:00:36
			time. That's the hope. Yeah. Thanks. Thanks, Omar,
		
00:00:36 --> 00:00:39
			for having us. We're happy to be here.
		
00:00:39 --> 00:00:41
			Justin and I went to grad school together.
		
00:00:41 --> 00:00:44
			We did our PhDs at the University of
		
00:00:44 --> 00:00:46
			Arizona. And around 2014
		
00:00:47 --> 00:00:49
			or so, I think Justin and I both
		
00:00:49 --> 00:00:49
			noticed
		
00:00:50 --> 00:00:53
			what appeared to us anyway, like a trend
		
00:00:53 --> 00:00:54
			on social media
		
00:00:55 --> 00:00:59
			of people using political and moral discourse on,
		
00:00:59 --> 00:01:00
			say Facebook,
		
00:01:01 --> 00:01:03
			to draw attention to themselves. You know, ostensibly
		
00:01:03 --> 00:01:05
			they were having conversations about the poor or
		
00:01:05 --> 00:01:08
			about immigration or about health care,
		
00:01:09 --> 00:01:12
			traditional family values, you know, abortion, whatever it
		
00:01:12 --> 00:01:13
			was. But
		
00:01:14 --> 00:01:16
			the sense that we got from these conversations
		
00:01:16 --> 00:01:18
			in these sort of slogans, it's like they
		
00:01:18 --> 00:01:20
			were like PR releases, you know, they were
		
00:01:20 --> 00:01:23
			like trying to trying to draw attention to
		
00:01:23 --> 00:01:24
			themselves, trying to make sure that they could
		
00:01:24 --> 00:01:26
			communicate, they had the right values, a lot
		
00:01:26 --> 00:01:28
			of self centered sort of,
		
00:01:29 --> 00:01:29
			talk. And
		
00:01:30 --> 00:01:32
			so we did we started thinking about this,
		
00:01:32 --> 00:01:33
			you know, what's going on?
		
00:01:35 --> 00:01:37
			And, at the time, the only term for
		
00:01:37 --> 00:01:38
			this sort of sort of
		
00:01:40 --> 00:01:41
			status seeking,
		
00:01:42 --> 00:01:42
			behavior
		
00:01:43 --> 00:01:43
			was grandstanding.
		
00:01:44 --> 00:01:44
			So,
		
00:01:46 --> 00:01:47
			believe it or not, this is before the
		
00:01:47 --> 00:01:49
			term virtue signaling. If you or some of
		
00:01:49 --> 00:01:49
			your,
		
00:01:50 --> 00:01:52
			listeners are familiar with the term virtue signaling.
		
00:01:52 --> 00:01:54
			Very much so. The term that's kind of
		
00:01:54 --> 00:01:56
			become cache. I think it it makes you
		
00:01:56 --> 00:01:58
			feel sound sound smart when you use it.
		
00:01:58 --> 00:02:00
			But back then in 2014, this is the
		
00:02:00 --> 00:02:02
			only term that that was sort of in
		
00:02:02 --> 00:02:04
			the public vernacular. And so,
		
00:02:05 --> 00:02:07
			so at that point, we started writing, you
		
00:02:07 --> 00:02:09
			know, doing some research, doing some writing, and,
		
00:02:09 --> 00:02:11
			eventually it turned into a paper, and then
		
00:02:11 --> 00:02:12
			it turned into a book.
		
00:02:13 --> 00:02:15
			But I think the the
		
00:02:15 --> 00:02:17
			the kernel of this project was just and
		
00:02:17 --> 00:02:19
			I looking at people
		
00:02:19 --> 00:02:22
			behaving and including ourselves, to be honest Yeah.
		
00:02:22 --> 00:02:24
			And talking about morality and politics,
		
00:02:25 --> 00:02:27
			using it as a vanity project, using it
		
00:02:27 --> 00:02:29
			to seek status, using it to impress other
		
00:02:29 --> 00:02:29
			people.
		
00:02:30 --> 00:02:31
			So, you know, when you look, you you
		
00:02:31 --> 00:02:33
			know, you look online, you look even prior
		
00:02:33 --> 00:02:35
			to, let's say, the social media aids, that
		
00:02:35 --> 00:02:38
			type of self promotion and whatnot, it's not
		
00:02:38 --> 00:02:41
			uncommon. Right? Like, you have people that will
		
00:02:42 --> 00:02:44
			hype themselves up, get rich quick schemes, all
		
00:02:44 --> 00:02:45
			those kinds of things.
		
00:02:46 --> 00:02:48
			What did you find unique about moral talk
		
00:02:48 --> 00:02:49
			in particular?
		
00:02:54 --> 00:02:57
			Well yeah. So so as you note,
		
00:02:57 --> 00:03:00
			there's nothing there's nothing unique either about this
		
00:03:00 --> 00:03:03
			happening just with social media, and there's nothing
		
00:03:03 --> 00:03:04
			unique really about,
		
00:03:06 --> 00:03:07
			at least in in some ways,
		
00:03:07 --> 00:03:08
			about,
		
00:03:09 --> 00:03:10
			about people,
		
00:03:12 --> 00:03:12
			using
		
00:03:12 --> 00:03:15
			any kind of forum where, you know, your
		
00:03:15 --> 00:03:18
			qualities are displayed to impress people.
		
00:03:18 --> 00:03:19
			So
		
00:03:19 --> 00:03:21
			because we're kind of hardwired to behave this
		
00:03:21 --> 00:03:22
			way,
		
00:03:23 --> 00:03:25
			it's just sort of to be expected that
		
00:03:25 --> 00:03:26
			that when people have an opportunity,
		
00:03:27 --> 00:03:28
			to distinguish themselves,
		
00:03:29 --> 00:03:30
			they take it.
		
00:03:31 --> 00:03:34
			I guess what makes the moral arena so
		
00:03:34 --> 00:03:35
			interesting to us is is not just that
		
00:03:35 --> 00:03:37
			that we're trained as moral philosophers,
		
00:03:38 --> 00:03:40
			but that I think you can see,
		
00:03:41 --> 00:03:43
			the competition between people,
		
00:03:44 --> 00:03:46
			in a way that's just so inappropriate,
		
00:03:47 --> 00:03:49
			when when people are talking about, you know,
		
00:03:49 --> 00:03:49
			important,
		
00:03:51 --> 00:03:52
			matters of of of justice
		
00:03:52 --> 00:03:53
			or,
		
00:03:53 --> 00:03:56
			or whatever, you know, issue of moral concern,
		
00:03:57 --> 00:03:57
			it's
		
00:03:58 --> 00:04:00
			it seems to us just so transparent
		
00:04:00 --> 00:04:02
			that people are engaged in a kind of
		
00:04:02 --> 00:04:02
			competition
		
00:04:03 --> 00:04:03
			so often,
		
00:04:04 --> 00:04:06
			with their friends, in a way that they
		
00:04:06 --> 00:04:07
			would be
		
00:04:08 --> 00:04:10
			over things that don't really matter
		
00:04:12 --> 00:04:14
			so, I mean, you know, for instance, it
		
00:04:14 --> 00:04:15
			doesn't really matter
		
00:04:15 --> 00:04:17
			that much who has, you know, the best
		
00:04:17 --> 00:04:19
			or loudest singing voice. Right? So, you know,
		
00:04:19 --> 00:04:21
			you can go to, like, a a religious
		
00:04:21 --> 00:04:23
			ceremony and you'll see people, like, trying out,
		
00:04:23 --> 00:04:25
			you know, out outdo one another or be
		
00:04:25 --> 00:04:27
			the most taken with the spirit or,
		
00:04:28 --> 00:04:29
			or or whatever,
		
00:04:30 --> 00:04:32
			depending on on your your mode of worship.
		
00:04:33 --> 00:04:35
			So, I mean, that's mostly harmless. But but
		
00:04:35 --> 00:04:38
			when you get people talking about some, you
		
00:04:38 --> 00:04:41
			know, contested moral issue, it's important that we'd
		
00:04:41 --> 00:04:43
			be able to see one another as
		
00:04:43 --> 00:04:45
			trying to get it right, you know, just
		
00:04:45 --> 00:04:48
			trying to figure out what justice requires of
		
00:04:48 --> 00:04:48
			us,
		
00:04:49 --> 00:04:50
			trying to figure out the right thing to
		
00:04:50 --> 00:04:52
			do, for its own sake.
		
00:04:53 --> 00:04:54
			And instead,
		
00:04:54 --> 00:04:55
			what we get is
		
00:04:55 --> 00:04:58
			behavior that's more at home in
		
00:04:59 --> 00:04:59
			in,
		
00:05:00 --> 00:05:02
			kind of arenas where where it doesn't matter
		
00:05:02 --> 00:05:05
			so much, but, you know, people using
		
00:05:05 --> 00:05:07
			something that that is bigger than themselves,
		
00:05:08 --> 00:05:10
			and turning it into something that is just
		
00:05:10 --> 00:05:11
			about themselves.
		
00:05:11 --> 00:05:13
			And I guess, you know, Brandon and I
		
00:05:13 --> 00:05:15
			wanted to write about this because we found
		
00:05:15 --> 00:05:16
			this so ugly,
		
00:05:16 --> 00:05:19
			and we think, you know, it's so important
		
00:05:19 --> 00:05:21
			that we'd be able to have these conversations,
		
00:05:22 --> 00:05:24
			and, you know but it you know, in
		
00:05:24 --> 00:05:24
			the meantime,
		
00:05:25 --> 00:05:26
			we have people,
		
00:05:26 --> 00:05:28
			abusing moral talk,
		
00:05:28 --> 00:05:31
			making it, a kind of vanity project instead.
		
00:05:32 --> 00:05:33
			So what
		
00:05:33 --> 00:05:34
			and I and I know you guys talk
		
00:05:34 --> 00:05:37
			about recognition desire in the book, but what
		
00:05:37 --> 00:05:39
			is it that you think that really drives
		
00:05:39 --> 00:05:40
			people from,
		
00:05:40 --> 00:05:42
			you know, if we discuss, let's say
		
00:05:43 --> 00:05:44
			and you gave a good example. Let's say
		
00:05:44 --> 00:05:45
			we talked about,
		
00:05:46 --> 00:05:48
			rent control. We wanna have a discussion about
		
00:05:48 --> 00:05:49
			rent control. Right?
		
00:05:50 --> 00:05:52
			What is it that's motivating people to instead
		
00:05:52 --> 00:05:55
			of, let's say, hash the issue out, maybe
		
00:05:55 --> 00:05:57
			evaluate both sides, come to some sort of
		
00:05:57 --> 00:06:00
			reasonable conclusion even if we disagree.
		
00:06:01 --> 00:06:03
			Rather than that being the objective, why has
		
00:06:03 --> 00:06:05
			the objective shifted so much to
		
00:06:06 --> 00:06:08
			me just trying to outdo you somehow or,
		
00:06:08 --> 00:06:12
			like, when imaginary inner point Internet points or
		
00:06:12 --> 00:06:12
			something?
		
00:06:13 --> 00:06:15
			Yeah. I think that's that's a good question.
		
00:06:15 --> 00:06:16
			I think the
		
00:06:18 --> 00:06:20
			what's at stake is social status.
		
00:06:20 --> 00:06:23
			And when we can get into a discussion
		
00:06:23 --> 00:06:26
			where, you know, I might forward some boring
		
00:06:26 --> 00:06:26
			view,
		
00:06:27 --> 00:06:29
			some real boring, you know, moderate
		
00:06:30 --> 00:06:31
			centrist take about rent control.
		
00:06:33 --> 00:06:36
			That's not gonna impress very many people. What
		
00:06:36 --> 00:06:39
			people tend to be impressed by are vivid,
		
00:06:40 --> 00:06:41
			extreme
		
00:06:41 --> 00:06:43
			claims that reveal
		
00:06:44 --> 00:06:46
			that have a kind of expressive value
		
00:06:47 --> 00:06:50
			about their moral about their moral insight, about
		
00:06:50 --> 00:06:53
			their moral commitment to these to these values.
		
00:06:53 --> 00:06:54
			And so
		
00:06:54 --> 00:06:56
			what's at stake is social status.
		
00:06:57 --> 00:06:59
			And we've turned a lot of social media.
		
00:06:59 --> 00:07:00
			It's not just people on social media, you
		
00:07:00 --> 00:07:01
			know, it could be,
		
00:07:02 --> 00:07:03
			cable news hosts,
		
00:07:04 --> 00:07:05
			it could be politicians.
		
00:07:05 --> 00:07:08
			Their social status to be had. And and
		
00:07:08 --> 00:07:10
			the reason is because a lot of us
		
00:07:10 --> 00:07:13
			care about the way people see us.
		
00:07:14 --> 00:07:16
			No. If the 3 of us get into
		
00:07:16 --> 00:07:18
			a conversation about, say, rent control, and each
		
00:07:18 --> 00:07:20
			of us think that we care deeply
		
00:07:20 --> 00:07:21
			about affordable,
		
00:07:22 --> 00:07:23
			housing or the poor.
		
00:07:24 --> 00:07:26
			And and then, you know, Omar chimes in
		
00:07:26 --> 00:07:28
			and says, you know, we should, you know,
		
00:07:28 --> 00:07:30
			we should cap the rent at this. And
		
00:07:30 --> 00:07:31
			then Justin says, are are you kidding, Omar?
		
00:07:31 --> 00:07:33
			If you really cared about the poor,
		
00:07:34 --> 00:07:36
			you know, you'd you'd cap the rent even
		
00:07:36 --> 00:07:37
			lower. And then I come in and say,
		
00:07:37 --> 00:07:39
			I'm absolutely disgusted by all of you. If
		
00:07:39 --> 00:07:41
			you truly cared about the poor, you'd endorse
		
00:07:41 --> 00:07:43
			a universal basic income.
		
00:07:44 --> 00:07:46
			And so and so I win. Right? Because
		
00:07:46 --> 00:07:48
			I it looks at least to many,
		
00:07:49 --> 00:07:51
			who were involved in this conversation that I
		
00:07:51 --> 00:07:55
			have the most sort of severe, most impressive
		
00:07:55 --> 00:07:56
			commitment to these values.
		
00:07:57 --> 00:07:59
			And the problem is that
		
00:08:00 --> 00:08:03
			what gets status and what expresses value often
		
00:08:04 --> 00:08:04
			diverges
		
00:08:05 --> 00:08:07
			from what's true. Because what's true is often
		
00:08:07 --> 00:08:09
			what's boring. What's true is often what's uninteresting.
		
00:08:10 --> 00:08:13
			And, so a politician, for example, can get
		
00:08:13 --> 00:08:16
			up there and, you know, wax wonkily about
		
00:08:16 --> 00:08:19
			the ins and outs of how of, housing
		
00:08:19 --> 00:08:20
			policy,
		
00:08:21 --> 00:08:23
			and and people are gonna tune them out.
		
00:08:23 --> 00:08:25
			Another politician can get up and, you know,
		
00:08:25 --> 00:08:28
			give a fiery speech about punishing landlords and,
		
00:08:29 --> 00:08:31
			and making housing affordable, and we're gonna pass
		
00:08:31 --> 00:08:32
			this law and that law. And it's a
		
00:08:32 --> 00:08:34
			vivid solution to a problem.
		
00:08:34 --> 00:08:36
			And voters just like the rest of us
		
00:08:36 --> 00:08:38
			on social media are taken in by these
		
00:08:38 --> 00:08:40
			sorts of claims. And so, you know, one
		
00:08:40 --> 00:08:42
			way to think about your question is, what's
		
00:08:42 --> 00:08:44
			a stake social status and how do people
		
00:08:44 --> 00:08:47
			get that by putting their values on display?
		
00:08:47 --> 00:08:50
			Because that gets more attention than than being
		
00:08:50 --> 00:08:52
			boring. And what is what is the social
		
00:08:52 --> 00:08:53
			status
		
00:08:53 --> 00:08:55
			giving people? Like, when I when I look
		
00:08:55 --> 00:08:57
			at especially social media. Right? I say the,
		
00:08:57 --> 00:08:59
			you know, the currency of social media's attention.
		
00:08:59 --> 00:09:00
			The more
		
00:09:00 --> 00:09:02
			eyeballs, likes, views,
		
00:09:03 --> 00:09:04
			comments, whatever that you get, the more that
		
00:09:04 --> 00:09:06
			you're winning in a sense. Right?
		
00:09:08 --> 00:09:12
			What's is there any real driver beyond just
		
00:09:12 --> 00:09:13
			I just want the attention, and so that
		
00:09:13 --> 00:09:15
			means I'm winning at this game?
		
00:09:16 --> 00:09:18
			Kinda like what are you may have another
		
00:09:18 --> 00:09:19
			way. It's like, what are how are they
		
00:09:19 --> 00:09:22
			getting rewarded that makes them keep doing it?
		
00:09:22 --> 00:09:24
			Yeah. I think there's there's a lot of
		
00:09:24 --> 00:09:24
			rewards.
		
00:09:25 --> 00:09:26
			Some of it,
		
00:09:26 --> 00:09:29
			is emotional. So one thing we know is
		
00:09:29 --> 00:09:31
			that expressing outrage feels good.
		
00:09:31 --> 00:09:34
			In various studies, if you give people the
		
00:09:34 --> 00:09:34
			option
		
00:09:35 --> 00:09:37
			of reading a story about an injustice that
		
00:09:37 --> 00:09:39
			makes them mad, and then you say, would
		
00:09:39 --> 00:09:42
			you like to read a nice story or
		
00:09:42 --> 00:09:43
			another story about injustice?
		
00:09:44 --> 00:09:46
			They tend to choose the story about injustice
		
00:09:46 --> 00:09:47
			because they like the way,
		
00:09:48 --> 00:09:51
			it feels to feel outrage. It makes people
		
00:09:51 --> 00:09:53
			feel morally superior. So, you know, one thing
		
00:09:53 --> 00:09:55
			that that's driving this is just the the
		
00:09:55 --> 00:09:59
			feeling the feeling of outrage. It's it's, it's
		
00:09:59 --> 00:09:59
			satisfying.
		
00:10:00 --> 00:10:02
			Philosophers for centuries have noted this that it
		
00:10:02 --> 00:10:04
			that it feels good to be mad at
		
00:10:04 --> 00:10:06
			people when they when they mess up, and,
		
00:10:06 --> 00:10:08
			because it makes us feel better about ourselves.
		
00:10:08 --> 00:10:10
			Another thing that's driving this is,
		
00:10:10 --> 00:10:11
			you know,
		
00:10:12 --> 00:10:13
			a lot of
		
00:10:14 --> 00:10:16
			us think that we're morally good people. If
		
00:10:16 --> 00:10:17
			you look at studies,
		
00:10:17 --> 00:10:20
			most people think that they're morally better than
		
00:10:20 --> 00:10:20
			average.
		
00:10:21 --> 00:10:24
			And to maintain this vision of ourselves
		
00:10:25 --> 00:10:25
			to ourselves,
		
00:10:26 --> 00:10:28
			we often have to behave in public in
		
00:10:28 --> 00:10:30
			certain ways to confirm to ourselves that we
		
00:10:30 --> 00:10:31
			are who we think we are. A lot
		
00:10:31 --> 00:10:34
			of our self conception has to do with
		
00:10:34 --> 00:10:36
			how we measure up to others. You know,
		
00:10:36 --> 00:10:38
			you might think you're really funny, and then
		
00:10:38 --> 00:10:40
			you meet some, like, friends who are, like,
		
00:10:40 --> 00:10:41
			really hilarious, and you're, like, oh, I guess
		
00:10:41 --> 00:10:43
			I'm not not not that funny. And then
		
00:10:43 --> 00:10:45
			you go home, and you're, like, the funniest
		
00:10:45 --> 00:10:47
			guy that, you know, your family's ever seen.
		
00:10:47 --> 00:10:49
			And so a lot of the ways that
		
00:10:49 --> 00:10:51
			we think about ourselves are, calibrated to whoever's
		
00:10:51 --> 00:10:52
			around us. And so if you think of
		
00:10:52 --> 00:10:54
			yourself as caring deeply about the poor or
		
00:10:54 --> 00:10:56
			caring deeply about piety,
		
00:10:56 --> 00:10:58
			and then you get around a bunch of
		
00:10:58 --> 00:11:00
			people who seem to care about these things
		
00:11:00 --> 00:11:01
			as much or more than you do, then
		
00:11:01 --> 00:11:03
			you have to behave in certain ways to
		
00:11:03 --> 00:11:04
			as it were,
		
00:11:05 --> 00:11:07
			put on a show to yourself. So that's
		
00:11:07 --> 00:11:09
			another reason why we do these things. And
		
00:11:09 --> 00:11:11
			then I think one that you mentioned,
		
00:11:13 --> 00:11:15
			is is is the status. We want to
		
00:11:15 --> 00:11:17
			be seen as morally good.
		
00:11:17 --> 00:11:20
			Why? Well, one thing I think is true
		
00:11:20 --> 00:11:23
			is that we just want that. Like, we
		
00:11:23 --> 00:11:25
			just want status. We merely want to be
		
00:11:25 --> 00:11:26
			seen as better. I think that's that's also
		
00:11:26 --> 00:11:27
			true.
		
00:11:27 --> 00:11:29
			But also the kinds of things that status
		
00:11:29 --> 00:11:31
			affords us. So here are some things that
		
00:11:31 --> 00:11:33
			status affords us. People defer to me in
		
00:11:33 --> 00:11:34
			public discourse.
		
00:11:34 --> 00:11:36
			People always want to know what I hear,
		
00:11:37 --> 00:11:38
			or people always want to hear what I,
		
00:11:38 --> 00:11:39
			you know, what I what I have to
		
00:11:39 --> 00:11:41
			say. So if if, you know, if I
		
00:11:41 --> 00:11:43
			have 10,000 followers who think that I'm
		
00:11:44 --> 00:11:47
			I'm, the vanguard of the poor or or
		
00:11:47 --> 00:11:49
			I'm I'm the best feminist, or I care
		
00:11:49 --> 00:11:52
			mostly about the American flag or something. And
		
00:11:52 --> 00:11:54
			they're constantly wanting to know what I think,
		
00:11:54 --> 00:11:56
			you know, that sort of attention feels good.
		
00:11:56 --> 00:11:57
			And so there's lots of goodies,
		
00:11:58 --> 00:12:01
			goodies. Some of them are financial goodies that
		
00:12:01 --> 00:12:04
			come with coming being seen as, you know,
		
00:12:04 --> 00:12:05
			having a great trade. I'm sure,
		
00:12:06 --> 00:12:08
			you know, in the in, you know, whatever
		
00:12:08 --> 00:12:10
			online communities we run-in,
		
00:12:11 --> 00:12:13
			there are there are financial goodies involved. Oh,
		
00:12:13 --> 00:12:16
			yeah. There's notoriety, speaking invites. That's right. All
		
00:12:16 --> 00:12:18
			that good stuff. Yeah. And so all that
		
00:12:18 --> 00:12:20
			stuff comes with and can be purchased just
		
00:12:20 --> 00:12:21
			by,
		
00:12:21 --> 00:12:22
			a ticket
		
00:12:22 --> 00:12:24
			that in in what that ticket says is,
		
00:12:24 --> 00:12:27
			you know, I'm I'm really morally special.
		
00:12:27 --> 00:12:29
			You know, there's one thing that you mentioned
		
00:12:29 --> 00:12:31
			in the book that I wanted you guys
		
00:12:31 --> 00:12:32
			to elaborate on was
		
00:12:32 --> 00:12:34
			and I haven't heard this term before, so
		
00:12:34 --> 00:12:35
			it caught my eye, was confabulation,
		
00:12:37 --> 00:12:40
			that we make up stories to tell ourselves
		
00:12:40 --> 00:12:41
			to cover up our true intentions. So we
		
00:12:41 --> 00:12:45
			might have one intent, but we give ourselves
		
00:12:45 --> 00:12:47
			a narrative to somehow make ourselves feel better
		
00:12:47 --> 00:12:48
			about it.
		
00:12:49 --> 00:12:51
			Could you explain that concept in a little
		
00:12:51 --> 00:12:52
			bit more detail
		
00:12:52 --> 00:12:54
			and how that works? Yeah.
		
00:12:55 --> 00:12:57
			So, I mean, think about all the things
		
00:12:57 --> 00:12:58
			that you that you do going throughout your
		
00:12:58 --> 00:12:59
			day.
		
00:13:00 --> 00:13:00
			Often,
		
00:13:01 --> 00:13:02
			you have some sense of of why you
		
00:13:02 --> 00:13:03
			do them.
		
00:13:06 --> 00:13:08
			You know, basically,
		
00:13:08 --> 00:13:10
			what what you're doing most of the time.
		
00:13:11 --> 00:13:11
			But
		
00:13:12 --> 00:13:14
			if your true motivations for every single thing
		
00:13:14 --> 00:13:15
			you do,
		
00:13:15 --> 00:13:17
			were revealed to yourself,
		
00:13:17 --> 00:13:19
			first, it would be overwhelming,
		
00:13:19 --> 00:13:21
			because it's just so much processing
		
00:13:22 --> 00:13:22
			going on.
		
00:13:23 --> 00:13:25
			It makes more sense to have some some
		
00:13:25 --> 00:13:27
			of it sort of beneath the level of
		
00:13:27 --> 00:13:29
			of your cognitive attention. So, I mean, just
		
00:13:29 --> 00:13:32
			breathing for is, like, the most obvious example
		
00:13:32 --> 00:13:34
			of this. Of course, you don't need to
		
00:13:34 --> 00:13:36
			think about, like, when and how deeply you
		
00:13:36 --> 00:13:38
			breathe every time you do it.
		
00:13:38 --> 00:13:42
			So something similar happens, for some of our
		
00:13:42 --> 00:13:44
			more complicated and less reflexive actions.
		
00:13:45 --> 00:13:47
			We just aren't aware of every single thing,
		
00:13:48 --> 00:13:49
			that motivates us.
		
00:13:49 --> 00:13:51
			Another reason for this is,
		
00:13:52 --> 00:13:52
			that
		
00:13:53 --> 00:13:55
			quite possibly some of our motivations are not
		
00:13:55 --> 00:13:58
			so pretty. So it's easy for us to
		
00:13:58 --> 00:14:00
			to think well of ourselves and continue to
		
00:14:00 --> 00:14:02
			do the things that that we need, to
		
00:14:02 --> 00:14:04
			do in order to to do well and
		
00:14:04 --> 00:14:06
			survive and and thrive, throughout our lives,
		
00:14:07 --> 00:14:08
			without,
		
00:14:08 --> 00:14:10
			all of our our deeper motivations
		
00:14:11 --> 00:14:11
			being,
		
00:14:12 --> 00:14:14
			laid bare before our before our eyes and
		
00:14:14 --> 00:14:15
			accept easily accessible,
		
00:14:16 --> 00:14:17
			in our deliberation.
		
00:14:18 --> 00:14:20
			So because we're like this, because,
		
00:14:21 --> 00:14:21
			we basically
		
00:14:22 --> 00:14:23
			are self deceived,
		
00:14:24 --> 00:14:24
			at least,
		
00:14:25 --> 00:14:26
			about a lot of the things that we
		
00:14:26 --> 00:14:27
			do,
		
00:14:29 --> 00:14:30
			we can do things like grandstand
		
00:14:31 --> 00:14:32
			without being aware
		
00:14:33 --> 00:14:33
			that we're
		
00:14:34 --> 00:14:36
			engaging in in public moral discourse
		
00:14:37 --> 00:14:40
			with the aim of impressing other people.
		
00:14:40 --> 00:14:43
			So there are both winning and unwitting grandstanders.
		
00:14:45 --> 00:14:46
			So winning grandstanders
		
00:14:47 --> 00:14:49
			know what they're doing. They they know,
		
00:14:50 --> 00:14:51
			that,
		
00:14:52 --> 00:14:53
			you know, I'm saying this and I want
		
00:14:53 --> 00:14:55
			people to think well of me. I wanna
		
00:14:55 --> 00:14:57
			get this guy, who's trying to make me
		
00:14:57 --> 00:14:58
			look bad, and I wanna look better than
		
00:14:58 --> 00:14:59
			him instead.
		
00:14:59 --> 00:15:00
			Unwitting grandstanders
		
00:15:00 --> 00:15:03
			come up with some other story that's, to
		
00:15:03 --> 00:15:04
			refer your question, confabulated.
		
00:15:05 --> 00:15:07
			Right? So they they tell themselves,
		
00:15:07 --> 00:15:09
			I don't know. You know, I'm just so
		
00:15:09 --> 00:15:11
			worked up about this. I'm so I'm so
		
00:15:11 --> 00:15:13
			upset that this guy doesn't understand how important
		
00:15:13 --> 00:15:15
			it is to take care of the poor,
		
00:15:15 --> 00:15:17
			and I'm gonna go after him and teach
		
00:15:17 --> 00:15:17
			him a lesson.
		
00:15:18 --> 00:15:19
			Whereas,
		
00:15:19 --> 00:15:21
			at least, a significant part of what motivates
		
00:15:22 --> 00:15:25
			someone who does that, just by hypothesis, we
		
00:15:25 --> 00:15:27
			can say, is at least in some cases,
		
00:15:28 --> 00:15:30
			that they wanna look good. They wanna look
		
00:15:30 --> 00:15:32
			like, as Brandon said, the vanguards of the
		
00:15:32 --> 00:15:33
			poor.
		
00:15:33 --> 00:15:35
			They wanna look like the person who cares
		
00:15:35 --> 00:15:37
			the most in this conversation or in this
		
00:15:37 --> 00:15:39
			friend group, or whatever,
		
00:15:39 --> 00:15:40
			about
		
00:15:41 --> 00:15:45
			some, some group of disadvantaged people say.
		
00:15:45 --> 00:15:47
			But, you know, when you put it this
		
00:15:47 --> 00:15:51
			way, this sounds ugly. And if I mean,
		
00:15:51 --> 00:15:52
			Brandon and I get blowback all the time
		
00:15:52 --> 00:15:54
			from people who get really mad and say,
		
00:15:54 --> 00:15:57
			how dare you say that I or anyone,
		
00:15:58 --> 00:16:00
			am engaging in in moral talk with anything
		
00:16:00 --> 00:16:02
			other than the interest of the people, you
		
00:16:02 --> 00:16:05
			know, who are trying to defend or or,
		
00:16:05 --> 00:16:06
			promote or or whatever.
		
00:16:08 --> 00:16:10
			You know, how dare how dare you say
		
00:16:10 --> 00:16:12
			that anything other than concern for for that
		
00:16:12 --> 00:16:12
			is,
		
00:16:13 --> 00:16:16
			is what's motivating us. And we don't think,
		
00:16:16 --> 00:16:18
			that it's only, you know, it's
		
00:16:19 --> 00:16:20
			we don't think that it's generally,
		
00:16:21 --> 00:16:23
			just a matter of people trying to promote
		
00:16:23 --> 00:16:26
			their own interests, right, to to promote their
		
00:16:26 --> 00:16:28
			status. It can be both,
		
00:16:28 --> 00:16:31
			but it's also quite plausible that people are
		
00:16:31 --> 00:16:32
			are not always aware,
		
00:16:33 --> 00:16:34
			when they're motivated,
		
00:16:35 --> 00:16:37
			by by status seeking.
		
00:16:38 --> 00:16:40
			So one thing and I and I think
		
00:16:40 --> 00:16:42
			both the girl kinda touched on it is
		
00:16:42 --> 00:16:45
			when people grandstand again, wittingly or unwittingly.
		
00:16:45 --> 00:16:48
			There's that one upsmanship, like, you have to
		
00:16:48 --> 00:16:49
			have a hotter take than the next guy
		
00:16:49 --> 00:16:51
			and then progressively hotter take, otherwise, you're not
		
00:16:51 --> 00:16:53
			gonna get any attention or attraction.
		
00:16:53 --> 00:16:55
			And we see the effect that that's had
		
00:16:55 --> 00:16:56
			on
		
00:16:56 --> 00:17:00
			political discourse, discourse around social issues, whether it's
		
00:17:00 --> 00:17:03
			racism, feminism, abortion, the election,
		
00:17:04 --> 00:17:04
			whatever.
		
00:17:07 --> 00:17:07
			How do
		
00:17:08 --> 00:17:11
			you as let's let's start with the the
		
00:17:11 --> 00:17:12
			mindset of just a consumer. Right? Like, I'm
		
00:17:12 --> 00:17:14
			an average dude. I take out my phone.
		
00:17:14 --> 00:17:16
			I'm scrolling through Facebook or group chats or
		
00:17:16 --> 00:17:18
			whatever. And I see all these people,
		
00:17:18 --> 00:17:21
			you know, engaging in exactly that. What's your
		
00:17:21 --> 00:17:23
			advice for navigating that? Like,
		
00:17:23 --> 00:17:25
			even if I care about some of these
		
00:17:25 --> 00:17:27
			social, let's say, causes,
		
00:17:27 --> 00:17:31
			everything I read just as the days go
		
00:17:31 --> 00:17:33
			go on get more and more polarized.
		
00:17:36 --> 00:17:38
			That's a good question. One thing to do
		
00:17:38 --> 00:17:39
			is to get off Twitter.
		
00:17:39 --> 00:17:40
			Yeah. That
		
00:17:40 --> 00:17:42
			that that you know, we give that advice
		
00:17:42 --> 00:17:44
			all, you know, often and and it's not
		
00:17:44 --> 00:17:47
			very satisfying. We recognize it's not very satisfying
		
00:17:47 --> 00:17:47
			advice.
		
00:17:48 --> 00:17:50
			So here are a couple things that we
		
00:17:50 --> 00:17:53
			suggest in the book in in in and
		
00:17:53 --> 00:17:54
			I think these are generally good
		
00:17:55 --> 00:17:58
			things to keep in mind in navigating social
		
00:17:58 --> 00:18:00
			media or, you know, the larger political climate.
		
00:18:00 --> 00:18:02
			So one thing to do
		
00:18:06 --> 00:18:08
			is to be careful about
		
00:18:09 --> 00:18:10
			how we contribute ourselves.
		
00:18:11 --> 00:18:13
			So one question we can always ask ourselves
		
00:18:14 --> 00:18:15
			when we're about to type into Twitter or
		
00:18:15 --> 00:18:16
			Facebook or whatever,
		
00:18:18 --> 00:18:20
			am I doing this to look good? Or
		
00:18:20 --> 00:18:21
			am I doing this to do good? Like,
		
00:18:21 --> 00:18:23
			is this, is this thing that I'm about
		
00:18:23 --> 00:18:24
			to say, like, can I tell myself a
		
00:18:24 --> 00:18:26
			story about how this is actually gonna help
		
00:18:26 --> 00:18:28
			someone? Like, is this gonna provide,
		
00:18:28 --> 00:18:30
			you know, evidence for something I believe in?
		
00:18:30 --> 00:18:33
			Is this gonna actually help someone actually stand
		
00:18:33 --> 00:18:35
			up for what's right? Or or am I
		
00:18:35 --> 00:18:37
			doing this to look good? And and would
		
00:18:37 --> 00:18:39
			I be disappointed if I didn't get, you
		
00:18:39 --> 00:18:42
			know, 5 retweets or a 100 likes or
		
00:18:42 --> 00:18:43
			whatever? Am I gonna be disappointed if this
		
00:18:43 --> 00:18:44
			doesn't go viral?
		
00:18:45 --> 00:18:47
			If you're gonna be disappointed, that suggests,
		
00:18:48 --> 00:18:49
			at
		
00:18:49 --> 00:18:50
			least in our view,
		
00:18:52 --> 00:18:55
			the wrong kind of priority in engaging in
		
00:18:55 --> 00:18:57
			public discourse. The, you know, we think that
		
00:18:57 --> 00:18:59
			they're good kinds of priorities don't have to
		
00:18:59 --> 00:19:01
			do with status seeking and making ourselves look
		
00:19:01 --> 00:19:03
			good and promoting our brand on Twitter or
		
00:19:03 --> 00:19:04
			Facebook.
		
00:19:05 --> 00:19:07
			These are really important issues, and they call
		
00:19:07 --> 00:19:09
			for more than the promotion of our own
		
00:19:09 --> 00:19:10
			reputation. So one thing that we recommend in
		
00:19:10 --> 00:19:12
			the book is to,
		
00:19:13 --> 00:19:15
			is this sort of look inward. Right? So,
		
00:19:15 --> 00:19:17
			you know, a lot of people read the
		
00:19:17 --> 00:19:18
			book or come across the work, and they
		
00:19:18 --> 00:19:20
			wanna know, like, alright, show me who the
		
00:19:20 --> 00:19:23
			grandstander is. Give me the test for who's
		
00:19:23 --> 00:19:23
			grandstanding.
		
00:19:24 --> 00:19:27
			And it's a perfectly understandable question that people
		
00:19:27 --> 00:19:28
			have, and we totally get it. And in
		
00:19:28 --> 00:19:30
			the book, we do go through some
		
00:19:31 --> 00:19:33
			some ways that grandstanding tends to to rear
		
00:19:33 --> 00:19:36
			its head in discourse. But we think that,
		
00:19:36 --> 00:19:37
			you know, this sort
		
00:19:38 --> 00:19:38
			of response
		
00:19:39 --> 00:19:41
			that people wanna know. Alright. Let me let
		
00:19:41 --> 00:19:42
			me at the grandstand, and let me go
		
00:19:42 --> 00:19:44
			get them. Is the wrong kind of is
		
00:19:44 --> 00:19:45
			the wrong kind of response? What what we
		
00:19:45 --> 00:19:48
			recommend is really turning our moral gaze
		
00:19:49 --> 00:19:52
			away from others and onto ourselves, and asking
		
00:19:52 --> 00:19:52
			ourselves,
		
00:19:53 --> 00:19:54
			you know, what can I do to make
		
00:19:54 --> 00:19:57
			this discourse healthier and less toxic? So that's
		
00:19:57 --> 00:20:00
			that's one thing. Another thing very briefly is
		
00:20:00 --> 00:20:00
			just to,
		
00:20:02 --> 00:20:04
			you know, when you come across stuff on
		
00:20:04 --> 00:20:05
			Twitter or,
		
00:20:05 --> 00:20:07
			Facebook or social media generally,
		
00:20:09 --> 00:20:11
			that looks self serving,
		
00:20:11 --> 00:20:13
			where it looks like someone is offering a
		
00:20:13 --> 00:20:14
			hot take
		
00:20:15 --> 00:20:17
			or offering an extreme hyperbolic
		
00:20:17 --> 00:20:18
			response.
		
00:20:19 --> 00:20:21
			If it looks like something that might be
		
00:20:21 --> 00:20:22
			attention seeking, just ignore
		
00:20:22 --> 00:20:24
			it. Just ignore it. I mean, one thing
		
00:20:24 --> 00:20:26
			that grandstanders want is your attention. That's what
		
00:20:26 --> 00:20:28
			they thrive on. That's what they that that,
		
00:20:28 --> 00:20:29
			you know, keeps them going.
		
00:20:29 --> 00:20:31
			Do you consider people
		
00:20:31 --> 00:20:33
			you know, there's always people who are
		
00:20:34 --> 00:20:36
			it's not you know, we used to say,
		
00:20:36 --> 00:20:38
			like, playing devil's advocate, but I think now
		
00:20:38 --> 00:20:40
			the more sophisticated version of that is people
		
00:20:40 --> 00:20:42
			who want to offer, like, a
		
00:20:43 --> 00:20:46
			nuance contrarian take to the discussion, whether they
		
00:20:46 --> 00:20:47
			kind of believe it or not. Is that
		
00:20:47 --> 00:20:49
			the same or is that a little bit
		
00:20:49 --> 00:20:49
			of a different
		
00:20:50 --> 00:20:53
			Yeah. I don't think it's the same. I
		
00:20:53 --> 00:20:55
			I I think there is is value in
		
00:20:55 --> 00:20:56
			virtue
		
00:20:56 --> 00:20:57
			in people
		
00:20:59 --> 00:21:00
			offering in good faith
		
00:21:01 --> 00:21:02
			reasons and evidences,
		
00:21:03 --> 00:21:04
			maybe for things they don't believe,
		
00:21:05 --> 00:21:06
			in an effort to,
		
00:21:08 --> 00:21:09
			you know, figure out what what the truth
		
00:21:09 --> 00:21:12
			is. I think those are perfectly valuable things.
		
00:21:12 --> 00:21:14
			And I, you know, and one thing, you
		
00:21:14 --> 00:21:15
			know, I think you're
		
00:21:15 --> 00:21:17
			sort of hinting at here is that it's
		
00:21:17 --> 00:21:19
			very difficult to tell when someone's grandstanding. Yeah.
		
00:21:19 --> 00:21:21
			It's very difficult to tell. And the reason
		
00:21:21 --> 00:21:23
			is because grandstanding has two parts. There's the
		
00:21:23 --> 00:21:25
			thing that you say, and there's the reason
		
00:21:25 --> 00:21:26
			why you say it, your motivation.
		
00:21:27 --> 00:21:29
			And that is hidden from us. That we
		
00:21:29 --> 00:21:31
			don't get to see. I don't get to
		
00:21:31 --> 00:21:33
			poke, you know, peer into your head, Omar,
		
00:21:33 --> 00:21:34
			and know why the things you say,
		
00:21:36 --> 00:21:38
			on on Twitter. And so,
		
00:21:38 --> 00:21:40
			it's it can be very difficult, but that's
		
00:21:40 --> 00:21:41
			just part of life. I mean, it's very
		
00:21:41 --> 00:21:43
			difficult to know when someone's lying. It's very
		
00:21:43 --> 00:21:45
			difficult to know when someone's bragging or,
		
00:21:46 --> 00:21:48
			engaging in demagoguery or, you know, this this
		
00:21:48 --> 00:21:50
			thing called humble bragging where Yeah. People, you
		
00:21:50 --> 00:21:52
			know, people say things like, oh, I can't
		
00:21:52 --> 00:21:54
			my my boss gives me all the all
		
00:21:54 --> 00:21:57
			the most important assignments. I just can't believe
		
00:21:57 --> 00:21:58
			it, you know.
		
00:21:59 --> 00:22:00
			So that those sorts of things are all
		
00:22:00 --> 00:22:02
			things that involve a kind of motivation that's
		
00:22:02 --> 00:22:04
			sort of hidden from us, and it's very
		
00:22:04 --> 00:22:06
			difficult to tell. And that's why we
		
00:22:06 --> 00:22:09
			caution, you know, go not not going around
		
00:22:09 --> 00:22:10
			accusing people of grandstanding.
		
00:22:11 --> 00:22:12
			But also, you know, if you if you
		
00:22:12 --> 00:22:14
			see someone that is that, you know, that
		
00:22:14 --> 00:22:16
			you think is grandstanding, you know, maybe you
		
00:22:16 --> 00:22:18
			just ignore it. And the hope is that
		
00:22:18 --> 00:22:20
			we can sort of change
		
00:22:20 --> 00:22:22
			the norms of public discourse so that this
		
00:22:22 --> 00:22:25
			sort of self centered, self aggrandizing moral discourse
		
00:22:25 --> 00:22:27
			becomes a little embarrassing.
		
00:22:27 --> 00:22:29
			So there's another part of the discourse
		
00:22:29 --> 00:22:31
			that happens in that and you see a
		
00:22:31 --> 00:22:33
			lot where people are let's say they are
		
00:22:33 --> 00:22:35
			well intentioned that they're trying to engage and
		
00:22:35 --> 00:22:36
			learn about a topic.
		
00:22:37 --> 00:22:40
			But because all the information has become so
		
00:22:40 --> 00:22:40
			polarized,
		
00:22:41 --> 00:22:43
			they don't like, there's no guidance on how
		
00:22:43 --> 00:22:45
			to navigate it. And I'll I'll give an
		
00:22:45 --> 00:22:47
			example to highlight what I mean. And,
		
00:22:48 --> 00:22:50
			you know, when we're talking about, let's
		
00:22:50 --> 00:22:50
			say,
		
00:22:51 --> 00:22:53
			why do people vote a certain way? Right?
		
00:22:53 --> 00:22:55
			And and I'll try to keep this unbiased,
		
00:22:55 --> 00:22:57
			but you have one side saying, well,
		
00:22:58 --> 00:23:00
			everyone that voted a certain way is racist.
		
00:23:00 --> 00:23:02
			And the other side is saying, well, everyone
		
00:23:02 --> 00:23:03
			who voted this way is trying to destroy
		
00:23:03 --> 00:23:06
			the country or whatever. Right? And so
		
00:23:06 --> 00:23:06
			their
		
00:23:07 --> 00:23:08
			entire
		
00:23:08 --> 00:23:10
			camps of people get painted with
		
00:23:11 --> 00:23:13
			the most extreme views of the other. So
		
00:23:13 --> 00:23:14
			me coming in saying, like, okay, let me
		
00:23:14 --> 00:23:15
			try to understand,
		
00:23:16 --> 00:23:18
			is everyone that
		
00:23:18 --> 00:23:21
			says this or believes this actually racist? And
		
00:23:21 --> 00:23:24
			it's it's hard to navigate those waters because
		
00:23:25 --> 00:23:27
			every issue gets turned into the most extreme
		
00:23:27 --> 00:23:28
			version of it.
		
00:23:29 --> 00:23:31
			Yeah. That I mean, what you're pointing out,
		
00:23:31 --> 00:23:34
			I think, is a huge problem. We talk
		
00:23:34 --> 00:23:36
			about it, a good bit in the book,
		
00:23:36 --> 00:23:36
			about,
		
00:23:38 --> 00:23:39
			so I mean, there there are a couple
		
00:23:39 --> 00:23:40
			of of parts here.
		
00:23:41 --> 00:23:42
			One is that
		
00:23:43 --> 00:23:44
			when
		
00:23:45 --> 00:23:47
			a discourse is is rich in grandstanding,
		
00:23:48 --> 00:23:50
			their status to be gained by talking about
		
00:23:50 --> 00:23:52
			how much you hate the other side.
		
00:23:52 --> 00:23:54
			And when people do that often, you know,
		
00:23:54 --> 00:23:56
			what they do is they
		
00:23:56 --> 00:23:57
			give characterizations
		
00:23:58 --> 00:23:58
			of,
		
00:23:59 --> 00:24:01
			characterizations of of the other side that are
		
00:24:01 --> 00:24:03
			ridiculous, that are caricatures,
		
00:24:03 --> 00:24:04
			basically.
		
00:24:05 --> 00:24:08
			They will say, you know,
		
00:24:08 --> 00:24:09
			that,
		
00:24:09 --> 00:24:12
			all Republicans are are are just rich white
		
00:24:12 --> 00:24:12
			men,
		
00:24:13 --> 00:24:14
			or,
		
00:24:14 --> 00:24:16
			you know, everyone everyone in the Democratic party,
		
00:24:16 --> 00:24:18
			like, half of them are gay.
		
00:24:19 --> 00:24:21
			And there's social science that that we cite
		
00:24:21 --> 00:24:23
			in in the book showing this that if
		
00:24:23 --> 00:24:25
			you ask people, you know, what what
		
00:24:26 --> 00:24:28
			percentage of of people who vote for this
		
00:24:28 --> 00:24:30
			party have this or that trait, they're off
		
00:24:30 --> 00:24:33
			by factors of, like, 10 and 20. I
		
00:24:33 --> 00:24:33
			so
		
00:24:34 --> 00:24:36
			it's really harmful in other words that,
		
00:24:37 --> 00:24:39
			our discourse is overrun
		
00:24:40 --> 00:24:40
			with,
		
00:24:41 --> 00:24:43
			people who are so eager
		
00:24:43 --> 00:24:44
			to paint the other side,
		
00:24:45 --> 00:24:47
			in the most extreme terms
		
00:24:47 --> 00:24:48
			possible,
		
00:24:48 --> 00:24:50
			because it gives everyone else a distorted
		
00:24:50 --> 00:24:51
			view,
		
00:24:52 --> 00:24:55
			of what's really going on, of what people
		
00:24:55 --> 00:24:57
			are really like, who support 1 or another
		
00:24:57 --> 00:24:58
			party or candidate.
		
00:24:59 --> 00:25:00
			So it's just as you say, the the
		
00:25:00 --> 00:25:01
			information,
		
00:25:01 --> 00:25:02
			becomes
		
00:25:02 --> 00:25:05
			kind of politicized or or polarized, I think
		
00:25:05 --> 00:25:06
			you said, or or moralized,
		
00:25:07 --> 00:25:08
			in a really misleading way.
		
00:25:09 --> 00:25:11
			Another thing, the other part,
		
00:25:11 --> 00:25:14
			of this that happens is people get sick
		
00:25:14 --> 00:25:15
			of it.
		
00:25:15 --> 00:25:18
			People don't want to engage in these kinds
		
00:25:18 --> 00:25:20
			of discussions. They're not helpful. They're over the
		
00:25:21 --> 00:25:22
			top. They're very heated.
		
00:25:22 --> 00:25:25
			There's no progress made. And so what happens
		
00:25:25 --> 00:25:27
			is that the people who don't find it
		
00:25:28 --> 00:25:28
			very,
		
00:25:28 --> 00:25:30
			either, you know, emotionally fulfilling,
		
00:25:31 --> 00:25:33
			or who aren't getting status, who aren't very
		
00:25:33 --> 00:25:35
			interested in getting getting status from engaging in
		
00:25:35 --> 00:25:38
			in these kinds of discussions, they check out.
		
00:25:38 --> 00:25:40
			So, so, you know, we have this, discussion
		
00:25:40 --> 00:25:42
			in the book called moderates checkout,
		
00:25:42 --> 00:25:44
			because this is what happens.
		
00:25:44 --> 00:25:46
			People who are kind of in the middle,
		
00:25:46 --> 00:25:47
			who have nuanced views,
		
00:25:48 --> 00:25:50
			who are kinda tired of of, saying, well,
		
00:25:50 --> 00:25:53
			you know, I think abortion's a really complicated
		
00:25:53 --> 00:25:55
			issue and, you know, I'm not sure what
		
00:25:55 --> 00:25:57
			to think about this one wrinkle of it
		
00:25:57 --> 00:25:58
			and or, you know, it seems like the
		
00:25:58 --> 00:26:00
			reasons are are kind of intention here,
		
00:26:00 --> 00:26:02
			and then they get they get dog
		
00:26:03 --> 00:26:06
			piled. So for someone like that, what's the
		
00:26:06 --> 00:26:08
			point in engaging in in a moral discussion
		
00:26:08 --> 00:26:09
			with people who are just going to rush
		
00:26:09 --> 00:26:11
			to villainize you, to show that, you know,
		
00:26:12 --> 00:26:14
			you're supposedly not morally pure?
		
00:26:15 --> 00:26:17
			So these people check out.
		
00:26:17 --> 00:26:19
			So it's it's awful for them, of course,
		
00:26:20 --> 00:26:21
			to to go through that, but it also
		
00:26:21 --> 00:26:24
			deprives the rest of us of reasonable people,
		
00:26:26 --> 00:26:28
			giving reasons and and arguments
		
00:26:28 --> 00:26:29
			that we're otherwise
		
00:26:30 --> 00:26:30
			quite
		
00:26:31 --> 00:26:33
			quite plausibly not going to think of,
		
00:26:33 --> 00:26:35
			that we then won't discuss,
		
00:26:35 --> 00:26:38
			and then people will will further polarize,
		
00:26:39 --> 00:26:40
			I mean, even more,
		
00:26:40 --> 00:26:43
			because they're only hearing, the most extreme views,
		
00:26:43 --> 00:26:45
			and they think, well, it's it's, I guess,
		
00:26:45 --> 00:26:45
			it's either
		
00:26:46 --> 00:26:48
			this group of crazy people or or this
		
00:26:48 --> 00:26:50
			other group of crazy people, and I'm closer
		
00:26:50 --> 00:26:53
			to, you know, this group of crazy people,
		
00:26:53 --> 00:26:54
			so I guess I'm with them.
		
00:26:55 --> 00:26:57
			So, you know, this is another reason to
		
00:26:57 --> 00:26:58
			to think that we need,
		
00:26:58 --> 00:27:00
			to get people to to calm down about
		
00:27:00 --> 00:27:01
			grandstanding.
		
00:27:02 --> 00:27:04
			How do you reengage those moderate voices? Like,
		
00:27:04 --> 00:27:05
			I feel
		
00:27:06 --> 00:27:07
			when I think of people that have those
		
00:27:07 --> 00:27:10
			moderate voices and takes, instead of talking about
		
00:27:10 --> 00:27:11
			politics, they're just kind of they like you
		
00:27:11 --> 00:27:13
			said, they check out and they're like, I'm
		
00:27:13 --> 00:27:16
			gonna dedicate all of my intellectual energy to
		
00:27:16 --> 00:27:19
			fantasy football. Like, you know, ins instead of,
		
00:27:19 --> 00:27:21
			you know, something like that. But how do
		
00:27:21 --> 00:27:23
			you reengage and get those voices back? Because
		
00:27:23 --> 00:27:25
			they, you know, they often do have
		
00:27:25 --> 00:27:28
			a more informed or more nuanced or, you
		
00:27:28 --> 00:27:30
			know, just a more healthier view on a
		
00:27:30 --> 00:27:31
			lot of subjects?
		
00:27:31 --> 00:27:34
			That's a really hard question. I I think
		
00:27:34 --> 00:27:35
			that one thing that happens,
		
00:27:36 --> 00:27:36
			is
		
00:27:37 --> 00:27:39
			people who are interested in in morality and
		
00:27:39 --> 00:27:40
			politics,
		
00:27:40 --> 00:27:41
			will find another forum.
		
00:27:42 --> 00:27:44
			So that we'll find a a private, you
		
00:27:44 --> 00:27:46
			know, more contained community of
		
00:27:46 --> 00:27:48
			of of people who, maybe are more reasonable,
		
00:27:49 --> 00:27:50
			who actually like
		
00:27:50 --> 00:27:52
			following arguments where they lead.
		
00:27:53 --> 00:27:55
			This used to be called the philosophy department,
		
00:27:56 --> 00:27:56
			But
		
00:27:56 --> 00:27:58
			Now it's a WhatsApp group. Not working out
		
00:27:58 --> 00:28:00
			so much for us anymore.
		
00:28:02 --> 00:28:04
			But but then, you know, this is this
		
00:28:04 --> 00:28:05
			is not a great solution because
		
00:28:07 --> 00:28:10
			then maybe those people can have worthwhile discussions,
		
00:28:10 --> 00:28:12
			but then the rest of the world doesn't
		
00:28:12 --> 00:28:14
			hear about them. Or maybe we we hope
		
00:28:14 --> 00:28:15
			that it trickles out somehow.
		
00:28:16 --> 00:28:17
			But,
		
00:28:17 --> 00:28:20
			you know, other than that, I mean, you're
		
00:28:20 --> 00:28:22
			getting into things that are are like,
		
00:28:23 --> 00:28:24
			you know,
		
00:28:24 --> 00:28:27
			tinkering with the algorithm on social media, which
		
00:28:27 --> 00:28:29
			they're not going to do. Right? Because so
		
00:28:29 --> 00:28:31
			there's the study showing if you wanna go
		
00:28:31 --> 00:28:32
			viral on social media,
		
00:28:33 --> 00:28:34
			a great way to do so is to,
		
00:28:35 --> 00:28:38
			invoke various moral emotional terms, hatred,
		
00:28:39 --> 00:28:40
			you know, unjust,
		
00:28:41 --> 00:28:42
			you know, eviscerated,
		
00:28:43 --> 00:28:46
			you know, Ben Shapiro eviscerates this person, John
		
00:28:46 --> 00:28:46
			Stewart.
		
00:28:47 --> 00:28:49
			The entrails are everywhere. Right?
		
00:28:49 --> 00:28:50
			So people know
		
00:28:51 --> 00:28:51
			that
		
00:28:52 --> 00:28:53
			that this is a good way, you know,
		
00:28:53 --> 00:28:55
			to, you know, catastrophize,
		
00:28:55 --> 00:28:57
			to make things, you know, really blown up
		
00:28:57 --> 00:28:59
			and fiery. This is a good way to
		
00:28:59 --> 00:29:00
			to go viral,
		
00:29:00 --> 00:29:02
			and of course, social media firms are are
		
00:29:02 --> 00:29:04
			well aware of this. They could tinker with
		
00:29:04 --> 00:29:05
			their algorithms,
		
00:29:06 --> 00:29:08
			so that, you know, that the stuff is
		
00:29:08 --> 00:29:09
			is demoted and maybe more reasonable,
		
00:29:11 --> 00:29:12
			talk is is promoted.
		
00:29:14 --> 00:29:16
			I think they probably have every reason not
		
00:29:16 --> 00:29:17
			to do that,
		
00:29:18 --> 00:29:18
			economically.
		
00:29:19 --> 00:29:20
			Yeah. But if they, you know, they wanna
		
00:29:20 --> 00:29:22
			get civic virtue all of a sudden,
		
00:29:23 --> 00:29:24
			I I think that would be a great
		
00:29:24 --> 00:29:24
			thing.
		
00:29:25 --> 00:29:27
			And then there was one thing that y'all
		
00:29:27 --> 00:29:28
			mentioned in the book was
		
00:29:29 --> 00:29:31
			one arena where it's a little bit different,
		
00:29:31 --> 00:29:33
			and that was, I think, politics as a
		
00:29:33 --> 00:29:35
			morality pageant where
		
00:29:36 --> 00:29:37
			people are almost expecting
		
00:29:37 --> 00:29:40
			or wanting grandstanding in a sense. Because it's
		
00:29:41 --> 00:29:43
			signaling to them sort of like what side
		
00:29:43 --> 00:29:44
			someone is on.
		
00:29:46 --> 00:29:47
			What have you seen, like,
		
00:29:48 --> 00:29:50
			now that I know that's obviously always been
		
00:29:50 --> 00:29:52
			there, but how is it different now with
		
00:29:52 --> 00:29:54
			all the social media as compared to maybe
		
00:29:54 --> 00:29:55
			even just a few years ago?
		
00:29:57 --> 00:29:58
			One of the things we know
		
00:29:59 --> 00:30:00
			is
		
00:30:00 --> 00:30:01
			what people
		
00:30:03 --> 00:30:06
			say when they're asked, why do you vote
		
00:30:06 --> 00:30:07
			the way you vote? And one of the
		
00:30:07 --> 00:30:10
			things that's really important to voters is to
		
00:30:10 --> 00:30:11
			share the values,
		
00:30:11 --> 00:30:13
			to share the moral
		
00:30:14 --> 00:30:17
			convictions of politicians that represent them. They wanna
		
00:30:17 --> 00:30:19
			know this politician cares about them. They wanna
		
00:30:19 --> 00:30:22
			know this politician roughly cares about the things
		
00:30:22 --> 00:30:23
			that they care about.
		
00:30:24 --> 00:30:25
			This gives an incentive
		
00:30:26 --> 00:30:26
			to
		
00:30:27 --> 00:30:29
			now in the abstract, that's, you know, that's
		
00:30:29 --> 00:30:31
			perfectly fine. You might think it's perfectly fine
		
00:30:31 --> 00:30:33
			for voters to want politicians to share their
		
00:30:33 --> 00:30:36
			values and and agree with them about these
		
00:30:36 --> 00:30:38
			sort of foundational moral issues.
		
00:30:38 --> 00:30:41
			Here's the problem. The problem is this gives,
		
00:30:41 --> 00:30:44
			as you know, this this gives an incentive
		
00:30:44 --> 00:30:44
			to politicians
		
00:30:45 --> 00:30:47
			to put their values on display.
		
00:30:48 --> 00:30:48
			And,
		
00:30:50 --> 00:30:51
			and so we know
		
00:30:51 --> 00:30:55
			that there's really simple ways for politicians to
		
00:30:55 --> 00:30:56
			put their values on display.
		
00:30:57 --> 00:30:59
			They can use a hashtag. They can use
		
00:30:59 --> 00:31:00
			a slogan. They can get on Twitter and
		
00:31:00 --> 00:31:03
			put something in their bio. And what this
		
00:31:03 --> 00:31:05
			does is that it easily encapsulates
		
00:31:05 --> 00:31:09
			this really simple value that they know voters
		
00:31:09 --> 00:31:09
			want.
		
00:31:10 --> 00:31:12
			So, you know, for the left, it might
		
00:31:12 --> 00:31:14
			be something having to do with black lives
		
00:31:14 --> 00:31:14
			matter.
		
00:31:15 --> 00:31:17
			So you'd imagine a voter on the left
		
00:31:17 --> 00:31:20
			thinking, I want a politician to care deeply
		
00:31:20 --> 00:31:21
			about this issue. And so what does a
		
00:31:21 --> 00:31:24
			politician do? Well, they put on their bio
		
00:31:24 --> 00:31:26
			hashtag black lives matter. On the right, it
		
00:31:26 --> 00:31:29
			could be about, you know, you know, enforcing
		
00:31:29 --> 00:31:30
			the border or whatever. And so, you know,
		
00:31:30 --> 00:31:32
			a politician put some slogan in their in
		
00:31:32 --> 00:31:34
			their bio. Oh, I got I got weekly
		
00:31:34 --> 00:31:37
			mailers, like, physical flyers at least 2 or
		
00:31:37 --> 00:31:39
			3 a week for the past month.
		
00:31:39 --> 00:31:40
			All of them were
		
00:31:41 --> 00:31:44
			which candidate was anti abortion or pro abortion.
		
00:31:44 --> 00:31:46
			Like Yeah. That would there was literally, like,
		
00:31:46 --> 00:31:49
			they knew that signaling that one issue would
		
00:31:49 --> 00:31:50
			sway the vote.
		
00:31:51 --> 00:31:54
			Right. Yeah. So politicians collect these sort of,
		
00:31:54 --> 00:31:56
			like, moral trinkets. Right? And they and they
		
00:31:56 --> 00:31:58
			put them in their in their bios. Now,
		
00:31:58 --> 00:32:00
			again, you might think, okay. So what's the
		
00:32:00 --> 00:32:02
			problem? Right? What's the problem with showing that
		
00:32:02 --> 00:32:03
			you care or showing that you're about that
		
00:32:03 --> 00:32:06
			you have these values? Well, here's a problem.
		
00:32:08 --> 00:32:10
			A lot of the times, the policies
		
00:32:11 --> 00:32:11
			that
		
00:32:12 --> 00:32:12
			politicians
		
00:32:13 --> 00:32:13
			endorse
		
00:32:14 --> 00:32:15
			to express their values
		
00:32:17 --> 00:32:18
			are not workable.
		
00:32:19 --> 00:32:21
			They don't actually accomplish
		
00:32:21 --> 00:32:22
			the values
		
00:32:22 --> 00:32:25
			they purport to defend. And here's just one
		
00:32:25 --> 00:32:26
			example, and it has to do with the
		
00:32:26 --> 00:32:28
			rent control that you mentioned earlier.
		
00:32:28 --> 00:32:31
			So a politician might say, look, I really
		
00:32:31 --> 00:32:32
			care about the poor. I really care about,
		
00:32:32 --> 00:32:34
			affordable housing.
		
00:32:34 --> 00:32:38
			And so I unveiled this policy that vividly
		
00:32:38 --> 00:32:40
			and clearly shows that I care about the
		
00:32:40 --> 00:32:42
			poor. I'm gonna make it illegal for landlords
		
00:32:42 --> 00:32:44
			to charge x amount of dollars for apartments
		
00:32:44 --> 00:32:45
			in San Francisco.
		
00:32:45 --> 00:32:48
			That shows vividly that I care.
		
00:32:49 --> 00:32:51
			However, you know, if you if you ask,
		
00:32:51 --> 00:32:53
			you know, economists about rent control to a
		
00:32:53 --> 00:32:55
			person, they will tell you the rent control
		
00:32:55 --> 00:32:57
			laws reduce the quality and quantity of housing.
		
00:32:58 --> 00:32:59
			So here's the problem.
		
00:32:59 --> 00:33:00
			We have politicians
		
00:33:01 --> 00:33:01
			endorsing
		
00:33:03 --> 00:33:03
			policies
		
00:33:04 --> 00:33:06
			for their expressive value in order to get
		
00:33:06 --> 00:33:09
			elected, in order to, you know, raise money,
		
00:33:09 --> 00:33:10
			in order to,
		
00:33:10 --> 00:33:13
			you know, get on cable news at night.
		
00:33:13 --> 00:33:15
			Those are the things they really want.
		
00:33:15 --> 00:33:17
			They endorse policies
		
00:33:18 --> 00:33:20
			to do those things, and yet were those
		
00:33:20 --> 00:33:21
			policies to actually be implemented.
		
00:33:22 --> 00:33:25
			They would undermine the values they purport to
		
00:33:25 --> 00:33:27
			care about. And this happens on the right
		
00:33:27 --> 00:33:29
			and the left. And so there's this problem
		
00:33:29 --> 00:33:31
			in democratic societies
		
00:33:32 --> 00:33:35
			where voters care deeply about having their politicians
		
00:33:35 --> 00:33:36
			express their values.
		
00:33:36 --> 00:33:39
			Politicians have a strong incentive to put these
		
00:33:39 --> 00:33:41
			values on display with these policies.
		
00:33:42 --> 00:33:43
			But the problem is a lot of times
		
00:33:43 --> 00:33:45
			these policies that express values
		
00:33:45 --> 00:33:47
			don't actually do what they're supposed to do.
		
00:33:48 --> 00:33:50
			And and voters often don't know that. I
		
00:33:50 --> 00:33:51
			mean, if you if you ask a lot
		
00:33:51 --> 00:33:52
			of voters, do you you know, can you
		
00:33:52 --> 00:33:54
			explain the economics of rent control? They're gonna
		
00:33:54 --> 00:33:57
			say, what? No. I can't. But I know
		
00:33:57 --> 00:33:59
			that this guy has a really vivid proposal
		
00:33:59 --> 00:34:01
			to solve this problem, and so I'm gonna
		
00:34:01 --> 00:34:03
			vote for him. And it almost seems like
		
00:34:03 --> 00:34:06
			the the grandstanding effect makes it
		
00:34:06 --> 00:34:08
			impossible to have the policy discussion.
		
00:34:09 --> 00:34:10
			Because you have to you have to pick
		
00:34:10 --> 00:34:12
			a side, like, you either have to be
		
00:34:13 --> 00:34:15
			for the poor or you're automatically against the
		
00:34:15 --> 00:34:16
			poor. And so
		
00:34:17 --> 00:34:19
			having the policy discussion almost will never happen.
		
00:34:21 --> 00:34:22
			Yeah. It's almost like,
		
00:34:24 --> 00:34:26
			morality trumps everything else.
		
00:34:27 --> 00:34:29
			So so if I say this policy
		
00:34:30 --> 00:34:32
			is the most moral or or this policy
		
00:34:32 --> 00:34:32
			is the
		
00:34:33 --> 00:34:33
			most,
		
00:34:35 --> 00:34:36
			you know, just policy,
		
00:34:36 --> 00:34:38
			and someone asked, well, yeah, but is you
		
00:34:38 --> 00:34:40
			know, what are the economics of this policy?
		
00:34:40 --> 00:34:42
			Like, what if like, what would happen if
		
00:34:42 --> 00:34:44
			we actually implemented that? Would it backfire? Like,
		
00:34:44 --> 00:34:45
			woah. Woah. Woah. Woah. Woah. Woah. Like,
		
00:34:46 --> 00:34:48
			I care about what's the you know, which
		
00:34:48 --> 00:34:50
			policy is moral. Why are you bringing economics
		
00:34:50 --> 00:34:52
			into this? Why are you bringing practical questions?
		
00:34:52 --> 00:34:53
			We're trying to
		
00:34:53 --> 00:34:56
			promote justice here. I don't have time for
		
00:34:56 --> 00:34:57
			your facts and figures. There are a lot
		
00:34:57 --> 00:34:59
			of people, if you spend much time on
		
00:34:59 --> 00:35:01
			Twitter, who talk that way, who don't care
		
00:35:01 --> 00:35:03
			about you know, as you know, who don't
		
00:35:03 --> 00:35:06
			care about the details because those details get
		
00:35:06 --> 00:35:07
			in the way of the kind of moral
		
00:35:07 --> 00:35:10
			advertisement for these policies that they really care
		
00:35:10 --> 00:35:10
			about.
		
00:35:12 --> 00:35:14
			I know we're kinda running on time here.
		
00:35:14 --> 00:35:15
			I had kind of maybe a final question
		
00:35:15 --> 00:35:18
			if you both could Sure. Add your thoughts,
		
00:35:18 --> 00:35:19
			which was,
		
00:35:20 --> 00:35:22
			you know, the more you doom scroll through
		
00:35:22 --> 00:35:24
			social media. Right? You're checking Facebook, Twitter, all
		
00:35:24 --> 00:35:26
			these things, and you keep seeing it get
		
00:35:26 --> 00:35:28
			progressively worse. And I feel especially with, like,
		
00:35:28 --> 00:35:31
			the election, it's like we've seen magnitudes of
		
00:35:31 --> 00:35:34
			worse, like, day by day, you know, more
		
00:35:34 --> 00:35:35
			than in the past couple of months. It's
		
00:35:35 --> 00:35:37
			almost like every day it's a new catastrophe
		
00:35:37 --> 00:35:38
			or crisis.
		
00:35:40 --> 00:35:41
			What you know, what's
		
00:35:42 --> 00:35:45
			the way to instill some sense of optimism
		
00:35:45 --> 00:35:45
			or, like,
		
00:35:46 --> 00:35:47
			things that
		
00:35:47 --> 00:35:48
			I can do
		
00:35:49 --> 00:35:53
			to either de incentivize this behavior or, like,
		
00:35:53 --> 00:35:55
			save my own sanity or, you know, just
		
00:35:56 --> 00:35:57
			some way of saying, like, there is a
		
00:35:57 --> 00:35:59
			way out. There is a way for this
		
00:35:59 --> 00:36:00
			to get better and that the
		
00:36:01 --> 00:36:03
			algorithms aren't going to just keep adjusting and
		
00:36:03 --> 00:36:05
			incentivizing this behavior more and more and more.
		
00:36:06 --> 00:36:06
			Yeah.
		
00:36:07 --> 00:36:09
			Here's here's one one thing that I like
		
00:36:09 --> 00:36:10
			to say.
		
00:36:10 --> 00:36:13
			So think of, like, early dinner parties.
		
00:36:14 --> 00:36:16
			Right? If people are first, you know,
		
00:36:17 --> 00:36:18
			in in, like, in civil society,
		
00:36:19 --> 00:36:20
			eating,
		
00:36:21 --> 00:36:23
			you know, with with the upper crust, you
		
00:36:24 --> 00:36:25
			know, more people are are
		
00:36:26 --> 00:36:29
			doing fine dining, things like this. If you
		
00:36:29 --> 00:36:31
			went to to dinner parties like this or
		
00:36:31 --> 00:36:32
			or to,
		
00:36:33 --> 00:36:35
			restaurants where where people are are, you know,
		
00:36:35 --> 00:36:37
			first being brought into the practice,
		
00:36:38 --> 00:36:40
			you probably would have seen some wild stuff.
		
00:36:40 --> 00:36:40
			Right?
		
00:36:41 --> 00:36:41
			People,
		
00:36:42 --> 00:36:43
			you know,
		
00:36:43 --> 00:36:44
			taking
		
00:36:44 --> 00:36:46
			bones and meat off of the serving dish
		
00:36:46 --> 00:36:48
			and not and just throwing it back, you
		
00:36:48 --> 00:36:50
			know, blowing their nose in the table cloth,
		
00:36:50 --> 00:36:51
			and,
		
00:36:51 --> 00:36:53
			and then, you know, if you read,
		
00:36:54 --> 00:36:57
			early etiquette guides, about dining, from from the
		
00:36:57 --> 00:36:59
			middle ages, you'll see that the advice is
		
00:36:59 --> 00:37:02
			is actually about just this sort of thing.
		
00:37:02 --> 00:37:04
			People had to be told not to do
		
00:37:04 --> 00:37:05
			this stuff.
		
00:37:06 --> 00:37:08
			So stuff that that you and I,
		
00:37:08 --> 00:37:11
			if we were ever told explicitly not not
		
00:37:11 --> 00:37:12
			to blow our nose in the table cloth,
		
00:37:12 --> 00:37:15
			would have learned as as very young children.
		
00:37:15 --> 00:37:17
			You know, people had to learn and it
		
00:37:17 --> 00:37:19
			probably took a while for for people to
		
00:37:19 --> 00:37:21
			to figure this out and have the norm
		
00:37:21 --> 00:37:22
			catch on.
		
00:37:23 --> 00:37:24
			So one thing that you might think,
		
00:37:25 --> 00:37:26
			sounding a a note of optimism
		
00:37:27 --> 00:37:29
			is that we're just very early in
		
00:37:29 --> 00:37:30
			in,
		
00:37:31 --> 00:37:33
			having the whole world interconnected,
		
00:37:33 --> 00:37:35
			in in the way that we are, and
		
00:37:35 --> 00:37:36
			we just need a while to have good
		
00:37:36 --> 00:37:37
			norms develop.
		
00:37:37 --> 00:37:40
			So eventually it will be seen as, like,
		
00:37:40 --> 00:37:41
			really embarrassing,
		
00:37:41 --> 00:37:43
			as, you know, Brandon said earlier,
		
00:37:44 --> 00:37:45
			for people to go on social media in
		
00:37:45 --> 00:37:47
			Grandstand just as it'd be really embarrassing to
		
00:37:47 --> 00:37:49
			see someone at a nice restaurant blowing their
		
00:37:49 --> 00:37:50
			nose in a table cloth.
		
00:37:51 --> 00:37:53
			So, you know, give it some time.
		
00:37:54 --> 00:37:57
			It's it's a sort of thing where where
		
00:37:57 --> 00:37:59
			it's hard to see what you can do
		
00:37:59 --> 00:38:00
			day to day,
		
00:38:00 --> 00:38:02
			to to make this happen, but,
		
00:38:03 --> 00:38:04
			human beings are pretty smart,
		
00:38:05 --> 00:38:06
			on on the whole, and we've solved
		
00:38:07 --> 00:38:09
			bigger problems than this in the past.
		
00:38:10 --> 00:38:11
			So
		
00:38:11 --> 00:38:13
			I think there there's reason to be hopeful.
		
00:38:14 --> 00:38:16
			So, yeah, one thing to add is,
		
00:38:17 --> 00:38:18
			you know, I I don't know how old
		
00:38:18 --> 00:38:21
			you are, Omar, but I I remember so
		
00:38:21 --> 00:38:22
			I got Facebook in, like,
		
00:38:23 --> 00:38:23
			2,005
		
00:38:24 --> 00:38:25
			or so, 2,006.
		
00:38:25 --> 00:38:27
			I'm I'm old enough to have signed up
		
00:38:27 --> 00:38:28
			for Facebook when you had to have a
		
00:38:28 --> 00:38:31
			college email address. Okay. Good. Me too. Alright.
		
00:38:31 --> 00:38:33
			Me too. We may you may be about
		
00:38:33 --> 00:38:35
			the same age then. Alright. So I I
		
00:38:35 --> 00:38:36
			don't know about you. I don't know what
		
00:38:36 --> 00:38:38
			circles you ran in 15 years ago,
		
00:38:38 --> 00:38:40
			but I don't remember, you know, when I
		
00:38:40 --> 00:38:41
			was on Facebook 2,005,
		
00:38:42 --> 00:38:44
			6, you know, I don't remember anyone talking
		
00:38:44 --> 00:38:47
			about politics. It was like super boring stuff.
		
00:38:47 --> 00:38:48
			Like, I'm going to a party tonight or
		
00:38:48 --> 00:38:51
			I made beans and rice, or I'm watching,
		
00:38:51 --> 00:38:54
			oceans 12 or something. It's just and it
		
00:38:54 --> 00:38:55
			was it was like
		
00:38:56 --> 00:38:57
			it was like people
		
00:38:58 --> 00:38:59
			had to figure out
		
00:39:00 --> 00:39:03
			that things like Facebook could be used
		
00:39:03 --> 00:39:05
			in the way that we now use them.
		
00:39:05 --> 00:39:07
			Basically, to talk about politics and gain status.
		
00:39:07 --> 00:39:08
			That took a while.
		
00:39:09 --> 00:39:11
			And it it's, you know, it it was
		
00:39:11 --> 00:39:11
			a new technology.
		
00:39:12 --> 00:39:13
			And sometimes,
		
00:39:13 --> 00:39:15
			you know, it takes humans a while to
		
00:39:15 --> 00:39:16
			figure out how to use it to technology
		
00:39:16 --> 00:39:18
			for a new purpose. I mean, it took
		
00:39:18 --> 00:39:19
			us 50 years to figure out how to
		
00:39:19 --> 00:39:20
			put wheels on suitcases.
		
00:39:21 --> 00:39:22
			I mean, so so it took us a
		
00:39:22 --> 00:39:24
			while to figure out how to use
		
00:39:25 --> 00:39:26
			this technology
		
00:39:26 --> 00:39:28
			to grandstand. I mean, grandstanding has always been
		
00:39:28 --> 00:39:29
			around, but
		
00:39:30 --> 00:39:31
			it it I I think it took us
		
00:39:31 --> 00:39:32
			a while to catch up to use Facebook
		
00:39:32 --> 00:39:34
			and Twitter for these purposes.
		
00:39:34 --> 00:39:36
			The flip side of that is is basically
		
00:39:36 --> 00:39:38
			what Justin points out is that it it
		
00:39:38 --> 00:39:40
			takes a while to figure out the new
		
00:39:40 --> 00:39:41
			norms
		
00:39:41 --> 00:39:43
			and to coalesce on a new set of
		
00:39:43 --> 00:39:43
			rules
		
00:39:44 --> 00:39:47
			for social media. Now, you know, the optimistic
		
00:39:47 --> 00:39:49
			take is that that we we will develop
		
00:39:49 --> 00:39:50
			norms. And,
		
00:39:51 --> 00:39:52
			and so social media could be cleaned up
		
00:39:52 --> 00:39:55
			a little bit. Whether that's, likely or not,
		
00:39:55 --> 00:39:57
			I don't know. Here are 2 other things
		
00:39:57 --> 00:39:59
			that we often talk about, reasons to be
		
00:39:59 --> 00:39:59
			optimistic.
		
00:40:00 --> 00:40:00
			You
		
00:40:02 --> 00:40:03
			know, one of them is,
		
00:40:04 --> 00:40:07
			to build alternate institutions. And I and I
		
00:40:07 --> 00:40:10
			think this is really important for people,
		
00:40:11 --> 00:40:12
			to get off
		
00:40:13 --> 00:40:15
			to spend less time on Facebook and Twitter
		
00:40:15 --> 00:40:19
			and social media and build alternate institutions where
		
00:40:19 --> 00:40:20
			the incentives to grandstand
		
00:40:20 --> 00:40:21
			are
		
00:40:21 --> 00:40:22
			are,
		
00:40:23 --> 00:40:25
			are lesser. So, you know, maybe you get
		
00:40:25 --> 00:40:27
			together and you and you do a reading
		
00:40:27 --> 00:40:29
			group. Right? Maybe you have an online,
		
00:40:30 --> 00:40:32
			so on Facebook, I'm a part of various
		
00:40:32 --> 00:40:34
			sort of, like, subgroups
		
00:40:35 --> 00:40:37
			where the norms are much different than the
		
00:40:37 --> 00:40:40
			Wild West norms on, like, Twitter and Facebook
		
00:40:40 --> 00:40:42
			because we've it's a smaller group of people
		
00:40:42 --> 00:40:44
			and we enforce the norms. And if someone's
		
00:40:44 --> 00:40:46
			not gonna behave, you kick them out. Right?
		
00:40:46 --> 00:40:48
			And so I think building alternative institutions where
		
00:40:48 --> 00:40:50
			people, even online, can get together
		
00:40:52 --> 00:40:53
			and have these discussions
		
00:40:54 --> 00:40:56
			even, you know, having strong disagreement, but enforce
		
00:40:56 --> 00:40:58
			certain kinds of norms
		
00:40:58 --> 00:41:01
			where these these kinds of important conversations can,
		
00:41:01 --> 00:41:03
			we can can happen. I think that's one
		
00:41:03 --> 00:41:06
			reason for optimism. Another one, and this is
		
00:41:06 --> 00:41:07
			a more extreme one, is
		
00:41:09 --> 00:41:09
			is,
		
00:41:09 --> 00:41:11
			you know, I think we would probably all
		
00:41:11 --> 00:41:13
			do better off by
		
00:41:15 --> 00:41:18
			decreasing the amount of our lives that are
		
00:41:18 --> 00:41:19
			overtaken by politics.
		
00:41:20 --> 00:41:21
			And
		
00:41:21 --> 00:41:23
			I think a lot of us spend, you
		
00:41:23 --> 00:41:25
			know, I think myself included, I think Justin
		
00:41:25 --> 00:41:28
			would say too. I mean, there's too much
		
00:41:28 --> 00:41:30
			of our lives are taken over by thinking
		
00:41:30 --> 00:41:33
			about stressing about worrying about reading about
		
00:41:33 --> 00:41:35
			tweeting about politics.
		
00:41:36 --> 00:41:38
			And, you know, my view is as as
		
00:41:38 --> 00:41:40
			important politics is, and I think it is
		
00:41:40 --> 00:41:42
			important, politics makes us dumb and mean.
		
00:41:43 --> 00:41:43
			And,
		
00:41:44 --> 00:41:47
			I think there are many better ways
		
00:41:47 --> 00:41:50
			to spend our lives doing good for others
		
00:41:50 --> 00:41:50
			even
		
00:41:51 --> 00:41:53
			than investing in politics. I mean, teaching your
		
00:41:53 --> 00:41:55
			kids how to ride a bike, learning how
		
00:41:55 --> 00:41:57
			to play the piano, you know, visiting a
		
00:41:57 --> 00:41:59
			local nursing home or, you know, whatever there
		
00:41:59 --> 00:42:01
			are whatever those things are, those things are
		
00:42:01 --> 00:42:04
			gonna have much better returns in terms of
		
00:42:05 --> 00:42:07
			helping others making a difference in the world
		
00:42:07 --> 00:42:07
			than,
		
00:42:08 --> 00:42:10
			spending 6 hours a day on Twitter reading
		
00:42:10 --> 00:42:12
			about, you know, reading about the election. I
		
00:42:12 --> 00:42:13
			mean, it's
		
00:42:14 --> 00:42:16
			it's it's almost ridiculous to think like, oh,
		
00:42:16 --> 00:42:18
			yeah. I'm actually making a difference by reading
		
00:42:18 --> 00:42:19
			all the stuff on Twitter. You know, it's
		
00:42:19 --> 00:42:21
			like, that's not gonna make you a better
		
00:42:21 --> 00:42:23
			person. It's not gonna make you a better
		
00:42:23 --> 00:42:23
			citizen.
		
00:42:24 --> 00:42:26
			And so I do think carving out larger
		
00:42:26 --> 00:42:29
			areas of our lives that are insulated from
		
00:42:29 --> 00:42:29
			politics,
		
00:42:30 --> 00:42:32
			It's better for our mental health and it
		
00:42:32 --> 00:42:35
			also, you know, encourages civic friendship, where we
		
00:42:35 --> 00:42:36
			can be friends with people,
		
00:42:37 --> 00:42:40
			even if I disagree with someone strongly about
		
00:42:40 --> 00:42:42
			politics, because politics is not allowed to enter
		
00:42:42 --> 00:42:45
			into this relationship. This is a no politics
		
00:42:45 --> 00:42:47
			zone. And I think that that vision of
		
00:42:47 --> 00:42:48
			civic friendship
		
00:42:49 --> 00:42:50
			is,
		
00:42:51 --> 00:42:52
			is really
		
00:42:53 --> 00:42:55
			something to aspire to. And I think, you
		
00:42:55 --> 00:42:56
			know, a grounds for optimism.
		
00:42:57 --> 00:42:59
			Awesome. I think that's a good note to
		
00:42:59 --> 00:43:00
			to close on.
		
00:43:01 --> 00:43:02
			Where can our listeners,
		
00:43:03 --> 00:43:04
			find you guys or follow you? I mean,
		
00:43:04 --> 00:43:06
			we just talked about getting off social media,
		
00:43:06 --> 00:43:08
			but do you have social media that that
		
00:43:08 --> 00:43:09
			you'd want them to follow on or
		
00:43:10 --> 00:43:12
			anything like that? And I know the the
		
00:43:12 --> 00:43:14
			book is obviously available at Amazon, and I'm
		
00:43:14 --> 00:43:17
			assuming other booksellers, but any any other places
		
00:43:17 --> 00:43:19
			that you'd wanna let them know to follow
		
00:43:19 --> 00:43:20
			you guys or stay in touch?
		
00:43:21 --> 00:43:24
			Yeah. We're both on Twitter at at brandonwarrenke
		
00:43:24 --> 00:43:25
			and at justintosie.
		
00:43:26 --> 00:43:27
			You know, we wrote this book and then
		
00:43:27 --> 00:43:29
			now we have to behave on social media.
		
00:43:29 --> 00:43:30
			So it's,
		
00:43:30 --> 00:43:33
			we're I I think we're both very boring
		
00:43:33 --> 00:43:35
			follows on Twitter, but you're but you're more
		
00:43:35 --> 00:43:36
			than welcome to.
		
00:43:38 --> 00:43:40
			And, yeah. And thank you. Yeah. The books
		
00:43:40 --> 00:43:42
			on Amazon. It's there's also, it's on Kindle.
		
00:43:42 --> 00:43:44
			There's an audible version too.
		
00:43:44 --> 00:43:45
			So
		
00:43:45 --> 00:43:47
			lots of ways to, lots of ways to
		
00:43:47 --> 00:43:48
			find it.
		
00:43:48 --> 00:43:50
			Okay. Cool. Thank you.
		
00:43:51 --> 00:43:53
			Thank you. Thanks so much, Omar.