Omar Usman – Upstream Dan Heath 3 Things I Learned

Omar Usman
Share Page

AI: Summary ©

The speakers discuss the importance of prioritizing one's role in society and putting someone in charge of the system. They also touch on the COBRA effect, which is a bounty on cobras and is a temporary solution. The speakers emphasize the need for rewarding individuals for their actions and the importance of upstream thinking to avoid problems and address privacy concerns. They share examples of open office and hesitant behavior in meetings.

AI: Summary ©

00:00:00 --> 00:00:02
			In this video I'm sharing 3 things I
		
00:00:02 --> 00:00:05
			learned from the book Upstream by Dan Heath.
		
00:00:05 --> 00:00:08
			We've all heard that saying: an ounce of
		
00:00:08 --> 00:00:10
			prevention is better than a pound of cure.
		
00:00:10 --> 00:00:13
			Why then do we optimize to deliver pounds
		
00:00:13 --> 00:00:16
			of cure? Upstream is a book about solving
		
00:00:16 --> 00:00:17
			problems before they happen.
		
00:00:18 --> 00:00:20
			The author starts off the book with a
		
00:00:20 --> 00:00:22
			public health parable. He says that there's 2
		
00:00:22 --> 00:00:24
			guys at a picnic, they're eating, and all
		
00:00:24 --> 00:00:25
			of a sudden there's a kid drowning in
		
00:00:25 --> 00:00:27
			the river. So they jump in, they're saving
		
00:00:27 --> 00:00:30
			the kid. As they're saving the kid, another
		
00:00:30 --> 00:00:31
			one comes
		
00:00:31 --> 00:00:33
			by drowning. They save that one. They keep
		
00:00:33 --> 00:00:35
			going and more and more kids are drowning.
		
00:00:36 --> 00:00:37
			Finally, one of them gets up out of
		
00:00:37 --> 00:00:39
			the river and starts walking off. The other
		
00:00:39 --> 00:00:41
			guy looks at him and says, where are
		
00:00:41 --> 00:00:42
			you going? And he goes,
		
00:00:42 --> 00:00:43
			I'm gonna go upstream
		
00:00:44 --> 00:00:46
			to find the jerk that keeps throwing these
		
00:00:46 --> 00:00:47
			kids in the river.
		
00:00:48 --> 00:00:49
			That is the metaphor
		
00:00:49 --> 00:00:52
			for solving problems before they happen. And in
		
00:00:52 --> 00:00:55
			this book, Dan Heath explores the reasons why
		
00:00:55 --> 00:00:57
			we're sometimes blinded to to those types of
		
00:00:57 --> 00:00:57
			problems
		
00:00:58 --> 00:00:59
			and some strategies
		
00:00:59 --> 00:01:01
			in order to tackle them. So in this
		
00:01:01 --> 00:01:03
			video, I'm gonna focus on 3 specific things.
		
00:01:03 --> 00:01:04
			The first
		
00:01:04 --> 00:01:05
			is that heroism
		
00:01:06 --> 00:01:09
			is a sign of systems failure. This is
		
00:01:09 --> 00:01:10
			one that set off a huge light bulb
		
00:01:10 --> 00:01:13
			for me. When we think about heroes, particularly
		
00:01:13 --> 00:01:15
			in a work environment or professional environment,
		
00:01:16 --> 00:01:18
			we think about someone that goes above and
		
00:01:18 --> 00:01:20
			beyond the call of duty. We think about
		
00:01:20 --> 00:01:22
			someone who comes in and saves the day.
		
00:01:22 --> 00:01:23
			They work extra hard.
		
00:01:23 --> 00:01:24
			They
		
00:01:25 --> 00:01:27
			solve the problem. They avert disaster and all
		
00:01:27 --> 00:01:29
			of these different things and then they're rewarded
		
00:01:29 --> 00:01:31
			for it. They get kudos. They get congratulations.
		
00:01:32 --> 00:01:33
			They get recognition.
		
00:01:33 --> 00:01:34
			Sometimes,
		
00:01:34 --> 00:01:36
			you know, being a hero gets someone promoted
		
00:01:36 --> 00:01:38
			and helps them to move up. We celebrate
		
00:01:39 --> 00:01:39
			the hero.
		
00:01:40 --> 00:01:43
			The problem is that the fact that we
		
00:01:43 --> 00:01:45
			rely on heroes to fix the problems that
		
00:01:45 --> 00:01:46
			we have
		
00:01:46 --> 00:01:49
			indicates that there is a systems failure at
		
00:01:49 --> 00:01:49
			play.
		
00:01:50 --> 00:01:52
			Why is it that we need a hero
		
00:01:52 --> 00:01:53
			in the first place?
		
00:01:53 --> 00:01:55
			Why is the system so broken that it
		
00:01:55 --> 00:01:59
			keeps generating results that require the intervention
		
00:01:59 --> 00:02:01
			of a hero to come in and fix
		
00:02:01 --> 00:02:03
			it? And now that's a very tough question
		
00:02:03 --> 00:02:06
			to deal with. And one of the in
		
00:02:06 --> 00:02:08
			in the book, he calls this tunneling. He
		
00:02:08 --> 00:02:10
			calls it a factor of problem blindness.
		
00:02:10 --> 00:02:12
			Because when we're in hero mode, when we're
		
00:02:12 --> 00:02:14
			in firefighter mode,
		
00:02:14 --> 00:02:16
			we come in and there's so much going
		
00:02:16 --> 00:02:17
			on that we have to do whatever we
		
00:02:17 --> 00:02:20
			can to keep our heads above water. And
		
00:02:20 --> 00:02:22
			so we come in and we deal with
		
00:02:22 --> 00:02:23
			crisis after crisis.
		
00:02:24 --> 00:02:27
			Major outage after major outage. And we're simply
		
00:02:27 --> 00:02:29
			in the business of trying to solve problems,
		
00:02:29 --> 00:02:31
			keep our heads afloat,
		
00:02:31 --> 00:02:33
			put one foot in front of the other,
		
00:02:33 --> 00:02:35
			you know, eat the elephant one bite at
		
00:02:35 --> 00:02:37
			a time, all of those cliches that we
		
00:02:37 --> 00:02:39
			talk about, and we celebrate the effort of
		
00:02:39 --> 00:02:42
			the person doing it, but we don't stop
		
00:02:42 --> 00:02:43
			and ask
		
00:02:43 --> 00:02:45
			why did it happen in the first place
		
00:02:45 --> 00:02:48
			and who is responsible for looking at it
		
00:02:48 --> 00:02:49
			and helping to avert
		
00:02:50 --> 00:02:52
			the need for the hero to begin with.
		
00:02:52 --> 00:02:55
			And that's tough because that is a position
		
00:02:55 --> 00:02:56
			that's not incentivized.
		
00:02:56 --> 00:02:59
			It doesn't get the same recognition that a
		
00:02:59 --> 00:03:01
			hero does. And so we need to find
		
00:03:01 --> 00:03:03
			ways to pay attention to that role,
		
00:03:03 --> 00:03:06
			incentivize that role, recognize that role, but ultimately
		
00:03:06 --> 00:03:08
			put someone in charge of looking at that
		
00:03:08 --> 00:03:09
			system
		
00:03:09 --> 00:03:10
			and assessing
		
00:03:11 --> 00:03:12
			what it is that we need to do
		
00:03:12 --> 00:03:14
			in the bigger picture. The second thing I
		
00:03:14 --> 00:03:16
			learned is that every system
		
00:03:16 --> 00:03:19
			is perfectly designed to get the results that
		
00:03:19 --> 00:03:20
			it gets.
		
00:03:20 --> 00:03:23
			And this is something that applies whether it's
		
00:03:23 --> 00:03:25
			a societal issue when we talk about things
		
00:03:25 --> 00:03:27
			like systemic racism, systemic poverty,
		
00:03:27 --> 00:03:28
			these really
		
00:03:28 --> 00:03:31
			large complex social issues that are affecting us.
		
00:03:31 --> 00:03:33
			And it's the same when we look at
		
00:03:34 --> 00:03:36
			a company or a business or something like
		
00:03:36 --> 00:03:38
			that. But every system that we have
		
00:03:38 --> 00:03:41
			is designed to get the results that it
		
00:03:41 --> 00:03:43
			gets. One very interesting example that Dan Heath
		
00:03:43 --> 00:03:45
			shared in the book was that of Expedia,
		
00:03:46 --> 00:03:48
			the online booking site. And he said that
		
00:03:48 --> 00:03:50
			they had an issue where, you know, people
		
00:03:50 --> 00:03:52
			were booking reservations and they weren't getting a
		
00:03:52 --> 00:03:54
			notification back, and so they would call into
		
00:03:54 --> 00:03:56
			the company to get a copy of their
		
00:03:56 --> 00:03:56
			itinerary.
		
00:03:57 --> 00:03:59
			Now what happened was that, you know, people
		
00:03:59 --> 00:04:01
			book their they book their vacation,
		
00:04:01 --> 00:04:03
			they don't get the itinerary for whatever reason,
		
00:04:03 --> 00:04:05
			and they call in. Well, everyone is siloed
		
00:04:05 --> 00:04:07
			out. Right? The web guys are saying, well,
		
00:04:08 --> 00:04:10
			our web system is working perfectly. The reservations
		
00:04:10 --> 00:04:12
			are going through here, the numbers that we
		
00:04:12 --> 00:04:14
			have. The customer service team is looking at
		
00:04:14 --> 00:04:16
			it saying everything that we're doing is great.
		
00:04:16 --> 00:04:17
			You know, this is our call volume. This
		
00:04:17 --> 00:04:19
			is our average handle time. This is how
		
00:04:19 --> 00:04:21
			quickly that we're resolving issues. So on and
		
00:04:21 --> 00:04:22
			so on and so on. And so everyone
		
00:04:22 --> 00:04:23
			in their silos
		
00:04:24 --> 00:04:26
			is perfectly doing their job. And in fact,
		
00:04:26 --> 00:04:27
			they're hitting their metrics.
		
00:04:28 --> 00:04:30
			They're hitting their targets. They're doing a good
		
00:04:30 --> 00:04:30
			job.
		
00:04:30 --> 00:04:33
			But it required someone in this particular case
		
00:04:33 --> 00:04:36
			whose job it wasn't their job. And they
		
00:04:36 --> 00:04:37
			looked at it and said why are we
		
00:04:37 --> 00:04:40
			getting so many calls from people that want
		
00:04:40 --> 00:04:41
			an itinerary?
		
00:04:41 --> 00:04:44
			And when he tallied up the cost per
		
00:04:44 --> 00:04:45
			call to the company and the number of
		
00:04:45 --> 00:04:47
			people that were calling in, this ended up
		
00:04:47 --> 00:04:48
			being a $100,000,000
		
00:04:49 --> 00:04:52
			problem that no one was paying attention to
		
00:04:52 --> 00:04:54
			and there were no metrics or targets to
		
00:04:54 --> 00:04:56
			indicate that anything was amiss, that anything was
		
00:04:56 --> 00:04:58
			wrong. It took someone to just look at
		
00:04:58 --> 00:05:01
			it and think about it in that way
		
00:05:01 --> 00:05:03
			and interpret the data in that way, in
		
00:05:03 --> 00:05:05
			that human way, and understand that this is
		
00:05:05 --> 00:05:07
			a problem. And he said, well look,
		
00:05:07 --> 00:05:09
			can we cut down on the number of
		
00:05:09 --> 00:05:11
			people needing an itinerary? Why aren't they getting
		
00:05:11 --> 00:05:12
			it? Are their emails going to spam? Is
		
00:05:12 --> 00:05:14
			there something else? Is there some other reason?
		
00:05:14 --> 00:05:16
			And when they're able to remedy that, the
		
00:05:16 --> 00:05:20
			number of calls go down. See, to actually
		
00:05:20 --> 00:05:22
			assess and say why are people calling and
		
00:05:22 --> 00:05:24
			how do we decrease that volume, that's an
		
00:05:24 --> 00:05:25
			upstream problem.
		
00:05:25 --> 00:05:27
			The downstream problem is the calls group saying,
		
00:05:27 --> 00:05:28
			well, here's the number of calls that we're
		
00:05:28 --> 00:05:30
			getting. Here's how we're optimizing.
		
00:05:31 --> 00:05:34
			Here's our scripts. Here's our customer satisfaction scores.
		
00:05:34 --> 00:05:36
			We're doing great. We're gonna optimize this further.
		
00:05:36 --> 00:05:38
			If we have an increase in volume, here's
		
00:05:38 --> 00:05:40
			how we're gonna handle it. And they're all
		
00:05:40 --> 00:05:41
			celebrating their success,
		
00:05:42 --> 00:05:44
			almost not realizing that there's actually a bigger
		
00:05:44 --> 00:05:47
			problem at play. And so up stream thinking,
		
00:05:47 --> 00:05:49
			and if we wanna call it strategic thinking,
		
00:05:49 --> 00:05:51
			I think that's a really good analogy,
		
00:05:51 --> 00:05:54
			it requires that type of strategic thinking
		
00:05:54 --> 00:05:57
			in order to approach the issues, assess the
		
00:05:57 --> 00:05:59
			landscape, and think more deeply about
		
00:05:59 --> 00:06:02
			what's going on and why things are happening,
		
00:06:02 --> 00:06:05
			and in many cases identifying a problem
		
00:06:05 --> 00:06:07
			that many people don't even know exist,
		
00:06:08 --> 00:06:11
			and in fact is often a bigger problem
		
00:06:11 --> 00:06:13
			than the ones that we actually see and
		
00:06:13 --> 00:06:15
			interact with on a day to day basis.
		
00:06:15 --> 00:06:16
			The third thing I learned from the book
		
00:06:16 --> 00:06:17
			Upstream
		
00:06:17 --> 00:06:20
			is the idea of the COBRA effect. Now
		
00:06:20 --> 00:06:21
			I've talked about this a little bit in
		
00:06:21 --> 00:06:24
			a previous video about second order consequences, but
		
00:06:24 --> 00:06:25
			this is taking that to the next level.
		
00:06:25 --> 00:06:27
			The term COBRA effect was coined during the
		
00:06:27 --> 00:06:29
			British colonization of India.
		
00:06:30 --> 00:06:31
			And what happened was that there was, you
		
00:06:31 --> 00:06:33
			know, all these cobras and they wanted to
		
00:06:33 --> 00:06:34
			get rid of them. So essentially,
		
00:06:35 --> 00:06:36
			Britain put a bounty
		
00:06:37 --> 00:06:38
			on cobras. So if you bring in a
		
00:06:38 --> 00:06:40
			dead cobra, we'll give you a certain amount
		
00:06:40 --> 00:06:42
			of money. Well, okay, that starts happening and
		
00:06:42 --> 00:06:44
			then people start to realize, like, well, hey,
		
00:06:44 --> 00:06:45
			this is a good way of getting money.
		
00:06:46 --> 00:06:49
			So they started to breed cobras and then
		
00:06:49 --> 00:06:50
			kill them and go and get the money.
		
00:06:50 --> 00:06:52
			Well, when Britain figured out that that's what
		
00:06:52 --> 00:06:53
			was happening,
		
00:06:53 --> 00:06:55
			they said, well, this is not what we
		
00:06:55 --> 00:06:57
			wanted so they cut the bounty, right? They
		
00:06:57 --> 00:06:59
			stopped paying people for the cobras and now
		
00:06:59 --> 00:07:01
			what happens is you have all of these
		
00:07:01 --> 00:07:03
			cobras that people have bred for the purpose
		
00:07:03 --> 00:07:06
			of getting this bounty, the bounty goes away,
		
00:07:06 --> 00:07:08
			and now you actually have more cobras than
		
00:07:08 --> 00:07:10
			you started out with in the very beginning.
		
00:07:11 --> 00:07:13
			And so when we tackle a problem,
		
00:07:13 --> 00:07:15
			we have to look at the behavior that
		
00:07:15 --> 00:07:18
			we end up incentivizing. And not just the
		
00:07:18 --> 00:07:19
			immediate consequence,
		
00:07:19 --> 00:07:21
			but the consequence of the consequence.
		
00:07:21 --> 00:07:23
			One example that they share is the idea
		
00:07:23 --> 00:07:25
			of the open office. Right? This is a
		
00:07:25 --> 00:07:28
			new trend that we see in workplaces where,
		
00:07:28 --> 00:07:30
			well, the goal that we're trying to achieve
		
00:07:31 --> 00:07:32
			is collaboration,
		
00:07:33 --> 00:07:35
			synergy, right? The accidental
		
00:07:35 --> 00:07:38
			collision and informal learning that happens when you
		
00:07:38 --> 00:07:39
			interact with coworkers
		
00:07:40 --> 00:07:43
			physically in the same space. Now, the open
		
00:07:43 --> 00:07:45
			offices that we get rid of the cubicle
		
00:07:45 --> 00:07:46
			walls, we, you know, tear down all those
		
00:07:46 --> 00:07:49
			barriers, we put everyone together in that open
		
00:07:49 --> 00:07:49
			format.
		
00:07:50 --> 00:07:52
			And this will somehow increase that collaboration.
		
00:07:52 --> 00:07:54
			But what they ended up finding in places
		
00:07:54 --> 00:07:56
			that had the open office format
		
00:07:56 --> 00:07:57
			was
		
00:07:58 --> 00:07:59
			now that people were so close together, they
		
00:07:59 --> 00:08:00
			were actually
		
00:08:00 --> 00:08:01
			hesitant
		
00:08:01 --> 00:08:03
			to have open conversation because,
		
00:08:04 --> 00:08:05
			one, they didn't want to be loud and
		
00:08:05 --> 00:08:06
			disturb others, but also
		
00:08:07 --> 00:08:09
			there was no privacy. And so they didn't
		
00:08:09 --> 00:08:11
			know who was listening to their conversation, so
		
00:08:11 --> 00:08:13
			they just didn't talk. On the flip side,
		
00:08:13 --> 00:08:15
			people that, you know, had a personality of
		
00:08:15 --> 00:08:16
			saying, like, well, I don't care who's around
		
00:08:16 --> 00:08:18
			me, I'm gonna speak loudly, take my meetings,
		
00:08:18 --> 00:08:19
			all of this,
		
00:08:19 --> 00:08:21
			They would go full force and, you know,
		
00:08:21 --> 00:08:23
			be like, you know, have their meeting and
		
00:08:23 --> 00:08:26
			talk loudly and all those things and other
		
00:08:26 --> 00:08:28
			people around them would get annoyed. And so
		
00:08:28 --> 00:08:28
			then when
		
00:08:29 --> 00:08:30
			someone else says, you know, I don't wanna
		
00:08:30 --> 00:08:32
			be like that, right, and so they then
		
00:08:32 --> 00:08:33
			hesitate to
		
00:08:34 --> 00:08:35
			speak up properly when they're on a video
		
00:08:35 --> 00:08:37
			call or they're at their desk. And so
		
00:08:37 --> 00:08:38
			now what ends up happening is instead of
		
00:08:38 --> 00:08:40
			having people co located
		
00:08:40 --> 00:08:43
			and working in that open office concept, all
		
00:08:43 --> 00:08:45
			the meeting rooms start to get booked because
		
00:08:45 --> 00:08:47
			everyone's running away to try to get some
		
00:08:47 --> 00:08:49
			sense of privacy, some sense of separation.
		
00:08:50 --> 00:08:51
			And so,
		
00:08:52 --> 00:08:54
			theoretically, yes, there should have been an increase
		
00:08:54 --> 00:08:54
			in collaboration,
		
00:08:55 --> 00:08:57
			but stopping and assessing the consequence of the
		
00:08:57 --> 00:08:59
			consequence, right? That second order thinking,
		
00:09:00 --> 00:09:02
			you start to see that what you incentivize
		
00:09:03 --> 00:09:05
			might not align with the problem that you're
		
00:09:05 --> 00:09:09
			trying to solve. That requires, again, that level
		
00:09:09 --> 00:09:11
			of upstream thinking and how we're going to
		
00:09:11 --> 00:09:11
			approach
		
00:09:12 --> 00:09:14
			this problem that we're dealing with. Alright. So
		
00:09:14 --> 00:09:16
			that's 3 things I learned from the book
		
00:09:16 --> 00:09:18
			Upstream by Dan Heath. Got the link to
		
00:09:18 --> 00:09:20
			the book in the show notes below. And,
		
00:09:20 --> 00:09:22
			as always, if you enjoyed this video, if
		
00:09:22 --> 00:09:23
			you found it useful, beneficial,
		
00:09:23 --> 00:09:26
			please hit the Like and Subscribe button and
		
00:09:26 --> 00:09:27
			share it with a friend. And see you
		
00:09:27 --> 00:09:29
			guys in the next video.