Omar Usman – Upstream Dan Heath 3 Things I Learned
AI: Summary ©
The speakers discuss the importance of prioritizing one's role in society and putting someone in charge of the system. They also touch on the COBRA effect, which is a bounty on cobras and is a temporary solution. The speakers emphasize the need for rewarding individuals for their actions and the importance of upstream thinking to avoid problems and address privacy concerns. They share examples of open office and hesitant behavior in meetings.
AI: Summary ©
In this video I'm sharing 3 things I
learned from the book Upstream by Dan Heath.
We've all heard that saying: an ounce of
prevention is better than a pound of cure.
Why then do we optimize to deliver pounds
of cure? Upstream is a book about solving
problems before they happen.
The author starts off the book with a
public health parable. He says that there's 2
guys at a picnic, they're eating, and all
of a sudden there's a kid drowning in
the river. So they jump in, they're saving
the kid. As they're saving the kid, another
one comes
by drowning. They save that one. They keep
going and more and more kids are drowning.
Finally, one of them gets up out of
the river and starts walking off. The other
guy looks at him and says, where are
you going? And he goes,
I'm gonna go upstream
to find the jerk that keeps throwing these
kids in the river.
That is the metaphor
for solving problems before they happen. And in
this book, Dan Heath explores the reasons why
we're sometimes blinded to to those types of
problems
and some strategies
in order to tackle them. So in this
video, I'm gonna focus on 3 specific things.
The first
is that heroism
is a sign of systems failure. This is
one that set off a huge light bulb
for me. When we think about heroes, particularly
in a work environment or professional environment,
we think about someone that goes above and
beyond the call of duty. We think about
someone who comes in and saves the day.
They work extra hard.
They
solve the problem. They avert disaster and all
of these different things and then they're rewarded
for it. They get kudos. They get congratulations.
They get recognition.
Sometimes,
you know, being a hero gets someone promoted
and helps them to move up. We celebrate
the hero.
The problem is that the fact that we
rely on heroes to fix the problems that
we have
indicates that there is a systems failure at
play.
Why is it that we need a hero
in the first place?
Why is the system so broken that it
keeps generating results that require the intervention
of a hero to come in and fix
it? And now that's a very tough question
to deal with. And one of the in
in the book, he calls this tunneling. He
calls it a factor of problem blindness.
Because when we're in hero mode, when we're
in firefighter mode,
we come in and there's so much going
on that we have to do whatever we
can to keep our heads above water. And
so we come in and we deal with
crisis after crisis.
Major outage after major outage. And we're simply
in the business of trying to solve problems,
keep our heads afloat,
put one foot in front of the other,
you know, eat the elephant one bite at
a time, all of those cliches that we
talk about, and we celebrate the effort of
the person doing it, but we don't stop
and ask
why did it happen in the first place
and who is responsible for looking at it
and helping to avert
the need for the hero to begin with.
And that's tough because that is a position
that's not incentivized.
It doesn't get the same recognition that a
hero does. And so we need to find
ways to pay attention to that role,
incentivize that role, recognize that role, but ultimately
put someone in charge of looking at that
system
and assessing
what it is that we need to do
in the bigger picture. The second thing I
learned is that every system
is perfectly designed to get the results that
it gets.
And this is something that applies whether it's
a societal issue when we talk about things
like systemic racism, systemic poverty,
these really
large complex social issues that are affecting us.
And it's the same when we look at
a company or a business or something like
that. But every system that we have
is designed to get the results that it
gets. One very interesting example that Dan Heath
shared in the book was that of Expedia,
the online booking site. And he said that
they had an issue where, you know, people
were booking reservations and they weren't getting a
notification back, and so they would call into
the company to get a copy of their
itinerary.
Now what happened was that, you know, people
book their they book their vacation,
they don't get the itinerary for whatever reason,
and they call in. Well, everyone is siloed
out. Right? The web guys are saying, well,
our web system is working perfectly. The reservations
are going through here, the numbers that we
have. The customer service team is looking at
it saying everything that we're doing is great.
You know, this is our call volume. This
is our average handle time. This is how
quickly that we're resolving issues. So on and
so on and so on. And so everyone
in their silos
is perfectly doing their job. And in fact,
they're hitting their metrics.
They're hitting their targets. They're doing a good
job.
But it required someone in this particular case
whose job it wasn't their job. And they
looked at it and said why are we
getting so many calls from people that want
an itinerary?
And when he tallied up the cost per
call to the company and the number of
people that were calling in, this ended up
being a $100,000,000
problem that no one was paying attention to
and there were no metrics or targets to
indicate that anything was amiss, that anything was
wrong. It took someone to just look at
it and think about it in that way
and interpret the data in that way, in
that human way, and understand that this is
a problem. And he said, well look,
can we cut down on the number of
people needing an itinerary? Why aren't they getting
it? Are their emails going to spam? Is
there something else? Is there some other reason?
And when they're able to remedy that, the
number of calls go down. See, to actually
assess and say why are people calling and
how do we decrease that volume, that's an
upstream problem.
The downstream problem is the calls group saying,
well, here's the number of calls that we're
getting. Here's how we're optimizing.
Here's our scripts. Here's our customer satisfaction scores.
We're doing great. We're gonna optimize this further.
If we have an increase in volume, here's
how we're gonna handle it. And they're all
celebrating their success,
almost not realizing that there's actually a bigger
problem at play. And so up stream thinking,
and if we wanna call it strategic thinking,
I think that's a really good analogy,
it requires that type of strategic thinking
in order to approach the issues, assess the
landscape, and think more deeply about
what's going on and why things are happening,
and in many cases identifying a problem
that many people don't even know exist,
and in fact is often a bigger problem
than the ones that we actually see and
interact with on a day to day basis.
The third thing I learned from the book
Upstream
is the idea of the COBRA effect. Now
I've talked about this a little bit in
a previous video about second order consequences, but
this is taking that to the next level.
The term COBRA effect was coined during the
British colonization of India.
And what happened was that there was, you
know, all these cobras and they wanted to
get rid of them. So essentially,
Britain put a bounty
on cobras. So if you bring in a
dead cobra, we'll give you a certain amount
of money. Well, okay, that starts happening and
then people start to realize, like, well, hey,
this is a good way of getting money.
So they started to breed cobras and then
kill them and go and get the money.
Well, when Britain figured out that that's what
was happening,
they said, well, this is not what we
wanted so they cut the bounty, right? They
stopped paying people for the cobras and now
what happens is you have all of these
cobras that people have bred for the purpose
of getting this bounty, the bounty goes away,
and now you actually have more cobras than
you started out with in the very beginning.
And so when we tackle a problem,
we have to look at the behavior that
we end up incentivizing. And not just the
immediate consequence,
but the consequence of the consequence.
One example that they share is the idea
of the open office. Right? This is a
new trend that we see in workplaces where,
well, the goal that we're trying to achieve
is collaboration,
synergy, right? The accidental
collision and informal learning that happens when you
interact with coworkers
physically in the same space. Now, the open
offices that we get rid of the cubicle
walls, we, you know, tear down all those
barriers, we put everyone together in that open
format.
And this will somehow increase that collaboration.
But what they ended up finding in places
that had the open office format
was
now that people were so close together, they
were actually
hesitant
to have open conversation because,
one, they didn't want to be loud and
disturb others, but also
there was no privacy. And so they didn't
know who was listening to their conversation, so
they just didn't talk. On the flip side,
people that, you know, had a personality of
saying, like, well, I don't care who's around
me, I'm gonna speak loudly, take my meetings,
all of this,
They would go full force and, you know,
be like, you know, have their meeting and
talk loudly and all those things and other
people around them would get annoyed. And so
then when
someone else says, you know, I don't wanna
be like that, right, and so they then
hesitate to
speak up properly when they're on a video
call or they're at their desk. And so
now what ends up happening is instead of
having people co located
and working in that open office concept, all
the meeting rooms start to get booked because
everyone's running away to try to get some
sense of privacy, some sense of separation.
And so,
theoretically, yes, there should have been an increase
in collaboration,
but stopping and assessing the consequence of the
consequence, right? That second order thinking,
you start to see that what you incentivize
might not align with the problem that you're
trying to solve. That requires, again, that level
of upstream thinking and how we're going to
approach
this problem that we're dealing with. Alright. So
that's 3 things I learned from the book
Upstream by Dan Heath. Got the link to
the book in the show notes below. And,
as always, if you enjoyed this video, if
you found it useful, beneficial,
please hit the Like and Subscribe button and
share it with a friend. And see you
guys in the next video.