The Little Black Book of Decision Making
eBook - ePub

The Little Black Book of Decision Making

Making Complex Decisions with Confidence in a Fast-Moving World

Michael Nicholas

Share book
  1. English
  2. ePUB (mobile friendly)
  3. Available on iOS & Android
eBook - ePub

The Little Black Book of Decision Making

Making Complex Decisions with Confidence in a Fast-Moving World

Michael Nicholas

Book details
Book preview
Table of contents
Citations

About This Book

The secret to making the right call in an increasingly complex world

The decisions we make every day – frequently automatic and incredibly fast – impact every area of our lives. The Little Black Book of Decision Making delves into the cognition behind decision making, guiding you through the different ways your mind approaches various scenarios. You'll learn to notice that decision making is a matter of balance between your rational side and your intuition – the trick is in honing your intuition to steer you down the right path.

Pure reasoning cannot provide all of the answers, and relying solely on intuition could prove catastrophic in business. There must be a balance between the two, and the proportions may change with each situation. This book helps you quickly pinpoint the right mix of logic and 'gut feeling, ' and use it to find the best possible solution.

  • Balance logic and intuition in your decision making approach
  • Avoid traps set by the mind's inherent bias
  • Understand the cognitive process of decision making
  • Sharpen your professional judgement in any situation

Decision making is the primary difference between organisations that lead and those that struggle. The Little Black Book of Decision Making helps you uncover errors in thinking before they become errors in judgement.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is The Little Black Book of Decision Making an online PDF/ePUB?
Yes, you can access The Little Black Book of Decision Making by Michael Nicholas in PDF and/or ePUB format, as well as other popular books in Personal Development & Personal Success. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Capstone
Year
2017
ISBN
9780857087058

Part One
No Place for Old Dogs: New Tricks Required

1
Let's Get Real: We All Make Mistakes

At 11.38 a.m. on 28 January 1986, the NASA space shuttle Challenger took off from Kennedy Space Centre at Cape Canaveral, Florida. Seventy-three seconds later, as it broke up, the liquid hydrogen and oxygen that was by then streaming from its ruptured fuel tanks explosively caught fire and enveloped the rapidly disintegrating spacecraft. The deaths of its seven crew members – including Christa McAuliffe, who would have been the first teacher into space – in such a catastrophic and shockingly visible way may well be the reason why this disaster, despite it having no real impact on the lives of the vast majority of those observing it, became the third fastest spreading news story ever.
Following the accident, U.S. President Reagan rapidly set up a special commission (known as the Rogers Commission, after its chairman) to investigate it. The consensus of its members was that the disintegration of the vehicle began after the failure of a seal between two segments of the right solid rocket booster (SRB). Specifically, two rubber O-rings designed to prevent hot gases from leaking through the joint during the rocket motor's propellant burn failed due to cold temperatures on the morning of the launch. One of the commission's members, theoretical physicist Richard Feynman, even demonstrated during a televised hearing how the O-rings became less resilient and subject to failure at the temperatures that were experienced on the day by immersing a sample of the material in a glass of iced water. There is no evidence that any other component of the space shuttle contributed to the failure.
I've found, from years of asking participants in my decision-making workshops, that most people's memory of that day aligns with the summary in the paragraphs above. Though relatively few are aware of the precise name of the actual component involved, they consistently remember only the seal failure. This root cause appears unambiguous. So why would the Rogers Commission have concluded, as they did, that the key factors contributing to the accident were NASA's organisational culture and decision-making processes, not the technical fault? We need to take a deeper look.

First Appearances are Often Deceptive

Full details of the events leading up to the Challenger disaster are a matter of public record,1 so I won't recount them in detail here. Bear in mind as you read the string of glaring errors below that this was the same organisation that achieved the incredible feat of landing men on the moon and returning them home safely, and which resolutely refused to succumb to the enormous challenges it faced in getting the stricken Apollo 13 crew back home safely when that mission suffered an oxygen tank explosion over two hundred thousand miles from Earth.
Let's return to that ill-fated Tuesday morning in January 1986. Several key facts shed light on the finding of the Rogers Commission that decision-making errors were at the heart of the catastrophe:
  • The O-rings had not been designed for use at the unusually cold conditions of the morning of the launch, which was approximately -2°C. They had never been tested below 10°C, and there was no test data to indicate that they would be safe at those temperatures (which were around 14°C lower than the coldest previous launch).
  • NASA managers had known for almost a decade, since 1977, that the design of the shuttle's SRB's joints contained a potentially catastrophic flaw. Engineers at the Marshall Space Flight Centre had written to the manufacturer on several occasions suggesting that the design was unacceptable, but the letters were not forwarded to Morton Thiokol, the contractor responsible for construction and maintenance of the SRBs.
  • Engineers raised specific warnings about the dangers posed by the low temperatures right up to the morning of the launch, recommending a launch postponement; but their concerns did not reach senior decision makers. The night before the launch, Bob Ebeling, one of four engineers at Morton Thiokol who had tried to stop the launch, told his wife that Challenger would blow up.2
  • In 1985, the problem with the joints was finally acknowledged to be so potentially catastrophic that work began on a redesign, yet even then there was no call for a suspension of shuttle flights. Launch constraints were issued and waived for six consecutive flights and Morton Thiokol persuaded NASA to declare the O-ring problem “closed”.
  • While the O-rings naturally attracted much attention, many other critical components on the aircraft had also never been tested at the low temperatures that existed on the morning of the flight. Quite simply, the space shuttle was not certified to operate in temperatures that low.
  • It seems that one of the most important reasons why NASA staff opposed the delay may have been that the launch had already been delayed six times. Two of its managers have been quoted as saying, “I am appalled. I am appalled by your recommendation”, and “My God, Thiokol, when do you want me to launch?”3
With this broader awareness it is easy to recognise that the technical, and obvious, “cause” of the accident – the O-ring failure – was really just an outcome of the complex structural problems arising from the relationships between the parties involved. Now, I expect that the Commission's conclusion seems completely unsurprising:
Failures in communication … resulted in a decision to launch 51-L based on incomplete and sometimes misleading information, a conflict between engineering data and management judgments, and a NASA management structure that permitted the internal flight safety problems to bypass key Shuttle managers.4
A report by the U.S. House Committee on Science and Technology went further. It agreed with the Rogers Commission on the technical causes of the accident, but was more specific about the contributing causes:
The Committee feels that the underlying problem which led to the Challenger accident was not poor communication or underlying procedures as implied by the Rogers Commission conclusion. Rather, the fundamental problem was poor technical decision-making over a period of several years by top NASA and contractor personnel, who failed to act decisively to solve the increasingly serious anomalies in the Solid Rocket Booster joints.5

The Problem with Hindsight

In examining the events leading up to the Challenger accident, it would be completely understandable to have the urge to scratch your head and wonder how so many obviously intelligent people (we are talking about rocket science, after all) could have displayed such apparent ineptitude. How did NASA, an organisation that places such importance on safety, end up so flagrantly violating its own rules and appear to have so little regard for human life?
“Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”
—Daniel Kahneman, Nobel Prize-winning Professor of Psychology and international best-selling author on judgment and decision making
When a decision has gone badly, the benefit of hindsight often makes the correct decision look as though it should have been blindingly obvious. But once you are aware of this bias, you'll see it everywhere – from the immediate aftermath of the horrendous terrorist atrocities in Paris in November 2015, where the press began questioning how intelligence services had failed to anticipate the attacks as soon as the “facts” leading up to them began to emerge, to football supporters who believe they have far greater expertise at picking the team than the manager, to the times when we second-guess our own decisions: “I should have known not to take that job”, “I knew the housing market would collapse/go up”, “I should have known that he was being unfaithful to me”, “I knew that if I trusted her she'd hurt me”, “I should have listened to my intuition”, and on it goes …
This “hindsight bias” refers to the tendency for uncertain outcomes to seem more likely once we know the outcome that has occurred. Because of it, we are prone to view what has already happened as relatively inevitable and obvious, not realising how the information about the outcome has affected us.
One of the first psychologists to investigate hindsight bias was Baruch Fischoff who, together with Ruth Beyth, used President Richard Nixon's historically important 1972 diplomatic visits to China and Russia as the focus for a study. Before the visits took place, participants were asked to assign probabilities to 15 possible outcomes, such as whether the U.S. would establish a diplomatic mission in Peking or establish a joint space programme with Russia. Two weeks to six months after the visits had taken place, the same people were asked to recall what their earlier predictions had been. The results were clear. The majority of participants inflated their estimates for the outcomes that had occurred while remembering having assigned lower probabilities to those that had not. This bias also became stronger as the time between the initial prediction and the recall task...

Table of contents