Saturday, 16 February 2013

The Big Decisions Checklist


Friday 16 November, 2012
There are a multitude of sins that plague humans when making decisions. We suffer from biases, fall in love with our own recommendations and focus on the exception rather than the rule when evaluating the best course of action. We need a robust decision-making process that levels the playing field, to move ourselves from Gut-Feel to Gut-Fact.
The Big Decisions ChecklistIn the movie Moneyball, Brad Pitt plays the Oakland A's cash-strapped baseball manager. The movie is a pretty good portrayal of the book, which is about the fallibility of ‘expert judgement', and how good decisions are part data, and part gut-feel. Moneyball told the story of baseball's journey that many businesses travail themselves, it was an entertaining story but didn't examine why the experts chewing tobacco got it so wrong for so long.
In an article about the book in The New Republic, the authors ask, "Why do professional baseball executives, many of whom have spent their lives in the game, make so many colossal mistakes?". "They are paid well, and they are specialists. They have every incentive to evaluate talent correctly. So why do they blunder?"
Enter Daniel Kahneman and Amos Tversky, scientific psychologists, who won the Nobel Prize for economics. Tversky has passed, but Kahneman has codified a lifetime of work in judgement and decision-making in the book Thinking Fast, and Slow.
Kahneman covers the multitude of sins that plague humans when making decisions. Covering statistics and probability as well as psychology with a simple and effective writing style, he elucidates the reasons for why we get it so wrong. Here is a small subset of the human error we suffer:
  • Availability heuristic - That is, making decisions on information that is immediately available
  • Prospect theory - Being risk averse when a decision is framed as a loss, and risk seeking when the same decision is framed as an opportunity. Explaining why people buy insurance and lottery tickets...
  • Confirmation bias - Looking for evidence that confirms our hypothesis and mental models of the world
  • Anchoring - Just one piece of information given up-front causes people to use that data point - however irrelevant as a starting point

Can we reduce our biases?

Mckinsey1 thinks we can, having studied organisations who worked to eliminate biases in decision processes actually performed better than the market in terms of shareholder returns. Reducing bias makes a difference.
To illustrate how biases can be endemic in organisations, Kahneman recounts a story about facilitating a corporate meeting where each individual manager was asked whether they would take a risk with a reasonable probability of loss. Each said no, they would not take the risk.
But the CEO was shocked - having 10 or so executives not taking risks means the organisation as a whole was not taking on enough risk. One or those projects would likely have big pay-offs.
The point is the skewed relationship of risk. The majority of the executive's projects will either fail or not deliver significant returns, and certainly negatively impact their careers. But the CEO needs a portfolio of risk across his managers.

Processes and checklists

What are we to do if all of us as experts are probably wrong? We need good processes.
Atul Gawande's The Checklist Manifesto, chronicles the medical profession and resistance of experts as to the cause of disease. He writes of how, the most simple of processes - a surgeon washing his hands - drastically reduced deaths in hospitals. That was over 200 years ago now, and checklists are often still resisted by experts.
Checklists, like business processes, are built up over time with the accumulated knowledge of an industry - I can't just make a check list for my doctor. But with analysis and insight we can determine those factors which most contribute to improve performance.
Billy Bean found it was 'getting on base' - not home runs - that made the difference in baseball. Sometimes, when the metric is confused with the goal, we get tripped up in a decision-making. The executives want to avoid risk, but taking on risk isn't the goal. A profitable company - now and in the future - is.
Instead we need to create an environment that focuses on results through the use of good processes. If the CEO's executives fail in their project, but followed good process and due diligence why punish them? If a CEO makes egregious judgement calls but makes money, do we reward them?
The CEO needed to create an environment that rewards a 'process' of prudent risk taking and not punishing unlucky failure, or rewarding sheer luck, as opposed to good process and due diligence.

Checklist for your next big decision

  • Check the motivations of the recommender

    Warren Buffett doesn't use investment bankers and consultants to find him deals, because they are usually paid a commission by the seller. Of course they will suggest he buys this great business.
  • Check for the affect heuristic

    Simply put, we fall in love with our recommendations.
  • Check for the availability heuristic

    The simple immediately available story. What is more likely - dying in a car accident or shark attack? Car of course, given the number, but we all recall the shark attack blown up in the media. Business cases often cite examples and testimonials, which exploit this very bias.
  • Consider alternatives

    Have alternatives really been evaluated? When we fall in love with the hypothesis we don't really evaluate the competing solutions legitimately.
  • In a perfect world, what data would you ask for?

    Too often we believe the data gathered is the only data available. Dig a little deeper, and perhaps initiate a pilot and gather some empirical evidence one way or the other.
  • Evaluate the source of the data

    Is it credible? Vendors will often 'low-ball' or 'start-high' to invoke the anchoring affect, where we now reference all future negotiations against that price.
  • Can you see a halo affect?

    The halo affect is common in interviews, where the candidate presents and speaks well, but may not objectively have met the benchmarks.
  • Experts too steeped in the past

    Just like the baseball managers, who having been players themselves, just 'know' what a good player looked like.
  • Check for the 'base case'

    This refers to the data points. Shark attacks are far more rare than car fatalities. In practical business terms, we have the planning fallacy, where the base case for similar projects to be completed is usually far longer than an ‘inside' component view.
  • Are you paranoid enough?

    Worst case thinking or sensitivity analysis is a common part of decision-making processes. But is bad enough?
It's been said that science advances one funeral at a time. If scientists - the bastions of objectivity - fall victim to bias, then without support and impartial umpires what hope do the rest of have?

Implementing quality control in decision-making

  • Create a checklist

    A checklist above should be developed and built into the organisations business case processes.
  • Ensure a separation of duties

    An objective third party, not the decision-maker or recommender should be evaluating the proposals. It seems like bureaucracy, and if not careful could become just that, but when millions of dollars are at stake it becomes an important step.
  • Track and monitor decision quality

    Monitor forecasts made, revisit assumptions. Don't just reward good results of business cases, also consider the bad ones. Consider these questions:

    1. Did the bad outcomes have good decision processes? What were the causal factors of the poor outcome?
    2. Did the good outcome decisions have poor processes and you just got lucky? This is an important question and one that needs more discipline, otherwise these bad behaviours and processes are likely to contribute to significant losses.
1"The Case for Behavioral Strategy" - McKinsey Quarterly, March 2010


Source:ceo-online.com

No comments:

Post a Comment