Making “risky” decisions

0

Everything in life is “risky” – life is uncertain and even a “no-brainer” can produce unexpected results.

So how do you make better decisions, when there can be no certainty about the “best” decision?

There’s a perception that improving decision-making is easy – just “be logical”.  It’s said by those who have read a little about decisions, logic or behavioural economics, but a little knowledge is a dangerous thing!

Among the problems with “logic” are:

  • We don’t actually control our minds much of the time – we do things automatically (look up “Stroop test” videos on the internet), make decisions before we know we have them to make (Soon et al, 2008), unconsciously include irrelevant facts in our decisions (Little et al 2007).
  • Life includes quantum uncertainty, complexity and chaos that mean we don’t actually know what events arise from what causes.
  • We evolved in a way that didn’t develop logic, it isn’t part of our “mental toolkit”.
  • It’s “logical” to consider the relevant information”. How do we know in advance what’s relevant?
  • We are over-confident that we can predict the future and what is important/relevant (Coelho, 2009) and suffer from something called Fundamental Attribution error. Basically, we assume that we succeed because we’re brilliant and fail only because we were unlucky, but that other people succeed only because they are lucky and fail because they are stupid.  If you doubt it, consider whether you think you are a better than average driver (or lover!)  About 73% of people think they drive better than average (the lover average is over 90%!) – not a help to mathematical validity about prediction and consequent decision logic.

That isn’t to say that we can’t use logic, but it isn’t as easy, or as useful, as people often claim.

We think practice makes perfect, but it only makes permanent.  Only practice with feedback makes perfect

So what do you do?

Here are three methods.

1. Make a model of the decision maker.

Meel (1954) did this, and in 60 years nobody has found it flawed. However, very few organisations use it and it’s often like pulling teeth to get clients even to consider using it – but every time one does, their decisions improve.

You consider data about factors in decisions, and the decisions of experts, and make a simple equation of it.  It was done first with oncologists.  They had thousands of examples of tumours, and about 10 pieces of information (X-rays, symptoms, scans etc.) on each.  The dozens of experts had to decide, is this tumour benign or malignant?  Their decisions were analysed, and compared to the actual findings after exploratory operations.  The first surprise was that the experts thought they used all the information, but actually they used at most three or four factors.  The second was that the model out-performed the experts.  When the experts and the model were tried on a fresh set of tumours, the best expert was right 69% of the time, the model was right 71%!

Nobody can believe it, but it works in medicine, psychology, law, finance, everywhere it is tried, the model of the expert is better than the expert.

2. Develop intuition.

That works when you can develop intuition, as in fire-fighters, soldiers etc., the subjects of much of Klein’s research.  The development of expertise is helpful, where the physics of a fire don’t change, where military strategies are a constant and one gets reliable feedback as a consequence.

But in dealing with changing situations (technological change, social change etc.) it can be tricky because the feedback isn’t reliable.

We think practice makes perfect, but it only makes permanent.  Only practice with feedback makes perfect.  Practice your tennis serve and you see whether the ball goes in or out, so you can adjust and improve.  In organisations, because of complexity, chaos etc. you can’t work out what the cause effect links are.  So making decisions in the real world is like practicing tennis blindfold, you can’t be sure whether the ball you hit went in or out or why, maybe it hit a seagull and bounced back in.  You don’t know, so all that happens is that you keep making decisions (serving) and there are consequences (the ball goes somewhere), but you don’t know whether it went where it did because of the way you hit it or something else.  So all you do is groove a particular way of making decisions (a service action) – but you don’t know whether you’ve got better, all you know is that is the way you serve.

3. Use group knowledge.

At first, the idea of all pulling together sounds great!  However, leaders in organisations should familiarize themselves with the findings of such as Asch, Janis and Stanley Milgram.

In groups, people have the tendency to obey, conform and gravitate towards agreement with one another’s’ views.  This is embodied in the classic “little knowledge is a dangerous thing” belief that all groups, form, storm, norm and only then perform. This frequently leads to a norm that can eliminate common sense – most groups are better with storming, as they don’t then stagnate.

But if you dare to question the group paradigm of norming, people (such as boards of directors), get upset.  The dissenter becomes a ‘whistle-blower’, ‘rebel’ or a heretic; who has the temerity to suggest the current orthodoxy is blinkered and self-serving. The rationale is self-preservation, but the cognitive-myopia involved has nothing to do with logic, it simply tries to justify poor decisions made to protect the ‘norm’; the status quo.

It does not matter how clever individual members of the team are, they cannot know everything.   In fact, if all the team members are clever in the same way, (for example they all have Harvard MBAs or the same accountancy qualification), there will be a tendency to see the world and any problem in a similar way.  The problem will be identified and defined in terms of the group’s norms; the group’s paradigm. This means that the decision making ability of the group is far less than the sum of its parts.  Three experts on a particular subject ‘X’, even if they have considerably wider vision and greater knowledge than other mortals, are not going to produce three times as much expertise. In fact, in situations which need creativity and divergent thinking, the ‘output’ maybe one-third!

If we consider the hunter-gatherer environment in which we evolved, and possible events (threats) emerging from the plains, we can model the phenomenon of cognitive narrowing using the analogy of a hunter’s 360 degree field of vision. The team has a huge blind spot.

 

 

You probably can’t reduce that blind spot by throwing more expertise at it. But that’s what we all tend to do, “Get another expert view” – which really means, “Confirm what we already think”, and reinforce the norm.

Even with larger groupings, decisions are usually made by a small number of very similar people who have “normed”. These ‘micro’ groupings might consist of only three individuals out of a total Board strength of 10 or 12. Everyone else conforms, obeys, fits in with this norm or gets out.

And of course humans are social animals.  We like people like us, in looks, clothes, tastes, opinions, and world view.  When we select staff, we often pick people like us, because we feel comfortable with them.  In the same way, we find it much easier to agree with somebody who thinks our assessment of risk is masterly, or that our investment policy is inspired. If we took the time and effort to attempt to examine the decision more objectively, (which, being human, we can’t do), we would probably realise that the person who disagreed had a point and might be giving more useful guidance than the one nodding in agreement.

Imagine that we deliberately go against the norm.  Maybe we don’t choose such extremely clever and expert people.  Perhaps they only have 90 or 100 degree vision, not 110 degrees.  But they have different perceptions and interpretations of the world, analyse differently, and generate different options.  What happens, using the hunter-gather analogy, is:

 

It’s good to have a smaller blind spot, ask the banks (and Chancellors of the Western economies, Chair of the Bank of England, Head of the Federal Reserve etc.!)

Conclusion

This is not an area where a little knowledge pays off – it’s too easy to make expensive mistakes.  What it takes is a knowledge of maths, and a very sound understanding of individual and group psychology.  This makes it possible, for example, to create decision models, elicit the tacit knowledge of domain experts and manage conflicts (keeping them on an intellectual level, not about personalities).

That way it’s possible to have a reliable decision process, take advantage of intuitions and make the perception and interpretation of fact and the generation of solutions more democratic and effective.  And at that point, you can create a prediction market (Wolfers & Zitzewitz, 2004) to collect together the information available to inform decisions (for which a mathematician is also useful, although not as useful as a good bookie!)

Kim is a Chartered and registered Occupational Psychologist.  He’s also qualified an Associate of the Chartered Insurance Institute and a financial adviser.  He provides consultancy and research in the psychology of risk, finance and decision making, also providing executive coaching, selection and development.  He is lead assessor on an MSc in the neuroscience of leadership, the author of Taming the Pound, a simple guide to handling money, and a regular contributor to Sky TV, BBC Radio and TV and to print and online media. 

If you’re a CII member you can see the podcast of Kim’s talk last October to the Insurance Institute of London that illustrates points in this article, and gives more background.

http://www.iilondon.co.uk/Lectures/IILLectures/IILLecturesOctober/tabid/3789/ModuleID/7621/ItemID/4012/mctl/EventDetails/Default.aspx?selecteddate=25/10/2013

References.
Coelho, M. P; (2010), Unrealistic Optimism: Still a Neglected Trait, Journal of Business and Psychology, 25:397-408
Kahneman, D., Tversky, A. (1979).”Prospect Theory: An Analysis of Decision Under Risk”. Econometrica, 47 (2), pp. 263–291.
Kahneman, D. and Klein, G. A. (2009). Conditions for intuitive expertise: A failure to disagree. American Psychologist, Vol. 64(6), pp.515-526.
Kahneman, D. (2011). Thinking, Fast and Slow. Macmillan.
Klein, G. A. (1999). Sources of Power: How People Make Decisions Cambridge, MA: MIT Press.
Klein, G. A. (2004).The Power of Intuition: How to Use Your Gut Feelings to Make Better Decisions at Work. Currency.
Little, A.C, Burrissa, R.P, Jones, B.C, and Roberts, S.C, ( 2007); Facial appearance affects voting decisions, Evolution and Human Behavior , 28 (1), 18–27
Meehl, P; (1954) Clinical versus statistical prediction: a theoretical analysis and a review of the evidence. The University of Minnesota Press.
Soon, C.S; Brass, M; Heinze, H-J and Haynes, J. D; (2008); Unconscious determinants of free decisions in the human brain; Nature Neuroscience; 11, 543 – 545
Wolfers, J & Zitzewitz, E; (2004) Prediction Markets; Journal of Economic Perspectives—18:2, 107–126

Share.

About Author

Leave A Reply