I have become much more aware of how important psychology is and how clever some of their experiments are. They really cut out "common sense" and all our vapid saloon bar theorising to show you exactly how daft we are.
But it is much more important than that. Sutherland shows that irrationality is the reason why engineers make planes that crash and nuclear power stations that fail; irrational generals slaughter thousands needlessly, and irrational management techniques cost millions.
The basic error is the 'availability effect'. Our brains cannot hold much evidence at one time when we are trying to make a decision so we tend to base decisions on the evidence that is most available. Often that it what we have just been told so, for example, a salesman can manipulate our decision to buy by providing evidence in favour of a purchase while keeping quiet about the opposite. One really dark side of the availability error is that we tend to notice the unusual so we associate unusual people such as minorities with unusual behaviours such as deviant behaviours like criminality.
Then there is the 'halo' effect. If you're good looking, you must be smart as well, and athletic, and ...
Another example of irrationality is 'Groupthink'. People in a group tend to leave the responsibility for decisions to others. This explains why a crowd of witnesses is less likely to go to the aid of a person being attacked than a single person; each individual in the crowd waits for someone else to make the first move. But Groupthink makes management decision making very difficult especially when there is a leader: everyone tries to think the way they think the leader wants them to think. Yes men are dangerous. So are financial advisers: they follow the herd. Even harder for the leader is that he tends to think he is right more often than he actually is (partly because he only remembers his successes, partly because he ignores all the successful things that might have happened had he not led the organisation along a different route) and that people find it very hard to backtrack on a public pronouncement.
Making it even harder to change direction is the 'sunk cost error'. If you buy a case of wine at £6 a bottle and it rises in price to £60 a bottle would you drink it or sell it? Most people think the wine is still worth only £6 a bottle so they would be happy to drink it even if they would never ever buy wine for more than £10 a bottle. The sunk cost error is particularly prevalent with watching bad films or bad plays: "I've paid £20 a ticket for this so I will sit through 3 hours of boredom rather than quitting at half time" although £20 and 1.5 hours boredom would be better than £20 and 3 hours boredom. As Sutherland says: "the past is finished, it cannot be changed, and its only use is that it may be sometimes possible to learn from it." (p 55) and "all that matters is ... future gains and losses" (p 71). After losing 57,000 men on the first day of the Somme Haig continued to attack... Poker really exercises the sunk cost error
A counterintuitive experiment is the one that found that paying for an altruistic act (such as giving blood) makes people devalue the act and makes them less likely to repeat it. Therefore school prizes are likely to make the prize winner value their achievement less (and makes the losers jealous). Sutherland points out that it is good for a teacher to give constructive feedback on a piece of work (as detailed as possible) but that grades are counter-productive.
There is the 'plausibility error'. When someone tells you something that is true you are more likely to believe his next statement. Liars use this trick. Advertisers make statements which will be believed by their target market (the example given is "dogs are just like people") before following up with less verifiable product claims.
People can't do probabilities. When judging situations they all too often think of the things that did happen and not of the cost of missed benefits of the things that might have happened instead (but didn't); this is an example of the availability error. People are dreadful about estimating risk and they are also over-confident that they are right: everyone from financial advisers to doctors over-estimate their success rate
Other irrationalities include:
- "the fallacy of like causes like" (p 133)
- Boomerang effect: eg Brexit? Cognitive Dissonance? "Once someone is strongly committed to a belief ... contrary arguments may only serve to increase its strength" (pp 35 - 36)
- Anchoring effects (p 168)
- Failing to spot regression to the mean (p 210)
He is particularly interesting about cognitive dissonance although he doesn't call it that. He notes that "People strive to maintain consistency in their beliefs, often at the expense of the truth." (p 65) (an example is the halo effect (p 65)) and so "People try to prove that their current hypothesis is correct" (p 99). Thereupon "evidence favouring a belief strengthens it while when the same evidence disconfirms a belief, it is ignored: the belief remains intact." (p 107) which means that "anyone who has made a decision is usually extremely reluctant to change it, even in the face of overwhelming evidence that it is wrong" (p 95)
He proposes an adaptation of the PMI technique to make decisions. Write down all the pros and cons on a piece of paper. By each one write their 'utility value' (ie score how pro or how con they are) and their probability of occurring. Even if you just estimate these it will be better than intuition based on a few pieces of evidence. Then tot up the overall scores.
But he recommends as well doing 'multiple regression analysis'. This calculates the r score of each separate factor that might be used to judge performance (based on past records). In the future each factor score will then be weighted by the r factor; these weightings will then be totalled. He claims that this will always give better results than humans working by 'intuition' even though it is often rejected by humans because it doesn't give perfect results (further evidence of irrationality).
This was a fascinating book with massive implications for management.
There are also some wonderful snipes at a wide variety of targets:
- A uniform: (p 48) "may give people an inflated sense of their own importance"; "is socially divisive"; and encourages people "to behave in extreme and irrational ways"
- "games played between different countries ... far from promoting amity between them, will only foster animosity." (p 50)
- "'We can't do that: it will set a precedent' ... is wholly irrational. The proposed action against which it is directed is either sensible or not. If it is sensible, taking it will set a good precedent; if it is not sensible, the action should not be taken." (p 55)
- "America has become a nation of masochists who spend hours aimlessly jogging and deprive themselves of all but the nastiest foodstuffs." (p 90)
- "As a diagnostic tool the Rorschach ... is virtually worthless" (p 118)
- "It is impossible to learn anything about someone's personality from the way he draws a person." (p 120)
- "Graphology does little if any better than chance" (p 121)
- "A lot of knowledge is a dangerous thing - it may not increase accuracy, but it does lead to false confidence." (p 176)
- "Financial advisers on average do consistently worse than the market in which they are investing" (p 176)
It is rare to find a book so thought-provoking and at the same time so beautifully written. This was a pleasure to read.
Feb 2009; 238 pages
No comments:
Post a Comment