Showing posts with label irrationality. Show all posts
Showing posts with label irrationality. Show all posts

Saturday, 31 August 2013

"Thinking, fast and slow" by Daniel Kahneman

This book is a sort of retrospective of the life's work of Daniel Kahneman, a psychologist who won the Nobel Prize for Economics by replacing the rational agent of classical Economics with a real person.

It seeks to divide human thinking into two systems; there are three ways he does this. In part one he compares the fast, intuitive ways in which we think with the slow, effort-full, rational ways of thinking. Because rational thinking is hard work we lazt humans tend to default to intuition which makes us more gullible. In part two he compares the rational agent of Economics (the 'Econ') with Humans. Part three pits the present against the past and shows how what we remember about what we experienced is rarely the same as what we experience.

Kahneman recounts the hundreds of experiments he has conducted during his long career. Many of them offer profound insights into how humans operate. It is clear that if one wishes to improve communications, improve pedagogy, or manipulate people better, there are lessons to be learnt.

One little quibble: many of these experiments were conducted with someone called Amos. Because I had skipped the introduction (I often read introductions at the end because I believe that if a book is strong enough it should stand without the introduction) I did not know who he was talking about. It was Amos Tversky, a long time collaborator, now dead. In some ways this book is Kahneman's tribute to Mr Tversky.

One thing I loved was the evidence base. Many of these experiments prove counter-intuitive conclusions. This hammers home the fact that we are not who we think we are. This book has a massive evidence base for Kahneman's view of humans, in contrast to the scarcely visible evidence base of Roger Scruton's The Uses of Pessimism.

A lot of what Kahneman has to say I have encountered before, for example in Nudge which was written by Richard Thaler and Cass Sunnstein, both former colleagues of Kahneman. For example, I knew about anchoring and that algorithms give better long-range forecasts than expert opinions. Nevertheless, this is a brilliant introduction to these ideas  if you are new to the topic and possibly the most comprehensive text I have encountered (but also try Irrationality by Stuart Sutherland). My only real quibble is that Kahneman does not go into sufficient detail about Bayes Theorem (for which you need The Signal and the Noise by Nate Silver).

Not only is this a book full of fascinating ideas but it is also extrememly readable. August 2013; 418 pages.

Sunday, 8 February 2009

Irrationality by Stuart Sutherland

This book reviews over a hundred psychology experiments which show people behaving in irrational ways. It is well written and amusing and fairly easy to read; it also gave me sometimes scary insights into how people can foul up.

I have become much more aware of how important psychology is and how clever some of their experiments are. They really cut out "common sense" and all our vapid saloon bar theorising to show you exactly how daft we are.

But it is much more important than that. Sutherland shows that irrationality is the reason why engineers make planes that crash and nuclear power stations that fail; irrational generals slaughter thousands needlessly, and irrational management techniques cost millions.

The basic error is the 'availability effect'. Our brains cannot hold much evidence at one time when we are trying to make a decision so we tend to base decisions on the evidence that is most available. Often that it what we have just been told so, for example, a salesman can manipulate our decision to buy by providing evidence in favour of a purchase while keeping quiet about the opposite. One really dark side of the availability error is that we tend to notice the unusual so we associate unusual people such as minorities with unusual behaviours such as deviant behaviours like criminality.

Then there is the 'halo' effect. If you're good looking, you must be smart as well, and athletic, and ...

Another example of irrationality is 'Groupthink'. People in a group tend to leave the responsibility for decisions to others. This explains why a crowd of witnesses is less likely to go to the aid of a person being attacked than a single person; each individual in the crowd waits for someone else to make the first move. But Groupthink makes management decision making very difficult especially when there is a leader: everyone tries to think the way they think the leader wants them to think. Yes men are dangerous. So are financial advisers: they follow the herd. Even harder for the leader is that he tends to think he is right more often than he actually is (partly because he only remembers his successes, partly because he ignores all the successful things that might have happened had he not led the organisation along a different route) and that people find it very hard to backtrack on a public pronouncement.

Making it even harder to change direction is the 'sunk cost error'. If you buy a case of wine at £6 a bottle and it rises in price to £60 a bottle would you drink it or sell it? Most people think the wine is still worth only £6 a bottle so they would be happy to drink it even if they would never ever buy wine for more than £10 a bottle. The sunk cost error is particularly prevalent with watching bad films or bad plays: "I've paid £20 a ticket for this so I will sit through 3 hours of boredom rather than quitting at half time" although £20 and 1.5 hours boredom would be better than £20 and 3 hours boredom. As Sutherland says: "the past is finished, it cannot be changed, and its only use is that it may be sometimes possible to learn from it." (p 55) and "all that matters is ... future gains and losses" (p 71). After losing 57,000 men on the first day of the Somme Haig continued to attack... Poker really exercises the sunk cost error

A counterintuitive experiment is the one that found that paying for an altruistic act (such as giving blood) makes people devalue the act and makes them less likely to repeat it. Therefore school prizes are likely to make the prize winner value their achievement less (and makes the losers jealous). Sutherland points out that it is good for a teacher to give constructive feedback on a piece of work (as detailed as possible) but that grades are counter-productive.

There is the 'plausibility error'. When someone tells you something that is true you are more likely to believe his next statement. Liars use this trick. Advertisers make statements which will be believed by their target market (the example given is "dogs are just like people") before following up with less verifiable product claims.

People can't do probabilities. When judging situations they all too often think of the things that did happen and not of the cost of missed benefits of the things that might have happened instead (but didn't); this is an example of the availability error. People are dreadful about estimating risk and they are also over-confident that they are right: everyone from financial advisers to doctors over-estimate their success rate

Other irrationalities include:

  • "the fallacy of like causes like" (p 133)
  • Boomerang effect: eg Brexit? Cognitive Dissonance? "Once someone is strongly committed to a belief ... contrary arguments may only serve to increase its strength" (pp 35 - 36) 
  • Anchoring effects (p 168)
  • Failing to spot regression to the mean (p 210)

He is particularly interesting about cognitive dissonance although he doesn't call it that. He notes that "People strive to maintain consistency in their beliefs, often at the expense of the truth." (p 65) (an example is the halo effect (p 65)) and so "People try to prove that their current hypothesis is correct" (p 99). Thereupon "evidence favouring a belief strengthens it while when the same evidence disconfirms a belief, it is ignored: the belief remains intact." (p 107) which means that "anyone who has made a decision is usually extremely reluctant to change it, even in the face of overwhelming evidence that it is wrong" (p 95)

He proposes an adaptation of the PMI technique to make decisions. Write down all the pros and cons on a piece of paper. By each one write their 'utility value' (ie score how pro or how con they are) and their probability of occurring. Even if you just estimate these it will be better than intuition based on a few pieces of evidence. Then tot up the overall scores.

But he recommends as well doing 'multiple regression analysis'. This calculates the r score of each separate factor that might be used to judge performance (based on past records). In the future each factor score will then be weighted by the r factor; these weightings will then be totalled. He claims that this will always give better results than humans working by 'intuition' even though it is often rejected by humans because it doesn't give perfect results (further evidence of irrationality).

This was a fascinating book with massive implications for management.

There are also some wonderful snipes at a wide variety of targets:

  • A uniform: (p 48) "may give people an inflated sense of their own importance"; "is socially divisive"; and encourages people "to behave in extreme and irrational ways"
  • "games played between different countries ... far from promoting amity between them, will only foster animosity." (p 50)
  • "'We can't do that: it will set a precedent' ... is wholly irrational. The proposed action against which it is directed is either sensible or not. If it is sensible, taking it will set a good precedent; if it is not sensible, the action should not be taken." (p 55)
  • "America has become a nation of masochists who spend hours aimlessly jogging and deprive themselves of all but the nastiest foodstuffs." (p 90)
  • "As a diagnostic tool the Rorschach ... is virtually worthless" (p 118)
  • "It is impossible to learn anything about someone's personality from the way he draws a person." (p 120)
  • "Graphology does little if any better than chance" (p 121)
  • "A lot of knowledge is a dangerous thing - it may not increase accuracy, but it does lead to false confidence." (p 176)
  • "Financial advisers on average do consistently worse than the market in which they are investing" (p 176)


It is rare to find a book so thought-provoking and at the same time so beautifully written. This was a pleasure to read.


Feb 2009; 238 pages