The logic of failure

At the talk he gave last week, Cass Sunstein warmly recommended [amazon_link id=”0201479486″ target=”_blank” ]The Logic of Failure: Recognizing and Avoiding Error in Complex Situations [/amazon_link]by Dietrich Dörner. So warmly that I bought a copy and read it on my train journeys yesterday. It’s a very good account of what goes wrong with decision-making in complex situations – including any economic context – although I wouldn’t be quite as glowing in my praise as Prof Sunstein was. Still, definitely one to read, along with [amazon_link id=”0300144709″ target=”_blank” ]Nudge[/amazon_link], [amazon_link id=”0007256531″ target=”_blank” ]Predictably Irrational[/amazon_link], [amazon_link id=”000731731X” target=”_blank” ]The Invisible Gorilla[/amazon_link], [amazon_link id=”1846144744″ target=”_blank” ]Risk Savvy[/amazon_link], [amazon_link id=”0141015918″ target=”_blank” ]Gut Feelings [/amazon_link] etc etc., if the issue of decision-making is of interest to you.

[amazon_image id=”0201479486″ link=”true” target=”_blank” size=”medium” ]The Logic of Failure: Recognizing and Avoiding Error in Complex Situations[/amazon_image]

Some of the psychological territory it covers is familiar from the now-ample behavioural economics literature. This includes the difficulty of making calculations, the salience of recent events or things we just happen to have noticed, the problem of limited attention. However, less familiar was the diagnosis of how hard many people find it to take account, not only of interactions between variables, but also dynamics – it seems almost impossible for many people not to extrapolate in straight lines, and not to be too impatient to wait for feedback.

The book uses the results of lab experiments to illustrate the point over and over, including very simple challenges like including a time delay between setting a regulator dial and achieving the target temperature. The relationship between dial and degrees C is simple and linear in this example, but only one participant is patient enough to wait for the response to her first moves of the dial before finding the right setting. This inability to wait is obviously a near-universal characteristic. Certainly, my husband has this issue with every shower he gets into despite my calmly explaining it to him many times, and ends up with the totally predictable oscillating temperatures as he over-reacts to short-term feedback. (Of course, he does have the patience to be married to an economist.)

The book concludes that people can learn to be better decision makers but concludes with a very long list of the traits that we need to acquire to achieve good outcomes in non-linear dynamic and complex contexts with limited information i.e. the world. I finished reading it feeling more pessimistic. There are many examples given of participants in experiments who concluded that it was efficient to have inflicted a famine on a country on the computer, or that a bad outcome was the result of a conspiracy (by the computer!) against them. As the world is ever more replete with instant feedback, what are the chances of getting a more patient and psychologically sophisticated politics?

4 thoughts on “The logic of failure

  1. I find it amusing that a BBC Technology correspondent is having problems with these new-fangled shower thingies…

    Maybe we should just outsource decision-making to computers, who lack our psychological shortcomings – something like Asimov’s Multivac.

  2. In my last year or so as a teenager not long after Stalin died I had an interesting time often helping to move a full scale Armoured Division around Germany in field conditions luckily in peace time. What was critical was to be fully aware of what was happening out there and if errors arose to correct them fast and this meant admitting errors and acting immediately. Later in the civilian world in almost all organisations these principles did not seem to apply. Often reactions were perverse involving blame games, deceits, cover ups etc. etc. or simple bull headed stupidity in carrying on regardless. In the private sector this often did show later in accounts or failures. In the public sector things tended simply to go on and on and any admission of error was altogether an alien concept. Perhaps it is why so often coups are run by the military.

  3. Pingback: A week of links | EVOLVING ECONOMICS

  4. Pingback: "A culture of mistakes" | Homines Economici

Comments are closed.