On my travels to and from the fabulous Kilkenomics Festival this weekend, I read by Philip Tetlock and Dan Gardner. It’s a very interesting book. If you liked Daniel Kahenman’s , Nate Silver’s and Gerd Gigerenzer’s , you’ll like this book too.
[amazon_image id=”184794714X” link=”true” target=”_blank” size=”medium” ]Superforecasting: The Art and Science of Prediction[/amazon_image]
Essentially the book describes the outcome of the research project that followed on from Tetlock’s famous demonstration in that experts are not good at predictions. One group of experts failed to do better than random guesswork – and did worse for long term forecasts. Another group did better than this but could rarely beat a simple rule of thumb such as ‘predict no change’ or ‘extrapolate the trend’. The follow up project was stimulated by the introspection of the US intelligence community post 9/11, looking to see whether it would be possible to make forecasts of political or economic events any better than the earlier dispiriting results suggested.
The bulk of the book explains that yes, it is, but it’s hard work requiring techniques and habits that help people avoid the normal cognitive short cuts (‘fast thinking’) we humans take. For example, make sure you start with what the authors call ‘the outside view’. Looking at the drop in popularity of a prime minister after an election? Start out by seeing what has happened to the ratings in the past. Watch out for the ‘bait and switch’ habit of answering an easy question rather than the hard one. Break up complex questions into smaller questions to narrow the territory of your ignorance. Take as many different perspectives as you can. Consult others and welcome diverse views – be on the alert for groupthink. Be prepared to change your mind. Be alert to conclusions based on your strong feelings or beliefs about an issue (the Keynesians vs Austerians debate is singled out as one where the participants are captive to their prior beliefs). All of the tips are gathered in a how-to-be-a-superforecaster appendix to the book.
Summarised like this, it sounds obvious perhaps. But the book is stuffed with examples demonstrating how hard it is to put the advice into practice. Indeed, the experiment showed that some people can do far, far better than the majority, including ‘experts’ – but there are not many of them. They have specific characteristics: for example, clever (but not Mensa geniuses), open-minded, self-critical, numerate, comfortable with probabilities, willing to change their mind – plus determination. Tetlock is still looking for volunteers to have a go (www.goodjudgment.com).
The book ends with a discussion of two critiques. One is whether Nassim Taleb’s Black Swans imply superforecasting is a chimera – if history moves in jumps because a black swan appears, that must be unforecastable. The other is Daniel Kahneman’s hypothesis that even superforecasters will lose their mojo because their very success will make them as vulnerable to the same cognitive patterns as the ‘experts’ who did no better than randomness or algorithms; we are all vulnerable to complacency or ‘fast thinking’. Tetlock concludes that history does show that the possibilities for the future are radically open but nevertheless argues that his results show that: “People can, with considerable effort, make accurate forecasts about at least some developments that really do matter.”
He does, for me, describe his results convincingly, although the ‘considerable effort’ bit seems a huge barrier to seeing superforecasting habits spreading more widely. But it doesn’t matter if you end up agreeing more with the critiques; this is still a very useful guide to cultivating your own good cognitive habits and critical thinking abilities. I’ll be trying to put the lessons here into practice myself – although not to the extent of starting anything foolish like macroeconomic forecasting.