Antifragile, pro and anti

[amazon_link id=”0141038225″ target=”_blank” ]Antifragile: Things that Gain from Disorder[/amazon_link], the latest tract from Nassim Nicholas Taleb, left me with mixed feelings. It’s interesting, and I find a lot of his argument intuitively appealing. On the other hand, it really needed a thorough edit – the train of the argument is convoluted and there are horribly self-indulgent passages. It could have been significantly shorter, too. On balance, it’s worth a go but lacks the punch of [amazon_link id=”0141034599″ target=”_blank” ]The Black Swan[/amazon_link] or [amazon_link id=”0141031484″ target=”_blank” ]Fooled by Randomness[/amazon_link].

[amazon_image id=”0141038225″ link=”true” target=”_blank” size=”medium” ]Antifragile: Things that Gain from Disorder[/amazon_image]

Which is a pity, because it would be good for the key ideas in [amazon_link id=”0141038225″ target=”_blank” ]Antifragile[/amazon_link] acquired the same traction in popular thought as the importance of fat tailed distributions (hence the frequency of Black Swans) and the fact that basic probability means luck plays a much bigger role in life than we think. Taleb’s main argument is that just as small tremors release tension along a geological faultline, averting a big earthquake, small setbacks play a useful role in economic and social contexts. It is a good thing for the economy as a whole that some firms fail; the policy manipulation that gave us the Great Moderation (the Greenspan ‘put’ of cutting interest rates whenever the markets declined) built up the imbalances that led to the Great Crash. The book gives many examples of contexts in which small stresses play a healthy, error-correcting role, and over-regulating creates the conditions for big errors.

Along the way, Taleb has many swipes at economics, mainly for its insistence on linear models and normal distributions – and I’m someone who thinks that’s a fair cop. The world is self-evidently non-linear, and it’s alarming that so many policymakers cling to the illusion of control they get from linear thinking – pull this policy lever, and that desirable consequence will follow.

There is one cracking story in the book, where Taleb recounts giving a lecture to Société Générale’s top executives on risk, warning them that the bank was taking massively greater risks than they imagined. The reception was hostile, he reports. Weeks later, SocGen had to liquidate $70bn of assets in a fire sale to cover the losses caused by the trades of Jerome Kerviel. At the talk, Taleb says, he had been “heckled relentlessly by Kerviell’s boss and his colleague, the head of risk management.”

There are some general lessons from observing anti-fragility, Taleb concludes. There are three kinds of context – the fragile (negative feedbacks or concavity), robustness (no feedbacks), and antifragility (positive feedbacks or convexity). Be aware of whether you are in a situation where the distribution of outcomes is curtailed at one end – travel times, for example, have little scope to be shorter than expected and much scope to be far, far longer than expected. Is there more upside than downside? How quickly do the outcomes change – does the time taken to drive from A to B increase by a lot more when you add a second hundred extra vehicles on the road than it did for the first hundred extra vehicles? Guard yourself by following what Taleb calls ‘barbell’ strategies – he means dual strategies avoiding the middle way – put your money 90% into safe assets and 10% into very risky ones, so limit your downside risk and create large upside, but don’t put 100% into medium risk assets which could lose you everything if you miscalculated the risk. But he extends the idea. So also, if you want to be a writer, work in a boring job and leave your free hours for writing, rather than taking a job as a creative writing academic, which will suck out your creative marrow with teaching and admin.

This extension points to one of the book’s weaknesses, which is that it extends arguments that make complete sense at the level of probability and asset prices to other areas, from macroeconomics to the rest of life, where they have intuitive appeal but would probably not convince an unsympathetic reader. So I agree with Taleb that there is too much effort to squeeze the variability out of various contexts, with ultimately damaging consequences, but then I thought that anyway without the paraphernalia of anti-fragility to get me there. I enjoyed reading [amazon_link id=”0141038225″ target=”_blank” ]Antifragile[/amazon_link] despite its bagginess, and other Taleb fans will enjoy it too, but I doubt it will have the wider impact of his previous books.

From dead-end to dynamism

I’ve been reading an interesting, and non-technical, overview of complexity theory as applied to economics, [amazon_link id=”1781951969″ target=”_blank” ]The Rediscovery of Classical Economics: Adaptation, Complexity and Growth[/amazon_link] by David Simpson. As the title indicates, the book looks through the complexity lens – the economy as an evolving, self-organising system –  at classical (as distinct from neoclassical) economics and at ‘Austrian’ business cycle theory. By classical he means not the specific body of thought of the 19th century, but rather the general perspective on the economy as dynamic and in disequilibrium.

[amazon_image id=”1781951969″ link=”true” target=”_blank” size=”medium” ]The Rediscovery of Classical Economics: Adaptation, Complexity and Growth (New Thinking in Political Economy Series)[/amazon_image]

The introduction states:

“Equilibrium theory has focused the attention of academic economists on issues surrounding the efficient allocation of a given set of resources among a number of competing uses at a single moment in time. While such questions have engaged the best brains of at least two generations in a number of intellectual conundrums, it has diverted them from an analysis of those features of a market economy that have impressed themselves on human history.”

Or in other words, the mainstream of economics, with its focus on the moment of equilibrium, along with the assumption of a common stock of knowledge and rational choice, has bypassed the most striking characteristics of actual economies – growth, uncertainty, and human unpredictability.

There are a few other books that serve as good introductions to complexity in economics, such as Paul Ormerod’s very accessible [amazon_link id=”0571197264″ target=”_blank” ]Butterfly Economics[/amazon_link] and Alan Kirman’s [amazon_link id=”0415594243″ target=”_blank” ]Complex Economics[/amazon_link]. The contribution of David Simpson’s book is to link the tools of complexity thinking to a particular tradition in economic thought that has always emphasised growth, uncertainty and the problem of knowledge. Even the thickest-skinned of mainstream economists is probably aware of Hayek’s work on knowledge, or rather the impossibility of knowing everything, and of [amazon_link id=”0415567890″ target=”_blank” ]Schumpeter’s ‘creative destruction'[/amazon_link]. One chapter quotes Paul Krugman saying: “Is the economy a self-organising system? Of course it is!” The problem is that most economists, even if they acknowledge the general point, haven’t been doing that kind of economics – Krugman’s macroeconomics is back-to-the-sixties Keynesian analysis, and happens in a different part of his brain from his 1996 book [amazon_link id=”1557866988″ target=”_blank” ]The Self-Organizing Economy[/amazon_link].

[amazon_image id=”1557866996″ link=”true” target=”_blank” size=”medium” ]The Self Organizing Economy[/amazon_image]

Simpson brings in Austrian economics to consider the business cycle and particularly the present recession. I’m not at all familiar with the Austrian approach, so can’t really evaluate how well it fits into the complexity/uncertainty framework; but the narrative here emphasises the role of technology as the initial trigger, credit expansion in the boom, and the financial causes of the crisis. Simpson writes: “The business cycle carries many of the characteristic signatures of a complex process.” The economy self-organises then self-disorganises.

He concludes: “The marginal revolution of the last quarter of the 19th century had focused attention on the theory of value at the expense of the theory of growth. …. The end result of assuming away so many important aspects of reality is that the theory is not operational. It is impossible to relate equilibrium theory to the empirical processes of an actual market economy. …. Equilibrium theory has reached a dead end.”

My sense is that mainstream economists were already waking up to the irrelevance of much of the post-war work in the subject even before the Crisis. The collapse of the communist regimes, the dramatic impact of technology and globalisation, the steady adoption of behavioural research were all contributing to a shift in the mainstream back towards reality. Events since 2008 have accelerated the move, for all that many economists remain in denial, to the point that curriculum reform is now well under way, as I’ve noted before.

David Simpson’s book is a clear and readable introduction to complexity in the specific context of the history of economic thought, and can thus fill a gap in far too many economists’ knowledge of their own subject. It’s unfortunately a pricey Edward Elgar book, so most readers will need to order it from their library.

 

Simple is difficult

I’ve been mulling over the question I posed a few days ago, about how to reconcile Andy Haldane’s superb Jackson Hole paper (The Dog and the Frisbee) arguing the case for simpler financial regulation with Cesar Hidalgo‘s equally persuasive arguments for using the capacity of Big Data to give us much more useful detail. It sent me back to one of my all-time favourite economics books, Thomas Schelling’s [amazon_link id=”0393090094″ target=”_blank” ]Micromotives and Macrobehaviour[/amazon_link], which is all about the aggregation of individual decisions. (Coincidentally, Sebastian Mallaby wrote about the same question in the FT yesterday.)

It hasn’t answered my question, but what struck me this time was how difficult it is to come up with the compelling reasons for individuals to align their behaviour in the common interest. There is the traffic light example, but Chapter 3 gives a few examples of effective rules and norms, and many other examples of problems – free-riding, collective action problems, lemons etc. I conclude that simple is difficult – you have to find the right simple rule for the context and it has to create strong self-interest in abiding by it. Still, Schelling is optimistic. He writes:

“These problems often do have solutions. The solutions depend on some kind of social organization, whether that organization is contrived or spontaneous, permanent or ad hoc, voluntary or disciplined….. What we are dealing with is the frequent divergence between what people are individually motivated to do and what they might like to accomplish together.

And there are many ways to make the collective bargain stick, he argues. I’m in an optimistic mood this morning, and will go with the argument that between social norms, morals, institutions and even regulations can change, and make a big difference to collective outcomes.

[amazon_image id=”0393090094″ link=”true” target=”_blank” size=”medium” ]Micromotives and Macrobehaviour (Fels Lectures on Public Policy Analysis)[/amazon_image]

Keep it simple, stupid?

In the past few days I have read two brilliant and fascinating articles, pointing to opposite conclusions. One argues that the extent of complexity in the financial domain is so great that effective regulation can only be achieved by the use of heuristics or rules of thumb. The other argues, equally persuasively, that the potential for ‘big data’ is now so promising that we will not need to map the complexity of the macroeconomy using simple aggregates and averages, but rather will be able to use actual data. My instinct tells me both are correct but I’m still thinking through how they might be reconciled.

The first is a speech given by the Bank of England’s Andrew Haldane at Jackson Hole, The Dog and the Frisbee. He notes that neither humans – nor dogs, who can do it even better – actually solve an optimal control problem when catching a frisbee. They follow the rule of thumb: run at a speed so that the angle of gaze to the frisbee remains roughly constant. Modern finance theory, and consequently financial regulation, has developed models of decision making under risk, but in fact the world features uncertainty and increasing complexity. The strong assumptions about the state of knowledge made in conventional models do not hold.

The speech concludes:

“Modern finance is complex, perhaps too complex. Regulation of modern finance is complex, almost certainly too complex. That configuration spells trouble. As you do not fight fire with fire, you do not fight complexity with complexity. Because complexity generates uncertainty, not risk, it requires a regulatory response grounded in simplicity, not complexity. Delivering that would require an about-turn from the regulatory community from the path followed for the better part of the past 50 years. If a once-in-a-lifetime crisis is not able to deliver that change, it is not clear what will. To ask today’s regulators to save us from tomorrow’s crisis using yesterday’s toolbox is to ask a border collie to catch a frisbee by first applying Newton’s Law of Gravity.”

This seems to me to be obviously true, even if it ruffles feathers in the financial regulatory community.

The second article is a conversation with MIT Media Lab’s Cesar Hidalgo on The Edge, What is Value, What is Money? This covers a lot of territory, but part of it is how we understand complexity in the aggregate. Hidalgo says:

“In the past when we looked at macro scales, at least when it comes to many social phenomena, we aggregated everything. Our idea of macro is, by an accident of history, a synonym of aggregate, a mass in which everything is added up and in which individuality is lost. What data at high spatial resolution, temporal resolution and typological resolution is allowing us to do, is to see the big picture without losing the individuality inside it. I believe that in the future, macro is going to be something that is going to be in high-definition. You’re going to be able to zoom in into these macro pictures and see that neighborhood, and see that person, and understand that individual, and to have more personalized interactions thanks to the data that is becoming available. I think that in some sense, big data can help recover the humanity of a world in which the scientific representations of people have become dehumanized, because of our need to simplify.”

Well, this is an exciting prospect and obviously potentially feasible in the Big Data world. But I’m not yet sure how it sits with the ‘Keep it Simple, Stupid’ moral of the Haldane paper. Or indeed with my strong instinct that public policy interventions are most effective when of the kind described by Thomas Schelling in his brilliant book [amazon_link id=”0393329461″ target=”_blank” ]Micromotives and Macrobehaviour[/amazon_link] – like traffic lights, a clear and simple rule which people have strong incentives to obey.

My dog retraining as a financial regulator

Woolly complexity

It has taken me a while to get round to reading Stuart Kauffman’s [amazon_link id=”0465018882″ target=”_blank” ]Reinventing the Sacred: A New View of Science, Reason and Religion[/amazon_link]. Although I was looking forward to reading one of the gurus of complexity expound on the meaning of life, culture and technology, the mention of religion in the title must have rung some kind of subconscious warning bell. For I found the book an infuriating mix of interesting reflections on complexity – particularly why it should warn us against being too reductive, or locked into interdisciplinary silos – and woolly philosophy. Actually, it’s a book about why science is not only compatible with spirituality but should drive us to spirituality and belief in God. While perfectly happy for anybody else to worry about reconciling science and the sacred, I’m just not that interested in it.

There is a chapter on the economy that skates over the application of evolutionary theory and complexity to economics. This is brief and may be a handy introduction for anyone who knows nothing about this subject. If you do, it will be very familiar. Inevitably for a book with extremely broad scope, it lacks depth.

In terms of the underlying hypothesis about the dangers of being reductive, this book suffers by comparison with another one I’m currently reading, [amazon_link id=”0300188374″ target=”_blank” ]The Master and His Emissary [/amazon_link]by Iain McGilchrist. By contrast to Reinventing the Sacred, it’s a real doorstop of a book, and goes in depth into the human brain as well as many aspects of human culture. It’s a shame – I enjoyed Kauffman’s other book [amazon_link id=”0195095995″ target=”_blank” ]At Home in the Universe: The Search for Laws of Self-Organization and Complexity[/amazon_link], and he has obviously been a massively influential thinker in introducing these ideas to the way we analyse social as well as biological phenomena.

[amazon_image id=”0465018882″ link=”true” target=”_blank” size=”medium” ]Reinventing the Sacred[/amazon_image]