Slow demise of the economist ex machina

A small number of economists have been interested in complexity theory and related approaches – agent-based modelling, network models – for a long time, and a growing number for a shorter time. Complex models in the technical sense of non-linear dynamic systems, with many inter-connections and feedbacks, can describe economic or financial data well. Their evolution over time is highly sensitive to initial conditions, and they are sometimes characterised by ordered states, a property known as emergence.

There are some very good books about complexity science, such as Mitchell Waldrop’s [amazon_link id=”0140179682″ target=”_blank” ]Complexity[/amazon_link], or Philip Ball’s [amazon_link id=”0099457865″ target=”_blank” ]Critical Mass[/amazon_link]. Albert Laslo Barabasi’s [amazon_link id=”0738206679″ target=”_blank” ]Linked[/amazon_link] sets out the related network approach. There are also quite a few books specifically about complexity in economics. The Santa Fe Institute has long been involved in complexity models, and Brian Arthur and Herb Gintis have applied them to economics. Paul Ormerod’s [amazon_link id=”0571197264″ target=”_blank” ]Butterfly Economics[/amazon_link], [amazon_link id=”057127921X” target=”_blank” ]Why Most Things Fail[/amazon_link] and [amazon_link id=”057127921X” target=”_blank” ]Positive Linking[/amazon_link] are all very accessible introductions to complexity in economics. Eric Beinhocker’s [amazon_link id=”0712676619″ target=”_blank” ]The Origin of Wealth[/amazon_link] is another. Alan Kirman more recently published [amazon_link id=”0415594243″ target=”_blank” ]Complex Economics[/amazon_link].

A new book [amazon_link id=”0691152098″ target=”_blank” ]Complexity and the Art of Public Policy: Solving Society’s Problems from the Bottom Up[/amazon_link] by David Colander and Roland Kupers is therefore not introducing a new area of research. But it is nevertheless doing two very interesting things.

[amazon_image id=”0691152098″ link=”true” target=”_blank” size=”medium” ]Complexity and the Art of Public Policy: Solving Society’s Problems from the Bottom Up[/amazon_image]

First, it locates complexity models in the context of the history of economic thought, explaining why and how economics turned away from the intuitively complexity-based approach of the classical economists (and the also both Hayek and Keynes ). There is a nice anti-reductionism quotation from Keynes: “Once the complexity of reality is carefully considered, the argument that applied policy concerns can be reduced to economics becomes so unreasonable that only an academic would dare consider it.” Colander and Kupers argue instead for what they describe as ‘activist laissez faire’, an approach which still leaves room for disagreement – as between Keynes and Hayek – but about empirical judgements and tactics rather than completely polarised, mutually exclusive approaches.

The authors link the turn in economics away from messy reality, towards sterile abstraction, to the work of Abba Lerner in the 1930s. They argue that his [amazon_link id=”0678006180″ target=”_blank” ]The Economics of Control[/amazon_link] was one of the founding texts of the viewpoint that came to dominate the discipline, the standard state control economic policy framework. This caught on because it was simple, clear, and cast economists as the experts who could identify what policies were needed to maximise social welfare with their analytically soluble models. State intervention was only needed when laissez faire markets failed – but one could argue that that was almost always. This is the attitude so brilliantly described in James Scott’s [amazon_link id=”0300078153″ target=”_blank” ]Seeing Like A State[/amazon_link]. Thus the stage was set for the dualism between interventionism and free market-ism, between ‘Keynesians’ and monetarists. The account in this book sets the reductionist turn in economics earlier than the conventional wisdom has it – others locate it in the 1940s and 1950s.

[amazon_image id=”0300078153″ link=”true” target=”_blank” size=”medium” ]Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed (Yale Agrarian Studies)[/amazon_image]

Aside from the discussion of the history of economic thought, the main contribution of this book is that it discusses the implications of the ‘complexity framework’ for the way we should think about public policy, arguing that it will get us away from the sterility of the markets versus states dualism. Instead, the role of both has to be respected – government in setting the rules and conditions, markets in delivering bottom-up choices in an efficient way. The policy intervention is  itself inside the system.

As the book points out, this is not consistent at all with the standard frame of economics, in which the economist is a deus ex machina calculating the optimal policy and implementing it – a command-and-control framework of thought that has spread far wider than economics to capture all of public policy and even business decision making.This has extracted a high price and I think is certainly at the root of the present dissatisfaction with economics.

The authors argue that the problem in economics itself is mainly with macro and theory, as the applied micro areas of the field have been steadily moving away from assumed rationality, linearity, static equilibrium etc for some decades now – behavioural economics being the obvious example. In 2000, this led Paul David to declare that ‘neoclassical’ economics was dead. The exception, however, is the one bit of economics that all normal people know about because it’s in the news all the time, not least because of the financial crisis and its aftermath. As the authors write: “Issues of morality, the market and the constitutional order should have been central to the policy debate about macroeconomics. They weren’t. The standard frame eliminated them from discussion.” They are wonderfully scathing about modern DSGE macro, a view with which I wholeheartedly agree. “While microeconomics has evolved considerably in the direction of complexity, progress in macro has been very limited.” What students are currently taught in their macro courses is not useful and in fact inconsistent with empirical reality.

Outside economics, in the wider influence the subject and its approach have had on public policy, reductionism still reigns, and probably will until future generations of people who have studied economics have experienced a different kind of curriculum. Colander and Kuper end with a curriculum reform proposal – economics education is a longstanding interest of David Colander, who contributed a chapter to [amazon_link id=”1907994041″ target=”_blank” ]What’s the Use of Economics[/amazon_link]. Role on the roll-out of INET’s CORE curriculum!

However, I think this book is more useful for people in the policy world rather than universities. It could start to chip away at the damaging idea that policy makers are deus ex machina, outside the system (something I spoke about in my Pro Bono lecture The Economist As Outsider), and focus attention once again on the importance of the institutional, cultural and ethical framework within which people make economic decisions.

7 thoughts on “Slow demise of the economist ex machina

  1. Very interesting post. I have grim memories of George Brown’s National Plan of 1965, when my view that it was a botch job which could never work because no national plan could ever work was denounced as lack of patriotism verging on treason. When Chaos Theory etc. came along I was told that it did not apply to economics as it could not be measured. Alas, there have been too many academics with too much to lose to give up their ideas of state planning and control.

  2. Love the aside that businesses think similarly. We continually face this when bidding for consultancy requirements. ‘Explain how you would ensure outcomes x, y, z’ (the product of myriad factors beyond control) ‘furthermore, payment will be subject to these outcomes being realised’. So we have to play the game where we claim not only to have perfect knowledge of the organisation’s present condition and context, but also of the universally applicable methods to take it into a (perfectly predictable) future of its choosing. As if ignoring the uncertainty will make it go away. Approaches which fully acknowledge uncertainty are out there (e.g. Managing the Unknown, Loch, DeMeyer & Pich) and implicit in many recent business books (such as Adapt, Antifragile, and Little Bets), but the dominant discourse of control is underpinned by strong incentives on both sides – both to be reassured that the future can be predicted and controlled by certain methods, and to be in the esteemed position of holding such ‘knowledge’.

    • And customers/voters probably wouldn’t be too receptive to the announcement that ‘Our 5 yr strategy is to drop the pretence that we’re doing anything other than muddling through’, either!

  3. Just a note to clarify something that I think is ambiguous in the post.

    The authors attribute the quote (on page 67) “Once the complexity of reality is carefully considered, the argument that applied policy concerns can be reduced to economics becomes so unreasonable that only an academic would dare consider it.” to John NEVILLE Keynes, the father of John Maynard Keynes.

    This quote bugged me because I can’t seem to find where J.N. Keynes said it and they don’t provide a citation. I am guessing it is in the Scope and Method of Political Economy? But who knows where.

  4. I have not read the book yet but your description of the authors’ thinking reminds me of “How Asia Works” by Joe Studwell, particularly his description of the economic development of South Korea.

Comments are closed.