Markets in stories

Yesterday I attended a very interesting conference, on Non-Equilibrium Social Science, which stands for making economics more realistic and interesting – looking at the economy in terms of non-linear dynamic systems. As ever at a good conference there were some great opportunities for conversation – and book recommendations. I came away with Marion Fourcade’s

, and W.E.G.Salter’s
, and William Hazlitt’s 1805
.

[amazon_image id=”0691148031″ link=”true” target=”_blank” size=”medium” ]Economists and Societies: Discipline and Profession in the United States, Britain, and France, 1890s to 1990s (Princeton Studies in Cultural Sociology)[/amazon_image]   [amazon_image id=”1144113512″ link=”true” target=”_blank” size=”medium” ]An Essay On the Principles of Human Action: Being an Argument in Favour of the Natural Disinterestedness of the Human Mind..[/amazon_image]

Also, after hearing him speak, David Tuckett’s

. His argument was, in a nutshell: “Financial markets are markets in stories.” He classed as standard models both the rational choice and ‘behavioural’ approaches to decision making, because both assume there is an objective reality about which an optimum can in principle be known or calculated. Instead, Prof Tuckett argued, the decision problem is one of making any sense at all of how to act now on the basis of information now about an ontologically uncertain future. His answer is that we act on the basis of ‘conviction narratives’ which are collective interpretations of what today’s ‘facts’ (all based on our sense perceptions) mean for tomorrow. He described some empirical work looking at the stories that run in financial markets, as extracted from unstructured texts such as newswire reports, brokers’ notes and Bank of England reports.

[amazon_image id=”0230299857″ link=”true” target=”_blank” size=”medium” ]Minding the Markets: An Emotional Finance View of Financial Instability[/amazon_image]

This is intriguing. It certainly chimes with my unease (as in my Tanner and Pro Bono lectures) about the way economists talk – especially in the context of policy – as if they are omniscient outsiders, not part of what they are analysing. My question – which Prof Tuckett agreed is still-unexplored territory – is how reality interacts with our narratives, sometimes changing them. I’d like to have followed up by asking if the idea is a 

paradigm shift, but in the domain of financial markets, or economic life in general, rather than scientific exploration.

Another conference highlight was Bridget Rosewell on the inadequacy of our standard cost-benefit approach for deciding whether or not to go ahead with transport infrastructure projects, inadequate because they use a comparative static, equilibrium framework for something that is bound to change the dynamics of the economy. She gave many examples of projects that would never have passed a modern cost-benefit assessment, from Bazalgette’s London sewers to the Jubilee Line Extension and – as she is scrupulous – an example of one that wasn’t making it in terms of her tale of self-fulfilling visions that deliver the benefits they subscribe. (This is the fantasy airport on Boris Island, which she supports.) Bridget touches on this infrastructure question in

.

[amazon_image id=”1907994149″ link=”true” target=”_blank” size=”medium” ]Reinventing London (Perspectives)[/amazon_image]

Share

Economists and humanity

Peter Smith sent me his new book T

. In a letter accompanying it, he said he has two motivations. One is to get economics out of the trap of over-simplifying so that models can use linear algebra and thus be made ‘tractable’. This is one of the things that makes complexity economics and agent-based modelling appealing; virtual economies run on a computer do not need to be solved algebraically.

[amazon_image id=”0957069707″ link=”true” target=”_blank” size=”medium” ]The Reform of Economics[/amazon_image]

The other aim is to make economic methodology something more like normal scientific methodology. Economic method consists of choosing some basic postulates and making deductions from them. The deductions can then be tested against data. Normal science involves both induction and deduction. Careful empirical observation will shape theory.

The book dates the choice of the purely deductive path to Lionel Robbins and his 1935 essay

. He defined economics as the science of constrained choice, which, “Not only excludes uncertainty, but it also excludes from the scope of economics both institutions and the medium-term evolution of economic systems.” This isolates economics from the institutional framework of the economy, and hence from what determines the availability of resources over time – it makes economics an inherently static subject.

Natural scientists do regard economics as bizarrely non-empirical – I’ve been in multi-disciplinary conferences about both macroeconomics and behavioural choice at which biologists exclaim about how rarely economists discuss data, for all that they might go away and test hypotheses. One of the joys of being on the Competition Commission for eight years was how profoundly evidence-based the process is, and hence a real insight for an economist used to generalising about how companies behave. There aren’t many business people who think about marginal cost curves and production functions.

 is a game of two parts (not halves). It is mostly a critique of economic methodology but also has a useful introduction to agent based modelling. It ends on an upbeat note I very much like:

“Economics is becoming a much more interesting area in which to work and learn; and we have every hope that a more realistic and effective reformed science of economics will also be a more humane one. For, ultimately, economics is about the well-being of humanity.”

Share

Resources on complexity economics

Following my posts last week on complexity and economics, Professor Leigh Tesfatsion of Iowa State University sent me this very useful website with links to loads of resources on the subject – including an introductory self-study course.

Prof Tesfatsion wrote to me that the complexity approach has real momentum but added: “In macroeconomics, however, bitter resistance has been encountered, particularly from those who have devoted themselves to mastering DSGE modeling.” However, there is also some work on agent-based macroeconomics.

One general book I spotted in this list, one I’ve not read, is Mark Buchanan’s

. Nate Silver’s 
doesn’t feature agent-based modelling but does talk about the macroeconomy as a complex (non-linear multivariate dynamic) system.

[amazon_image id=”1408827379″ link=”true” target=”_blank” size=”medium” ]Forecast: What Physics, Meteorology, and the Natural Sciences Can Teach Us About Economics[/amazon_image]

Share

Serendipity, complexity, and loneliness

No sooner (literally)  had I written about the complexity economics of the new book by David Colander and Roland Kupers,

, than (in one of the many instances of serendipity in life) another book  on complexity turned up in the post, courtesy of its author, Peter Smith. The book is
.

[amazon_image id=”0957069707″ link=”true” target=”_blank” size=”medium” ]The Reform of Economics[/amazon_image]

The book looks like it argues for a more realistic alternative to mainstream economics by actually developing it, using agent-based modelling. In a covering letter, Dr Smith says the intelligent agents: “learn by experience how to respond to market conditions. … They can engage in price exploration, and learn to manage their inventory, plant renewal and cashflow (or go bust), all starting from far-from-equilibrium states.” He also describes his research, and his search for a post-crisis renewal of economics as “a mite lonely.” I think it might be less lonely than he fears.

Share

Slow demise of the economist ex machina

A small number of economists have been interested in complexity theory and related approaches – agent-based modelling, network models – for a long time, and a growing number for a shorter time. Complex models in the technical sense of non-linear dynamic systems, with many inter-connections and feedbacks, can describe economic or financial data well. Their evolution over time is highly sensitive to initial conditions, and they are sometimes characterised by ordered states, a property known as emergence.

There are some very good books about complexity science, such as Mitchell Waldrop’s

, or Philip Ball’s
. Albert Laslo Barabasi’s 
sets out the related network approach. There are also quite a few books specifically about complexity in economics. The Santa Fe Institute has long been involved in complexity models, and Brian Arthur and Herb Gintis have applied them to economics. Paul Ormerod’s
and 
are all very accessible introductions to complexity in economics. Eric Beinhocker’s 
is another. Alan Kirman more recently published
.

A new book 

by David Colander and Roland Kupers is therefore not introducing a new area of research. But it is nevertheless doing two very interesting things.

[amazon_image id=”0691152098″ link=”true” target=”_blank” size=”medium” ]Complexity and the Art of Public Policy: Solving Society’s Problems from the Bottom Up[/amazon_image]

First, it locates complexity models in the context of the history of economic thought, explaining why and how economics turned away from the intuitively complexity-based approach of the classical economists (and the also both Hayek and Keynes ). There is a nice anti-reductionism quotation from Keynes: “Once the complexity of reality is carefully considered, the argument that applied policy concerns can be reduced to economics becomes so unreasonable that only an academic would dare consider it.” Colander and Kupers argue instead for what they describe as ‘activist laissez faire’, an approach which still leaves room for disagreement – as between Keynes and Hayek – but about empirical judgements and tactics rather than completely polarised, mutually exclusive approaches.

The authors link the turn in economics away from messy reality, towards sterile abstraction, to the work of Abba Lerner in the 1930s. They argue that his 

was one of the founding texts of the viewpoint that came to dominate the discipline, the standard state control economic policy framework. This caught on because it was simple, clear, and cast economists as the experts who could identify what policies were needed to maximise social welfare with their analytically soluble models. State intervention was only needed when laissez faire markets failed – but one could argue that that was almost always. This is the attitude so brilliantly described in James Scott’s
. Thus the stage was set for the dualism between interventionism and free market-ism, between ‘Keynesians’ and monetarists. The account in this book sets the reductionist turn in economics earlier than the conventional wisdom has it – others locate it in the 1940s and 1950s.

[amazon_image id=”0300078153″ link=”true” target=”_blank” size=”medium” ]Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed (Yale Agrarian Studies)[/amazon_image]

Aside from the discussion of the history of economic thought, the main contribution of this book is that it discusses the implications of the ‘complexity framework’ for the way we should think about public policy, arguing that it will get us away from the sterility of the markets versus states dualism. Instead, the role of both has to be respected – government in setting the rules and conditions, markets in delivering bottom-up choices in an efficient way. The policy intervention is  itself inside the system.

As the book points out, this is not consistent at all with the standard frame of economics, in which the economist is a deus ex machina calculating the optimal policy and implementing it – a command-and-control framework of thought that has spread far wider than economics to capture all of public policy and even business decision making.This has extracted a high price and I think is certainly at the root of the present dissatisfaction with economics.

The authors argue that the problem in economics itself is mainly with macro and theory, as the applied micro areas of the field have been steadily moving away from assumed rationality, linearity, static equilibrium etc for some decades now – behavioural economics being the obvious example. In 2000, this led Paul David to declare that ‘neoclassical’ economics was dead. The exception, however, is the one bit of economics that all normal people know about because it’s in the news all the time, not least because of the financial crisis and its aftermath. As the authors write: “Issues of morality, the market and the constitutional order should have been central to the policy debate about macroeconomics. They weren’t. The standard frame eliminated them from discussion.” They are wonderfully scathing about modern DSGE macro, a view with which I wholeheartedly agree. “While microeconomics has evolved considerably in the direction of complexity, progress in macro has been very limited.” What students are currently taught in their macro courses is not useful and in fact inconsistent with empirical reality.

Outside economics, in the wider influence the subject and its approach have had on public policy, reductionism still reigns, and probably will until future generations of people who have studied economics have experienced a different kind of curriculum. Colander and Kuper end with a curriculum reform proposal – economics education is a longstanding interest of David Colander, who contributed a chapter to

. Role on the roll-out of INET’s CORE curriculum!

However, I think this book is more useful for people in the policy world rather than universities. It could start to chip away at the damaging idea that policy makers are deus ex machina, outside the system (something I spoke about in my Pro Bono lecture The Economist As Outsider), and focus attention once again on the importance of the institutional, cultural and ethical framework within which people make economic decisions.

Share