When Michael Lewis came to dinner

That Michael Lewis came to dinner at our house once. This was about 30 years ago, when he was dating a friend of mine for a while, and before Liar’s Poker made him famous. He was charming – working in finance – but if only I’d known who he’d turn into, I’d have quizzed him closely about his stellar writing technique.

I’ve just devoured The Undoing Project, his much-trailed book about Daniel Kahneman and Amos Tversky, and their launching of the behavioural revolution in economics, over two evenings. It’s a wonderful book. I heartily recommend it as a Christmas gift for the economists in your life, or a treat for yourself over the holiday.

The book weaves together the personal and intellectual biographies of its protagonists. It explains the ideas, including the paradoxes requiring one to think about probabilities, beautifully clearly. It’s also just a terrific human story about an intense creative friendship as it flowed, and ebbed, over the decades and continents. If they’re not always totally likeable, the two characters are always immensely sympathetic.

It will surely send many readers on to Kahneman’s Thinking Fast and Slow, which requires some mental effort but everybody who fancies themselves an intelligent, educated person ought to have read. Although I’ve read loads of behavioural economics books and papers, and so think I know about a lot of the insights the literature has given us about how our decision-making processes function, there were still some new (to me) ones in The Undoing Project. These are two I found. Tversky had a rule that you must wait a day before replying to any invitation, even one you wanted to accept. It becoms much easier to decline the ones you don’t want. This is advice I definitely need to follow.

The other comes from thinking about reversion to the mean. An exceptionally (beyond average) good or bad performance is usually followed by one that is less good or less bad (closer to average).  Yet coaches and teachers and bosses often hold that if you praise someone for doing well, they do less well next time, and if you shout at someone for doing badly, they do better next time.  “Because we tend to reward others when they do well and punish them when they do badly, and because there is regression to the mean, it is part of the human condition that we are statistically punished for rewarding others and rewarded for punishing them,” wrote Kahneman. This strikes me as profound and something one ought to act on.

It was surprising to learn that at the height of his fame, in the years before his death, Tversky was bugged by the criticism of their work by Gerd Gigerenzer. I’ve never seen Gigernenzer’s argument that heuristic rules of thumb were rational because they economized on brain energy as a fundamental attack on Kahneman and Tversky, more an extension. There’s surely loads still to be discovered about decision making (especially under uncertainty), not least when decisions are conventionally ‘rational’ versus when ‘behavioural’ behaviour kicks in.

As economics is all about decision-making in the domain of resource use and allocation, this overlap with psychology and cognitive science is an exciting area – even though I’m deeply uneasy about the eagerness with which some economists and policy makers are leaping to adopt ‘nudges’ as another handy tool for social engineers to get the people to behave as they ought. We certainly ought to be teaching this at A level and in universities. The Undoing Project is a great book to introduce behavioural economics – and a cracking good story, told by a master.

41nlqhg0xl

Rescuing macroeconomics?

It’s always with great diffidence that I write about macroeconomics. Although I’m in good company in being sceptical about much of macro (see this roundup from Bruegel and this view from Noah Smith, for instance), I’m all too well aware of the limits of my knowledge. So with that warning, here’s what I made of Roger Farmer’s very interesting new book, Prosperity for All: How To Prevent Financial Crises.

The book starts with his critique of conventional DSGE real business cycle and also New Keynesian models. Farmer rightly dismisses the idea that it’s ok to mangle the data through a Hodrick-Prescott filter and ‘calibrate’ the model, as in the real business cycle approach. But he also has a chapter criticising the (now) more conventional New Keynesian models. He writes, “Macroeconomics has taken the wrong path. The error has nothing to do with classical versus New Keynesian approaches. It is a more fundamental error that pervades both.” This is the idea that there is an ultimate full employment equilibrium. Echoing Paul Romer‘s recent broadside, he describes both as phlogiston theory. Echoing a number of others, he describes New Keynesian models as being like Ptolemaic astronomy, adding more and more complexity in a desperate and degenerate attempt to keep the theory roughly aligned to evidence.

The book demolishes the idea that there is a natural rate of unemployment (and thus also the idea of the output gap). Farmer argues that there are multiple possible outcomes and unemployment in many of these will persist in equilibrium. His alternative model – also a DSGE approach – replaces the New Keynesian Phillips Curve with a ‘belief function’, assuming that beliefs about future expected (nominal) output growth equal current realized growth.

It seems obvious to me that this approach is preferable to the New Keynesian and certainly RBC models, although this obviousness is rooted in my intuition – of course unepmloyment can persist and there can be multiple equilibria, duh! However, some things about it are less appealing. Above all, Farmer’s assumption that consumption is a function of wealth, not income. He replaces the conventional consumption function with one more purely related to the Permanent Income Hypothesis. This troubles me, although not because I disagree  with his view that the conventional Keynesian consumption function and multiplier are inconsistent with macro data. However, I thought the permanent income hypothesis sat badly with the data also, as it implies more consumption smoothing than is observed. It seems incredible too that many people look forward very far in determining their consumption level even if they do note and respond to asset price changes. Besides, most people have low net wealth – indeed, 26% of Britons have negative net financial wealth.

As the book points out, this change of assumption, and the role of the belief function, have strong policy implications. The debate about austerity, which is in effect an argument about the size of the multiplier (which we don’t know), is a distraction: higher government deficits will not shift the economy to a lower unemployment equilibrium. Instead, Farmer advocates a policy using the composition of the central bank balance sheet to manage asset markets and thus the level of output, and beliefs, through wealth effects (rather than including asset price inflation in an inflation target, as some have advocated). Balance sheet policies are effective because financial markets are incomplete: future generations cannot trade now; parents canot leave negative bequests.

This is a policy debate I’ll be leaving to those who are more knowledgeable than me – although of course those who know what they’re talking about will also have their own horses they’re backing in the race. It is striking that these macro debates rage around the same small amount of data, which is insufficient to identify any of the models people are battling over. In his excellent book about forecasting, The Signal and the Noise, Nate Silver pointed out that weather forecasters reacted to weaknesses in their models and forecasts by gathering many more data points – more observatories, more frequent collection. I see macroeconomists downloading the same data over and over again, and also ignoring the kinds of data issues (such as the effects of temporal aggregation) that time series econometricians like David Giles point out. So something like the David Hendry model-free approach, as set out in his recent paper, seems the best we can do for now.

My reservations should not stop anybody from reading Prosperity for All. It is an accessible read – undergraduate and maybe even A level students could cope with this  – and the model is available at www.rogerfarmer.com. I’d like to have seen some discussion of non-US data, and also structural change/non-stationarity. It’s also a short book, and as a macro-ignoramus I’d have liked some sections to be explained more fully. But these are quibbles. This is an important contribution to the debate about macroeconomics, and it’s an important debate because this is what most citizens think economics is all about, and macro policy has a profound effect on their lives.

I’m keen to read other reviews of the book now. I’m sure Roger is more right than the conventional DSGE approach – but also think the how-to-do macro debate is far from settled. How open-minded are more conventional macroeconomists?

51zdd7pouql

Digi-trust-busting

As someone who spent eight years on the Competition Commission, the changing shape of competition in the digital world is a question of compelling interest to me. Mainly, I blocked mergers, but the exceptions were retail inquiries where the growing competition from online retailers (especially Amazon) was, to me, a clear constraint on merging high street chains (some of my colleagues were less convinced – this was 2001-2009). Looking more recently at the literature on digital platforms, it is clear that economists have to step up and deliver new, practical analytical tools for competition authorities. As Jean Tirole and his co-authors famously established, the old tools of market definition and SSNIP tests are inadequate for assessing competitive conditions. And when the dynamics of competing for versus in the market, and the evolution of ecosystems, are so important now, the longstanding failure of competition economics to deliver a systematic way of thinking about static versus dynamic impacts of mergers really matters.

This is a long-winded preamble to mentioning Virtual Competition: the promise and perils of the algorithm-driven economy by Ariel Ezrachi and Maurice Stucke. The authors clearly are concerned about the failure of competition policy tools in the new context, and although it tries to be even-handed the book paints a picture of a world of increasing market power, to the detriment of consumers and citizens.

The most interesting thread in the book from an economist’s perspective is the reflection on the role of information in markets. Reductions in search costs should improve consumer and economic welfare, make markets more competitive. However, the greater availability of information in the online world is illusory because there is a staggering imbalance. Digital platforms have an extraordinary amount of extra information about us – and there are very interesting chapters covering the struggle between platforms, advertisers, app developers etc to gather and aggregate the personal information. However, the information we consumers get about the goods and services we’re looking to purchase is diminishing. The book raises the question as to whether the use of cookies and geo-tracking is enabling ever-better price discrimination by platforms and online sellers; there has been no systematic evidnce that this is so, but then it would be hard to gather the data to test this properly.

At the start of the internet era, there was great optimism that this was a technology for empowering consumers with more nearly perfect information, allowing easy price and product comparisons. In fact, it may be returning us to the era of the bazaar, with reducing transparency of information about prevailing market prices and conditions. “In a market that is in reality controlled by bots and algorithms, what power does the invisible hand posess?” Instead, maybe we have a digitalized hand, determining the specific market price in any given context. As others have done (Francis Spufford in Red Plenty – not cited – and Eden Medina in Cybernetic Revolutionaries – which is cited here), the book notes that in the limit a profit-maximizing market with perfect information and a social-welfare maximising central planner similarly well-informed would reach the same prices and allocations (although contrasting distributions).

The book does a good job of describing the changing dynamics of competition in digital markets, and why there is every reason to be concerned. Written by two lawyers, it is frustrating that it hardly mentions the economic research literature, which is proliferating even if not yet reaching policy-ready conclusions. The authors also over-do some of their critique of digital businesses – for example, they include a section on the use of framing and choice architecture to manipulate consumer choice, but that dates back to the pre-digital days of Mad Men. Still I share their view that we are in an age similar to those of the giant industrial trusts, and some digi-trust-busting is going to be needed.

41cjyoodbol

The social life of electricity, continued

I devoured Then There Was Light: Stories Powered by the Rural Electrification Scheme in Ireland on my train journey back from the Bristol Festival of Economics yesterday. It’s a delightful collection of reminiscences about this scheme taking electrification to the countryside (ie most of Ireland) during the 1940s and into the 1950s. In fact, coverage only reached 100% of all the remote islands in 2000, and some hamlets were eventually depopulated as electricity never got to them.

The book is a reminder of what a poor and agrarian country Ireland was until, well EU accession really. The essays by men who worked digging post holes and driving trucks often start with the relief they felt on getting, not only a job, but one for a government body paying decent wages. Being hired for the work changed their lives. Most are written by people who spent at least part of their childhood in darkness lit only by candles and paraffin lamps, with mothers doing all the laundy by hand. One writer calls the scheme one of the most transformative events in Ireland’s 20th century history as a nation, and by the end of the book, this doesn’t seem like hyperbole.

It’s fasinating to read about the hesitations some people had: about the cost not only of getting hooked up to start with but also the vulnerability to having to make ongoing payments for power. For many new customers, this was the first regular bill payment they experienced; about the dangers – would it harm the cattle or burn down the house? And about whether it would change the character of life for the worse. Parish priests were often key advocates for electrification, and so were women, who quickly saw the potential benefits of electrical domestic appliances. The company also had a PR and sales team – the book has illustrations showing some of the demos and delightful adverts: “Electricity saves money in the farmyard!”

Above all, the book is a reminder that all technology is social. Not only do new technologies need complementary investments  – households paying for their internal wiring and switches, or street lights, for example – people have to be able to see the benefits as well as the costs (faster milking, no more washing by hand), and these often need demonstrating in practice before they are believed. Electricity in particular is also a social technology, involving collective decisions to create and sustain the incentives to make it function. The investments are long-term, they change places dramatically; and although the technology is now old, it is both essential and dangerous.

So it is that many countries still cannot provide a consistent universal electricity supply and even advanced economies experience power cuts and underinvestment. If western political systems lose their ability to create consensus and collective, long-term action there will be many bad consequences but one of them might well be disrupted electricity supplies. This is not alarmist: I spent chunks of my teenage years doing homework by candlelight too. We had the power stations and the wires, but not the social and political infrastructure in the years of unrest and strikes in the 1970s.

41r6lcy3c5l

Facts and values (statistical version)

The trouble with reading two books simultaneously is that it slows down the finishing. But I have now finished a terrific novel, You Don’t Have to Live Like This by Benjamin Markovits – a sort of state of the United States novel except it seems like another age in this grotesque situation of Donald Trump apparently going to become President. And also The Cost of Living in America: A Political History of Economic Statistics, 1880-2000 by Thomas Stapleford.

The title might mark it out as a bit of a niche read – yes, ok – but it is truly a very interesting book. The key underlying message is that all statistics are political, and none more so than a price index. The account is one of the recurring, and recurringly failing, attempts to turn conflicts over the allocation of resources into a technical matter to be resolved by experts. The systematic state collection of statistics is part of the 19th-20th century process of the rationalization of governance as well as being itself “a form of rationalized knowledge making”. Theodore Porter’s well-known Trust in Numbers: The pursuit of Objectivity in Science and Public Life has documented the political appeal of developing and using impersonal, quantitative measures and rules. In my experience, statisticians themselves are far more aware than politicians (or indeed economists) of the highly judgmental nature of their work.

The Cost of Living in America presents the history of the development of price measurement in the US, with a division between the labor movement’s emphasis on the standard of living and cost of living, and the increasingly technocratic development of a price index for use in macroeconomic management. The former began with the study of ‘baskets’ of goods and a debate about what working families needed to maintain their standard of living and keep up with the norm. This was affected by context. For example, the price of certain staples including housing rose faster in wartime. New goods appeared. The debate about price indices increasingly revolved around whether to try to measure the cost of a fixed level of satisfaction, or the cost of a fixed basket of goods?

By the time of the Boskin Commission, this had been resolved decisively in favour of a constant utility index, the minimum change in expenditure needed to keep utility unchanged. (Robert Gordon has since said the Commission under-stated the over-statement of inflation.) This made accounting for quality change and new goods a pressing issue. Many economists started to agree that the statisticians had not adequately accounted for these in their price indices. Economists including Robert Gordon and Zvi Griliches focused on this question, Griliches developing the hedonics approach.

Stapleford writes: “If economists were to claim that their discipline had any claim to neutral technical knowledge, surely that claim required them to have neutral apolitical facts – namely economic statistics. … A constant-utility index was surely the proper form for a cost-of-living index, but the idea that one could compare ‘welfare’ in two different contexts [eg two separate time periods or two countries] without introducing subjective (and probably normative) judgments seemed implausible at best.” Yet applying price indices to macroeconomic analysis of growth or productivity  rather than labour disputes helped depoliticise them. And hedonics tackled the problem by redefining goods as bundles of characteristics. As the book notes, governments became keen on their statisticians applying hedonics, from the mid-90s, when they realised that it implied very rapid declines in some prices and hence higher productivity growth. (And the ‘accuracy’ of price indices is in question again now because of the ‘productivity puzzle‘.)

But this is an uncomfortable resolution. Although this elegant solution links national income statistics to neoclassical utility theory, there seems a category mismatch between a set of accounts measuring total production with the idea that value depends on utility. Setting aside the fact that hedonic methods are not applied to a large proportion of consumer expenditure anyway, this piece of statistical welding is coming under huge strain now because the structure of production in the leading economies is being transformed by digital.

One of the many consequences of the Brexit vote and Trumpery is that economists (and others) are again thinking about distribution. The issue is usually framed as the distribution of growth – when it is there, who gets it? I think the question raised for our economic statistics is far more fundamental: we need to recognise the normative judgements involved in the construction of the growth statistics to begin with. Actually existing macroeconomic statistics embed a tacit set of assumptions about welfare, and a production structure which is becoming redundant. But that of course is my favourite theme.

31nh-hpypcl