Facts and values (statistical version)

The trouble with reading two books simultaneously is that it slows down the finishing. But I have now finished a terrific novel, You Don’t Have to Live Like This by Benjamin Markovits – a sort of state of the United States novel except it seems like another age in this grotesque situation of Donald Trump apparently going to become President. And also The Cost of Living in America: A Political History of Economic Statistics, 1880-2000 by Thomas Stapleford.

The title might mark it out as a bit of a niche read – yes, ok – but it is truly a very interesting book. The key underlying message is that all statistics are political, and none more so than a price index. The account is one of the recurring, and recurringly failing, attempts to turn conflicts over the allocation of resources into a technical matter to be resolved by experts. The systematic state collection of statistics is part of the 19th-20th century process of the rationalization of governance as well as being itself “a form of rationalized knowledge making”. Theodore Porter’s well-known Trust in Numbers: The pursuit of Objectivity in Science and Public Life has documented the political appeal of developing and using impersonal, quantitative measures and rules. In my experience, statisticians themselves are far more aware than politicians (or indeed economists) of the highly judgmental nature of their work.

The Cost of Living in America presents the history of the development of price measurement in the US, with a division between the labor movement’s emphasis on the standard of living and cost of living, and the increasingly technocratic development of a price index for use in macroeconomic management. The former began with the study of ‘baskets’ of goods and a debate about what working families needed to maintain their standard of living and keep up with the norm. This was affected by context. For example, the price of certain staples including housing rose faster in wartime. New goods appeared. The debate about price indices increasingly revolved around whether to try to measure the cost of a fixed level of satisfaction, or the cost of a fixed basket of goods?

By the time of the Boskin Commission, this had been resolved decisively in favour of a constant utility index, the minimum change in expenditure needed to keep utility unchanged. (Robert Gordon has since said the Commission under-stated the over-statement of inflation.) This made accounting for quality change and new goods a pressing issue. Many economists started to agree that the statisticians had not adequately accounted for these in their price indices. Economists including Robert Gordon and Zvi Griliches focused on this question, Griliches developing the hedonics approach.

Stapleford writes: “If economists were to claim that their discipline had any claim to neutral technical knowledge, surely that claim required them to have neutral apolitical facts – namely economic statistics. … A constant-utility index was surely the proper form for a cost-of-living index, but the idea that one could compare ‘welfare’ in two different contexts [eg two separate time periods or two countries] without introducing subjective (and probably normative) judgments seemed implausible at best.” Yet applying price indices to macroeconomic analysis of growth or productivity  rather than labour disputes helped depoliticise them. And hedonics tackled the problem by redefining goods as bundles of characteristics. As the book notes, governments became keen on their statisticians applying hedonics, from the mid-90s, when they realised that it implied very rapid declines in some prices and hence higher productivity growth. (And the ‘accuracy’ of price indices is in question again now because of the ‘productivity puzzle‘.)

But this is an uncomfortable resolution. Although this elegant solution links national income statistics to neoclassical utility theory, there seems a category mismatch between a set of accounts measuring total production with the idea that value depends on utility. Setting aside the fact that hedonic methods are not applied to a large proportion of consumer expenditure anyway, this piece of statistical welding is coming under huge strain now because the structure of production in the leading economies is being transformed by digital.

One of the many consequences of the Brexit vote and Trumpery is that economists (and others) are again thinking about distribution. The issue is usually framed as the distribution of growth – when it is there, who gets it? I think the question raised for our economic statistics is far more fundamental: we need to recognise the normative judgements involved in the construction of the growth statistics to begin with. Actually existing macroeconomic statistics embed a tacit set of assumptions about welfare, and a production structure which is becoming redundant. But that of course is my favourite theme.

31nh-hpypcl

Books at Kilkenomics

The fabulous Kilkenomics festival of economics and comedy took place over the weekend just gone. As ever, there were plenty of authors on parade. Tim Harford spoke about Messy (which I reviewed here), Deirdre McCloskey about her latest, Bourgeois Equality (here). Jim Rickards’ session on his book The Road to Ruin, which I haven’t yet read, had rave reviews. Dan Ariely (of Predictably Irrational and Payoff) did a masterclass in behavioural economics. Rutger Bregman was there talking about his book on basic income, Utopia for Realists.

Best of all, however, was my discovery in the excellent local bookshop The Book Centre of Then There Was Light: Stories Powered by the Rural Electrification Scheme in Ireland, edited by PJ Cunningham and Joe Kearney. I know, I know. But electricity is a fascinating technology because (a) it is so hard to get the economic incentives right, hence frequent blackouts in so many countries, and (b) it is so closely entwined with social change. I’ll report back, but I’m thrilled by this collection.41r6lcy3c5l

 

Algorithms and (in)justice

It’s been one of those weeks. One of those years, actually – David Bowie *and* Leonard Cohen. Listening to ‘Democracy‘ as I write this.

Still, I have managed to read Cathy O’Neil’s excellent Weapons of Math Destruction, about the devastation algorithms in the hands of the powerful are wreaking on the social fabric. “Big data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that’s something only humans can provide. Sometimes that will mea putting fairness ahead of profit.”

The book’s chapters explore different contexts in which algorithms are crunching big data, sucked out of all of our recorded behaviours, to take the human judgement out of decision-taking, whether that’s employing people, insuring them or giving them a loan,  sentencing them in court (America, friends), ranking universities and colleges, ranking and firing teachers….. in fact, the scope of algorithmic power is increasing rapidly. The problems boil down to two very fundamental points.

One is that often the data on a particular behaviour or characteristic is not observed, or unobservable – dedication to work, say, or trustworthiness. So proxies have to be used. Past health records? Postcode? But this encodes unfairness against individuals, those who are reliable even though living on a bad estate, and does so automatically with no transparancey and no redress.

The other is that there is a self-reinforcing dynamic in the use of algorithms. Take the example of the US News US college ranking. Students will aim to get into those with a high ranking, so they have to do more of whatever it takes to get a high ranking, and that will bring them more students, and more chance of improving their ranking. Too bad that the ranking depends on specific numbers: SAT scores of incoming freshmen, graduation rates and so on. These seemed perfectly sensible, but when the rankings they feed into are the only thing that potential students look at, institutions cheat and game to improve these metrics. This is the adverse effect of target setting on addictive crystal meth. Destructive feedback loops are inevitable, O’Neil points out, whenever numerical proxies are used for the criteria of interest, and the algorithm is a black box with no humans intervening in the feedback loops.

The book is particularly strong on the way apparently objective scoring systems are embedding social and economic disadvantage. When the police look at big data to decide which areas to police more harshly, the evidence of past arrests takes them to poor areas. A negative feedback loop – they are there more, they arrest more people for minor misdemeanours, the data confirms the area as more crime-ridden. “We criminalize poverty, believing all the while that our tools are not only scientific but fair.” Credit scoring algorithms, those evaluating teachers using inadequate underlying models, ad sales targetting the vulnerable – the world of big data and algos is devastating the lives of people on low incomes. Life has always been unfair. It is now unfair at lightning speed and wearing a cloak of spurious scientific accuracy.

O’Neil argues that legal restraints are needed on the use of algorithmic decision-making by both government agencies and the private sector. The market will not be able to end this arms race, or even want to as it is profitable.

This is a question of justice, she argues. The book is vague on specifics, calling for transparency as to what goes in to the black boxes and a regulatory system. I don’t know how that might work. I do know that until we get effective regulation, those using big data – including especially the titans like Facebook and Google – have a special responsibility to consider the consequences.

51aemrp8fbl

Bank of England independence from Nigel Lawson

This is a bit late but it has been in the back of my mind this week that Lord Lawson used to be keen on isolating central banks from political pressure. I remembered right. He made the case to a sceptical Margaret Thatcher, arguing it would give UK monetary policy an anchor that was necessary if the country was not to commit to an external anchor such as ERM membership. Here is Nigel Lawson in 1992, in his autobiography The View from Number 11:

scan-6

Nigel Lawson in 2016, about Mark Carney – a change of tune about the propriety of political pressure on an independent central bank, then:untitled

The trade-investment-service-intellectual property nexus

I’ve managed to resist reviewing Richard Baldwin’s new book The Great Convergence: information technology, trade and the new globalization until now, and it has taken serious self-restraint as the book is so relevant to (among other things) the Brexit debate. I would for one thing force every Cabinet member to read it and not allow them to keep their jobs unless they could pass an exam based on it. Anyway, the book’s published on 14th November and now it’s November my self-denying ordinance can end.

The Great Convergence offers a compelling framework for thinking about how trade is organized and why and how it benefits whom. The first part is a historical overview of trade leading up to the first, the Old Globalization or the 19th century. This phenomenon, due to steam power reducing trading costs, industrialization and a context of relative global peace led to the Great Divergence: the major economies of Asia, which had been richer than the West, fell behind, dramatically so over the course of two centuries. The New Globalization, since the 1980s, driven by the new information and communication technologies, has taken the rich countries’ share of global output back to its 1914 level in little over two decades. China is the standout story, going from uncompetitive in 1970 to 2nd biggest in the world by 2010, but other rapidly industrializing nations in the New Globalization are Korea, India, Poland, Indonesia and Thailand (ie. a different group from the notorious BRICs).

However, as the book goes on to document, the New Globalization is a completely different kind. Trade over distance has three costs: the costs of moving goods, ideas and people. When moving goods got cheap, the first explosion of trade occurred, but ideas were costly to move so the innovations of the industrial revolution were not easily exported. The Old Globalization was the result of low shipping costs and high communication costs. ICTs have reduced the latter significantly, so industrial competitiveness is defined in terms of production networks, interlinked supply chains, that cross national borders. Knowledge has been offshored, and the rapid growth in a few previously poorer countries has come about because of their geographical location, close enough to G7 industrial centres that managers can travel there, sharing knowledge within the confines of the production network.

This means the New Globalization happens at the level of stages of production and occupations. This makes it harder to predict who will be affected – which jobs will be offshored, which areas most affected. “Nations are no longer the only natural unit of analysis”. Much of the book describes a new data set making it possible for economists to begin to explore the ‘value added’ pattern of trade created by the switch from trading finished goods toward trading components in global production chains. The picture is going to be utterly different – the famous example being the iPhone which is sourced conventionally as a Chinese export to the US but where the value added is concentrated in the American business and the Chinese import a lot of the components they assemble and re-export with not much value added at that stage.

This is one insight the Brexiteers need to appreciate, although the Nissan letter suggests at least some members of the government realise the signficance. British businesses are woven into supply chains with our near neighbours: we aren’t importing prosecco and salami so much as gear boxes. Brexit threatens to tear apart these links. If the cost appears to be too high, the multinationals at the head of the supply chains will relocate chunks of their production networks, and won’t care if they’re exporting gear boxes to the Czech Republic rather than Britain.

The book adds: “Twenty-first century supply chains involve the whole trade-investment-service-intellectual property nexus, since bringing high quality, competitively priced goods to customers in a timely manner requires international coordination of production facilities via the continuous two-way flow of goods, people, ideas and investments. Threats to any of these flows become barriers to global value chain participation…” Baldwin adds that the movement of people is still a binding constraint on globalization, and face-to-face communication – and so distance – remain important. He argues that the improving quality of telepresence is changing this, but I think that remains to be seen.

Ultimately, trade policy today is not just about trade nor about nations. It involves deploying the nation’s productive resources through overseas connections. This is why 90% of the economics profession thought, and thinks, Brexit so damaging, and the idea that the UK has more economic self-determination outside the EU a delusion. The Great Convergence is not about Brexit – it ranges far wider. I can’t imagine a better and more accessible analysis of trade and globalization in the digital era.

51dsa3wrnl

(20 November: minor typos corrected)