Hope amid the gloom?

Anybody in the UK who is feeling gloomy about the General Election result could deepen their gloom by reading the new edition of Danny Dorling’s [amazon_link id=”1447320751″ target=”_blank” ]Injustice: why social inequality still persists[/amazon_link]. The book has quite an upbeat conclusion: “Slowly, collectively, with one step back for every two taken forward, we inch onwards to progress; we gradually undo the mistakes of the past, and recognise new forms of injustice arising out of what we once thought were solutions. … Everything it takes to defeat injustice lies in the mind. What matters most is how we think. And how we think is metamorphosing because – everywhere – there are signs of hope.”

[amazon_image id=”1447320751″ link=”true” target=”_blank” size=”medium” ]Injustice: Why Social Inequality Still Persists[/amazon_image]

Clearly this new and greatly revised edition of the book was written before the election. What’s more, everything that is new in it ought to make a campaigner for social justice even gloomier. The indicators collected in the book have at best improved only a little since the financial crisis – household indebtedness, income inequality, for instance. Perhaps even more noteworthy is the apparent absence of any change in the zeitgeist, or public philosophy. When the first edition was published in 2010, many commentators thought the scale of the crisis would lead to a significant swing in public opinion away from Big Finance, markets and greed. With the exception of popular disapproval of unseemly bonuses in the socially destructive banking industry, that doesn’t seem to have happened.

Or has it? Dorling presents some evidence – Janet Yellen calling extreme inequality ‘un-American’, international efforts to collect unpaid taxes from rich globocrats, popular dislike of elites. It’s not much, perhaps. One could add a few more examples, like today’s call from the OECD for tech companies to stop their “aggressive tax planning”. Is this enough to justify a belief that the social and economic order will change. Dorling writes, optimistically in the circumstances: “No-one can truly know what will be sufficient to change deeply held and institutionally transmitted beliefs.”

In a terrific book, [amazon_link id=”0691151571″ target=”_blank” ]Masters of the Universe[/amazon_link], Daniel Stedman Jones described the long process of organising, campaigning, debating by many people that paved the way for the triumph of the individualist ‘free market’ philosophy from 1980 and is still our official public philosophy. Who knows how long it will take how many people who dislike this to pave the way for an alternative – but when a system change of this kind happens, it happens relatively quickly (over a decade at most) and is then dramatic.

[amazon_image id=”0691151571″ link=”true” target=”_blank” size=”medium” ]Masters of the Universe: Hayek, Friedman, and the Birth of Neoliberal Politics[/amazon_image]

[amazon_link id=”1447320751″ target=”_blank” ]Injustice[/amazon_link] replaces Beveridge’s original Five Giants with new ones: elitism, social exclusion, prejudice, greed and despair. The book links these in a vicious circle of social and economic inequality – exactly why single policy measures are drops in the ocean and a broader change is needed, Dorling argues, collecting all the evidence one could need to conclude this. I wouldn’t want to predict whether we’ll get it or not; that will depend on what we all do, and think, next.

The El Farol problem goes digital

I saw this tweet and it immediately reminded me of Brian Arthur’s El Farol equilibrium story.

AdamRFisher
Please develop an app that simulates Waze recommendations to assess which routes will open up based on everybody else follow Waze.
13/05/2015 07:12

For those who don’t know it, El Farol is a bar in Santa Fe, where Arthur is based. It has great Irish music but the problem is that it’s no fun if the bar gets too crowded. What is the outcome in this situation where people are trying to choose based on beliefs about what others will choose? There is no ‘deductively rational’ solution to this choice problem. In his paper, Arthur writes: “If all believe few will go, all will go. But this would invalidate that belief. Similarly, if all believe most will go, nobody will go, invalidating that belief. Expectations will be forced to differ.” Simulating behaviour of 100 agents repeatedly shows that mean attendance at El Farol quickly converges to 60, but with large swings from period to period, and changing identity of those attending – it isn’t always the same 40 with a varying number of extras. “while the population of active predictors splits into this 60/40 average ratio, it keeps changing in membership forever. This is something like a forest whose contours do not change, but whose individual trees do. These results appear throughout the experiments, robust to changes in types of predictors created and in numbers assigned.”

The Waze problem looks similar. If you see congestion on your planned route, you’ll switch to an alternative – perhaps – for you also have to predict how many other people will switch too, and whether the initial congestion will stay or vanish if there are enough other users of similar apps.

Later work on El Farol found one game theoretic solution: individual agents adopt a mixed strategy, whereby each has a fixed probablility of choosing either El Farol or an alternative bar. Another is that after a period of learning, agents sort themselves into groups, those who always go and those who always stay home. But I don’t think these work for Waze-style congestion problems which are not repeated.

Indeed, real-time apps are surely creating more of these kinds of co-ordinated decision problems.

Brian Arthur’s book [amazon_link id=”B00SLUR9HI” target=”_blank” ]Complexity and the Economy [/amazon_link]is a nice introduction to his work.

[amazon_image id=”B00SLUR9HI” link=”true” target=”_blank” size=”medium” ]Complexity and the Economy: Written by W. Brian Arthur, 2014 Edition, Publisher: Oxford University Press, USA [Hardcover][/amazon_image]

“Facts alone are wanted in life”

One of the great inventions of the Enlightenment and capitalism – perhaps one of the lesser-known ones – was statistics. The accumulation of facts, represented by numbers, was taken a mark of progress, along with the presumption that an aggregate number such as a mean could be used to study something inherently variable, including the behaviour of individuals in society. This is the argument of Theodore Porter’s 1986 book [amazon_link id=”069102409X” target=”_blank” ]The Rise of Statistical Thinking, 1820-1900[/amazon_link]. Yes, I’ve been sucked into the history of statistics.

[amazon_image id=”069102409X” link=”true” target=”_blank” size=”medium” ]The Rise of Statistical Thinking, 1820-1900[/amazon_image]

Porter writes: “The pre-numerate age was not entirely deprived of statistical tables, but the great explosion of numbers that made the term ‘statistics’ indispensable occurred during the 1820s and 1830s. The demands it placed on people to classify things so that they could be counted and placed in an appropriate box in some official table, and more generally on the character of the information people need to possess before they feel they understand something, are of the greatest importance.” (p11)

The collection of social statistics was also a tool in the centralization and bureaucratisation of government. Early ‘statists’ hoped to bypass traditional authorities such as church and monarch; but the effect of collecting orderly data on society was to consolidate state power, the book argues. (It is only now that we can think about the potential for citizen statistics.) However, the enthusiasm for statistics was manifested by pragmatic reformers, who “believed that the confusion of politics could be replaced by an orderly reign of facts.” (p27) This is still the dream of technocrats, and still disappointed by every election campaign.

In 19th century Britain, statistical enthusiasm took shape in private societies, principally the Statistical Societies of London (forerunner of the Royal Statistical Society, and with Malthus and Babbage among its founders) and Manchester (still thriving). The book draws an interesting connection between the emerging idea of social laws, statistical regularities unaffected by individual choices, and laissez faire liberalism, which reached its apogee in the 1850s. Government was seen as a hindrance to ‘natural’ social progress, obstructing the course of history toward prosperity and freedom. Interestingly, the idea of statistical regularities in physics, such as James Clerk Maxwell’s work on gases, was borrowed from the observation of social regularities.

And the opponents of statistics (many of them French positivists such as Comte) rejected the key novelty of statistical thinking, the idea that individual unpredictability would cancel out: “Any social science that views the differences among individuals as random, they argued, is irremediably flawed. …One must analyze carefully in order to establish causes and recognize their heterogeneous effects on different parts of the population.” (p152) There were medical opponents too, who said statistical generalizations were useless because they said nothing about the individual patient – something anybody presented with a diagnosis and a population frequency will identify with.

Fascinating. [amazon_link id=”0199536279″ target=”_blank” ]Mr Gradgrind’s[/amazon_link] insistence on Facts (“Now, what I want is, Facts. Teach these boys and girls nothing but Facts. Facts alone are wanted in life. Plant nothing else, and root out everything else. You can only form the minds of reasoning animals upon Facts: nothing else will ever be of any service to them.”) is both political and performative, and not boring at all.

[amazon_image id=”150567817X” link=”true” target=”_blank” size=”medium” ]Hard Times[/amazon_image]

Recent robot round-up

I’m looking forward to reading Martin Ford’s [amazon_link id=”0465059996″ target=”_blank” ]The Rise of the Robots[/amazon_link] – it gets a good review in the FT today. Edward Luce calls it “well researched and disturbingly persuasive.”

[amazon_image id=”0465059996″ link=”true” target=”_blank” size=”medium” ]Rise of the Robots: Technology and the Threat of a Jobless Future[/amazon_image]

I’m still a robo-sceptic in the sense of thinking there is nothing inevitable about the employment and income distribution outcomes of skill-biased automation. It’s technological determinism to think otherwise, as the underlying technological waves are channelled through economic and political institutions. That’s not to say we shouldn’t be concerned. After all, there was a wave of automation in manufacturing in the late 1970s/early 1980s and the social consequences of that were devastating – the institutions handled the transition very badly.

There is an interesting recent (free) e-book collection of essays (including one of mine) from the IPPR, Technology, Globalization and the Future of Work. Also this recent paper, Robots at Work, by Georg Graetz and Guy Michaels. They find in a panel of data across industry in 17 countries, robotization increased total factor productivity and wages, although with some adverse effects on hours worked by low-skilled workers.

The largeness of small errors

I enjoyed Oskar Morgenstern’s trenchant observations about the (in)accuracy of economic statistics in [amazon_link id=”0691003513″ target=”_blank” ]On The Accuracy of Economic Observations[/amazon_link]. Here are a few more examples:

[amazon_image id=”0691003513″ link=”true” target=”_blank” size=”medium” ]On Accuracy of Economic Observations[/amazon_image]

“The idea that as complex a phenomenon as the change in a ‘price level’, itself a heroic theoretical abstraction, could at present be measured to such a degree of accuracy [a tenth of one percent] is simply absurd.”

“It behooves us to pause in order to see what even a 5 percent difference in national income means. Taking the US and assuming a Gross National Product of about 550 billion dollars, this error equals + or – 30 billion dollars. This is more than twice the best annual sales of General Motors…. It is far more than the total annual production of the entire electronics industry in the United States.”

(Updating and relocating this exercise, a 5% error in the £1.7 trillion GDP of the UK would be almost the same size as the entire UK motor industry including the supply chain, more than the total profits of the financial services sector, or about the same as households spend in total on food and drink.)

The errors don’t get the attention they deserve, Morgenstern writes: “Instead, in Great Britain as in the United States and elsewhere, national income statistics are still being taken at their face value and interpreted as if their accuracy compared favourably with that of the measurement of the speed of light.” And he points out that arithmetically, when you are looking at growth rates of figures each measured with some error, even proportionately small errors in the levels turn into large errors in the rate of change. He gives an arithmetical example, of a ‘true’ growth rate of 1.8% being measured as somewhere between -7.9% and +12.5% for measurement errors of up to 5% in the two levels.

It’s interesting that every economist and statistician would acknowledge the errors problem and yet virtually all ignore it. We’ve invested so much that to admit great uncertainty would undermine the totemic value of the figures and the ritual pronouncements about them. At a talk I did at IFN in Stockholm yesterday about [amazon_link id=”0691169853″ target=”_blank” ]GDP[/amazon_link], one of the respondents, Karolina Ekholm, State Secretary at the Ministry of Finance, said it made her uneasy that key policy decisions such as cutting government spending depended so much on the output gap – the difference between two imaginary and uncertain numbers. Of course we have to try to measure, and how marvellous it would be if the official statisticians got some extra resources to improve the accuracy in the raw data collection, and yet I think she’s right to be uneasy.

Next on my reading pile: [amazon_link id=”B00SLUQ5HS” target=”_blank” ]The Politics of Large Numbers: A History of Statistical Reasoning[/amazon_link] by Alain Desrosières and [amazon_link id=”069102409X” target=”_blank” ]The Rise of Statistical Thinking 1820-1900[/amazon_link] Theodore Porter.

[amazon_image id=”B00SLUQ5HS” link=”true” target=”_blank” size=”medium” ]The Politics of Large Numbers: A History of Statistical Reasoning: Written by Alain Desrosieres, 2002 Edition, (New Ed) Publisher: Harvard University Press [Paperback][/amazon_image]  [amazon_image id=”069102409X” link=”true” target=”_blank” size=”medium” ]The Rise of Statistical Thinking, 1820-1900[/amazon_image]