Pricing progress – yes please

I pounced on Eli Cook’s new book, The Pricing of Progress: Economic Indicators and the Capitalization of American Life. The author is an historian, and I enjoyed reading the historical detail, which traces the evolution of economic measurement of the US economy from Alexander Hamilton on. The early chapters set up a contrast between the use of ‘moral statistics’ – essentially detailed social statistics – in public discourse and policy in the earlier part of the period and the forerunners of the economic statistics we are used to today. Hamilton was an outlier in his day, the book argues, in seeking to price everything. It was not until the late 19th or early 20th century that the commercial mindset predominated. And Cook – like many modern critics of capitalism from Polanyi on – regrets that shift.

This framework means the book sees more continuity than I (and others) would between pre-World War 2 statistics and modern ones. Cook’s argument about that continuing essence is this: “One of the key elements that distinguishes capitalism from previous forms of cultural and social organization is capital investment, the act through which basic elements of society and life – including natural resources, technological discoveries, cultural productiond, urban spaces, educational institutions, human beings and the fiscal nation state – are transformed (or capitalized’) into income-generating assets valued and allocated in accordance with their capacity to make money and yield profitable returns.”

It seems to me there are two separate arguments here. One is about the spread of money as a metric. Concerning the ‘moral statistics’ of the mid-19th century, Cook writes: “Moral statistics did not measure social welfare in units of money, as the American people’s general disdain of the pricing process held strong through the 1840s.” Indeed, he notes an ‘explosion’ in the use of the word ‘priceless’ in the 1830s and 1840s – although this was a sign, perhaps, of this approach coming under pressure. The book portrays the 1850s as “a watershed decade for the pricing of progress”. This is a well-aired debate, to which the economist’s response as always been that it is impossible to weigh up trade-offs without measuring in common units, and money is at least as good as any other. Anyway, this dislike of money as the metric of value of culture, natural resources etc will resonate with many readers.

The point about regarding all of these things as income-generating assets is a distinct one. The book starts by defining this – just like the theory of a company’s market cap – as the net present value of the stream of future earnings. The first example is the shift (in 17th and 18th century England) from seeing land as a forum governed by traditional relationships to the enclosure of land and its valuation estimated as a multiple of expected rents. The book sometimes uses ‘capitalization’ with a different meaning – often, just ‘aggregated’. It does acknowledge right at the end that modern (system of national accounts based) economic statistics are different from predecessor statistical frameworks such as those of William Petty or Thomas Jefferson: “There is one important difference between GDP and some of the previous forms of capitalization documented in this book.” That is, of course, that GDP pays no attention to asset values at all. All that matters is the current flow of resources, no matter what the inter-termporal trade-offs or depreciation.

This seems at least as important a watershed as the transition to valuation based on market prices: if only we had capitalized natural resources then we might not be in the current dangerous environmental situation! It is true that at in microeconomics the ‘capital’ metaphor has persisted and spread – we have human capital, cultural capital, social capital etc. Some of these are more persuasive than others – natural capital is as real as physical capital. I find the concept of human capital and investment in one’s capabilities a useful one, though Cook disparages it. And – being an economist – would argue we should be doing more pricing of assets we ‘value’ (in the normal everyday sense) in order to take more care of the future than has been the case for the past 70 years or more. The more we ‘capitalize’ the future benefits nature will give us, by looking at the value of tomorrow now, the better we will look after the assets.

So although there is a lot to enjoy in The Pricing of Progress, the elision of monetization and capitalization is confusing and frustrating. It lasts up to the final page, where Cook criticizes Donald Trump for comparing running America to running a business. I agree – but would not describe it remotely as a “capitalizing vision for America,” as Cook describes it. On the contrary, Mr Trump seems to have no concern at all for America’s assets, as he and his family extract as much value short term as they can. Read this book for its insights into the growth of a commercial mindset in 19th and early 20th century America, including the role of slavery, but I don’t think it adds a lot to the current debate on economic measurement.


Competition, competition, competition

Grazing along my bookshelf this morning, postponing getting to work, I found ‘Industrial Concentration‘ by M.A.Utton in the Penguin Modern Economics series – 1960s/70s paperbacks for the people providing overviews of different fields in the subject. This one was published in 1970 and it’s fascinating as a window on the historical evolution of competition policy.

One distinction it draws, certainly no longer valid, is between tough American anti-trust policy with a legacy dating back to the Sherman Act and relatively weak and new British competition policy based on 1948 legislation under the Monopolies Commission, which Utton describes as always willing to accept ‘public interest’ arguments for allowing mergers of big companies. American policy was far more willing to tackle the structure of an industry, he argues.

Hence UK business had become far more concentrated in the 1950s and 60s, although with effects mitigated by greater openness to foreign competition via trade than the relatively closed US economy. At the time of writing, the newish (1966) Industrial Reorganization Corporation (IRC) in the UK was busy promoting still more mega-mergers to create ‘national champions’, with the companies involved given a nod and a wink to say they would not be referred to the Monopolies Commission.
Interestingly, a recent Yale Law Journal article by Lina Khan argues for a return from the Chicago School emphasis on consumer welfare as measured by current prices to Sherman Act-inspired interventions in market structure, in the context of the digital giants. But it isn’t just the digital sector; there’s pretty convincing evidence of increasing concentration across the US economy, as The Economist recently summmarised.

The UK’s history of competition policy has been brighter recently thanks to the formation of the independent Competition Commission and now Competition and Markets Authority (with its excellent economists, including my son). There have been blips – notably the very bad decision during the financial crisis to make finance a sector exempt from the usual competition rules, in order to allow the Lloyds-HBoS merger. Still, the independence of the watchdog and the removal for the most part of vague ‘public interest’ considerations has been beneficial. However, vigilance is needed.

It isn’t only the challenge of ensuring the giant digital companies, with their giant network effects and economies of scale, continue to deliver for social rather than just private gain. The EU’s State Aid regime has been a massively important backdrop to domestic policy. If the Brexit train wreck continues, it will be essential to carry the regime over into domestic policy.

This is all the more important in the context of both the likely negative impact of Brexit on key sectors – there will be queues of badly affected businesses asking for special help or dispensations – and the aim of having a more strategic approach to economic policy, an industrial strategy. Nobody (in theory) wants a return to the ‘picking winners’ (ie losers) days of the IRC.

A really tough competition policy is the best way to avert this. It needs to include not just State Aid rules but also a rethink about the weak sector regulators in network sectors like water and telecoms. This is why we on the Industrial Strategy Commission have been putting so much emphasis on competition policy.(I note the Amazon price for this book is algorithmically weird – original cover price was 40p.)


Founder of the information age

For reasons linked to book and bag size, and journey modes and lengths, I’ve been reading three books at once – Jennifer Homans’ Apollo’s Angels (a monumental history of ballet, non-portable), a biography of Claude Shannon, and Daniel Dennett’s Intuition Pumps and Other Tools for Thinking.

I’ve finished the middle one now, A Mind At Play: How Claude Shannon Invented the Information Age by Jimmy Soni and Rob Goodman, and thoroughly enjoyed it. When I described it to some friends at the weekend, to my surprise they turned out never to have heard of Claude Shannon. Those of us with interests in digital know of him as the author of a profoundly important paper launching information theory. It seems this is the first full biography, and it starts from his childhood in a small town in the midwest via wartime service in cryptograhy and then a long stint at Bell Labs to MIT. (He met Alan Turing during the war but both were doing work so secret they didn’t dare talk to each other about it.)

I’m not sure I’d seen a photo of Shannon before and he looks like a blend of Samuel Beckett and Albert Camus. He seems to have been rather reclusive, whimsical – rider of unicycles, keen juggler, creator of gadgets such as a machine to turn itself off, and so on. His most famous creation – in that it got him to public attention – was a mechanical maze-solving mouse (named Theseus) that would learn to find a piece of (metallic) cheeese. (Although, Shannon explained, the maze solved the mouse rather than the other way round. The information was in the maze, and it and the mouse formed a system.)

After 1948 when his paper was published, Shannon was a celebrity and much in demand for lectures. The book explains that he had few graduate students because most were too much in awe of him to dare ask him to supervise their work. Shannon’s paper (later a book), A Mathematical Theory of Communication picked up the idea that information is a meaningfully quantifiable entity, defined communication as the reproduction of messages, transmitted as a signal, subject to noise, to a received. Thus abstracted, all kinds of things could be interpreted as the communication of information. Importantly, Shannon introduced the role of uncertainty (information is a measure of uncertainty overcome), redundancy (uninformative but helps mitigate noise), and defined the bit, an amount of information that results from a choice between two equally likely options. A message is the elimination of all irrelevant signals from the available pool. Without Shannon’s paper, the modern era would not exist.

The book does a good job at explaining the ideas in combination with rattling good storytelling about the life of someone who was clearly an extraordinary character. Shannon settled down at MIT into an enjoyable life of making gadgets, attending conferences and playing the stockmarket. He does deserve to be far better known and this biography is a great place to start.




Late night ethics

This book is somewhat off-topic for economists, but I’ve been dipping into the marvellous Ethics at 3am by Richard Marshall over the past couple of weeks. Marshall is the philosophy editor at 3am magazine (‘Whatever it is, we’re against it.’) and does the enlightening interviews ranging across the whole territory of modern philosophy. This subject was the bit of my PPE degree that didn’t agree with me, and although I still don’t find it easy, it does interest me more and more.

Anyway, this is Marshall’s second thematic collection of interviews (the first was ‘Philosophy at 3am’), with an introduction that guides you through the issues and schools of thought. The most interesting section is the applied ethics one, ranging from Luciano Floridi on information, privacy, AI, the role of the nation state, and more in ‘The Philosophy of the Zettabyte’ to Rebecca Gordon on torture in ‘Saying No to Jack Bauer’. (And by the way, there is an impressive representation of women philosophers here, given how male dominated the subject is – as bad as economics. There are 11 women and 3 non-white scholars interviewed, out of a total of 26.)

My favourite bits of each chapter are: answers to the first question (usually ‘What made you become a philosopher?’*); and the book recommendations each subject is asked to give. As for the actual philosophy, some of these are a heavier read than others & I have no confidence in my ability to understand, still less summarise them. As Marshall notes in the introduction, contemporary ethics is a broad landscape – but surely a field of philosophy most deserving of public exposure and debate. This book is a great introduction to the research frontier and well worth us non-philosophers trying to get our brains around. I hope he and OUP are now going to bring us an epistemology collection – Truth at 3am??

*”I was raised in a home where philosophy was a frequent topic for dinner conversation.” “People who are too sure of themselves.” “My impression of philosophy came from reading Plato, whose arguments I had found lame and fallacious.” “I was conscripted into military service and my gut feeling was to refuse to serve. I did not want to kill other people.” “I often think I would not have become a philosopher had my older brother not been killed one night in a car accident.” You get the idea.


Inequality, revisited

There have been a few essay collections recently responding to the splash created by Thomas Piketty’s Capital in the 21st Century.

I reviewed one, After Piketty, edited by Heather Boushey and Brad DeLong, for The Chronicle Review. It was generally sympathetic. Anti-Piketty, edited by Jean-Philippe Delsol, Nicolas Lecaussin, Emmanuel Martin, was generally not.

Recently I’ve been dipping into The Contradictions of Capital in the 21st Century: The Piketty Opportunity, edited by Pat Hudson and Keith Tribe. I think it’s fair to say there is a reasonable consensus among the contributors to these various collections that Piketty’s theorising is flawed (in particular depending on an empirically invalid assumption about the substitution between capital and labour), that his application of a theory about productive capital to the data including housing wealth and financial capital is troubling, and that his call for a global wealth tax is (as Avner Offer puts it in his essay in this book) ‘utopian’. Equally, pretty much all would agree that Piketty, with co-authors, has done a terrific service in putting together the database, and in getting inequality on the agenda of both economists and policymakers.

Pat Hudson labels that here as ‘the Piketty opportunity’.  Her challenge to Capital in the 21st Century is its omission of globalisation and technology as drivers of inequality – if one is thinking about policies to mitigate inequality, r>g isn’t much help. She pinpoints financial markets, and regulatory and political beliefs, as key points of intervention – in other words, the specifics Piketty ignores in his generalisations. Avner Offer focuses on housing, and limiting its tendency to make wealth more unequal through credit controls.

The main sections of the book aim to particularise the analysis of inequality by looking at the trends, institutions and politics of different countries – one section on western economies, one on major economies elsewhere. As Luis Bértola points out here, Piketty is very Eurocentric.

Having these different national perspectives is a useful contribution to what is turning out to be quite an extensive new literature on inequality. While there was a substantial body of research on inequality before Piketty’s book was published in 2014, it was based on micro datasets, and focussed on individual questions such as benefit regimes or educational drivers of income. What we have now is a growing macro literature, and this includes global perspectives such as Branko Milanovic’s Global Inequality and François Bourguignon’s The Globalization of Inequality.

What’s needed now is for these two perspectives, the big picture and the individual outcomes, to get joined up. I think the late and much-missed Tony Atkinson came closest in his Inequality, and as a result had a set of eminently practical policies to propose.