Capital, statistics and stories

Two hundred pages into Piketty’s Capital in the 21st Century, I’ve found plenty of interest but am not yet bowled over as some reviewers have been. Still, 400 pages to go. I’m not going to live tweet the reading experience but will pick up on some interesting points as I go along before perhaps attempting an overall review.

A mere 58 pages in, I was calling out ‘hear, hear’ when I read this:

“One conclusion stands out in this brief history of national accounting: national accounts are a social construct in perpetual evolution. They always reflect the preoccupations of the era in which they were conceived. We should be careful not to make a fetish of the published figures.”

This is needless to say completely in harmony with my own view in GDP: A Brief But Affectionate History.

Too many economists pay no attention to the figures they are using, simply downloading time series from handy databases and telling stories around them. This chart showing ONS revisions to 1990s UK GDP data (or something similar) should be pinned above every economist’s desk as a reminder that the figures, although they’re all we have to measure the tide of economic events, are as broad brush as can be. The late 80s boom was even boomier than we remember, but the subsequent bust far less severe. As a reminder, there were a few things happening in 1992 – a general election in April, double digit interest rates in the summer, ‘Black Wednesday’ in September.

The present national accounts – as I describe in my book – co-evolved with Keynesian macroeconomics. The accounting identity C+I+G+(X-M) segued into the theory of aggregate demand and its successors. We’re stuck with GDP until the next Keynes conceives a theory of aggregate dynamics fit for a largely intangible, service and information driven economy, when there will have to be a different statistical framework.

PS I should add that it was @BenChu_ who alerted me to the 1990s revisions

Share/Save

Canada versus Minsky, and the politics of banking

With Eurostar journeys coming up, I anticipate making decent inroads into Piketty’s Capital, but meanwhile the enormous buzz about it makes it harder than ever to understand why the popular and intellectual anger about plutocracy has not translated (yet?) into political consequences.

This thought was underlined by browsing through Fragile By Design: The Political Origins of Banking Crises and Scarce Credit by Charles Calomiris and Stephen Haber. They write: “There is no avoiding the government-banker partnership.” The book combines history, economics and political science to analyse the nature of the state-bank relationship, within a framework of bargaining. One section compares and contrasts the US (12 systemic banking crises since 1840) and Canada (zero). Another looks at the relationship in authoritarian contexts and democratic transitions.

One important conclusion is that no general theory can explain why banking crises have not been equally likely in all countries in the recent past. For example, Hyman Minsky‘s theory of endogenous excess followed by fear has enjoyed a revival – indeed there was a recent BBC Radio 4 Analysis on it (Why Minsky Matters) that is well worth a listen – but why was Canada exempt from these oscillations arising from human nature? It isn’t that Minsky is wrong, but rather that context matters for crises too. “Useful propositions about banking generally are only true contingently, depending on historical context.” And, to mangle Tolstoy, every country (except Canada?) has its own unhappy politics.

Which brings me back to the strange absence of any political consequence of the financial crisis for banking. Bankers will complain about excess regulation but the only result has been to cause them to employ more compliance officers, and more lawyers to game the regulations. There has been little action on leverage and capital ratios, next to none on scandalous rent-seeking bonuses and none at all enforcing competition and new entry. The financial sector isn’t the only locus of the modern plutocracy, but it is one of the most significant.

One possibility is that the political classes are befuddled because – as I describe in GDP: A Brief But Affectionate History – the national accounts figures overstate the contribution of the financial sector to the economy. Maybe some politicians genuinely believe they cannot risk killing the goose that’s laying the golden eggs even if it is keeping all the eggs within its own nest. Whatever the explanation, the bargain between banks and politics is working for bankers, and not for other citizens.

Here is an excellent VoxEU interview about the book with Charles Calomiris. For now, I’m off to St Pancras and on with Piketty.

(Non-) Digital thinking

It was a good way to end the week: I spent Friday talking to very clever people in London and then Cambridge, about things digital in general and specifically how the tech is affecting democracy. I might write more about both meetings in due course, but meanwhile, these are the books mentioned in the discussions.

Capital in the 21st century by Thomas Piketty, of course – even Cabbage is interested

The Second Machine Age by Erik Brynjolfsson and Andrew McAfee

Revolt of the Masses by Jose Ortega Y Gasset

Liberty and the News by Wlater Lippmann (which coincidentally I read fairly recently)

Thought and Change by Ernest Gellner

1984 by George Orwell and (in the same sentence) Brave New World by Aldous Huxley

  

The Confidence Trap by David Runciman

From Gutenberg to Zuckerberg by John Naughton

Plato The Republic

There were several others too, given that it was an academic forum, but those were the ones I happened to note. Interesting, though, that this largely non-tech group was referring back to the politics/democracy literature (and mainly classics) far more than to recent books specifically on digital politics, such as Rebecca Mackinnon’s Consent of the Networked.

Is the US a post-innovation economy?

Production in the Innovation Economy edited by Richard Locke and Rachel Wellhausen is a useful short summary of an interesting inter-disciplinary MIT research project, and is the companion to an earlier volume, Making in America: From Innovation to Market. Although entirely US-centric, the research into what is needed for firms to be able to innovate then commercialize and grow is fascinating. All the more so in the light of the mild furore yesterday about Facebook’s purchase of Oculus, which some commentators saw as a signal of the inability of start-ups to grow organically.

The work, based on existing data sources and a new survey of US manufacturing companies – including high-tech spinouts from MIT, explores a number of potential barriers to turning research into successful US manufacturing entities. The papers in the book cover skills, what it rather coyly calls ‘complementary assets’ but is actually finance to grow past the start-up stage, and a thick, geographically specific ‘ecosystem’ of suppliers. Taking these in turn:

- most manufacturers have no specific high skill needs and have little sustained trouble filling jobs, but a significant minority (15-20%) need people with higher mathematics and computer skills, and team working capability, and find it hard to recruit. Small, innovative start-ups do not have the capacity to train up their own people – they need to hunt for them in the labour market.

- surprisingly, finance to get to the point of being able to manufacture at scale, including a stage of iterative incremental innovation and prototyping, is a major barrier even in the US (and surely all the more so elsewhere). Part of the problem is that VCs have become specialists in specific stages and are usually looking for an out from their stage – in other words, venture capital has become very short term finance, shorter term than needed to grow a successful innovative manufacturing business. Many US businesses now turn to strategic overseas investors in Asia, either their customers or even state-funded entities.

- a third requirement is the presence locally – for the exchange of tacit knowledge – of a thick enough market for components. The geographic co-location of members of a supply chain is becoming a constant theme of the literature on innovation. In the economy of the 1960s and 70s, large vertically integrated companies with their own R&D arms did not have this challenge of finding partners in the supply chain; but all new innovative manufacturers have to source components from suppliers who are prepared to work on the new products.

The specific policy conclusions are written for the US, and do not carry over elsewhere in the same form. But the general principles are very obvious. They all point to the need for government to play a co-ordination role. Whether it is ensuring the system of education, training and apprenticeships serves the needs of new start-ups, co-ordinating finance or tax breaks on the funding gap at the transitional stage of growth, or liaising between firms and with local authorities and educational establishments to achieve the geographic clustering, there is a strategic role for government.

Asian governments are superb at this, including focusing support on some specific sectors (such as energy innovation in China), and I think some European governments do a decent job too. The book is pretty clear, though, that the US government has not adapted its industrial strategy for an economy consisting of smaller innovators, one where it can’t rely on big defence contracts and the likes of Bell Labs to look after the business of growing new ideas into large-scale commercial success. In the US and UK there is still the official delusion that industrial policy is equivalent to pouring a bath of taxpayers’ money in which lame ducks can splash around in unproductive luxury. This book provides a really solid evidence base concerning the barriers to growing small, innovative manufacturers, and is very persuasive about the need for policy action to lower those barriers – otherwise, it implies, the US economy can have superb universities and early stage research but will be nevertheless a post-innovation economy.

Secret statistics

Is there a word for ‘serendipity’ that doesn’t have the overtone of being a positive thing? In a horrible coincidence, over the past day or two I’ve been reading the sections of The Theory That Would Not Die by Sharon Bertsch McGrayne concerning using Bayes’ Theorem to help trace missing objects at sea, albeit submarines and nuclear missiles gone astray rather than airplane debris. The theme of the book is the way Bayesian statistics survived two centuries of dismissive treatment by the academic statistics establishment because the techniques are just so useful in an acute situation. The description in today’s newspapers of the way Inmarsat and other investigators combined different sources of information to piece together the path of the vanished plane could slot in to the book very neatly.

Although an econometrician by training, and trained in the early 1980s so almost wholly in frequentist techniques, I’ve never really understood the ‘either-or’ aspect of the debate about Bayesian techniques. Why would you expect the same technique to be useful in every situation? Why would you throw away information? I was a bit disappointed in The Theory That Would Not Die for two reasons. One is that it actually doesn’t do a very good job of explaining what the alternative approaches actually are – the explanation of Bayes Rule and how to apply it is left until Appendix B. Maybe an editor said it would be too scary for readers to have any actual statistics in the main text.

The second reason is that I ended up not really clear why Bayesian methods were infra dig for so long – the book describes one thing after another conspiring against public acclaim for ‘Bayesianism’ without giving a synthesis. The closest it comes to a theory is that the code-breakers from Bletchley Park on and the military were such heavy users of Bayesian methods that statisticians actually did use them but never talked about them.

Having said that, the book’s a very enjoyable read. There are lots of episodes and characters new to me, and it’s very well written. The problem-solving sections – finding those nuclear warheads gone astray – are gripping. Bayesian statistics seem particularly appropriate to economics, which has relatively little scope for repeated experiments and much historical, context -specific, non-repeatable evidence. This book made me think I ought to find something far more practical to read – recommendations welcome.