Political Arithmetic

“In the late 1950s and early 1960s, it was not necessary to emphasize that history was one of the principal sources of generalizations about the economy,” according to Robert Fogel (and his cast of co-authors) in [amazon_link id=”0226256618″ target=”_blank” ]Political Arithmetic: Simon Kuznets and the Empirical Tradition in Economics.[/amazon_link] That widespread understanding that economics is an historical science, like geology or maybe meteorology, was lost in the succeeding generations, and is only just returning – see, for example, the essays in [amazon_link id=”1907994041″ target=”_blank” ]What’s the Use of Economics[/amazon_link].

[amazon_image id=”0226256618″ link=”true” target=”_blank” size=”medium” ]Political Arithmetic: Simon Kuznets and the Empirical Tradition in Economics (National Bureau of Economic Research Series on Long-Term Fac)[/amazon_image]

Political Arithmetic also emphasizes Kuznets’ insistence – as you might expect from one of the pioneers of national statistics – the importance of hands-on, detailed study of quantitative data for empirical economics. “High on his list of major dangers was the superficial acceptance of primary data without an adequate understanding of the circumstances under which the data were produced. Adequate understanding involved detailed historical knowledge of the changing institutions, conventions and practices that affected the production of the primary data.” Economists still frequently fall into the trap of not understanding the statistics they use, especially academic macroeconomists, who have fallen into the habit of downloading data from easily accessible online sources without giving any thought to how the statistics might have been collected. (Professional and applied micro-economists are typically more careful because less likely to be using the standard online databases.) Young economists are not even taught basic data-handling skills – such as the simplest precaution of printing out all your data series in straightforward charts to check for data-entry errors and outliers, before running the simplest regression or correlation.

These descriptions of Kuznets’ approach to economics certainly appeal to me, but overall I was disappointed by this book. My own book on the history of GDP is out later this year, and as Kuznets is such an important figure in national accounting, I was expecting to find all kinds of insights needing to be marked up on the proofs. However, Political Arithmetic turns out to be very short, more of an extended essay, so there are no details about Kuznets’ work on either national accounting or income and growth not to be found in previous books. At the same time, I think it fails in its claim to give an overview of Kuznets’ contribution to economics; there are elements of this in chapters on the way academic economics came to have a role in policy in the early 20th century, as well as on the emergence of national income accounting, but they don’t knit together in an effective synopsis. Maybe four co-authors are too many for a 118 page book.

Risk-related rambling

Not a new extreme sport, but some thoughts prompted by John Naughton’s column today on the vulnerability of important economic networks, such as just-in-time manufacturing or supermarket supply chains. It seems to me there’s an important gap between businesses making rational individual decisions about what they outsource and where, and how this aggregates up. For one manufacturer to outsource a vital component because there are lots of potential suppliers may be entirely sensible, but what if all the potential suppliers are clustered in one place, as is so often the case in global manufacturing chains? I doubt this question has made it to many businesses’ risk matrices. There’s an issue about diversity here – just as the border between the wisdom of crowds and the madness of crowds turns out to be all the individual members of the crowd having distinct ideas, not influenced by each other. For a supply chain that’s both efficient and resilient, you’d want the individual downstream components not to be linked to each other.

Network mathematics are tricky – I just looked in Barabasi’s 2002 [amazon_link id=”0738206679″ target=”_blank” ]Linked: The New Science of Networks[/amazon_link], my reference on the subject, and can’t find anything there on this issue. I’ve read [amazon_link id=”0099444968″ target=”_blank” ]Six Degrees[/amazon_link] by Duncan Watts and [amazon_link id=”0007303602″ target=”_blank” ]Connected[/amazon_link] by Nicholas Christakis and can’t remember if the discussion of vulnerability and resilience in those touched on this diversity question. But somebody must have looked at it.

[amazon_image id=”0738206679″ link=”true” target=”_blank” size=”medium” ]Linked: The New Science of Networks[/amazon_image]

Information underload

In his classic book [amazon_link id=”014016734X” target=”_blank” ]Darwin’s Dangerous Idea[/amazon_link], Daniel Dennett said: “A scholar is just a library’s way of making another library.” I’ve always loved that inversion of conventional thinking about causality, and sometimes even muse that as friendly bacteria in the gut are to humans, we humans are becoming to computers or the internet.

[amazon_image id=”014016734X” link=”true” target=”_blank” size=”medium” ]Darwin’s Dangerous Idea: Evolution and the Meanings of Life (Penguin Science)[/amazon_image]

The line from Dennett is quoted in James Gleick’s [amazon_link id=”0007225741″ target=”_blank” ]The Information[/amazon_link]. It’s been a very enjoyable read, covering some of my favourite territory in a well-written way. This includes the long-run effects of the  telegraph, Charles Babbage and Ada Lovelace, Alan Turing’s codebreaking work, Norbert Wiener and cybernetics, Claude Shannon’s information theory. The book had some angles that were new and quirky – for example, I like the point that lots of newspapers named themselves The Telegraph, following on from The Bugle, but none chose The Telephone. I liked it that Turing and Shannon had met in 1943, one devising codes and the other breaking them, but because of wartime secrecy had been unable to discuss their work. There’s a good section on Godel’s incompleteness theorem and why this relates to computation, although drawing quite a lot on Douglas Hofstadter’s [amazon_link id=”0140289208″ target=”_blank” ]Godel, Escher, Bach[/amazon_link] and [amazon_link id=”0465045669″ target=”_blank” ]Metamagical Themas[/amazon_link].

Overall, though, there was little that’s new here (at least if you share my obsessions and have read so many other books on this territory, from Tom Standage’s [amazon_link id=”0753807033″ target=”_blank” ]The Victorian Internet[/amazon_link] to George Dyson’s [amazon_link id=”014101590X” target=”_blank” ]Turing’s Cathedral[/amazon_link]), and I could not find really find a line of argument. There’s a general theme that everything is about information, right down to genetic code and the meaning of life; that information is the fundamental idea that should shape how we think about the physical universe and all of life, rather than energy. Maybe. There’s a long section on entropy that tries to underpin this thought. But I think that just as our forbears saw everything via mechanical metaphors, information is the framing metaphor of our times.

So, an enjoyable, meandering read, ideal for a flight. But not, for me, living up to the praise heaped on it by other reviews such as this in The Guardian or this (rather more tempered) one in The New York Times. And of course it won the Royal Society Winton Prize, a major achievement. So maybe it’s just me. I wouldn’t discourage anybody from trying it.

[amazon_image id=”0007225741″ link=”true” target=”_blank” size=”medium” ]The Information: A History, A Theory, A Flood[/amazon_image]

Big data 2.0

As I read a long Foreign Affairs article about Big Data and its ramifications this morning, it struck a real chord with a passage I’d just read in James Gleick’s interesting (although rambling) book [amazon_link id=”0007225741″ target=”_blank” ]The Information[/amazon_link].

[amazon_image id=”0007225741″ link=”true” target=”_blank” size=”medium” ]The Information: A History, A Theory, A Flood[/amazon_image]

He describes the use of the newfangled telegraph to send for the first time almost instantaneous news about the weather in different, distant parts of the country – fine in York, raining in Manchester. “The telegraph enabled people to think of the weather as a widespread and interconnected affair, rather than an assortment of local surprises.” Gleick quotes a commentator of 1848: “The telegraph may be made a vast national barometer.” (Tom Standage’s [amazon_link id=”0753807033″ target=”_blank” ]The Victorian Internet[/amazon_link] is a brilliant book about this particular communication medium.)

The possibility of weather reports led to the 1854 etablishment of the Meteorlogical Office by the Government, headed by Admiral FitzRoy, famously captain of [amazon_link id=”014043268X” target=”_blank” ]The Beagle[/amazon_link]. In 1860 he began issuing the first weather forecasts. The science of meteorology, and understanding of a global interconnected system, was built on the foundation of the local weather information conveyed by telegraph.

The new book, [amazon_link id=”1848547900″ target=”_blank” ]Big Data[/amazon_link], by Cukier and Mayer-Schoenberger appreciates the transformative scope of today’s Big Data 2.0 – in the article they write: “Big data is different: it marks a transformation in how society processes information. In time, big data might change our way of thinking about the world.” What the change will be is impossible to predict.

[amazon_image id=”1848547900″ link=”true” target=”_blank” size=”medium” ]Big Data: A Revolution That Will Transform How We Live, Work and Think[/amazon_image]