Well of course he didn’t actually write about the sharing economy. But one of the essays in a new collection about Coase’s legacy – Forever Contemporary: The economics of Ronald Coase, edited by Cento Veljanovski – offers a Coaseian perspective on the phenomenon. Its author, Michael Munger, argues that the sharing economy platforms are enabling reductions in the transactions costs involved in exchanges that were always in principle possible. The three key transactions costs are: information about prices, characteristics, options; assuring safety and quality to creat enough trust for the transaction to occur; and process the transaction agreement and payment in a reliable and speedy manner.
The essay also notes, in a point that was bound to appeal to me, that there could be “a potentially dramatic reduction in the amount of new stuff that we need to manufacture,” be it shared drills or cars. Munger ends, “The firm of the future may operate primarily as a software platform rather than as a physical location.” While I certainly don’t think all firms will take this shape, it doesn’t seem a mad idea.
I haven’t yet read the other essays but it looks like a nice volume for all Coase fans – and aren’t we all? The pdf of the book can be downloaded here.
There’s a chapter online from a forthcoming book, Economic Psychology edited by Robert Ranyard, called How Laypeople Understand Economics, by David Leiser and Zeev Krill. Although not entirely surprising, the chapter is very interesting. Those of us who are economists long ago internalised the subject’s distinctive way of thinking and understanding of how variables are related. Most people find economics difficult, even mysterious, however. There are some excellent demystifications, from John Lanchester’s How to Speak Money to Tim Harford’s Undercover Economist, and all the rest of the popular economics literature. But as this chapter points out, laypeople – including many politicians – will use one of three strategies for trying to make sense of economic discussions: use heuristics; use metaphors; fall back on teleological or causal explanations.
One example of a common heuristic is ‘good begets good’ – if there is a change in one variable perceived to be good, it is assumed it will cause good changes in other variables. On metaphors, the authors comment: “Understanding of financial marketsrelies onseven metaphors: the market as a bazaar, as a machine, as gambling, as sports, as war, as a living being and as an ocean. Crucially, each metaphor highlights and hides from view certain aspects of the foreign exchange market. Some of themetaphors imply market predictability, others do not. For instance, the sports and themachine metaphors were found to be associated with fixed rules and predictability, whereas the bazaar and war metaphors with unpredictability.”
It’s Saturday, when I try to bring order to my life, and I was just sorting out the teetering pile of books when I unearthed Hamburgers in Paradise: the stories behind the food we eat by Louise fresco. It’s a notably handsome book with lovely pictures, so already enticing. Paging through, it looks a fascinating read as well. Although billed as a cultural history, it looks at the dominant role of supermarkets in the way we shop, at genetic modification, at agriculture, poverty and economic development, at the slow food movement and the globalization of food supply.
Now, I love food and prepare most of our meals from scratch, buying few ready-made items. Quality is important to me. Eating as a family, round a table, talking, is essential. Yet the slow food movement makes me uneasy, as it often seems to reject the productivity needed to feed everyone, and to embody an approach few can afford. Agricultural productivity needs to increase again. As it happens, I just spotted this tweet on exactly this subject:
Global middle class is booming, so is demand for food. More crop per acre is the only way! https://t.co/LM9PFbRAbi https://t.co/VL09pRxOFv
On the other hand, the scandals of industrial food production – horse meat disguised as beef, the treatment of animals including stuffing them with antibiotics, obesity, the high-salt, high-fat, high-margin products etc – are unacceptable and probably unsustainable. We will soon be publishing a terrific book by David Fell on food policy and taxation in our Perspectives series. meanwhile, I’m going to read Hamburgers in Paradise over the Christmas holiday. Fresco concludes: “Without food there is no evolution and no civilization. We are what we eat, literally. … What it means to be human is concentrated in food and our understanding of it. Inevitably, part of that is the consciousness that many have too little to eat, or cannot choose to have the things tah are associated with a decent meal.”
I’ve been dipping into the truly fascinating World Intellectual Property Report from WIPO, published last week. The overview chapter has a beautifully clear overview of economic growth, and the role of innovation and IP rights. The rest of the report falls into two sections: case studies of historical breakthrough innovations (airplanes, antibiotics, semiconductors); and case studies of newer innovations with breakthrough potential (3D printing, nanotechnology and robotics).
The lessons drawn should not be surprising but seem hard for people looking at future growth prospects to absorb. For example, big innovations can affect growth through several routes (for example with antibiotics by the impact on human capital); their economic transformations are far-reaching, unpredictable and can take a long time; all breakthough innovations require continuous follow-on innovations, both technical and organizational; the specifics of the innovation ecosystem matter greatly, and have a geographical dimension; the structure of the ecosystem will change as the technology matures, steadily involving more professional and formal structures. Interestingly, the historical examples suggest that the IP system made far less difference to the wide dissemination of the technologies than the absorptive capacity of each country.
The report is free to download and – unusually for such official reports – a very good read. Its case study approach is illuminating and I learned a lot about the technologies I’m less familiar with.
200 years of innovation
I was reading some of the essays in the volume on The Philosophy of Economics edited by Daniel Hausman, and was struck by the echo in a terrific comment by Herbert Simon of something Dani Rodrik says in Economics Rules. (The Simon paper was originally in the AER P&P volume for 1963, Vol 53 (1963): 229-231.)
Simon, like Rodrik, points out the logical fallacy of using an empirical observation to validate the assumptions of a theoretical model devised to explain – or at least for consistency with – that empirical observation. The assumptions must also be empirically valid, or validated, Simon argues. “The remedy for the difficulty is straightforward, although it may involve more empirical work at the level of the individual actors than ost conventionally-trained economists find comfortable.”
Market theories, and macro theories, do need micro-foundations, but empirical ones, foundations based on how people or firms behave. Do businesses maximize profits? Of course not, or at least, economists do not test the assumption before they build models on it. In my years on the Competition Commission taught me, many businesses can be blithely unaware of which of their activities make the most profit. As I’ve complained about macro models before, they might indeed have theoretically rigorous micro foundations but are ad hoc with respect to reality.
Simon goes on to suggest that economics adopts the scientific practice of making sure assumptions simplify but approximate sufficiently closely to the real world. He, like Deirdre McCloskey and Steven Zilliak, bemoans the tyranny of statistical significance rather than maningfully significant: in hypothesis testing, “We do not primarily want to know whether there are deviations of observation from theory which are ‘significan’ in this [statistical] sense. It is far more important to know whether they are significant in the sense that the approximation of theory to reality is beyond the limits of our tolerance.” Unfortunately, it is much easier to read a t-statistic from a software package than to think (and apparently also too hard to consider the statistical power of regression results) so the tyranny of statistical significance continues.