Property is theft (and allocatively inefficient too)

We launched the Bennett Institute for Public Policy in Cambridge this week so it’s been a bit hectic. I still managed to read Radical Markets: Uprooting Capitalism and Democracy for a Just Society by Eric Posner and Glen Weyl. It’s extremely thought provoking and clearly brilliant – yet also barking mad. This is the territory of thought-experiment rather than policy proposal.

The basic idea is Proudhon (‘All property is theft!’). Any private ownership of property contains within it the seeds of market power. Worse, “Private ownership of any asset, except homogenous commodities, may hamper allocative efficiency.” A more efficient and more egalitarian arrangement is for all property to be in effect rented from the state, by current ‘owners’ stating what they think every item is worth, and paying a tax on that amount. At any time, somebody who values it more can bid it away from them; there are continuous auction markets. For homes, there might be a notice period so people can order their affairs. There might be exemptions for small items of sentimental value such as Grandma’s fountain pen. The revenues raised from the tax would be returned as a basic income to all citizens.

The authors want to Radicalise voting as well as ownership. In place of the one person one vote tyranny of the majority, they apply the auction principle to politics as well as markets. Everybody gets an allowance of ‘voice’ which they can allocate according to their political preferences and strength of feeling about the issues. There is a quadratic tapering so I’d need four voice tokens to vote twice and so on. They like the idea so much they’ve applied it to opinion polling to elicit more accurate views – and, they write, “We have patented the use of QV and related methods to solicit opinions digitially.”

After two introductory chapters, the book applies the broad concepts to some specific areas, including the high profile paper proposing ‘data as labour’: in other words, that we should be paid by digital data harvesters for providing our time and knowledge. Each of the chapters starts with a vignette of what the Radical Market future might look like. They’re all rather dystopian, and especially the one in the data as labour chapter. Every interaction between human and digital assistant is monetised. I’m an economist – I like markets – but don’t want every minute or keystroke to have a dollar sign attached. If a click on a Like button is worth 20 cents, would I start wondering whether it was worth telling my neighbour about the great new coffee shop, because she’s not going to pay me for it? Of course the current situation is unsatisfying, but I’m still unpersuaded by this potential solution.

The most interesting chapter is the one about the concentration of share ownership (in the US) in the hands of a small number of large institutional investors. The book argues persuasively that this diminishes competition. Antitrust concentrates on the corporations, but institutional investors dilute its effectiveness, “by knitting together the interests of the biggest firms that dominate any particular market.” Here there is a policy proposal worth thinking about: restricting institutional investors to holding a maximum of a 1% stake in companies in the same market; or as much of they want of one firm but then none of its competitors shares. Very interesting idea. Don’t see what it has to do with the Radical Markets conceit.

The book ends by reflecting on markets versus central planning, alluding to the socialist calculation debate. Markets “allocate resources in ways no present computer could match.” Prices are a uniquely efficient summary of information, but markets can be improved – by having them operate continuously. “The market is the appropriate computer to achieve the greatest good for the greatest number,” but its bugs need fixing so there are fairer outcomes. Even better, common ownership makes the market outcome more efficient too.

This does glide over the fact that – should the nirvana of constant online auctions be attained – the state is there in some sense as the owner of all property and redistributor of large tax revenues raised as a sort of rent from everyone for having temporary use of, well, everything. Nor does it touch on assets owned by foreigners, or owned overseas. In fact, the book doesn’t really discuss practicalities at all because it isn’t a real set of proposals.Thomas Piketty‘s global wealth tax has more chance of becoming a reality than the permanent revolution of ubiquitous Vickrey auctions.

However, Radical Markets certainly made me think, about property, information, power. Well worth reading.

Share

Planning the unplannable

I just read a working paper by Joe Kane, ‘The Economic Flaws in Computerized Socialism’, which refers to Eden Medina’s excellent 2011 book about Project Cybersyn in Allende’s Chile, Cybernetic Revolutionaries. She recounts the history of this attempt to use computers for central planning, from a futuristic control room, relying on data input in factories around the country. Even before the coup, the project was in some trouble. As Kane notes, there has been a revival of the idea that computers and the internet (and now the blockchain) make central planning feasible. Evgeny Morozov trailed the idea an article that drew on Medina’s book. So did Paul Mason in his dire book Post-Capitalism. Kane cites a few other examples.

As he points out in the working paper though, the central planning problem is not one of computation or the transfer of information – or rather, that is only the case in the world of neoclassical general equilibrium theory. Frances Spufford’s brilliant book Red Plenty makes use of the formal equivalence of the centrally planned and competitive market economies in that la-la-land with no frictions, fixed preferences and complete markets. Markets are a process of discovery, as John Kay has to repeat over and over again. The data that would be needed for computerized central planning are not ‘out there’, but are created by the market. I wouldn’t go nearly as far as Kane in concluding government policies are therfeore useless, but do agree that “the outcome of the market process is not separable from the process which generates it.”

 

Share

Epidemics vs information cascades

As I was looking at publishers’ websites for my New Year round up of forthcoming books, I noticed OUP billing Paul Geroski’s The Evolution of New Markets as due out in January 2018. This is odd as it was published in 2003, and Paul died in 2005; it isn’t obvious why there’s a reprint now. He was Deputy Chairman and then Chairman of the Competition Commission during my years there, and was a brilliant economist as well as a wonderful person. I learnt an amazing amount from being a member of his inquiry team.

Anyway, the catalogue entry for the reprint sent me back to my copy of the book, along with Fast Second, which Paul co-authored with Constantinos Markides. Fast Second challenges the received wisdom of first mover advantage: Amazon was not the first online books retailer, Charles Schwab not the first online brokerage, and so on. The opportunity lies in between being early in a radical new market and being late because a dominant design and business model have already emerged. The Fast Second business competes for dominance – and supposed first mover advantages are usually fast second advantages.

Paul’s book The Evolution of New Markets – in which I found a handwritten note he’d sent me with it, which made for an emotional moment – does what it says, and explores models of innovation diffusion – so in other words, models of S-curves. His view was that the epidemic model of S-curves, which seems to be the standard one, was a misleading metaphor. He argued that information cascades best fit the observed reality. The epidemic model assumes that a new technology is adopted as information about it is diffused. Each user passes on the info to the next user. However, as the book points out, information diffuses far faster than use. Users need to be persuaded rather than just informed.

More appropriate is a model whereby new technologies arrive in a number of variants at slightly different times, making adoption risky and costly – especially when there are network externalities or when people realize there is going to be a standards battle. Most new products fail, after all. But once one variant starts to dominate, the cost and risk decline and adoption will occur much faster.

It’s a persuasive argument, and a very readable book. Although the list price is surprisingly high for a short paperback, one can be confident second hand copies are just as good.

 

 

Share

Social limits to growth

In preparing for an event tomorrow celebrating the 40th anniversary of the publication of Fred Hirsch’s The Social Limits to Growth, I’ve naturally been re-reading the book. It’s full of comments that leap out from the page, such as this: “The extent of interdependence of many forms of consumption in advanced, urbanized societies has brought increasing recognition that to give effect to public choice among the available economic alternatives represents a still unresolved intellectual and administrative problem, rather than requiring merely the sweeping away of impediments to the working of the market mechanism.” And, “To see total economic advance as individual advance writ large is to set up expectations that cannot be fulfilled, ever.”

These comments reminded me very much of Will Baumol’s long overlooked book (his PhD thesis!), Welfare Economics and the Theory of the State, which I read quite recently. Part of his argument is that interdependence is far more extensive than in textbook world. The changes in the character of the economy since 1977 have made this ever more true. Hirsch is of course famous for the concept of positional goods, where there are negative consumption externalities – I am worse off if you have the status symbol and I don’t. Some of this has been absorbed in modern signalling models. However, positive consumption externalities – network effects, direct and indirect – are now becoming widespread too.

The conventional matrix of goods (according to whether they are easy or hard to exclude and rivalrous in consumption or not) needs extending:

—————————-Easy to exclude                        Hard to exclude

Rivalrous+neg externality      Positional                           Commons good

Rivalrous                               Private good                      Commons good

Non-rivalrous                         Club good                         Public good

Non-rival+pos externality       Network club                     Network commons

In only one of these boxes does the standard ‘free market’ presumption apply.

 

 

 

Share

‘Free’ markets

I read recently The Illusion of Free Markets by Bernard Harcourt (date), on the recommendation of an esteemed colleague. The bulk of the book is about state discipline – Bentham’s Panopticon, Foucault, the American penitentiary state. The bit that really appealed to me was the opening section on French grain markets in the 18th century, compared with Chicago commodities markets in the late 20th century.

The book opens with great detail about how intensively regulated markets were in early 18th century France, with even trivial breaches of the rules in theory liable to punishment, imposed by the police des grains. Harcourt then draws the comparison with what we think of as a model of free market capitalism, the open outcry pit of the Chicago Board of Trade (I visited once  – an amazing experience). As he convincingly establishes, there os no sharp contrast, as the modern market rules are in fact just as detailed as the 18th century version.

Why then do we contrast ‘free markets’ as today’s ideal with the over-regulated past? The book attributes the turn to the Physiocrats, and “that contested moment in the 18th century when notions of natural order were beginning to take shape.” The argument is that they shaped a sharp dichotomy between “the economy as the realm of natural order” and everything else which was thereby in the sphere of being policed by the state. “In other words, the market is efficient, and within that space there is no need for government intervention. What is criminalized and punished is behaviour outside the sphere of the orderly market.” The government can legitimately penalize non-market behaviours.

But of course, the dichotomy is a false one. The state is present in all markets, and often in just as much detail as the C18th police des grains. The rhetoric of ‘free markets’ is misleading.

I certainly agree with this last point, as does anybody who (like me) has spent some time as an economic regulator (the UK Competition Commission in my case). Modern economies are highly regulated, and that goes for the Anglo-Saxons as much as anyone else. I don’t know nearly enough about the C18th or the literature on punishment to evaluate those parts of Harcourt’s book. But it certainly offers food for thought.

 

Share