Too big, full stop

No sooner has summer ended than it’s almost Christmas – how has this happened? In between meetings and paper-writing, I have managed to read a few things. Two thrillers on journeys to and from a family visit last week, John Le Carre’s A Legacy of Spies and one of the outstanding Mick Herron Jackson Lamb series, Spook Street – highly recommended if new to you.

On more serious matters, I’m half way through the handsome new Stripe Press edition of Mitchell Waldrop’s The Dream Machine. And I’ve finally read Tim Wu’s The Curse of Bigness. This is a very interesting, and commendably concise, history of US anti-trust legislation and enforcement. The argument in a nutshell is that anti-trust was born out of a power relations confrontation between the original trusts – Rockerfeller, Carnegie etc – and the US government: Theodore Roosevelt determined on trust-busting to establish the primacy of government power. To some extent this tradition continued after the second world war with landmark cases against AT&T and IBM. But, Wu continues, the Chicago school and especially Robert Bork defanged US anti-trust enforcement by embedding so thoroughly an economic test based on a consumer welfare standard as measured only by consumer prices. Today, with digital giants so often charging zero or low prices, this is less appropriate than ever. The time has come to reaffirm that the government, not rent-extracting monopolies, runs the country.

This is an interesting and persuasive account. It is also a specifically American one. Although the underlying economic analysis concerned crosses the Atlantic, there has never been such a narrow interpretation either of consumer welfare or of how to measure it in Europe. The test in UK law is a ‘substantial lessening of competition’ with reasonably wide discretion for the competition authority, and the guidance sets out other dimensions of welfare such as quality, range, innovation – although of course price is the easiest to measure. We have had prominent cases looking at monopsony power, such as the inquiry into supermarkets. Nor has there ever been on this side a routine acceptance that the benefits of vertical integration or horizontal merger can be assumed to be passed on to consumers – in my cases we always asked about the incentive as well as the scope for efficiencies to be passed on. And, as Wu notes at the end of the book, the UK’s market inquiry tool can be very powerful.

Having said all this – and cautioning against translating Wu’s account and other influential authors such as Lina Khan – to non-US contexts, this does not mean the question of monopoly power is not a pressing one here too. The UK has an inquiry into digital competition under way (chaired by Professor Jason Furman – I’m a member). In other markets from insurance/banking to pharma there are very powerful and profitable firms sustaining their position over long periods and scant sign that new entry is possible. As I’ve written in a forthcoming paper, the competition authorities need tools to assess dynamic, Schumpeterian competition as well as their everyday static toolkit.

Behind the technicalities, there is also the issue of political power highlighted by Wu’s book. Most economists would be hesitant to re-politicise competition policy after the dire experiences of big companies using their lobbying power to protect themselves before the present regime came into force. Two former DGs of the UK’s Office of Fair Trading, John Fingleton (here) and John Vickers, have rightly pointed to the vast expansion of arbitrary ministerial say-so over mergers in proposed UK legislation. This is a route sure to make consumers worse off.

At the same time, there are valid political questions. Some companies in a number of sectors have become simply too powerful. Paul Tucker’s recent book Unelected Power highlights these, arguing for a tilt in the balance away from technical economic analysis toward political choice. I’m not persuaded that the problem stems from the use of economics in competition policy, such that dethroning economic analysis would fix the pwer imbalance. However, there do seem to be some unresolved tensions between the economic standards for assessing competition in a market, the legal interpretations, and the politics.

Much food for thought in a short book. The Curse of Bigness is a great stocking filler for the economists and lawyers in your life.

Share

Marketcraft

Marketcraft: How Governments Make Markets Work by Steven Vogel is a nice overview of the inextricable links between ‘state’ and ‘market’. It would be great to put to rest the concept (still reflected in verbal usage) of government and market as opposites, and the book offers the concept of ‘marketcraft’ as a device to make the point. Markets always require a framework of government action to function at all.

It isn’t as if economists believe there is such as thing as the abstract ‘free’ market – ‘free’ from government ‘interference’. As Vogel very fairly notes (while regretting the use of a competitive equilibrium as a benchmark in any way at all), not only behavioural economics but also institutional economics, market design – and he could have added industrial organisation/competition economics, labour economics, health economics and all the other applied fields – have the market as an embedded institution at their core. Competition economists like me, for instance, know that it takes sustained attention from the institutions of the state to keep a market competitive.

The book – a short one drawing on previous work – compares and contrasts the liberal makrket economy of the US with the co-ordinated market economy of Japan; although Vogel is critical of the ‘varieties of capitalism‘ approach, arguing that it overstates the differences. Liberal market economies are only differently co-ordinated, he believes. There is a tension in the argument, for Vogel argues that the US should become more like Japan while also arguing that Japan’s attempt to become more like the US has failed because it did not take account of the social norms, conventions and culture in which the economy was embedded. (To be fair, he acknowledges wholesal change in the Japanese direction would not be possible.)

Somewhat ironically, Vogel reflects on the same tension in Karl Polanyi’s The Great Transformation, noting that it both asserts that ‘the market’ becomes a separate sphere from society, commodifying a growing territory of life, and that the self-regulating free market is a myth because markets are always socially embedded.

Although the argument the book makes isn’t dramatically new, and I for one need no persuasion about having to think of markets as institutions which can be shaped and designed for better or worse, there are some nice insights. I liked the section on the language we use to perpetuate the ‘free market’ chimera: governments make ‘interventions’ rather than just ‘acting’; we speak of ‘redistribution’ rather than ‘distribution’. It was also a welcome surprise that the book doesn’t set out the usual straw man version of economics. The term ‘marketcraft’ (as an analogue to statecraft) is also very nice. Governments are always ‘intervening’ in markets even if unintentionally. For sure government failure is a real thing, yet there’s no way we can live collectively without collective actions. We call that government.

 

Share

Property is theft (and allocatively inefficient too)

We launched the Bennett Institute for Public Policy in Cambridge this week so it’s been a bit hectic. I still managed to read Radical Markets: Uprooting Capitalism and Democracy for a Just Society by Eric Posner and Glen Weyl. It’s extremely thought provoking and clearly brilliant – yet also barking mad. This is the territory of thought-experiment rather than policy proposal.

The basic idea is Proudhon (‘All property is theft!’). Any private ownership of property contains within it the seeds of market power. Worse, “Private ownership of any asset, except homogenous commodities, may hamper allocative efficiency.” A more efficient and more egalitarian arrangement is for all property to be in effect rented from the state, by current ‘owners’ stating what they think every item is worth, and paying a tax on that amount. At any time, somebody who values it more can bid it away from them; there are continuous auction markets. For homes, there might be a notice period so people can order their affairs. There might be exemptions for small items of sentimental value such as Grandma’s fountain pen. The revenues raised from the tax would be returned as a basic income to all citizens.

The authors want to Radicalise voting as well as ownership. In place of the one person one vote tyranny of the majority, they apply the auction principle to politics as well as markets. Everybody gets an allowance of ‘voice’ which they can allocate according to their political preferences and strength of feeling about the issues. There is a quadratic tapering so I’d need four voice tokens to vote twice and so on. They like the idea so much they’ve applied it to opinion polling to elicit more accurate views – and, they write, “We have patented the use of QV and related methods to solicit opinions digitially.”

After two introductory chapters, the book applies the broad concepts to some specific areas, including the high profile paper proposing ‘data as labour’: in other words, that we should be paid by digital data harvesters for providing our time and knowledge. Each of the chapters starts with a vignette of what the Radical Market future might look like. They’re all rather dystopian, and especially the one in the data as labour chapter. Every interaction between human and digital assistant is monetised. I’m an economist – I like markets – but don’t want every minute or keystroke to have a dollar sign attached. If a click on a Like button is worth 20 cents, would I start wondering whether it was worth telling my neighbour about the great new coffee shop, because she’s not going to pay me for it? Of course the current situation is unsatisfying, but I’m still unpersuaded by this potential solution.

The most interesting chapter is the one about the concentration of share ownership (in the US) in the hands of a small number of large institutional investors. The book argues persuasively that this diminishes competition. Antitrust concentrates on the corporations, but institutional investors dilute its effectiveness, “by knitting together the interests of the biggest firms that dominate any particular market.” Here there is a policy proposal worth thinking about: restricting institutional investors to holding a maximum of a 1% stake in companies in the same market; or as much of they want of one firm but then none of its competitors shares. Very interesting idea. Don’t see what it has to do with the Radical Markets conceit.

The book ends by reflecting on markets versus central planning, alluding to the socialist calculation debate. Markets “allocate resources in ways no present computer could match.” Prices are a uniquely efficient summary of information, but markets can be improved – by having them operate continuously. “The market is the appropriate computer to achieve the greatest good for the greatest number,” but its bugs need fixing so there are fairer outcomes. Even better, common ownership makes the market outcome more efficient too.

This does glide over the fact that – should the nirvana of constant online auctions be attained – the state is there in some sense as the owner of all property and redistributor of large tax revenues raised as a sort of rent from everyone for having temporary use of, well, everything. Nor does it touch on assets owned by foreigners, or owned overseas. In fact, the book doesn’t really discuss practicalities at all because it isn’t a real set of proposals.Thomas Piketty‘s global wealth tax has more chance of becoming a reality than the permanent revolution of ubiquitous Vickrey auctions.

However, Radical Markets certainly made me think, about property, information, power. Well worth reading.

Share

Planning the unplannable

I just read a working paper by Joe Kane, ‘The Economic Flaws in Computerized Socialism’, which refers to Eden Medina’s excellent 2011 book about Project Cybersyn in Allende’s Chile, Cybernetic Revolutionaries. She recounts the history of this attempt to use computers for central planning, from a futuristic control room, relying on data input in factories around the country. Even before the coup, the project was in some trouble. As Kane notes, there has been a revival of the idea that computers and the internet (and now the blockchain) make central planning feasible. Evgeny Morozov trailed the idea an article that drew on Medina’s book. So did Paul Mason in his dire book Post-Capitalism. Kane cites a few other examples.

As he points out in the working paper though, the central planning problem is not one of computation or the transfer of information – or rather, that is only the case in the world of neoclassical general equilibrium theory. Frances Spufford’s brilliant book Red Plenty makes use of the formal equivalence of the centrally planned and competitive market economies in that la-la-land with no frictions, fixed preferences and complete markets. Markets are a process of discovery, as John Kay has to repeat over and over again. The data that would be needed for computerized central planning are not ‘out there’, but are created by the market. I wouldn’t go nearly as far as Kane in concluding government policies are therfeore useless, but do agree that “the outcome of the market process is not separable from the process which generates it.”

 
Price: £7.88
Was: £10.99

Share

Epidemics vs information cascades

As I was looking at publishers’ websites for my New Year round up of forthcoming books, I noticed OUP billing Paul Geroski’s The Evolution of New Markets as due out in January 2018. This is odd as it was published in 2003, and Paul died in 2005; it isn’t obvious why there’s a reprint now. He was Deputy Chairman and then Chairman of the Competition Commission during my years there, and was a brilliant economist as well as a wonderful person. I learnt an amazing amount from being a member of his inquiry team.

Anyway, the catalogue entry for the reprint sent me back to my copy of the book, along with Fast Second, which Paul co-authored with Constantinos Markides. Fast Second challenges the received wisdom of first mover advantage: Amazon was not the first online books retailer, Charles Schwab not the first online brokerage, and so on. The opportunity lies in between being early in a radical new market and being late because a dominant design and business model have already emerged. The Fast Second business competes for dominance – and supposed first mover advantages are usually fast second advantages.

Paul’s book The Evolution of New Markets – in which I found a handwritten note he’d sent me with it, which made for an emotional moment – does what it says, and explores models of innovation diffusion – so in other words, models of S-curves. His view was that the epidemic model of S-curves, which seems to be the standard one, was a misleading metaphor. He argued that information cascades best fit the observed reality. The epidemic model assumes that a new technology is adopted as information about it is diffused. Each user passes on the info to the next user. However, as the book points out, information diffuses far faster than use. Users need to be persuaded rather than just informed.

More appropriate is a model whereby new technologies arrive in a number of variants at slightly different times, making adoption risky and costly – especially when there are network externalities or when people realize there is going to be a standards battle. Most new products fail, after all. But once one variant starts to dominate, the cost and risk decline and adoption will occur much faster.

It’s a persuasive argument, and a very readable book. Although the list price is surprisingly high for a short paperback, one can be confident second hand copies are just as good.

Price: Check on Amazon
 

 

Share