An impulse buy

I was just passing through the Blackwell’s on campus here when I spotted Lorenz by Captain Jerry Roberts. I’m a sucker for books about the code breaking efforts at Bletchley Park. I loved the wider and gripping account of intelligence efforts in the UK during the war, Most Secret War by R.V Jones.

This book is about breaking the Lorenz (rather than the Enigma) code. Captain Roberts (whom my husband met) died in 2014 but the book was just published last year. War is often the crucible of innovation but we often think of material technologies (canned foods for Napoleon’s armies, Teflon in the Cold War/space race). The immaterial technology of codebreaking and computing was surely by far the most significant, though?


Learning about (machine) learning

Last week I trotted off for my first Davos experience with four books in my bag and managed to read only one – no doubt old hands could have warned me what a full-on (and rather weird) experience it is. The one (and that read mainly on the journeys) was The Master Algorithm by Pedro Domingos. I was impressed when I heard him speak last year & have been meaning to read it ever since.

The book is a very useful overview of how machine learning algorithms work, and if you’ve been wondering, I highly recommend it. On the whole it stays non-technical, although with lapses – and I could have done without the lame jokes, no doubt inserted for accessibility. The book also has an argument to make: that there is an ultimate ‘master algorithm’, a sort of Grand Unified Theory of learning. This was a bit of a distraction, especially as there’s an early chapter doing the advocacy before the later chapters explaining what Domingos hopes will eventually be unified.

However, the flaws are minor. I learned a lot about both the history of the field and its recent practice, along with some insights as to how quickly it’s progressing in different domains and therefore what we might expect to be possible soon. Successive chapters set out the currently competing algorithmic approaches (the book identifies five), explains their history within the discipline and how they relate to each other, how they work and what they are used for. There is an early section on the importance of data.

As a by the by, I agree wholeheartedly with this observation: “To make progress, every field of science needs to have data commensurate with the complexity of the phenomena it studies.” This in my view is why macroeconomics is in such a weak state compared to applied microeconomics: the latter has large data sets, and ever more of them, but macro data is sparse. It doesn’t need more theories but more data. Nate Silver made a simliar point in his book The Signal and the Noise – he pointed out that weather forecasts improved by gathering much more data, in contrast to macro forecasting.

Another interesting point Domingos makes en passant is how much more energy machines need than do brains: “Your brain uses only as much power as a small lightbulb.” As the bitcoin environmental disaster makes plain, energy consumption may be the achilles heel of the next phase of the digital revolution.

I don’t know whether or not one day all the algorithmic approaches will be combined into one master algorithm – I couldn’t work out why unification was a better option than horses for courses. But never mind. This is a terrific book to learn about machine learning.


Moral sentiments and economics

Having read it some years ago, I’ve been looking through Economic Analysis, Moral Philosophy and Public Policy by Daniel Hausman and Michael McPherson while revising a paper for publication. It reminded me – if ever I absorbed it properly in the first place – that economists use the word ‘normative’ in a specific sense. It means, to us, the opposite of positive, something involving ethics or value judgements, and therefore outside the scope of what economists aim to do. We concentrate on ‘positive’ questions, following the distinction famously made by Milton Friedman, and Lionel Robbins before him.

However, as Hausman and McPherson argue, while economists claim ethical issues are a sphere apart, they/we often assume rationality – and that itself implies a moral stance. “Like morality, rationality is normative. One ought to be rational. One is wicked if not moral and foolish if not rational.” The book goes on to argue the case that the “moral principles implicit in standard views of rationality are implausible.” As a card carrying economist, I don’t agree with all they say here, but do wholeheartedly agree that economics and ‘moral sentiments’ cannot be separated. The impartiality economics aims for in trying to insist on being a ‘positive’ science is laudable, and economics should continue to distinguish the technical and value-laden elements of analysis. But we should not continue to shy away from the ethical debate. Doing so has not served the reputation of economics at all well.



What are top people reading?

It’s always interesting to hear what books people refer to at a conference. At the end of last week I attended one with people in the UK and French business and policy worlds. These were the titles I noted being mentioned:

Life 3.0 by Max Tegmark

WTF by Robert Peston

I, Robot by Isaac Asimov

Price: £7.17
Was: £8.99

Homo Deus by Noal Yuval Harari

Les trentes glorieuses by Jean Fourastié

Price: Check on Amazon

Devil’s Bargain by Joshua Green

This week I’m going (for the 2nd time – the 1st was in 2001) to the World Economic Forum in Davos – feeling a bit like one of the entertainers on board a cruise liner, as I’ll be talking about measuring the economy. I’ll report back on the reading materials of the global elite. I’ll miss Donald Trump’s speech on Friday but doubt he’ll be citing many books, so that’s ok.