Economic forecasts, fortune telling and sunspots

An enticing looking book has arrived in the post. It’s

, by Walter Friedman.

[amazon_image id=”0691159114″ link=”true” target=”_blank” size=”medium” ]Fortune Tellers: The Story of America’s First Economic Forecasters[/amazon_image]

It’s easy to make fun of economic forecasts, which are always wrong. Nate Silver’s 

has a chapter explaining with care why this is so, without bothering to score the cheap shots many critics resort to. Essentially, he points out that the macroeconomy is a large complex system with many feedbacks, about which we have very little data. Economic forecasting lags well behind weather forecasting in its gathering and use of statistics. David Hendry and Mike Clements have written, for my money, the best book on how to do time series forecasting given our current data and knowledge,
.

There has been progress. W Stanley Jevons famously correlated economic activity with sunspots. The theoretical basis for this might in fact have grown stronger now there is so much electronic communication for solar storms to disrupt. Fans of Kondratiev cycle-type analysis sometimes stretch the insight that applying disruptive technology can require generational change to trying to forecast the cycles.

I’m sympathetic to macro forecasters as I used to be one myself for a couple of years. It was an eyeopener to me, a relatively freshly minted, idealistic PhD, to realise how much fiddling there is to make any forecast look even plausible – they all require it –  and therefore how strong the herding instinct among forecasters. It took me another giant stride on my journey from macro to micro. I was working for Data Resources Inc, founded by the eminent US macroeconomist Otto Eckstein in 1969 (now part of Global insight).

stops before the Second World War, however – that is, before modern macro models. It looks great fun.

Share

Undercover, bigtime

I think everybody should read Tim Harford’s new book,

. This includes (a) everybody who has no idea what to make of the conflicting arguments about fiscal and monetary policy, whether the austerians or the stimulards are right; (b) everybody who thinks they know exactly what fiscal and monetary policy ought to be; (c) all economics students; (d) anybody not in the first three categories.

[amazon_image id=”1408704242″ link=”true” target=”_blank” size=”medium” ]The Undercover Economist Strikes Back: How to Run or Ruin an Economy[/amazon_image]

The subtitle is ‘How to Run – or Ruin – an Economy’. The Undercover Economist has decided to tackle macroeconomics. That this is so successful a book – clear, balanced but not indecisive, readable – is praise indeed, from someone like me who thinks macroeconomics is in a pretty sorry state. There is an absolutely terrific introduction about Bill Phillips (of the machine and the curve). The first batch of chapters cover: what do we mean by macroeconomics, what can cause recessions, what is money, how money and inflation are related, different policy prescriptions for different types of recession, output gaps and unemployment. Later chapters look at management and productivity, the concept of GNP, demolish the idea that ‘happiness’ can/should replace growth, discuss whether there are physical limits to growth, look at inequality, and the book ends with an agnostic view about the future of macroeconomics.

The book would be worth reading for the first half alone; it is such a public service to explain why macroeconomists are arguing about how policy should respond to the post-crisis recession, and to do it with such clarity that you can be utterly confident the author understands his subject. (I do not get this sense of confidence from a lot of people writing about the economy.)

I would have liked the Undercover Economist to tackle some areas of macroeconomics omitted from the book: financial markets and asset prices; and exchange rates and the balance of payments. Both are important to understand recent economic history and the financial crisis. So the book is not a complete guide. It would be churlish, too, to point out that a couple of the chapters are really more microeconomics than macro (job matching and efficiency wage models of the labour market, and management). And I personally don’t like the Q & A format of the book, but that’s seemingly a minority view. I still enjoyed reading it. It is definitely one for my list of economics books for beginners.

 

Share

Positive, normative and provocative economics

Last night it was my privilege to give the annual Pro Bono Economics lecture. I’d be delighted to hear people’s comments on it. (It would be even more pleasing if you’d look at the website and consider making a donation to their work.)

Many people in the audience have been enthusiastic, but one macroeconomist has taken great offence at my criticism of macro. I daresay I was too provocative – Dave Ramsden of the Treasury, chairing the evening, diplomatically described it as ‘challenging’ – but it does simply amaze me that so many (but not all) macroeconomists don’t think anything much needs to change in their area. Anyway, views welcome.

In the chat afterwards, somebody recommended to me 

by Martin Hollis and Edward Nell. The blurb says:

“Economics is probably the most subtle, precise and powerful of the social sciences and its theories have deep philosophical import. Yet the dominant alliance between economics and philosophy has long been cheerfully simple. This is the textbook alliance of neo-Classicism and Positivism, so crucial to the defence of orthodox economics against by now familiar objections. This is an unusual book and a deliberately controversial one. The authors cast doubt on assumptions which neo-Classicists often find too obvious to defend or, indeed, to mention. They set out to disturb an influential consensus and to champion an unpopular cause. Although they go deeper into both philosophy and economics than is usual in interdisciplinary works, they start from first principles and the text is provokingly clear. This will be a stimulating book for all economic theorists and philosophers interested in the philosophy of science and social science.”

[amazon_image id=”0521033888″ link=”true” target=”_blank” size=”medium” ]Rational Economic Man[/amazon_image]

I’d like it to have been a bit more specific about the authors’ doubts, but it sounds intriguing.

Richard Davies of The Economist (@RD_Economist on Twitter) has recommended 

by Marcia Baron et al.

[amazon_image id=”0631194355″ link=”true” target=”_blank” size=”medium” ]Three Methods of Ethics: A Debate (Great Debates in Philosophy)[/amazon_image]

I can see I’m going to have to improve my philosophy to continue in the vein of the Pro Bono lecture.

UPDATE: Paul Kelleher (@kelleher_) recommends 

by Julian Reiss

[amazon_image id=”041588117X” link=”true” target=”_blank” size=”medium” ]Philosophy of Economics: A Contemporary Introduction (Routledge Contemporary Introductions to Philosophy)[/amazon_image]

Share

The Scarlet Letter for economists

An econometrics paper that can make you laugh? Yes, Ed Leamer, famously the author of a 1983 paper, Let’s Take the Con Out of Econometrics (pdf), has a superb 2010 article in the Journal of Economic Perspectives, Tantalus on the Road to Asymptopia – it’s free access,  only moderately technical, and brilliant.

Leamer’s theme is the same in the more recent paper as in the earlier one, the need for a profound culture change in empirical economics:

“Can we economists agree that it is extremely hard work to squeeze truths from our data sets and what we genuinely understand will remain uncomfortably limited? We need words in our methodological vocabulary to express the limits. We need sensitivity analyses to make those limits transparent. Those who think otherwise should be required to wear a scarlet-letter O around their necks, for “overconfidence.””

The point is that the available economic data will always support a range of different theories, and Leamer advocates sensitivity analyses that illustrate the spectrum of parameter values and theories consistent with observed data. Economists need to go back to 1921, he argues, and read Keynes’s 

and Frank Knight’s
. Both books point out that decisions are subject to three-valued logic (yes, no, don’t know) whereas economic theory assumes away the large territory of don’t know.

I strongly agree with Leamer’s conclusions:

“Ignorance is a formidable foe, and to have hope of even modest victories, we economists need to use every resource and every weapon we can muster, including thought experiments (theory), and the analysis of data from nonexperiments, accidental experiments, and designed experiments. We should be celebrating the small genuine victories of the economists who use their tools most effectively, and we should dial back our adoration of those who can carry the biggest and brightest and least-understood weapons. We would benefit from some serious humility, and from burning our “Mission Accomplished” banners. It’s never gonna happen.”

He, like me, is profoundly sceptical about macroeconomics: “Our understanding of causal effects in macroeconomics is virtually nil, and will remain so.”

I must go away and read Leamer’s 2009 book,

.

[amazon_image id=”364207975X” link=”true” target=”_blank” size=”medium” ]Macroeconomic Patterns and Stories[/amazon_image]

 

 

Share

How not to do economic forecasts

Every so often I come across a book that should be read by: (a) all economists; (b) all students; (c) everybody involved in politics and policy; and (d) everybody else with an interest in the world. Nate Silver’s 

– which I’ve finally read shamefully long after publication – is one of those books. It should in fact be read by all macroeconomists who publish forecasts at least annually, as a condition of their continuing membership of the profession. If I were teaching, it would emphatically be a required item on the course reading list.

It is a wonderfully clear and engaging explanation of the challenges of making predictions in fields ranging from politics and elections to weather and earthquakes to economics and poker. Apart from a couple of sections on American sports, which might as well have been written in a foreign language, the examples illustrate how to, and how not to, make forecasts. You’ll be wiser for reading it, not to mention able to put Bayesian inference into practice. Silver makes a compelling case for adopting the Bayesian approach, rather than the standard (‘frequentist’) statistics descended from R.A.Fischer and universally taught to economists in their econometrics courses. The emerging new economics curricula should at least include Bayesian statistics in the modules covering empirical methods. As Silver writes:

“Essentially the frequentist approach toward statistics seeks to wash its hands of the reason that predictions most often go wrong: human error. It views uncertainty as something intrinsic to the experiment rather than something intrinsic to our ability to understand the real world.”

In other words, it is not true that collecting more and more data – although usually useful to a forecaster – will eliminate your uncertainty about the real world. The signal-noise problem is epistemologically unavoidable. What’s more the frequentist approach involves assumptions about the distribution of the population; we know about the (in-)validity of the normal curve assumption, and anyway, “What ‘sample population’ was the September 11 attack drawn from?”

The chapter on macroeconomic forecasting is a pretty devastating critique of economists who do that kind of thing. There is a demand for macro forecasts, and I’d rather economists supply them than anybody else. But we shouldn’t pretend they’re going to be accurate. Almost all forecasters, even if they publish standard errors, will give the impression of precision – is growth going to be 0.5% or 0.6%? – but it is inaccurate precision. Silver calculates that over the period 1993-2010, GDP growth fell outside the 90% confidence intervals of macro forecasts for the US economy a third of the time, and a half the time if you look back to 1968.

Macroeconomic data are very noisy, especially early estimates of GDP: in the US the margin of error on the initial quarterly estimate of GDP is plus or minus 4.3%. The initial estimate for the final quarter of 2008 was a decline of 3.8% – later revised to minus 9 per cent. Silver makes the comparison between economic forecasts and weather forecasts, similarly difficult problems. However, weather forecasting has improved over the decades, thanks to a better understanding of the causal links and a greater degree of disaggregation of data, made possible by more powerful computers. Economists have neither the improved understanding – on the contrary, important causal links notably finance were ignored until recently – not seemingly the appetite for better data (as I’ve pointed out before).

The book also makes the point that others (like

) have emphasised, that the economy is a complex non-linear system so there is a lot of unavoidable uncertainty about forecasts more than a short period ahead. It also notes that although we know about the Lucas Critique and Goodhart’s Law – both pointing out that policy affects behaviour – economic forecasters typically ignore it in practice. Silver also underlines the rarely-resisted temptation to overfit the data – and microeconomists are just as guilty as macroeconomists here. The temptation is strong because an over-fitted model will seem to ‘explain’ more than a ‘true’ model when the data are noisy, so the usual tests for good fit will look better. 
have been pointing out the siren allure of ‘statistical significance’ for ages – it has almost nothing to do with economic meaning – and perhaps The Signal and the Noise will help broadcast the message further.

Finally, I learned a lot from the book. The chapter on how to approach the question of CO2 emissions and climate change is a model of clear thinking. My favourite new fact: members of Congress – with access to lots of company information via lobbyists and the ability to influence companies’ fortunes by legislation – see a profit on their investments that beats the market averages by 5 to 10 per cent a year, “a remarkable rate that would make even Bernie Madoff blush,” as Silver observes.

Anyway, if you haven’t yet read this, go and do so now. The new UK paperback also has a wonderful cover image!

[amazon_image id=”B0097JYVAU” link=”true” target=”_blank” size=”medium” ]The Signal and the Noise: The Art and Science of Prediction[/amazon_image]

Update: Dan Davies (@dsquareddigest) has gently rebuked me for the paragraph about Bayesian versus frequentist statistics. Via Twitter, he said: “Silver has a really annoying misintepretation of Bayesian vs frequentist which is now becoming commonplace… the paragraph you quote is really confused – NS is a good practical statistician but all over the place on theory & methodology. The (otherwise excellent) book gains nothing from taking a very strong position in someone else’s philosophical debate.” Needlesss to say, I know less than Dan about this debate. This doesn’t change my mind that econ students should be taught the Bayesian approach too, nor the conclusion that the book clearly explains how to do it in practice.

Share