In the past few days I have read two brilliant and fascinating articles, pointing to opposite conclusions. One argues that the extent of complexity in the financial domain is so great that effective regulation can only be achieved by the use of heuristics or rules of thumb. The other argues, equally persuasively, that the potential for ‘big data’ is now so promising that we will not need to map the complexity of the macroeconomy using simple aggregates and averages, but rather will be able to use actual data. My instinct tells me both are correct but I’m still thinking through how they might be reconciled.
The first is a speech given by the Bank of England’s Andrew Haldane at Jackson Hole, The Dog and the Frisbee. He notes that neither humans – nor dogs, who can do it even better – actually solve an optimal control problem when catching a frisbee. They follow the rule of thumb: run at a speed so that the angle of gaze to the frisbee remains roughly constant. Modern finance theory, and consequently financial regulation, has developed models of decision making under risk, but in fact the world features uncertainty and increasing complexity. The strong assumptions about the state of knowledge made in conventional models do not hold.
The speech concludes:
“Modern finance is complex, perhaps too complex. Regulation of modern finance is complex, almost certainly too complex. That configuration spells trouble. As you do not fight fire with fire, you do not fight complexity with complexity. Because complexity generates uncertainty, not risk, it requires a regulatory response grounded in simplicity, not complexity. Delivering that would require an about-turn from the regulatory community from the path followed for the better part of the past 50 years. If a once-in-a-lifetime crisis is not able to deliver that change, it is not clear what will. To ask today’s regulators to save us from tomorrow’s crisis using yesterday’s toolbox is to ask a border collie to catch a frisbee by first applying Newton’s Law of Gravity.”
This seems to me to be obviously true, even if it ruffles feathers in the financial regulatory community.
The second article is a conversation with MIT Media Lab’s Cesar Hidalgo on The Edge, What is Value, What is Money? This covers a lot of territory, but part of it is how we understand complexity in the aggregate. Hidalgo says:
“In the past when we looked at macro scales, at least when it comes to many social phenomena, we aggregated everything. Our idea of macro is, by an accident of history, a synonym of aggregate, a mass in which everything is added up and in which individuality is lost. What data at high spatial resolution, temporal resolution and typological resolution is allowing us to do, is to see the big picture without losing the individuality inside it. I believe that in the future, macro is going to be something that is going to be in high-definition. You’re going to be able to zoom in into these macro pictures and see that neighborhood, and see that person, and understand that individual, and to have more personalized interactions thanks to the data that is becoming available. I think that in some sense, big data can help recover the humanity of a world in which the scientific representations of people have become dehumanized, because of our need to simplify.”
Well, this is an exciting prospect and obviously potentially feasible in the Big Data world. But I’m not yet sure how it sits with the ‘Keep it Simple, Stupid’ moral of the Haldane paper. Or indeed with my strong instinct that public policy interventions are most effective when of the kind described by Thomas Schelling in his brilliant book Micromotives and Macrobehaviour – like traffic lights, a clear and simple rule which people have strong incentives to obey.