I recently attended an interesting talk by Andrew Lo of the Sloan School. He does research on how human behavior factors into economics and the limitations of the classical economics model that people are purely rational entities.
He gave a good introduction to the “how” of the recent financial crisis. The key point is that there was an assumption built into all of the models that risks in the debts being securitized weren’t correlated. That was true historically, but it is not true when a housing bubble collapses. Everyone knew that we were in a housing bubble, but forgot to go back and check the effect of that fact on the assumptions in their models. He used the classic basketball video to explain why that was neglected. Once finance professionals had built this complex system based on these assumptions, they started thinking about the system at this higher level and neglected to pay attention to the underlying assumptions. This is human nature.
These details are interesting, but not terribly important. His argument (based on the work of Charles Perrow) is that any complex, coupled system is going to have failures. In this case, the complexities underlying the financial system were coupled for a few reasons. The first reason is that the banks were able to use high leverage ratios to maximize their profits, but this amplified any unexpected effects. Another reason is that there was a small number of very large companies involved. For example, many banks thought that they were safe from correlation risk because they had insurance on their positions. But they all had insurance from the same company; AIG.
Even if someone at one of these banks realized that there was going to be a problem when the bubble burst, it’s not clear that they could have done anything about it. He gave examples of people recognizing the warning signs in 2005. Then he pointed out that if you had shorted real estate in 2005, you would have lost a fortune that year, and in 2006, and in 2007, … You would have been out of business by the time the crash actually happened. Even if you know the current behavior can’t continue forever, you can’t make a prediction of when it will stop that is accurate enough to act on.
Professor Lo argues that we have to design complex systems around the assumption that failures like this will occur, and that we should simply try to localize the resulting damage. He compared this to planning for hurricanes and earthquakes. His recommendations included things like:
- Breaking up too-big-to-fail institutions.
- Create exchanges for exotic financial instruments to provide some transparency.
- Create a “financial NTSB” to do post mortems on crashes.
- Impose leverage constraints on banks.
He also argued against the idea that these complex financial instruments should be banned. Instead, he thinks we need to do a better job of educating people in business and the government. He proposed having the federal government fund financial research the way that they currently fund medical and scientific research. His analogy for this was a story about his son’s 8th birthday party. He asked his son what gifts he wanted to get for the guests. His son said “chainsaws!” Giving chainsaws to a group of 8 year olds is probably a bad idea. That doesn’t mean that chainsaws should be banned though. They’re exactly what you want if you’re clearing brush. He thinks that technologies like securitization aren’t yet idiotproof, but that they will be at some point. But we can’t just ban them in the meantime.
Professor Lo’s talk was very interesting. But one thing that was kind of scary though was that, while he thinks we will get improved financial regulations, he thinks we’ll get them after the next crash. He compared it to Glass-Steagall getting passed in 1933, not after the initial crash in 1929. He thinks that the initial recovery we’re currently in reduced the pressure for reform, but he doesn’t think it’s over yet. He expects another crash to happen. Perhaps because of commercial real estate; perhaps because of Spain & Greece. So hang on…