Financial Crisis and the Role of Risk-Management Software

Greg ,

The global financial services industry, perhaps more than any other, illustrates the overwhelming power and egalitarian nature of modern telecommunications. Anyone with an Internet connection and a cash account can buy and sell any number of financial instruments online. They can transfer assets from bank to bank in the blink of an eye. In an ad for a large US-based online trading firm, a customer extols his ability to sit in his home at midnight and trade shares on the Hang Seng exchange in Hong Kong.

"Hong Kong. That's China," he says with apparent wonder.

These stunning global connections, available to professionals and consumers alike, have arisen only in the past decade—one of relatively benign global market movement. Over that period, both sophisticated traders and less adventurous "buy and hold" investors had come to trust not only the communications technology that enabled their participation in the worldwide economy but also, implicitly, the legitimacy of the data going into the network to assist their decisions.

That assumption, of course, has turned out to be wrong—tragically wrong for some investors and workers who have lost savings and livelihoods. In the aftermath of the global market meltdown and ensuing economic crisis, leading risk-management technologists are saying it's time for the engineering community to reassess the shortcomings of old risk models based on rigid math and physical sciences assumptions.

A public history of a model that failed

Despite shortcomings, existing technologies might have helped avoid the current market disaster if they'd been more widely used. Whether the failure to use these tools was a legitimate but shortsighted business practice or malfeasance is still an open question, but also now a public one.

The US Congress held a series of public hearings on the market collapse's origins after authorizing a US$700 billion relief package for banks. At one of those hearings, held 22 October (, executives from the major credit ratings agencies testified to their companies' roles in the crisis.

Frank Raiter was the managing director of Standard & Poor's Residential Mortgage Backed Securities Ratings from March 1995 to April 2005. During that period, Raiter testified, total US residential mortgage production grew from $639 billion to $3.3 trillion. Subprime production grew more than 20-fold, from $35 billion to $807 billion. Commensurate with the explosion in mortgage activity came an increase in the data supporting the ratings of these mortgages.

In 1995, Raiter said, S&P decided to switch from a rules-based risk model for these mortgages to a more sophisticated statistics-based model. The first version of the new model analyzed 500,000 loans and five years of performance data. By 1999, S&P had updated the model to include data on approximately 900,000 loans and six to eight years' performance history.

"Things began to change in 2001 as the housing market took off," Raiter testified. "A new version of the model was developed using approximately 2.5 million loans with significant performance information. This model was by far the best yet developed, but it was not implemented due to budgetary constraints. Extraordinarily large volumes of transactions requiring ratings put a strain on the analytical staff resources, and requests for more staffing were generally not granted."

Although the new model wasn't used, Raiter said the model-development team continued to collect data. In late 2003 or early 2004, they had a fourth version of the model based on approximately 9.5 million loans, covering the full spectrum of new mortgage products, particularly in the Alt-A (that is, between prime and subprime customers) and fixed/floating payment categories.

"To my knowledge, that model has yet to be implemented," he said.

Risk models need art as well as science

Watching the utter failure of existing risk-management tools to head off the crisis&for whatever reason&financial-sector technologists are now asking questions that might change the way they design and deploy such tools in the future. Specifically, what responsibilities do technologists have to ensure that their creations will be used legitimately and correctly? And what opportunities now exist to create new tools that could perhaps help business-side managers avert such a meltdown in the future?

"These are very vital, important questions, and they should not be asked today," says Kabir Dutta, a principal with the CRA International consulting firm and a former senior economist at the US Federal Reserve Bank's Boston office. "They should have been asked before the event. Unfortunately, the first problem is, people never ask that question when things are working fine. It's the old 'don't fix it if it's not broken' attitude, but we do not know if it isn't broken if we can't test the environment."

Burton Group executive strategist Jack Santos says the technology that enabled worldwide instantaneous trading also contained a downside just now being realized.

"We would not be where we are if it were not for the technology we put in front of these guys," Santos says, "technology like powerful desktop computers where analysts can flip a switch and pivot tables like there's no tomorrow and come to conclusions where nobody knows what they're talking about. And that has had certain effects; in some cases, management structures that see that as a kind of black art and don't ask very basic questions. Fundamentally, it all comes down to human behavior and lack of oversight."

Andrew Aziz, Toronto-based executive vice president of risk solutions for risk technology vendor Algorithmics, says it's time for the sector's computing experts to welcome other disciplines into the development process.

"Risk management is a blend of art and a science, and in the last few years the science overtook the art," Aziz says. "Those tools are powerful, but people will have to assess when they are appropriate and when they're not, and when to overlay those tools with experience, with history and psychology." Aziz sees the brilliance in the ways people have been able to apply Brownian motion and the tools of physical sciences applied to finance. "But, fundamentally," he says, "finance is economics; it is a behavioral science, and not in all conditions is it well behaved."

In fact, Burton Group's Santos cites behavioral economists such as Dan Ariely ( in suggesting that the most basic assumption of traditional economics—on which math- and physics-based risk models have been built—is faulty.

"Traditional economics assumes everybody behaves rationally," Santos says. "There's actually very irrational behavior going on all the time, but you can count on it and predict it and design for it. Unfortunately, we don't design our software to take that into account."

CRA's Dutta warns against assuming math-based models correctly include even basic economics. Their developers might be brilliant mathematicians but typically have "very little, and in most cases, absolutely no training in economics," he says. "They do not understand the dynamics of human behavior and the dynamics of economic decision making."

The idea that new risk models will have to be more multidisciplinary is becoming conventional wisdom. However, Coral8 chief technology officer Mark Tsimelzon says engineering new tools won't be a simple matter of adding some historical precedents and behavioral probabilities into a code base.

Coral8 develops complex-event-processing (CEP) software, which capitalizes on the millions of data points flowing through any distributed enterprise network. This data can assist managers in making decisions based as closely as possible on real-time circumstances. Tsimelzon says adding the dynamics of historical market dislocations, such as the great panics of 1907, 1929, and 1987, might be useful in enriching CEP products, but no one should rely on it without reservations.

"The danger is that if you build something simplistic, you'll be completely wrong, because markets are so different now than in 1987, forget about 1929," he says. "There are orders of magnitude more in just the sheer number of orders, propagation routes, and leverage factors. You can try analogies, but you'd better not say, 'We saw x happen in 1987, and therefore, we'll see x again.'"

Time for a big talk

Algorithmics' Aziz says it's time for an all-encompassing discussion about where risk-management models fit in the global networks.

"It's not just the technologists who have to participate," he says. "It's the academics, the industry, and the regulators. All these people's roles will need to evolve in a way so the technology provides the tools and allows decisions to be as timely as possible, and is open enough so you can apply different assumptions. The days of the black box physical science model, without questioning the underlying assumptions, without looking at other measures such as stress testing, are gone."

Burton Group's Santos says that reengineering the risk-management model must be comprehensive. It needs to include elements that range from basic risk assumptions to user interfaces intuitive enough for a wide range of users. However, he also cautions against discarding some of the principles behind the financial instruments that led to the crisis.

"It appears that in some ways we're overreacting to this," he says. "The innovation that's taken place the last 10 years, such as collateralizing debt, has been pretty good, and it's something that we probably should continue to pursue and work on. But because it's at the center of this whole crisis, people are saying it's a bad thing to do. That's the same thing as saying because the stock market dropped 30 percent, we should outlaw mutual funds. Mutual funds with equity instruments are very similar, in that you're repackaging a lot of individual stocks into a packaged format."


Whatever disciplines get involved in discussing and developing new models, Dutta says it's time for anybody with expertise and concern to speak up.

"Now that people and the government are paying attention, those who have a critical view of what has been happening now have the opportunity to voice their concern. A couple of years ago, if they had questioned those models, they would have been told they didn't know what they were talking about."

Cite this article:

Greg Goth, "Financial Crisis and the Role of Risk-Management Software," IEEE Distributed Systems Online, vol. 9, no. 11, art. no. 0811-oy001.

65 ms
(Ver 3.x)