The Supercomputer Solution

Five years after the financial crisis, regulators are still struggling to keep pace with market structure changes and technological advances. The answer could reside with a group of computer scientists and mathematicians at a laboratory in Northern California.

supercomputer-futurist-large.jpg

Postmortems on the terrorist attacks of 2001 and the financial crisis of a few years later yielded parallel insights. Both disasters were connect-the-dots failures.

Data was available to detect the 9/11 plot, but it was scattered among intelligence agencies that were siloed and lacked mechanisms to share it in a timely fashion. A 2004 law put a director of national intelligence in charge of 16 agencies, essentially to coordinate information-sharing.

Likewise, evidence of impending economic doom was mounting before 2008, but no single overseer was capable of sounding an alarm. That deficiency led to the formation, mandated by the Dodd-Frank Wall Street Reform and Consumer Protection Act, of the Financial Stability Oversight Council, a panel of top regulators led by the secretary of the Treasury and charged with identifying systemic risks and preventing contagion. For data to support its analyses and judgments, the FSOC relies on another Dodd-Frank creation, the Office of Financial Research.

OFR director and former Wall Street economist Richard Berner said in a recent speech that to overcome past failures, “the analytical tool kit needs improvement to assess these fundamental sources of vulnerability and the instability in the financial system. It needs to be more forward-looking and to test the resilience of the financial system to a wide range of events and incentives.”

This is a problem of big data and predictive analytics — and not the only one in postcrisis financial regulation. Securities regulators are grappling with high frequency markets; the specter of the May 6, 2010, flash crash; and the implications of several lesser but equally troubling operational breakdowns. One of the Securities and Exchange Commission’s pursuits — watched with interest by regulators and exchanges around the world — is the development of a consolidated audit trail (CAT) that would allow monitoring of all market activity, timely detection of anomalies and predictive analyses for supervision and enforcement.

Can a governmental body keep pace with, let alone get ahead of, the market structure changes and technological advances that created this bind? Skepticism abounds. No one doubts the competencies of people like Berner and Gregg Berman, the former hedge fund manager and risk management technologist who is associate director of the SEC’s Office of Analytics and Research and an influential voice on the CAT and other high-tech initiatives. Rather, it’s a question of resources: budgets for people, training and technologies that are inherently constrained for government entities, in contrast to those of the banks and hedge funds whose activities and innovations they are trying to track.

Sponsored

Technology experts point out that the CAT, although simple in concept, must account for extreme levels of complexity in markets’ and market participants’ operations. Years of system development and standardization will be required to achieve full compliance.

“At the institutional level everything will need to be recordable and retrievable,” says Brian Sentance, CEO of Xenomorph Software, a London-based provider of market analytics and data management systems. “Regulators, lacking budgets but expected to make sense of it all, will be very challenged.”

A group of computer scientists and mathematicians at Lawrence Berkeley National Laboratory, on the University of California campus, say they have a solution. Their supercomputer center routinely processes petabytes (quadrillions of characters) of data for scientific research purposes. Relative to data runs in, say, high-energy physics, market volumes are “almost trivial,” says Berkeley Lab senior scientist David Bailey.

What’s more, the Californians have done some work on the financial markets problem. In 2010, David Leinweber, who formerly ran $6 billion of quantitative equity investments at boutique manager First Quadrant, co-founded the Center for Innovative Financial Technology with Berkeley Lab deputy director Horst Simon. Its profile has been low because, alas, the lab is part of the U.S. Department of Energy and dependent on research grants. Leinweber has brought some attention to CIFT through his congressional testimony and blogging. And, on what Bailey, a computational mathematician who is CIFT’s director, terms “a shoestring,” the unit has produced several papers. One is a manifesto on its proposed “bridge between the computational science and financial markets communities,” with some intriguing findings on the predictability of flash crashes.

“We can help ultimately to see what is happening in markets in real time,” asserts Bailey.

It will take some connecting of dots — and budgetary generosity — to build that bridge.

Jeffrey Kutler is editor-in-chief of Risk Professional magazine, published by the Global Association of Risk Professionals.

Related