Past financial crises have been characterized by panics, runs, and restrictions on the availability of credit, and our crisis prevention measures (including capital regulation, deposit insurance, and the lender of last resort) have been adopted in light of this historical experience. These forms of financial stability regulation pay limited attention to operational risks, however, and as the technologies by which financial services are delivered become increasingly complex, operational risks may become a transmission mechanism as well as a source of financial distress. In the future, financial crises may look more like a rolling blackout than a bank run.
Take the processing of retail payments, for example. Increased attention is being devoted to the possibility that a cyberattack could incapacitate payment processing, but not enough attention is being paid to the fact that the same outcomes could arise from the interactions of technological glitches – even in the absence of nefarious actors. Imagine, for example, that a payments provider succumbs to a software bug that prevents it from transmitting payment instructions from payer to payee – that provider might try to route its customers’ payment orders to a second payments provider while the bug is being fixed, but the second provider may simultaneously be suffering the same problem (financial services providers are increasingly obtaining their technology from the same third-party vendors, so such an outcome is becoming more and more likely). Or the second provider might have been functioning well initially, but its systems could buckle under the increased load of payment instructions it receives as a result of the first provider’s technological problem (experience with power grids tells us that stressed infrastructure is more vulnerable to failure). If payments providers are compromised by an increased load of instructions routed from other struggling providers, the remaining payments system could become overloaded – even across national borders – to the point where people are unable to transact. If such a situation were to persist for several days, the economic fallout could be significant – and there would be nothing that a lender of last resort could do to remedy the situation.
In my forthcoming paper, Payments Failure, I argue that our current regulatory regime largely treats the management of operational risks as an internal matter for financial institutions and assumes that problems arising from operational risks can only be transmitted to other institutions through credit-related channels like runs. Existing regulations generally fail to anticipate that operational risks at different institutions can interact with one another to cause problems even in the absence of any run (although of course it is quite possible – even likely – that runs will accompany a cascade of operational failures). The inadequacy of the current approach to regulating operational risks can be illustrated by way of analogy to the inadequacies of microprudential regulation. Before the last crisis, it was assumed that, so long as individual banks were safe and sound, the financial system as a whole would also be robust. However, steps that individual banks took to preserve their own solvency – most notably, selling assets at fire sale prices – weakened the financial system as a whole. It is similarly possible that leaving operational risk management to individual firms will make the retail payments system as a whole more fragile, if the steps taken internally to manage operational risk have consequences for other payments providers. The influx of complex new payments processing technologies (ranging from blockchain to cloud computing) could increase the likelihood of compounding operational complications. Payments Failure therefore makes the case for “macro-operational” regulation, designed to deal with a potential new breed of financial crises that could arise from systemic interactions of technological operational risks.
Macro-operational regulation should be rooted in complexity theory, rather than in our historical experience of financial crises (most of which occurred when finance was not so deeply dependent on complex technologies). Complexity theory tells us that, to make a system more robust, policymakers should prioritize the system’s ability to scale and evolve with changed use and to reorganize and reconfigure when needed (potentially at the expense of the system’s efficiency and, perhaps somewhat counterintuitively, at the expense of the reliability of the individual components of that system). The practical implementation of a macro-operational approach to regulating the retail payments system might include measures to promote some redundancies in that system (perhaps even mandating the acceptance of cash). Policymakers should also work towards developing sensors that can detect when the system is likely to be compromised by changes in the type and scale of use (perhaps by simulating scenarios of failures of different components of the retail payments system). Measures along the lines of circuit breakers or mandated outages also need to be considered, so that policymakers are prepared to contain and mitigate the fallout if cascading operational failures begin.
To be clear, complexity science is somewhat pessimistic about our ability to entirely contain cascade failures within complex systems, and we should not expect macro-operational regulation to be a silver bullet. However, well-designed regulation can make the retail payments system more robust to such failures. Significant work needs to be devoted to fleshing out the contents of a macro-operational approach to regulation, but even at this early stage, it is clear that the expertise of complexity and data scientists will be critical to macro-operational regulation. Exploring the best way to integrate these types of expertise into the current financial regulatory apparatus is an important subject for future research.
 As an example of this credit-channel perspective on financial crises, see Ben Bernanke, The Real Effects of Disrupted Credit: Evidence from the Global Financial Crisis, BROOKINGS PAPERS ON ECONOMIC ACTIVITY (Sept. 13, 2018).
 For an accessible discussion of the complexity science framework, see J.B. Ruhl, Managing Systemic Risk in Legal Systems, 89 IND. L. J. 559 (2014).
This post comes to us from Professor Hilary J. Allen at American University’s Washington College of Law. It is based on her recent article, “Payments Failure,” available here.