On September 27th and 28th, 2016 the Center for Financial Professionals held their US conference on Liquidity and Funding Risk. Practitioners from across the industry discussed issues and strategies in a financial landscape changed after the crisis. The following presents some highlights from the presentations and discussions.
From the outset, it was increasingly apparent that the core focus across liquidity and funding still remains focused on the regulatory interpretation. Economics and risk modeling took second stage to discussions on regulator intentions during the two days’ proceedings. Plenty of slides focused on rule text analysis over concepts of liquidity risk and modeling. Of course, this focus is natural to anyone who works in treasuries or risk management divisions, but it is still remarkable how dominating the new regulations are. Compliance is the binding constraint for operations. Most participants agreed that internal controls and stress models eventually should be the constraining factor on decision-making, but that point is still far away in time. For now, efforts still focus on building an infrastructure that will support the governance and reporting required by regulators.
Risk aggregation and data documentation
The regulatory push on the liability side of the balance sheet currently sits between LCR and NSFR rollouts. Business lines like repo desks and FICC have, in general, been forced to adapt due to LCR constraints and we are only in the early phases of thinking about NSFR.
In this intermezzo, regulators are focusing on a move from rule-based to risk-based supervision: “Regulators don’t want to check boxes anymore. They want banks to show them that they’re in control of their risk,” said one presenter. This points to more and more model validation and parameter sensitivity tests. Further, one presenter emphasized that you also need to “prove your data”. Principally, there must be a qualified response to questions on the origins and importance of all data points reported in LCR, 2052a, CLAR etc. This requires recording metadata and writing down commentary when decisions are made on data selection and on the data collection infrastructure. “Documentation is key,” repeated one state federal regulator representative several times during his presentation and Q&A responses. That neatly summarizes much of the conference.
Risk aggregation was the second main topic in discussions on regulatory focus. Reporting a consolidated liquidity risk view across different operations requires conceptual work on how to consolidate liquidity risks from different lines of business. A single institution might have a broker-dealer business, a retail bank, and a commercial lending institution; liquidity risks are qualitatively different across these businesses, but they have to be aggregated and reported on a consolidated basis for US operations. This applies to both FBOs and US-based institutions, most of which are still trying to develop concepts and models to handle this task. For now, LCR methodology is the most common toolkit, but it is crucial to develop internal models. Several presenters made this point repeatedly. Without a customized analysis specifically developed on the basis of an institution’s idiosyncratic operations, risk managers might be blindsided by the LCR framework and its coarse and general specifications.
Building capacity
For now, most Treasurers focus on the quantitative challenges by building IT infrastructure and consolidating models. Gathering data and setting up structures across merged entities and legacy systems is a daunting task, but the sooner one starts the better. Regulators will likely require more data and more reporting going forward, so getting one’s data in order is a valuable preemptive exercise. In any case, it is sensible to streamline data structures so information can be leveraged in internal decision-making.
There are more than technical challenges to compliance. On one side, Treasuries and Liquidity Risk Management have to build capacity to satisfy regulators. On the other side, other business lines within the organization have a hard time understanding increased liquidity charges in Funds Transfer Pricing. On top of the technical development it is important to think of organizational politics during implementation of new rules and systems. It can be hard to understand increasingly high funding charges when fed funds effective floats around 40 basis points – many conference participants discussed this challenge. The best advice was to educate and organize discussions preemptively such that there is some sort of understanding when liquidity charges rise. Some did ‘roadshows’ for the parts of the organization that are dependent on funding and FTP.
Horizontal comparisons in perspective
Regulators repeatedly focus on “best practices”. The general idea behind CLAR and 2052a is to form a dynamic view of liquidity risk management across the industry. Regulators aim to spot outlier activities given current conditions in markets and the economy. This is undoubtedly a useful exercise for systemic safety. It gathers information and can possibly help identify stressors. However, both regulators and individual institutions should resist steering all outlier practices toward the mean. If everyone moves in concert, correlations in funding strategies amplify systemic swings. Further, if you base decisions on models and limits sanctioned by regulators, you are bound to miss some aspect of the liquidity risks that are important to your specific institution. From the individual institution’s perspective, this means internal models should be the binding constraints. For the regulators, this means outliers shouldn’t necessarily be pushed toward the mean.