Byline: Dihan Rosenburg, Director, Product Marketing at ASG Technologies
With the coming December 2021 retirement of LIBOR, Chief Data Officers in financial institutions are preparing to face their Y2K moment. An estimated $350 trillion of financial instruments are linked to this globally accepted benchmark rate for short-term loans, including adjustable rate mortgages, credit cards, student loans, bonds, securities and more.
The data management challenge is colossal. Over 100 million transactions reference LIBOR and must be replaced with alternative risk-free rates (“RFRs”), such as SOFR for the U.S. dollar and SONIA in the UK. Making the switch won’t be a simple search-and-replace exercise. Here’s why:
- RFRs are overnight rates, whereas LIBOR is published for multiple terms (e.g., one-week, three-month, etc.).
- Credit risks are embedded in LIBOR, while RFRs are “risk-free,” making a simple conversion impossible.
- RFRs have different behavioral characteristics than LIBOR, resulting in different historical spreads, so fallback rates must be applied.
To make the transformation, contracts referencing LIBOR will all need to be rewritten. Organizations must locate all references to LIBOR in contracts they hold to update with fallback provisions reflecting the new RFR terms, and communicate those new terms to clients.
While some banks have made a start on contract remediation, the data impact is even broader. Firms will need, for example, to create new models for pricing using the new benchmarks, creating new credit risk spreads and evaluating how that will affect margins and profitability—and far fewer firms have yet reached that point. Firms will also need to consider the impact of these new benchmarks on their FRTB and BCBS 239 programs.
To Get it Right, Impact Analysis is Vital
The transition of this complex data journey requires robust and comprehensive inventory analysis and data lineage capabilities. This upfront impact analysis is necessary to find the rates and analyze the changes before replacing them.
While firms may try to manage the necessary analysis manually with spreadsheets and subject matter expert (SME) interviews, this approach is particularly impractical for LIBOR migration programs. LIBOR has been around for over 40 years. The rates are primarily sourced and calculated inside of legacy systems—many of these SME’s have long since retired.
Even locating all the references to LIBOR rates will be difficult. The data is spread across unrelated systems and technologies, data will be both unstructured and structured, contracts may be unlinked to amendments, or a host of other data disparities may present themselves. Tracing LIBOR data is also complex, due to inconsistencies in how data is organized at the logical and physical level and transformed across thousands of applications.
Fortunately, Metadata Management Applications Can Automate these Tasks
Automated metadata harvesting and management applications are ideal for the unique data management challenges LIBOR migration presents. Here’s how:
- Automated inventory will accelerate the discovery of where LIBOR rates are referenced, whether in applications, mainframes, ERP systems or contracts.
- Automated data lineage graphically displays the flow of LIBOR data from its origination in legacy platforms to all final points of consumption—identifying transformation points and systems where derived values are calculated along the way. Intelligent lineage delivers both vertical and horizontal views, offering a tangible connection between the business and technical metadata. It provides a visual, easily digestible artifact that informs the analyses of technology, business and risk leaders.
- Business glossaries document business processes and terminology and connect technical and business assets. An artificial intelligence (AI)-based recommendation engine (“virtual data steward”) accelerates the speed and accuracy of matching business and technical data, as well as helping to find and track new rates going forward.
- Impact analysis begins with an understanding of how LIBOR rates are being calculated today. Analysts will find these calculations embedded inside of the SQL code from within the data lineage. The Snapshot feature displays both the current rate and what it will look like post-conversion to test and validate the new piping. By monitoring the lineage as it begins to shrink, users will also be able to track their progress towards meeting the deadline.
- Data quality information can augment the data lineage analyses to ensure key areas get the attention they require. Other data governance features—including data stewardship, issue management and dashboards—will keep disparate teams informed and coordinated across various concurrent activity streams.
From LIBOR Transition to Digital Transformation
Typically, financial institutions have treated similar data initiatives like MiFID II, Dodd Frank and Margin Rules as one-off projects. Disruptions in the global business and regulatory environment are occurring with increasing frequency mandating a more holistic approach. For instance, LIBOR is only one of the reference rates being retired. Other interbank rates (“IBORs”) around the globe are also on the chopping block.
Instead of viewing the LIBOR transition as just another costly, resource-consuming exercise, organizations should see this change a catalyst to uplevel and automate their data management processes to better understand and trust their data. Here are some benefits that one large financial institution told ASG they received from implementing an automated data intelligence system in just four months:
- Discovered LIBOR rates across 150 Applications and 20,000 Cobol programs.
- Solved a nine-month attestation request from their largest client that was being passed from department to department in search of a root cause.
- Shrunk their footprint of redundant applications, including consolidating ten UDTs to one!
With these successes, this data governance organization was able to easily justify additional investment for other use cases.
By looking at this transition as an opportunity to transform enterprise data intelligence and put in place the processes needed to clean, understand and trust data, financial organizations can future proof their data and adeptly address emerging changes. And with the greater data usage that trusted data enables, they can adroitly exploit new opportunities to enhance operations, deliver greater value to customers and gain competitive advantages.