Logo

Generic selectors
Exact matches only
Search in title
Search in content
Search in posts
Search in pages

FRTB AND OPTIMAL DATA MANAGEMENT

05/09/2019

by Martijn Groot, VP, Asset Control 

 

FRTB comes into effect at the beginning of 2022. It exacerbates the market data management challenges for banks at a time when firms should already be planning for optimal data management compliance.

The challenges banks have to cope with are ramping up all the time. They face increased regulatory scrutiny on data quality. They must achieve consistency between risk and front office to meet modellability and attribution tests. They have to navigate legacy systems that do not scale to handle required data volumes and miss crucial capabilities in e.g. data lineage and bi-temporality. At the same time, they are having to transition to the cloud to drive efficiencies and support business user enablement by delivering better access to data.

FRTB raises the stakes further by introducing three key changes to market data management, how to measure risk, assess risk factors and classify risk factors.

 

  • Measuring risk

The introduction of Expected Shortfall (ES) as a replacement for VaR entails that the outliers hit the tail and therefore the regulatory capital directly. Most banks currently use one- or two-years’ history, meaning the expected shortfall requirement for 10 years’ worth is not easily retro-fitted; particularly in cases where teams are working with legacy systems. Apart from that, banks need to be able to determine the most stressful 12 months over the last 10 years. This means banks require reliable and flexible storage for time series data, with the ability to consolidate data sources – and fill any gaps. i

 

  • Assessment of risk factors: focus on real prices

The second major change FRTB introduces is a refreshed form of assessment for risk factors, based on their ‘modellability’. The use of real prices is required to prove modellability and identify ‘non-modellable’ risk factors (NMRFs). Banks can use data vendors to supply this real price data, providing they can be audited by regulators.

 

  • Classification of risk factors

Risk factors are also subject to a new method of categorisation. This includes a need for model support behind risk factor classifications, the ability to extend reference data mappings and the integration of the liquidity horizon categorisation.

These changes will place extra pressure on banks .Essentially, however, FRTB adheres to the same notion of a solid data foundation as many other regulations – and therefore should be seen in that context.

Ultimately, the best way to manage regulatory change is through the accurate collection, controlled sourcing, cross-referencing, and integration of data. This can address common regulatory “asks” around taxonomies, classifications, unambiguous identification, additional data context, links between related elements and requirements on audit and lineage. The capabilities to do this will pave the way to insight-driven data management, early warnings on market data issues and their implications and business user enablement to support users in risk, valuation, finance, operations and front-office with increasingly data intensive jobs.

 

YOUR COMMENT

Your Comment

Email (will not be published)

Finance Derivative is a global financial and business analysis magazine, published by FM.Publishing. It is a yearly print and online magazine providing broad coverage and analysis of the financial industry, international business and the global economy.

Copyright @ 2018-2019. Finance Derivative. All Right Reserved