How Big Data is Transforming Bilateral Trading

By Stuart Smith, Co-Head Business Development – Data & Risk at Acadia

 

Since its inception, Big Data has been an important part of how firms have identified and constructed quantitative trading strategies with hedge funds depending more on quant strategies which rely heavily on big data driven analytics.

As big data technology continues to move from being a specialised technical capability to being a commoditised capability available on a range of easily consumed technology platforms, its use within the financial derivatives will continue to increase beyond the initial quantitative driven capabilities.

At the same time, the number and range of available data sources is increasing rapidly. Whether it’s the increase in alternative data sets or new technology enabling firms to simply keep more of the data they have been creating, the volume of data available is increasing dramatically.

 

Big Data in Risk Management

Risk Management has always had requirements which have driven a close collaboration between business and technology to make available risk analytics useful for the business to make better decisions. As technology becomes more advanced, the metrics available continue to improve as well. This is typically because many risk metrics require high numbers of scenarios and valuations to correctly identify risks in multiple scenarios. To maintain flexibility, this has led to an explosion of data to manage. Firms are increasingly keeping all this data available which can run into many Terabytes (TBs), much of which needs to be ‘In Memory’ to make it accessible to analysts.

Stuart Smith

To achieve this big-data, technology is critical to allow firms to move large volumes of data quickly and easily from affordable long-term storage into high performance in-memory analytics. Big Data technology is ideal for this type of problem to enable large volumes of data to be recalled from across multiple stores and appropriately aggregated or filtered based on the analysis which users are requesting. Whereas in the past, analysts would have to accept that data outside of the last 3-5 days is only available in a summarised format, they can now expect that the data can be re-hydrated quickly and easily from cloud data stores and available to them in an easy-to-consume web interface.

This can enable much more dynamic types of analysis, for example where a new risk is identified, through analysis of a recent data set it’s now possible to find a long history of that risk, whereas previously it would have been lost through summarisation and fixed reporting processes.

 

Collaborative Data Sets

More big data stores are being created as the industry becomes more collaborative and uses increasing numbers of fintech solutions and platforms. With this change come new ways to analyse data and provide new insights.

For instance, through the automation of collateral exchange, an historical store of margin calls, payments and disputes has been created. This history provides a resource for banks to understand their performance in accurately issuing and making margin calls based on derivatives and compare their performance to that of the industry as a whole. The example below shows how a firm can be benchmarked while holding other institutions data private.

These types of analysis are new and could not be delivered without the centralised collaborative data model. It can prove to be instrumental in improving firms’ overall operational efficiency and client service.

It also provides an opportunity for Machine Learning techniques, based on big data sets, to analyse and predict payments requests which are likely to be disputed and potentially identify causes before an actual dispute is even raised. This type of ‘self-healing’ process can only be enabled by a large history of data through which algorithms can be trained.

In the case of Initial Margin (IM) calculated by ISDA SIMM* a new set of challenges have been introduced through having a two-sided risk calculation as part of the process of deriving payment information. This adds another level of complexity to the resolving of disputes; however, the potential offered by having large volumes of data opens up new options on how this challenge could be solved. The long history of Common Risk Interchange Format (CRIF)** data provides a long-term view of the sensitivities for most OTC derivatives, which can enable firms to identify basic issues like stale market data day over day. However, as with most detailed analysis differences in models, they can also be identified through looking at differences over long periods of time. Identification of these types of model discrepancies can help firms to be more proactive about reviewing their modelling deficiencies to ensure that differences don’t lead to disputes.

 

Looking ahead

The sheer volume of data can be an industry-wide challenge with firms having to manage disparate, needlessly duplicated and ultimately overwhelming information. Creation of an industry standard for reporting and analytics is, therefore, crucial to enable firms get clarity and valuable insights from the masses of data and centralise the information as a single data layer. Acadia has designed Data Exploration (DX) suite to be one-of-its-kind big data analytics platform to help sell-side, buy-side and fund administrators see its market positioning, trends and analysis of industrywide metrics.

The impact of big data will only grow and the industry is left with no choice than to evolve the use of technology, whether that is to drive quant strategies for hedge funds, more dynamic forms of risk management or larger shared industry data sets. All of these applications rely on underlying big data technology platforms to provide distributed analysis capabilities. As these capabilities continue to develop so will the types of analysis which are available to firms.

*The ISDA Standard Initial Margin Model (ISDA SIMM™) is a common methodology for calculating initial margin for non-centrally cleared derivatives, developed as part of ISDA’s Working Group on Margin Requirements (WGMR) to help market participants meet the BCBS-IOSCO margin framework for non-cleared derivatives.

** The CRIF file (Common Risk Interchange Format) is the industry template used to hold and exchange sensitivity data. ISDA’s calculation specifications are used to produce Delta, Vega and Curvature sensitivity numbers at Risk Factor-level

 

 

 

 

 

 

 

 

 

 

 

 

spot_img
Ad Slider
Ad 1
Ad 2
Ad 3
Ad 4
Ad 5

Subscribe to our Newsletter