Is there such a thing as a single, reliable source of data for investment reporting, asks Abbey Shasore, CEO, Factbook.
We hear much about ‘golden copy’ data within the investment management world. The term refers to a master version of security and reference data acting as a single authoritative source of truth for all the applications in the asset management IT landscape.
A master source of data has an obvious appeal – it is always accurate, can be used for multiple purposes and can be piped through an organisation and even to external parties without rehandling, avoiding the risk of error creeping in. Yet the term has often been mismanaged. I have heard of companies referring to ‘multiple golden sources’ existing across their firm, usually generated by different departments for differing needs. And firms are often quite comfortable with the apparent contradiction in this terminology.
With golden copies of data being in circulation since the early noughties and large EDM projects having been undertaken by asset management firms to solve the problem, it is perhaps hard to understand why a master database is still an issue. Is it just a back office challenge or does it extend into the front and middle offices also?
The creation of multiple sub-databases goes back to the days of mainframe computers, when teams found it difficult to extract the data they wanted for specific tasks. The staff maintaining these databases were often administrative or clerical in nature, without necessarily understanding the context of the data in question. Data governance had not really taken hold within many firms at this point.
Fast forward to today and the practice of every system having its own set of static data is much improved, though it certainly hasn’t been eliminated. The persistent problem is in the domain of the individual, where they maintain their own spreadsheets in order to validate certain data. Effective data governance will prevent this from occurring.
In the area of investment reporting, heavily productionised items like KIIDS and fund factsheets should have a very strict protocol concerning where the static data comes from, how it is validated, how the audit process works, what the fund names are, what the share classes are, what the investment objectives are, what the currency is and so on.
With dynamic data, however, it is a different story since the way the data can be interpreted may vary. Multiple data sources (such as pricing information) that are being ingested over a time series are subject to potential variations when reports are generated. Classic examples are when ad hoc pitchbooks and client-specific reports are produced for a presentation, which may often contain subtly different data from the last fund factsheet that was published.
Strict data governance can help in these situations. For example, fund accounting will have a process to make sure that the prices that go into valuing each of the assets in a fund is audited and correct. It’s that final, definitive price that then goes into the master data for that NAV and feeds the ongoing reports.
Where the new data normalising problems lie is in categories like ESG, where the taxonomy and the regulation are always changing. There are many different ESG providers that are supplying specialist pieces of the ESG puzzle, and each asset manager often ends up creating their own methodology.
Trying to normalise data across different asset classes is also quite complicated. Whilst there are generally accepted definitions for commonly traded instruments like equities and bonds, for private asset varieties like private equity, infrastructure, private debt and real estate, the kind of data that is useful for investment reporting often involves different fields. In these instances, there are certain commonalities within a multi asset portfolio, such as geographical region, but there are other features that are difficult to compare cross-market.
Conclusion
Many firms may still be some way off establishing a single version of the truth for their investment reporting, but if it is clear what data goes into which field for a report that is being compiled, a press button methodology can be applied. If firms avoid a plethora of different systems holding their own sets of data, then investment reporting can be productionised to a large extent – and costs reduced.