EMPOWERING THE DATA ANALYST WITHIN FINANCIAL FIRMS

by Martijn Groot, VP Strategy, Alveo

 

The integration of data management and analytics can empower data scientists and analysts within financial institutions and materially boost their productivity and value. Rather than sitting at the end points of data flows and having to spend a large portion of time verifying and gathering data before any analysis can start, they can use data aggregation and verification capabilities integrated with a foundation for advanced analytics for their risk assessment, pricing, research, scenario analysis and reporting.

Financial institutions are keen to enable their analysts but what is making easily provisioning them with verified information more urgent is the growing volume in both data and in the diversity of data sources to analyse. This covers client interaction as well as external data sources that provide further insight into valuation and investment analysis. There is simply a lot more content to work with, and a lot of that is caught very close to the point of creation as content owners increasingly try to monetize their data directly. On the external reporting side too, regulators demand increasing granularity.

All this is leading to more pressure on the middle of the data management process: the central hub, if you like, where the information collection, processing and distillation happens. That’s where change needs to happen to be able to scale and keep on top of rising data demands.

Analysts would traditionally interact with data sources through accessing multiple internal stores and looking to supplement this externally as needed. Very often they would create another custom database as input for their own libraries in C++ or Matlab. Spreadsheets would be used as well. Often the lineage and freshness of the data would be unclear, leading to ambiguous or faulty outcomes of their models. Essentially, data scientists or quants operated as small departments or one-person shops covering the entire process of data collection, aggregation, storage and analysis.

 

A Change in Approach 

Decision-making has become increasingly automated. That is true of corporate functions including risk and financial reporting or the development of annual reports, financial statements and filings. Increasingly, analytics and automation is coming into real time decision-making: whether you accept a client during onboarding or whether you have to intervene to stop a transaction.

The new process is also about collaboration and in this context the new approach enables data analysts to enable a robust and scalable data meeting place, leveraging open source technologies (including R and Python) and share analytics across the entire data supply chain.

The use of open source programming languages means that adoption levels are high. A lot of organisations are likely to be tackling the same or similar problems and that means that there are a lot of libraries available that organisations use to kickstart development. Firm shouldn’t have to worry about solving common industry problems – open source libraries can provide an excellent foundation for that.

Open source database technologies too can be helpful in this context, such as Cassandra and PostgreSQL.  The benefit of open source is not just to avoid license fees but also around the skills that are available in the market. Firms prefer to use technologies that are broadly adopted and are wary of getting trapped in dependence on proprietary languages or databases that require rare and pricy expert skills. Also, many open source technologies have been developed for the cloud and are state-of-the-art.

Also with analytics now closely integrated into the middle of business workflows, financial firms can put in place a platform for proprietary analytics and consequently raise the bar on data analysis still further. To do that, they first have to raise the bar in systems terms, and, with the latest financial data management and analytics tools coupled with open source technologies in place, to quickly source data; to quality proof it and to make it more easily available to their data scientists.

All of that capability together allows financial organisations to hit the ground running – and data scientists can spend more time on proprietary analytics –the secret sauce of any firm – that is specific to their client base, their business and their markets. They can accelerate the onboarding and workflow integration of new data sets and avoid being snowed under by mounting data volumes. They can begin to incorporate innovative data science solutions (e.g.: AI, Machine Learning), into market analysis and investment processes, and allow their firms to use the results to drive business advantage.  As a result, they start at a higher point and move on to proprietary analytics, that delivers real added value for their firm. It is a compelling example of how this new approach to data management and analytics can elevate the role of data scientists and enable the firms for which they work to really reap the rewards.

 

spot_img

Explore more