Data Mesh: The key ingredient to a digital future in financial services

Charles Southwood, Regional VP at Denodo

Thanks to the digital transformation taking place across the sector, in recent years, there has been an exponential boom in the amount of data and personal information that financial services (FS) organisations are able to access. If used correctly, this data could unlock multiple opportunities, both in terms of revenue and customer experience initiatives. Its insights could be the differentiator which enables FS organisations to stand out in one of the most competitive markets in the world.

However, as the landscape grows in both size and complexity, it’s becoming increasingly difficult to make sense of this never-ending stream of information. So much so that many FS organisations are failing to realise their digital dreams and step into the next data-driven chapter. In order to reach their true potential, these organisations must be able to effectively access, analyse and understand data. It’s only then that they can derive true value from it and use it to their advantage.


Charles Southwood

Making sense of all that data

It is not only the volume of data that has grown over the years; it is also the variety, with an ever-increasing number of data lakes, applications and other sources, all with different formats and protocols. Multiple data stores and applications create multiple analytical workloads running on multiple analytical systems. This results in siloes which ultimately leads to bottlenecks and delays. For FS organisations – especially those looking to innovate fast in order to compete with the fresh wave of challenger banks and fintech startups – these delays could spell disaster.

This is where data mesh comes in. A new concept first proposed by Zhamak Dehghani of Thoughtworks in 2019, this decentralised paradigm for data analytics aims to remove bottlenecks and take data decisions closer to those who understand the data. It proposes a unified infrastructure enabling domains to create and share data products, while enforcing standards for interoperability, quality, governance, and security.

Data mesh is designed to be a distributed model. In other words, each organisational unit has its own data product owners which own domain-specific data. They handle the modelling and aggregation of this data, aiding data democratisation and self-service data for the business as a whole.

This system enables a company to achieve greater analytical velocity and scale because each unit will have a better understanding of how their data should be used. It also means that data becomes a decentralised, self-contained product that can be consumed by anyone in the organisation. This removes the bottleneck of centralised infrastructure and gives each team the freedom to manage their own data pipelines. However, whilst data mesh can bring many benefits when it comes to driving digital decision making and innovation, FS organisations need to make sure they’re getting it right.


Innovation without compromise

Whilst innovation has never been more important, FS organisations need to be able to achieve it without compromising on governance and compliance. Regulatory compliance in the financial services sector is a complex field to navigate. Whether its potential financial fraud or money laundering, risk comes in many forms and, due to their very nature, FS businesses are under heavy scrutiny when it comes to achieving the highest levels of data governance.

When it comes to implementing a data mesh strategy, in order to comply with legal requirements and industry standards – as well as avoid fragmentation, duplication and inconsistencies – FS organisations need to focus on the interoperability between domains. This is where data virtualisation can play a vital part.

By creating a logical layer between siloed data sources and the domain-specific data consumers, data virtualisation can help to facilitate an effective data mesh. Unlike a traditional ETL (extract, transform, load) or data warehousing model, it avoids the need to ‘move and copy’ the data. Instead, semantic models are defined in the virtual layer between the disparate data sources and the different data consumers. This allows users to abstract the data they need, as and when they need it, in near to real-time. Thanks to the simplicity of use and the minimisation of data replication enabled by data virtualisation, the creation of data products is much faster and much safer than using traditional alternatives.

From a compliance perspective, data virtualisation’s logical layer also enables organisations to automate the enforcement of global security policies. For example, leadership teams can mask the salary data in all data products unless the user has a certain HR role or is at a certain level within the business.

For FS organisations, data-driven innovation and digital advancement has never been more important. As such, there’s no doubt that modern architectures, such as data mesh, are set to play a big role moving forward. In order to truly maximise on these architectures and achieve success tomorrow, organisations should start investing in the right technologies today. By providing a 360-view of all information stored across an organisation, data virtualisation is emerging as a key element to a successful, data-driven future for those operating in the financial services.


Explore more