Navigating the challenges of Generative AI in financial services

By Ashmita Gupta, Global Head of Analytics at Linedata

When ChatGPT came into being in late 2022, it fundamentally changed the way people perceived the possibilities of artificial intelligence (AI). By January 2023, it became the fastest-growing technology platform in history, with over 100 million users. From a business perspective, every industry is now exploring how to integrate Generative AI (GenAI) in its business processes, including financial services.

77% of leaders across Europe’s financial services sector expect GenAI to significantly affect productivity. A further 68% predict up to a quarter of all roles will require upskilling over the next year. However more than a third (35%) lack action plans. 

The financial services industry has used GenAI primarily in three ways until now: to tap their own internal knowledge base to quickly analyse structured or unstructured data sets through a Q&A engine; to summarise multiple items of information, such as for investment portfolios; and to use for coding, especially with tools such as GitHub Copilot.

Firms are now keen to widen the scope of how they use GenAI, although they still have concerns about the associated risks. However, there are ways to take advantage of its benefits, while mitigating its potential downsides.

Ashmita Gupta

Adopt a problem/solution mindset

70% of organisations across sectors are currently in exploration mode with GenAI, including financial services companies. But they have limited, if any, experience of using it and starting with open-ended experimentation versus focused application. They will get the most out of the platform if they identify a specific problem first, then work out how to use AI to provide the solution.

For example, generative AI has proved valuable for retrieving information faster and in a more structured way. This type of use case extends to others such as Investment Compliance, and Research and Due Diligence Questionnaire (DDQ) analysis.

Tackle privacy and data security

One of the key concerns around GenAI is keeping data private and secure, which is especially critical for the data of financial services companies and that of their clients. The use of GenAI has led to reported leaks of internal, sensitive information with the likes of Samsung, but businesses can address the issue of security for positive return on investment:

One way is to use an open-source large language model (LLM), which can be securely deployed in a private environment so as not to allow any data sharing. Another is to apply a private link if using a public cloud environment to connect resources and disable all public access to the data.

Another area of exposure is when data is manually handled for model inference.  Building an Extract, Transform, Load pipeline (ETL) pipeline and automating data ingestion jobs can mitigate risks of access by unauthorised individuals.

Operationalise to maximise

The launch of ChatGPT motivated many financial businesses to increase their spend on GenAI. The real ROI will come however, when the technology is operationalized enabling firms to realize the benefits of cost savings and productivity growth.  Here firms need to think about:

  1. Building pipelines for deploying and maintaining Gen AI solutions to achieve benefits of CI/CD (Continuous Integration and Delivery)
  2. Scalability and extensibility of the application to other internal or external groups
  3. Ensuring proper AI governance and data privacy considerations are put in practise

This is just the beginning

It is understandable that financial services businesses, like so many others, have ‘FOMO’ and are keen to leverage GenAI. They will benefit most from its possibilities with the right planning and precautionary measures. But it is also important to remember the technology is still advancing. As the AI landscape evolves, views on its application will quickly evolve too. And the possibilities may be endless.

spot_img

Explore more