Financial services turn to observability to aid diagnosis of breach investigations and threats

Nick Heudecker, Senior Director of Market Strategy for Cribl

 

Organisations across all industries have turned to digital channels to maintain operations during the Covid-19 pandemic. Banking and financial services are no exception, creating new channels to accommodate both the huge shift to online transactions and the switch to remote working.

These changes played out against a surge in cybercrime, with the pandemic creating a perfect storm for new methods of attack. The financial industry is disproportionately affected by cyberattacks today, with 23 percent of all attacks directed at financial institutions[1]. In fact, banks experienced a 1,318 percent year-on-year increase in ransomware attacks alone in the first half of 2021[2].

Even before these monumental changes, data volumes were exploding, year-on-year. Now, given the increase in digital channels to engage with customers, banking and financial organisations need a way to manage the ever-increasing volumes and varieties of data. Customers today demand access to banking and financial services 24 hours a day, 7 days a week. This means organisations need to better understand the behaviour of those applications and data – while ensuring the data is secure from any potential breaches or attack and complies with privacy and compliance requirements.

Nick Heudecker

The challenges of data monitoring

It is important to acknowledge that financial organisations recognise the value of data and leveraging it to make better business decisions. This could be improving customer service, matching prospective customers with new offers, bringing services to market faster or reducing operational overheads.

However, the collection and storage of vast amounts of data today comes at considerable cost. At the same time, organisations are faced with managing highly dynamic, but also very complex, distributed environments, with different application developers, SREs and security teams often working in silos with no single pane of glass or view over all applications.

So, while financial organisations want to leverage the power of data for better business insight, this often means pulling data that exists in multiple sources from the network, servers, endpoints and applications before loading the data into a logging analytics or security platform. They often have data coming in a variety of formats, and those formats may not always match the formats required by the tools they are using. Also, these platforms don’t provide much control over that data, and they don’t allow you to share data between different tools. This lack of choice limits the effectiveness of security programs.

Here, the CIO of a bank or financial institution might traditionally have looked to data monitoring solutions. The problem with data monitoring is that you have to know what you want to know ahead of time. Exacerbating that problem is that today they might have an application that gets deployed across multiple clouds, and on premises, or at the edge in a container, and those applications might be using dozens of different technologies.

The answer lies in observability pipelines. These can drive operational efficiencies by getting the right data, to the right destinations – in the right formats – at the right time. Put simply, observability is the practice of interrogating your environment without knowing in advance the questions you need to ask.

Observability is more operational than analytics-based. So instead of relying on point-to-point solutions, observability means taking all of the event data – logs, metrics and trace data – and running it through a centralised point, which puts the decision making back into the hands of the organisation. The user can decide how they want to route that data, to filter it, to redact data such as PII, meaning they can govern all of their data at one point. This way you can also reduce the amount of data that you are sending to downstream systems.

Diagnosis of breach investigations and threats

As mentioned, for financial services there is particular focus on security and the use of observability for diagnosis of breach investigations and potential threats.

Leading tools enable organisations to encrypt sensitive data in real time before it is forwarded to and stored at a destination, ensuring anonymity for every customer. This helps financial institutions keep customers’ personally identifiable information (PIN) safe – mitigating the risk of a breach and ensuring continued customer loyalty.

It can also allow financial institutions to place full-fidelity data in low-cost storage for as long as they need. As and when organisations discover a security breach, they can collect data from object storage and replay that security data to any SIEM or UEBA systems. This also means they can put customers at ease by quickly diagnosing and resolving existing breaches and potential cyber threats.

Observability provides the banking and financial services industry with an affordable way to retain more data for longer periods of time while still making that data easily accessible for breach investigations, whenever they happen.

It empowers financial organisations to make choices that best serve their needs without undermining their digital transformation efforts, the customer experience – or sacrificing organisational security.

 

[1] https://www.institutionalassetmanager.co.uk/2021/08/19/305127/financial-institutions-are-prime-targets-cybercriminals-and-future-attacks-are

[2] https://www.securitymagazine.com/articles/96128-banking-industry-sees-1318-increase-in-ransomware-attacks-in-2021

spot_img

Explore more