Cathal McGloin, CEO of ServisBOT writes, “The COVID-19 pandemic has created major spikes in calls to financial sector helplines dealing with customers who are concerned about temporary business closures, or seeking information on mortgage holidays and insurance cover.
Easing the pressure
With call volumes surging at many contact centres, moving customers from a voice to a text-based channel and encouraging some of them to self-serve via your website or mobile app helps to reduce pressure on contact centre agents. A call-deflection solution doesn’t have to be complex, costly or time-intensive, but it can be extremely effective in managing additional call volumes more cost-effectively, while still providing your customers the information that they need to allay their concerns.
If customers are able to interact with a chatbot initially and this resolves their immediate queries, this can significantly reduce call volumes and the business can still enable the bot to handover to a customer service agent for customers that require further assistance.
Setting up a Chatbot in 48 hours
Whether your interactive voice response (IVR) is based on legacy technology or is a modern cloud-based solution, it’s possible to deflect customers from an inbound voice channel to a messaging channel. We know, because we have done this for a client who considered this impossible with their legacy on-premise IVR system. Spinning up a solution took just 2 days and allowed them to successfully deflect calls, automate the response, and still offer customers a path to live chat.
Employing a Chatbot as a Call Deflection Solution
Financial services businesses can launch a very simple bot. The bot can be as simple as just pointing a customer to the COVID-19 FAQ page or it can be an extension of an existing customer service bot that offers multiple capabilities. On day one it may just be used to quickly assess queries and handover to a live agent. However, by gathering the training phrases from customer chats, the bot can be made progressively smarter and add capabilities, so that it can be trained over the course of a week to start automating your customer service
After a week the bot can start automating to become more self-sufficient and take more of the burden from your customer service agents, allowing them to handle more complex customer issues.
Using a chatbot opens up a whole new path to automation. Once customers start to engage with your intelligent virtual agent, the bot can handle simple requests, direct them to the relevant information on your website, or help them transact in a self-service manner. All of this can happen without the need for them to engage with an agent unless they specifically request this, or the bot escalates the request to an agent. It can even be integrated with your live chat systems so that the bot works in parallel with live agents when needed.
During crisis periods, when interactions with concerned customers need to be handled well, call deflection using a chatbot or virtual agent takes the pressure off contact centre agents. It also introduces an automation path that can help customers around the clock.
Once your chatbot has been trained to respond to common customer queries round the clock and reduce the pressure on your contact centre staff, your employees can focus on providing the best care for your customers who urgently need to speak to them. Introducing virtual assistants sends a clear message to your customers that they are your priority and increases the resilience of your business against future emergencies.
Defining Fraud in 2023
Scott Buchanan, Chief Marketing Officer at Forter
Fraudsters are fluid — they constantly experiment with new tactics to find cracks in a merchant’s defenses. In 2023, there are five trends that merchants need to be aware of — we saw each in 2022 and expect to see them with even more frequency in the year ahead.
Human ‘Bot’ Farms
First, let us acknowledge that while “human bots” is an oxymoron, it is also highly insensitive. At present, our industry lacks a better way of describing the practice. It used to be that human ‘bot’ farms referred to sweatshop-style arrangements in which poorly paid workers, often in developing countries, spent their days on brute force attacks, solving things like CAPTCHAs.
Now, though, a new twist on this old theme has arisen. In short, human bot farms use trafficked humans to scale their fraud operations. Often, they behave as bots, conducting brute force (and similar) attacks.
Human bots were widely recognised in fraud manager communities as a driving force behind recent repeated attacks, especially during the holiday rush. For example, human bot farms bombarded merchants that offer limited edition merchandise, decreasing the chances that prized products find their way to (and ultimately frustrating) good customers. These same operations also applied several tactics that follow at a scale that overwhelmed some fraud solution providers and their merchant customers.
Low-tech Address Manipulation
In the past year, fraudsters reverted to old tricks to circumnavigate rule-based fraud prevention as we saw an uptick in low-tech address manipulation. Consider a merchant with a rules set that checks a shipping or billing address against a negative list. And let’s say a noted fraudster has an address of 123 Main Street that is on that list. Therefore, any transaction with a shipping or billing address of 123 Main Street will be blocked by rules.
Fraudsters found an easy workaround. They simply write a variation of the address during checkout that evades the rules but can be easily understood by FedEx, UPS, or any other delivery company. For example, 123 Main Street becomes One-two-three Main Street or 123 Maain Street.
This should be simple to identify and block in theory. Still, fraud managers were frustrated that rules-based solutions — even those that applied artificial intelligence to speed rules application — struggled to spot this manipulation. During the Black Friday rush, more than one vendor threw up their hands and admitted they had no way to stop this tactic effectively. And as a result, fraud teams with these solutions had to manually review a growing queue of transactions.
With the growing presence of marketplaces to exchange goods, fraudsters are using triangulation more. Think about this as ‘stolen to order’ (instead of made to order). A fraudster posts a sought-after item for sale on a marketplace; in 2022, some of the most popular items for triangulation were high-end ‘cozy’ blankets, sneakers, gaming systems, and other electronics.
When a consumer buys an item from a fraudster on the marketplace, the fraudster then steals the item from a merchant. They input a shipping address for the marketplace buyer at checkout, which typically evades address verification checks. The marketplace buyer gets their item; the fraudster gets their money; the merchant gets penalised, and the marketplace is entirely unaware.
Fraudsters prefer triangulation because they don’t make any effort until they have a buyer — they never have to worry about stealing something they can’t sell, and they never have to touch the merchandise (further reducing their operating costs).
Emboldened cheaters are attempting more brazen tactics. A prime example of that is double-dipping — while this is not new, we did see more attempts (especially from amateurs and previously good consumers) to double-dip in 2022.
Double dipping can take any form where a bad actor wins twice. For example, the bad actor makes a purchase and has the product shipped. They tell the merchant the item was not received and simultaneously file a chargeback with their issuer. Since it may take hours or days for the issuer to inform the merchant of the refund request, the communication gap can mean the bad actor receives money back from both entities and keeps the product.
We’ve also heard examples of bad actors buying and receiving an item, then filing a return, yet failing to return the item. Instead, they send the merchant back a package with rocks (or something else weighted). In one particularly devious example, a bad actor filled a bag with dry ice, which evaded a weight check by the delivery company, and then arrived at the merchant as an empty package.
The best-known form of friendly fraud is chargeback fraud when a customer makes a purchase and receives it but files a fraud chargeback claiming that the purchase was made by a fraudster. This form of friendly fraud has been growing dramatically in recent years. Less recognised is that other forms of friendly fraud — which can also be labeled policy abuse — are increasingly serious.
For example, a consumer buys a sweater as a final sale. When it arrives at their doorstep, they realise it doesn’t fit as they’d hoped. Disappointed, the (previously good) consumer contacts the merchant to claim the sweater never arrived (code = Item Not Received) and demands a refund. The consumer now has the item they can wear (hey, at least the fit is close) or resell on a marketplace for profit.
Friendly fraud can also surface as returns abuse (returning items worn or outside of store policies), promotions abuse (re-using new customer discounts or other voucher codes), and more.
Friendly fraud is difficult to stop since it is often perpetrated by good consumers — they don’t appear on negative lists or fail basic rules. But professional fraudsters get in on the same acts, industrialising the consumer problem by increasing its scale and professionalism significantly. To increase their odds of success, they have gotten pretty systematic about this form of fraud. For example, on the dark web, fraudsters have shared the exact language to use when calling specific large merchants or issuers to nearly guarantee a refund or chargeback.
Parting Thought: The Power of Identity
The above tactics that fraudsters used with some success in the past year generally exploit gaps in rules-based systems (deployed by the merchant and/or offered by a fraud solution provider). These tactics don’t work when you can pinpoint the identity behind an interaction.
When you can be statistically confident that the identity entering an address of “One-two-three main street” is associated with fraud, it doesn’t matter what they enter in the address field; their transaction attempt is blocked. When a known fraudster is attempting to put an item up for sale on a marketplace or purchase an item with a net new shipping address, you stop them. And when they try to re-use promotional codes repeatedly, you reject the attempt.
You cannot pinpoint an identity with rules — instead, you need a massive graph of online identities and as much data as possible on each. While fraudsters always manipulate aspects of their identities, they cannot mask thousands of data points. Next-generation fraud solutions that use machine learning to augment human expertise can pattern match and pinpoint identity.
And to build the largest identity graph, you need a consortium of the largest merchants — collectively, they will ‘know’ the vast majority of online identities. And in this model, an identity — a bad actor or a good customer — known to one merchant is immediately known to all merchants.
And that is why the final trend for 2023 will be merchants abandoning rules-based systems at an increasing rate. That includes the rules-based fraud solution providers masquerading as machine learning (but really just speed up the application of rules). To combat more sophisticated fraudsters, merchants will make decisions based on identity. They will seek out the largest identity graph in order to achieve superior results.
Mizuho Bank Luxemburg upgrades anti-financial crime compliance risk management with Napier
Mizuho Trust and Banking (Luxembourg) S.A , the Luxembourg subsidiary of Japan’s Mizuho Trust & Banking division (part of Mizuho Financial Group) , is upgrading its Transaction Monitoring framework strategy through a partnership with Napier, the financial crime compliance technology specialist.
An intelligent compliance platform, Napier Continuum, including Transaction Monitoring, Client Screening, Perpetual Client Risk Assessment and Client Activity Review, will provide Mizuho Bank Luxemburg with a holistic overview of compliance that will enable it to connect data, control compliance operations, and manage risk.
The bank wanted to upgrade its framework to make it more robust given the importance of financial crime for credit institutions.
Naim Tliba, Chief Compliance Offer and Vice President at Mizuho Trust and Banking (Luxembourg) S.A., said: “We chose to work with Napier as it has the flexibility to meet our needs, at the same time offering the most advanced technology supported with powerful AI. With improved transaction and client monitoring capabilities, our organisation will be able to stay ahead of the curve and provide our clients with the most secure and regulated asset servicing experience.”
As part of Mizuho Financial Group, Mizuho Trust & Banking (Luxembourg) S.A. has been formed in 2000 and provides securities and fund services to its institutional clients.
Napier’s compliance technology helps businesses and financial institutions to comply with local and international anti-money laundering (AML) regulations, monitor transactions, and screen customer and business partners and therefore participate to the efforts to better combat financial crime.
Greg Watson, CEO at Napier, said: “Our range of new-breed compliance solutions help organisations like Mizuho Luxembourg to gain control over their risk management so that it can become a competitive advantage. The technology is one side of this, but it’s the capability to adapt a system in adherence with local regulations that offers the most effective solution, and that’s what we have been able to provide Mizuho Luxembourg. Approaching a system upgrade in this proactive way means that they will be equipped with a futureproofed anti-money laundering strategy that will take care of their AML compliance needs.”
How FS organisations can utilise data to boost customer experience
Charles Southwood, Regional VP and GM – Northern Europe and Africa at Denodo We’ve all heard the age-old adage “the customer...
The Evolution of SoftPoS in 2023
By Brad Hyett, CEO of phos Contactless payments and digital wallets have surged in popularity in recent years. Part of...
The Importance of Digital Trust in Banking and Finance
By Maeson Maherry, COO at Ascertia With the rising adoption of eSignatures and the acceleration of digital transformation, trust...
Taking Financial Services to the Edge
Authored by Pascal Holt, Director of Marketing, Iceotope Edge computing, cloud, and AI are changing the competitive landscape for...
Accounting Automation in the Future
Accounting automation is the process of streamlining repetitive tasks in financial processes. For example, some processes like invoicing are time-consuming...
How banks can help customers during the cost of living crisis
Lavanya Kaul Head of BFSI, UK & Ireland, LTI Mindtree Surging energy and food prices are significantly driving up...
Weathering the economic storm in 2023
Nikki Dawson, Head of EMEA Marketing at Highspot New year, new business challenges. When it comes to creating and...
Three ways data can help financial organisations thrive in today’s economy
By Rinesh Patel, Global Head of Financial Services, Snowflake Financial organisations are caught in the middle of an ever-evolving...
What is the right strategy for the end of money?
By John Barber, VP & Head of Europe at Infosys Finacle More than five thousand years ago, humans replaced barter...
2023 – what will happen in the payment world?
Tommaso Jacopo Ulissi, Head of Group Strategy, Nexi Group 2022 was a year of transition for consumers, as BNPL (Buy...
2023 crypto trends that businesses need to know about
By Marcus de Maria, Founder and Chairman of Investment Mastery As cryptocurrencies have started to enjoy wider global acceptance...
Defining Fraud in 2023
Scott Buchanan, Chief Marketing Officer at Forter Fraudsters are fluid — they constantly experiment with new tactics to find cracks in...
How accounting software may hold the key to keeping on top of credit control
By Paul Sparkes, Commercial Director of award-winning accounting software developer, iplicit. One of the first rules everyone learns about...
Coreless Banking: How banks can thrive in 2023
Hans Tesselaar, Executive Director of BIAN In recent years, banks have faced immense disruption and struggled to transform with...
Will cyberattacks be uninsurable in 2023? Three steps that financial organisations can follow now
By James Blake, Field CISO of EMEA, Cohesity The growing number of cyber attacks and subsequent damage has led...
Why Financial Services Institutions must de-risk the customer journey in 2023
By Perry Gale, VP EMEA at Cyara From rising interest rates, to the cost-of-living crisis and the ongoing recession,...
Why finance needs a technological leap in fraud prevention
Brett Beranek, VP & General Manager, Security and Biometrics at Nuance Communications Banking fraud is always a punishing experience for...
How Banks Should be Future-Proofing Themselves
By John da Gama-Rose, Head of BFS, Global Growth Markets, Cognizant Businesses across the world are facing a combination of...
The Promise of AI in Financial Services in 2023
By Kevin Levitt, Global Industry Business Development, Financial Services, NVIDIA As we enter the new year, many are left...
What to expect from banking and payments in 2023
Michael Mueller, CEO, Form3 The banking industry went through a number of significant challenges in 2022. The steep increase...