Entering the era of intelligent payments

Euromoney Limited, Registered in England & Wales, Company number 15236090

4 Bouverie Street, London, EC4Y 8AX

Copyright © Euromoney Limited 2024

Accessibility | Terms of Use | Privacy Policy | Modern Slavery Statement

Entering the era of intelligent payments

Automation and artificial intelligence are transforming the payments industry into one of the most dynamic sectors of transaction banking. But there are still many teething problems in an industry that has been catapulted onto centre stage.

eye-biometrics-AI-780.jpg



Doing business in Sudan isn’t easy, but back in 2011, it should have been possible. 

“We were working with major telecoms company Zain based out of southern Sudan at the time – before the country split in two,” explains Charlie Tryon, chief executive of Maris, an investment holding company that operates across east and southern Africa.

“Sudan was on the US’s Office of Foreign Assets Control (OFAC) sanctions list, but the south was exempt, so it should have been OK for us to work here. We took out all the precautions we needed – did all the necessary paperwork, spoke to all the right people, told the banks, made sure we had permission from OFAC and notified all other regulatory bodies about our cross-border transaction – everything.

“But the money we transferred from our business bank account to Zain was still stopped,” he says. 

Tryon’s transaction failed to pass compliance tests. In a bind, he was forced to physically move thousands of dollars from a bank account in Uganda to Sudan to pay his suppliers – at great personal risk. 

“We had suitcases full of cash that we took via plane and car into Sudan,” says Tryon. “This isn’t the ideal way to run a business, but at the time we had very little choice.”



Behavioural biometrics provides an added layer of visibility to distinguish between legitimate applicants and fraudsters - Frances Zelazny, BioCatch


A lot has changed since 2011. Know-your-customer and anti-money laundering (AML) screening is increasingly automated, helping to remove some of the delays caused by strict compliance measures. 

On top of this, banks, retailers, payment service providers (PSPs) and other businesses involved in the money transfer process are using artificial intelligence (AI) to make much more accurate decisions about payments. 

“At the time, we spent a huge amount of money and time trying to set this right,” says Tryon. “It included international travel, lobbying, meetings with banks and regulators. But I think what it came down to was the fact that the international banks just didn’t care.” 

With all the automation and AI now used in payments, can the issues that Tryon faced back in 2011 be avoided today?

AI adoption

Matt-Mills-Featurespace-160x186.jpg

Matt Mills,
Featurespace

“I challenge anyone to find a fraud prevention company that doesn’t have the words ‘AI’ or ‘machine learning’ somewhere in its description,” says Matt Mills, chief commercial officer at Featurespace, a company that uses adaptive behavioural analytics to detect anomalies in real time to prevent fraud. 

AI has been adopted by payment platforms and banks to tackle the key problem in the sector – the need to protect the payments system from fraudulent activity, while at the same time coping with the exponential growth in digital payments.

The World Payments Report, published by Capgemini and BNP Paribas, predicts there will be over 876 billion non-cash transactions made in 2021, nearly double the volume recorded in 2016. Braintree, a division of PayPal that specializes in mobile and web payment systems for e-commerce companies, says that by the end of 2019, 2.1 billion consumers will have used a digital wallet – an increase of 30% since 2017. 

In November last year, the European Central Bank launched Target Instant Payment Settlement (Tips), a system that uses central bank money to settle payments individually in less than 10 seconds. In the US, the Federal Reserve plans to go live with its Real Time Gross Settlement system in 2020. At the end of 2018, financial services company FIS found that there were 40 active real-time payment programmes around the world, up from 25 in 2017. 

It is becoming a lucrative industry. Global payments revenue grew 11% year on year to $1.9 trillion in 2017, according to McKinsey. The firm predicts the sector will grow to $2 trillion by 2020 and $3 trillion within five years.

Payments is the new battleground for transaction bankers now involved in a scramble to use the latest technology to keep up with the competition.

“If you want to do well in this space, you need to have state-of-the-art, usually expensive, payments technology, which has been difficult for incumbents to develop alone – hence the rise in M&A activity in recent years,” says Jerry Norton, vice-president, financial services, at IT and business process services company CGI. 

Jerry Norton-160x186

Jerry Norton,
CGI

“Banks are much more likely to acquire this technology or outsource their payments business than they are to develop it themselves,” he says. 

In September this year, HSBC became the latest bank to acquire technology from Featurespace to support efforts to strengthen AML and fraud prevention in the insurance and retail sectors. 

Moreover, convenience at the individual level – where instant online payments are made with a click of a button – is also driving the change. 

“Corporates and individuals are not dissimilar, they both want seamless transactions,” says Luca Corsini, head of global transaction banking at UniCredit.

“This is happening for the first time,” he says. “Just a few years ago, innovation in banking stemmed from large corporate clients, filtered down to small corporates and then into retail banking, but we’re seeing a shift in this paradigm.

“Think about what is happening in payments,” he says. “In China, WeChat and Alipay have become ubiquitous. This is the type of service and efficiency that corporates want as well.”

Lines blurring

Banks haven't been the only players in payments for some time; and there has been a rise in the number of payment apps, PSPs and the forms of infrastructure that payments run on. As the industry grows, the lines are blurring. 

“There is a lot of overlap in payments,” says an industry expert. “We are in a sort of mix-and-match situation, where banks and PSPs will work together or compete. Who will be doing what in the future hasn’t really been decided as yet.” 

One thing that is inevitable is consolidation. In July, FIS closed its $43 billion acquisition of Worldpay – the biggest M&A deal in the payments sector to date.

In the same month, Fiserv and US financial services company First Data finalized their merger for $22 billion, while in June, PayPal acquired iZettle for $2.2 billion. In August, Mastercard announced the $3.2 billion purchase of the real-time payments unit of Nets Group, which will complement the company’s acquisition of VocaLink in 2016. 

VocaLink operates key payments technology such as automated clearing services.

“Just look what’s happening in the industry,” says one head of banking, who wished to remain anonymous. “Who would have thought that a payments company would have their own automated clearing house infrastructure 10 years ago? The entire payments landscape is in flux and new players are coming on board.”



Stuart_Riley_2019-Citi-780.jpg

Citi's Stuart Riley



Stuart Riley, global head of institutional client group technology and operations at Citi, says: “The only way to quickly process huge volumes of payments is to automate the process. We wouldn’t be able to physically keep up with the volumes if we didn’t.

“But if we want to comply with regulation and make legitimate transactions quickly, banks will have to start using AI.” 

Machine learning

Machine learning – the use of algorithms and statistical models that allow computer systems to perform a specific task without explicit instructions – allows banks and payment platforms to make accurate decisions around compliance issues without human intervention. And machine learning in payments can support financial inclusion. 

“The stat is [that] on average, for every fraudulent transaction that is stopped, 20 legitimate transactions are declined either by the bank, retailer or payment company,” says Michael Reitblat, chief executive of fraud prevention company Forter.

“Basically, someone along the line will raise a red flag and say that this looks suspicious without having the full context, perhaps trying to maximize for a specific financial outcome rather than punishing people for actual fraudulent activity.”

Mills at Featurespace says: “If you start from a position where you consider everyone to be a good consumer and then place anomaly detection on top of this, machine learning will be more accurate in identifying the suspicious behaviour and, subsequently, more fraudulent transactions.

“AI turns the whole process on its head.”

Machine learning is able to learn from human behaviour, improve user profiles and use this to validate sessions or transactions. Behavioural biometrics are particularly robust at measuring individual human behaviour and are very difficult to hack. 

Whereas physical biometrics uses identifiers such as fingerprints or iris patterns, behavioural biometrics uses voice identification, gait analysis or the way someone uses a keyboard or mouse at their computer to verify transactions. A distinct change in the way you use your electronic devices or type in sensitive data online is enough to flag potential fraud. 

The technology is incredibly accurate, say its advocates.

Frances-Zelazny-BioCatch-160x186.jpg

Frances Zelazny,
BioCatch

“And this technology has applicability across the board,” says Frances Zelazny, chief strategy and marketing officer at BioCatch, a behavioural biometrics and threat detection company.

“Traditionally, most of the deployments of AI-driven behavioural biometrics have focused on preventing account takeover. But with the data criminals have available from the dark web, they can easily apply for credit cards, insurance policies, open bank accounts, set up social security numbers, and more, under assumed and made up identities. 

“In this case,” Zelazny says, “behavioural biometrics provides an added layer of visibility to distinguish between legitimate applicants and fraudsters.”

Stop fraud at the payment stage and there is the potential to stop a whole host of other criminal activity from taking place.

Fines

Banks care about compliance and fraud. When they don’t, it can result in billions of dollars in fines and severe reputational damage. 

According to Boston Consulting Group, between the financial crisis and 2017, banks were fined a combined $321 billion due to regulatory failings relating to market manipulation, money laundering and terrorist financing. During the same time period, fines relating to OFAC – the same sanctions programme that affected Tryon at Maris – hit $16 billion.

In September, the UK tax authority, HMRC, handed out its biggest fine to date for AML violations. Touma Foreign Exchange was hit with a £7.8 million bill for a breach of regulations between June 2017 and September 2018. 

In April, UniCredit pleaded guilty to charges that it allowed Iranian customers to conduct transactions in violation of sanctions. The bank will pay $1.3 billion to the US authorities as part of the settlement. A few days earlier, Standard Chartered was fined $1.1 billion for violations of sanctions relating to Burma, Cuba, Iran, Sudan and Syria. 

De-risking by exiting relationships and closing client accounts may be the only viable option for financial institutions that have either been burnt in the past or are wary of committing offences. 

No wonder businesses doing legitimate business in riskier countries feel like the banks don’t care, as Tryon puts it. 

“It’s not as simple as saying we don’t care – that’s not the case at all – but because we have to comply with regulations we take necessary precautions,” says Katharine Steger, executive director, public sector and development organizations at Standard Chartered. 



We live in a world where if you are waiting longer than 24 hours for a payment to come through, you are waiting too long - Trevor LaFleche, Fiserv


Inevitably, that requires stopping a payment to carry out the necessary due diligence. 

“If we can develop relationships with some of these corporate clients, or others that face difficulties when it comes to cross-border transactions, then perhaps we can work on arrangements to ensure that their transactions are completed in a timely manner, removing the delays and issues such as delayed or cancelled payments. But we have to tread carefully,” she says.

Transparency helps. Products such as Swift GPI, which cuts transaction times, tracks payments from start to finish and records all accompanying costs, will strengthen banking relationships with corporates. 

“It creates a paper trail that wasn’t possible before,” says Tony Wicks, head of financial crime compliance at Swift.

But while messages cross the Swift network in real-time, the receipt of funds in the beneficiary account can still take hours or days to settle. And for some, that’s just not good enough.

“We live in a world where if you are waiting longer than 24 hours for a payment to come through, you are waiting too long,” says Trevor LaFleche, director of product management and marketing at financial services technology company Fiserv.

“Swift GPI was lauded as one of the greatest developments in cross-border transitions and payments,” he says, “but we are already transitioning to the next stage, where instant payments are now the norm. Individuals and corporates expect more.”

Cross-border

Can AI help with cross-border transactions and reverse the trend towards de-risking among banks and PSPs? 

“Real-time on-boarding is becoming a reality because corporate clients demand it,” says Rachel Woolley, global AML manager at the compliance and data management company Fenergo, which works with Santander, UBS, Bank of China and a number of other large financial institutions. “We use robotics and AI to streamline part of the process.” 

But it’s not as simple as just applying AI. 

“We still have to weigh this up with the risk,” says Woolley. “Onboarding cannot be automated in every instance because by their nature, some issues – geography, type of business, people involved – create an individual risk profile that needs to be assessed in depth and in person.” 

Take the example of charity ActionAid. 

“When there has been a major humanitarian disaster, we need money on the ground urgently – to buy from our suppliers or to pay health workers for example,” says Rachid Boumnijel, ActionAid’s humanitarian deputy director, who has worked in disaster zones such as the Philippines, Mozambique and Indonesia.

“But we still find that even today our transactions are stopped, despite having gone through the proper due diligence with our bank partners. This can severely delay the relief effort putting people in danger.” 

It’s not just geography but sector that impacts the process. 

“Charities, money service bureaus, the defence industry – all of these are areas where additional checks will be applied as part of the compliance process, because traditionally these are the types of sectors that are flagged the most,” says Wicks at Swift. 

“Unfortunately, specific cases within these sectors raise the bar for all companies within them.”. 



Organizations that aren’t obviously monetizing your data are doing it in a more subtle way; unfortunately, that tends to mean that they are working with the highest bidder - Matt Mills, Featurespace


Even when compliance issues are met, there are many other factors still at play. 

Harry Newman, head of banking at Swift, says: "While cross-border payments are being sped up, there are still challenges to overcome such as domestic payment systems' operating hours, which can cause delays when you transfer cash cross-border – but this is changing as we integrate instant cross-border payments with real-time domestic systems."

The world may be moving towards instant payments, but we are not there yet.

Those manual time-consuming processes that banks hope to automate through AI will remain in place because these are processes that can’t be standardized across the board. And in certain cases the potential financial, reputational and regulatory risk if something goes wrong is just too high. 

“We haven’t seen significant job losses,” says Riley at Citi. “The reason this is true is because, although we automated certain things in the past, as we automate work and scale our business, we redeploy those people into other areas where they are typically adding more direct client service.

“In our operations area across the markets and payments business we have eliminated 200 jobs through AI and robotics in the last year,” he says. “But at the same time, just in our payments services area alone, we have added more than 200 roles.”

“Our clients want a human touch,” says Swapna Malekar, senior product manager at the Royal Bank of Canada. “We automate some processes, employ AI in others, but we know that sometimes our clients want to walk into a branch and speak to a human. That aspect won’t go away.” 

Perhaps relationship managers in transaction banking have a longer shelf life than originally thought. After all, there’s nothing more annoying than a bad chatbot. 

Data

But there are many other issues to take into consideration beyond job losses. 

“We are at the stage where the technology out there is available to everyone, so it all boils down to the data you have,” says Riley. 

Data is a valuable commodity, and it can end up in the wrong hands. The public have been burnt before. One of the most obvious examples of data misuse was when it was discovered that Cambridge Analytica had harvested the personal data of millions of Facebook users without their consent.

“Organizations that aren’t obviously monetizing your data are doing it in a more subtle way; unfortunately, that tends to mean that they are working with the highest bidder, which doesn’t always mean that there are good intentions involved,” says Mills at Featurespace.

“The reality is that self-regulation is difficult for organizations, but our core market – the banks and other financial institutions – generally do quite well at this.” 

The General Data Protection Regulation (GDPR) in Europe should help protect individuals’ data. But there is a growing tension with the arrival of the second Payments Services Directive (PSD2), which in essence encourages data sharing to create competition in banking and payments. If banks limit access to customer payment data to third parties for fear of breaching their customers’ privacy rights under GDPR, authorities may consider this a breach of competition law. 

GDPR fines have already been imposed on a number of institutions in Europe, including hospitals, airlines and schools. In banking, the Romanian National Supervisory Authority issued its first fine against UniCredit for €130,000 in July after an investigation found that the bank failed to implement appropriate technical and organizational measures in compliance with GDPR requirements to protect customers’ data.

“People aren’t really that aware of their data protection rights at the moment, so I think in the initial stages PSD2 will win over GDPR,” says an industry expert. “It will take some time for people to work out the tension.” 

Issues around accountability and regulation will creep up in to other places as the technology is increasingly adopted. 

“When a decision is made by AI, how are you then able to find the exact reason behind why a transaction is not stopped when it should have been – other than to blame it on the algorithm?” says Benoît Desserre, head of global transaction banking at Société Générale.

“Using AI makes it very difficult to audit payments. At the early stages, if you have one algorithm, this might be easier, but down the line when you are dealing with millions of payments and multiple algorithms, how easy will it be then?” 

Then what happens if the data you collect is flawed? 

“Often we get excited about algorithms for data analytics, but there is a lot of data that is not ready to use,” says Lukasz Szpruch, programme director for finance and economics at the Alan Turing Institute. 

“There have been cases where algorithms extract features in the data that we would not like to see. For instance, there was a case in the US where banks did not give loans to people of a certain race or gender. When this was investigated, it became apparent that the algorithm used was inherently biased because of flaws in the data. 

“And even if you remove race from the data set, there are often cross-correlations, so the same type of people may be discriminated against,” he says. 

In payments, generally the worst that can happen is that a transaction is declined or delayed, but the fact remains that there are still flaws with using AI in payments – often the same flaws that were supposed to be avoided by removing the human element in the first place. 

Technology aimed at making processes simpler brings with it a number of unforeseen consequences that payments providers, financial institutions, fintechs and regulators will have to work out. 

Until then, people such as Boumnijel at ActionAid and Tryon at Maris will have to carry on doing the manual work with the banks to make sure their payments go through. 



Gift this article