Banks – which need to know the details of their clients for KYC and their payment flows for intraday liquidity reports – now have access to detailed data sets, which opens up the potential for profitable new services.
Meanwhile, corporate treasurers and data-intelligence providers are sniffing out new opportunities.
| Data only truly becomes information when it has been cleansed, consolidated and validated|
“Data is the currency of the 21st century,” says Max Speur, COO of SunTec.
The quality of the data sets is as important as the quantity. Assessing patterns in client behaviours has the potential to highlight when fraud is taking place and take measures to root it out.
Through combining historic data and looking at what information is available in real time, banks are able to get a greater insight into the customer base and can tailor their offerings accordingly.
Paul Clapis, director of product management, IntelliMatch, SunGard, says: “One of the most important uses is to be able to analyse the historical data to detect trends and anomalies.”
Speur adds: “By marrying the past with the present, banks gain a better understanding of what a customer actually wants on an individual-by-individual basis.”
The proliferation of data sources has seen a rise in aggregator services to make some sense of the findings, creating a new business for data intelligence providers.
Clapis says: “The number of sources and number of disparate formats is growing exponentially, so we are seeing an increase in the number of data aggregation services being offered to help pull together and mine the aggregated data.”
And banks that can capitalize on the findings can give themselves the competitive edge.
Speur says: “Banks which can aggregate customer data from different systems quickly have an advantage over their rivals as it can achieve a faster time-to-market for new products. More importantly, the data has to be used in the right way to understand the customer better.”
|The 'rip and replace’ method will only leave |
banks further in the dark as the history of
customer transactions will be erased
He adds that competition in the market has also pushed forward the development of better systems. The arrival of the non-bank technology providers into the mix means they come with access to their client’s past behaviours, which already gives them that advantage over the bank offering.
The arrival of the technology companies such as Apple, Google and Facebook, which are looking to provide financial services, can log details on the preferences of their customers.
“The game has not been changed yet, but the longer the banks ignore how non-traditional banks are using data to their customers’ advantage, the faster they will lose market share,” Speur says.
Data has highly practical applications for clients, as it can be used for benchmarking. The information is being pulled together not just from internal systems but also from external sources, including those that are publicly available.
This can provide a wider view of what is taking place and the ability to benchmark companies effectively against what others in their peer groups are doing.
To capitalize on this capability, Citi launched its Working Capital Analytics tool, which can provide real-time analysis, and highlight where efficiencies can be made.
|Citi's Mark Tweedie|
Mark Tweedie, managing director, EMEA trade and treasury solutions, head of sales at Citi, says:“With Citi's Treasury Diagnostic survey, there is the ability to benchmark across core facets of treasury such as subsidiary funding, FX, systems, payments, working capital and risk management.
“Citi's established ecosystem flow insights and buyer-seller banking relationships enable us to tell corporates where they sit relative to their sub-sectoral peers across key areas.”
He notes that access to data is becoming ever more important in the relationship between the banking and the corporate space, adding: “The key is how the data is sourced and presented, with the emphasis placed on usability and customization. Much like the banks, corporate treasuries are being asked to do more with less and increase automation, visibility and control.”
The desire for these systems is being pushed from the corporate side, as it gives them the opportunity for the first time to understand how they can improve their systems along the lines of industry best practice globally.
Tweedie says: “Relative to more general best-practice evaluations or local market finance function comparisons, increasingly corporate treasuries want to compare and contrast with companies who share common industry dynamics and thus treasury demands.”
Realizing this dream poses practical challenges.
There is a need to be cautious about the volume of data that is available, as not all of it will be of a high standard that can used effectively. Data sets need to be thoroughly checked and assessed before they can be used in any meaningful way.
“Data only truly becomes information when it has been cleansed, consolidated and validated,” SunGard's Clapis says. “There is definitely a need for filtering and prioritizing because there is much more data available than the markets want or can use effectively.”
The push towards importing new software systems could have a detrimental impact if the new systems cannot integrate with those already in place. The proliferation of technology that has been designed to be easily removed can also take with it potentially valuable data.
SunTec's Speur says: “There is high value attached to 30 to 40-year-old IT systems because of the customer information contained in them and the ‘rip and replace’ method, propagated by some technology providers, will only leave banks further in the dark as the history of customer transactions will be erased.”
With more players offering such services, it is becoming important to get data checked and ready for consumption as quickly as possible, as the faster it can be made available, more markets will be willing to invest in the products offering it.
Clapis concludes: “Everyone knows that year-old data has little value other than for contributing to trends analysis. In the near future, we will find that a premium will be paid for data that is generated and consumed within hours or minutes.”