Big data transform treasury role
Companies are beginning to recognise that the insights garnered from big data can help them move towards the concept of real-time financial analysis, especially when combined with internal data from traditional data-management systems.
As treasurers move towards rolling forecasting and budgetary approaches that prioritise risk, the traditional annual budget process is being marginalized in favour of predictive analytics, connectivity and big-data computing power.
That is the observation of Stephen Malinak, global head of content analytics at Thomson Reuters, who says the drivers for a connected, big data-enabled treasury are numerous and evident in the emergence of the role of head of financial planning and analysis at firms such as Alliance Boots, easyJet and Legal & General.
New institutions have a tremendous opportunity to use insight extracted from data to build a business
According to Igor Panivko, finance director at Konica Minolta Business Solutions Russia – a provider of office hardware – companies with big-data repositories that are introducing new roles such as chief data officer, chief data architect or insight manager are replacing the analytics role of the traditional chief finance officer, leaving CFOs with responsibility for compliance and treasury functions.
He says the growing potential of analytics requires either redefining the CFO’s role in the data-driven world of business, or transforming the traditional financial-analytics function into a new role of leading data crunchers and enterprise data analytics architects.
The reason for the data-led transformation of the traditional treasury role is clear: new data intelligence is transforming capital-market raising, business planning, customer retention strategies and much more.
In the capital markets, the trend towards smaller and more numerous trades is one example of corporate treasurers utilizing advanced analytical models to minimize costs and boost asset/liability matching, says John Myers, research director, business intelligence at Enterprise Management Associates.
“The results of these trades and the analysis of this larger dataset allow firms to capitalize on the granular trades of their competitors and/or the marketplace,” he says.
Active trading participants need a way to digest data in a timely manner and act on it, and solutions are available in the market to achieve that, adds Ralph Achkar, capital markets product director at Colt, a provider of data centre services.
“The cost of building such a scalable solution that can hold large volumes and still generate responses fast enough is not trivial,” he says.
“Offering such a solution as a managed solution is probably where the next push will occur. Providers who can help with the infrastructure, content and software packages are the players that need to get involved in making this a reality.”
Vincent Kilcoyne, capital markets specialist at SAS, says management and IT departments need to start thinking about the problem they are trying to solve and see what elements of data are relevant to the needs of the customer.
“There is a role for structured data, but incorporating more variety adds significant value,” he says.
David Turner, IBM Global Business Services strategy and analytics expert, says his company is seeing many examples of big-data analysis successfully delivering specific business outcomes.
“Many institutions are beginning with the simplest executions, such as capturing call-centre data and looking for the presence of specific business issues,” he says.
“For example, we are helping companies analyse call-centre interactions and interpret why customers are calling. With these new insights, we can help the institution create treatments to eliminate the need for the call or to enable more timely handling.”
He adds: “This creates a better customer experience with higher customer-satisfaction scores and a lower call volume, thereby reducing the cost to serve.”
This fits directly into existing data-management systems and outcomes, says Turner, adding: “Traditional structured data remains essential to institutions because it provides the baseline level of insight that is enhanced by big data.”
In this context, the observation of Jon Cooke, head of UK development and big-data specialist at GFT, that many organizations are still struggling with traditional data challenges in getting consistent, accurate and group-wide views of their data estate is worrying.
“Some of the challenges include the standardization of data across front, middle and back office, implementing data policies and standards across the group and enforcing group-wide data governance practices,” he says.
“By way of example, one area of great challenge in the traditional data space is exception and breaks management. Vast amounts of adjustments that occur in the back office come from data-quality exceptions, which are now being exacerbated by the explosion in volume, more stringent and detailed reporting regulatory requirements and the pressure to reduce costs in the number of people performing reporting and data analysis.”
Companies are also more aware they don’t have all the data they need within their own organization. For example, relying exclusively on transactional data to target customers with offers fails to take account of wider industry or demographic trends.
Retailers have been successful by combining internal and external data, and companies in other industry sectors are starting to follow suit.
One way of thinking about the transformative impact of big data in the near term comes from Harvey Lewis, research director for Deloitte’s analytics team. He says he is yet to see any evidence that big data and its analysis is influencing strategy.
“It is the other way around – businesses have a strategy and they are looking to see how big data can help them achieve it,” he adds. “This means that new institutions have a tremendous opportunity to use insight extracted from data to build a business.”