Network performance analysis tools are designed to give banks and brokers increased visibility into the performance of their hardware architectures and software applications.
Users can drill down to a granular data level to troubleshoot network, application and connectivity issues without creating disruptions to business critical flows.
This type of analysis is crucial in an environment where the buy side is demanding more transparency on trading costs and process, and is required to undertake more due diligence on providers as a result of the regulatory requirements coming into force under Mifid II, while the sell side remains under extreme margin pressure and needs to demonstrate it provides clients with value for money.
Once so expensive that only the biggest institutions could afford them, the increasing adoption of cloud computing and software-as-a-service means analytics solutions, which often work across multiple asset classes, have become accessible to smaller dealers.
With FX trading systems that make algorithmic decisions in microseconds, a vital feature of any monitoring system is the ability to analyse what is happening in real time.
This means that intelligent analytics need to be performed at high speed as soon as the data is captured so that real-time alerting can provide an insight into operational problems that could result in catastrophic losses if left unaddressed, explains Paul Spencer, chief operating officer at analytics firm Velocimetrics.
“Systems that capture data to a database for later analysis are of limited use as losses will have already been incurred before the trading institution can take action,” he says.
Having access to tools that monitor and analyse network data also enables brokers and banks to pre-empt potential problems on their networks, suggests FXecosystem CEO James Banister.
“The visualization aspect is vital to any performance analysis service given that the aim is to present data in a clear, graphical format,” he says.
The use of data visualization to create personalized data analysis that supports specific job functions is just one example of how performance analysis systems have become more user-friendly as a result of improvements in technology and the ability of firms to create their own versions, either on a proprietary basis – working directly with a software provider – or by integrating products from several different vendors.
Indeed, Alex Viall, head of regulatory intelligence at compliance software firm Behavox, says the do-it-yourself approach to analytics is very much in vogue.
|Alex Viall, Behavox|
“Banks and brokers of all sizes are starting to recruit their own teams rather than outsource and the demand for data scientists has increased significantly as a result,” he adds. “Use of the cloud to house and back-test data is also much more feasible.”
The value of this type of analysis should not be underestimated, given the capacity for operational issues – such as delays to the dissemination of pricing movements caused by latency on the network – to impact trading efficiency.
Happily for smaller banks and brokers, this type of insight no longer comes with an eye-watering price tag.
Earlier systems were based on high-cost appliances, typically ranging between $100,000 and $200,000, that were out of the reach to all but the wealthiest tier-one banks, explains Velocimetrics’ Spencer.
“Now, more forward-thinking vendors are offering software-only solutions at much lower cost so that clients can source their own hardware, allowing cost-effective, high-performance solutions to be deployed in the smallest appliances suitable for co-location,” he says.
Additionally, clients do not have to monitor their full network from day one. Analytics solutions can be implemented quickly on a phased basis, which might make deployment easier for banks and brokers with limited resources or complex networks.
|Duncan Ash, Qlik|
Traditional analytics and business intelligence tools from mainstream stack vendors are expensive – and not just for smaller banks. Not only are the cost implications of this type of software high, but the implementation process can last for anything from 12 to 18 months.
“A contemporary approach to visual analytics enables smaller banks to deliver a solution at speed that does not put undue cost pressure on the business and can be delivered without overstretching the resource of a small IT team,” concludes Duncan Ash, Qlik senior director and sector group lead for financial services.
“This reduces the barriers to adoption when deployed across the business and enables banks to deliver an agile solution to the users that need it.”