Clean data demand holds back Mifid II
Euromoney, is part of the Delinian Group, Delinian Limited, 4 Bouverie Street, London, EC4Y 8AX, Registered in England & Wales, Company number 00954730
Copyright © Delinian Limited and its affiliated companies 2024
Accessibility | Terms of Use | Privacy Policy | Modern Slavery Statement
Fintech

Clean data demand holds back Mifid II

The implementation of Mifid II has been postponed to allow additional time for improving data quality. Meeting this standard, though, is requiring a substantial amount of work.

The implementation of the Markets in Financial Instruments Directive II (Mifid II) will spark changes to reporting across the European banking industry. In transaction reporting, the level of detail has increased substantially. But the quality required of this information has also intensified

 

Alan Samuels-160x186

 Alan Samuels, Alacra 

Reliable, clean data is a paramount need. Alan Samuels, vice-president and head of product strategy for reference data services at workflow applications supplier Alacra, says: “Having quality, clean data has become a hot issue. The use of big data now covers so many different processes that making sure it is reliable is becoming ever more important.”

Poor-quality information can compromise the results, and the reputation of the institution submitting it. Samuels says: “If more emphasis is being placed on using big data and algorithms then the data has to be good. If there are discrepancies it will not produce good results.”

Assessing the quality of data is set to be a continuing process with a global reach. Three years after the implementation of the Dodd-Frank Act a consultation is being carried out to review the processes. It will take a sharper look at the data being used, from the quality to the format, and the clarity around reporting

Andy Green, global head of business development at compliance tools provider Risk Focus, says those that have grappled with new regulations before have learned that improving data quality is a key requirement: “From previous implementations of new regulations such as Dodd-Frank and Emir, firms have learnt that data quality is paramount and if this is not right the cost and reputational impact of rectifying it after the event can be horrific.”

The weight of these increased requirements has already been felt in view of the European Securities and Markets Authority (Esma) having called for implementation to be held off until January 3 2018. Although this gives more breathing space, it means that when the deadline arrives, the regulator will expect data quality to be extremely high.

Green says that the announcement of the delay provided an opportunity for institutions to rethink their processes from all angles: “The first thing we noticed was financial institutions taking a breath, reviewing their options and making sure their strategy was the right one. Putting in place software that allows firms to have greater opportunities to test their output as well as to create robust and continuous internal testing processes is one area that we see firms engaging in now, when maybe they would have had less capacity to do so without the delay.”

In order to be able to report best execution when the rules are finally implemented, banks will now have to store much higher levels of data since it is not known how granular the final requirements will require data to be.

 

Simon Garland Kx

 Simon Garland, Kx 

Simon Garland, chief customer officer at database software company Kx, says: “In the past client trades, client orders and market data were typically kept in different databases. With the new reporting requirements, that data has to be able to be merged and reported together, which means financial institutions will have to re-examine their database architectures.”

Alacra's Samuels adds: “With more regulations coming into play there needs to be both clean data and the right systems in use. There is a clear regulatory need for meeting high standards. This is creating more and more challenges for operational managers to build flexible, scalable processes and systems to be able to address use cases that have not yet even been articulated.”

Collating and condensing the necessary data is presenting a challenge when it is recognized that enriched counterparty data, collateral and valuation data and confirmation information will all need to be included. Simply pulling all of this information together presents a considerable task

Risk Focus's Green says: “It’s not uncommon for only 10% of the reportable data to come from the actual trade-capture system. Therefore there could be multiple different systems, in different locations, feeding into a firm’s reporting hub. The risk of one of these routes failing for a multitude of different reasons is a reality and therefore firms need to have very strong regression and end-to-end testing processes and controls to capture any failings as soon as they happen.”

Integrating existing data into the new systems is creating a further layer of complexity. The process of manually adding data to the digital database opens up the margin for error. Garland says: “In the past, static data would often have been kept on paper. Now it must be digitized and its correctness becomes a critical part of the reporting process, placing much higher constraints on its quality. When running sophisticated analyses under Mifid II, databases will need to be able to join the right static data at any point in time with corresponding trade, order and market data. For many banks this will force them to re-design and re-architect their systems.” 

Green says that data will continue to be a focus of the regulators for some time, even before the arrival of Mifid II, so the industry will need to continue investing. “The regulators have highlighted that they continue to see poor data,” he says. “The Commodity Futures Trading Commission (CFTC) specifically and publicly highlighted that they received one trade with a notional of six quadrillion US dollars. 

“They are making it clear to the firms that the reporting parties own the quality of the data and must have the right controls in place to ensure the information is right before it is sent [to the regulators] and have the right tools in place to identify and fix issues when they do happen.”

Gift this article