Model Validation – Stating the obvious
We’ve all heard the stories of how certain so-called market professionals have been able to book what proved fictitious profits because the bank they worked for had models which were pricing FX options significantly differently to the market. While it is true that opinions make markets, it is astonishing that this nefarious practice is still going on after the NAB debacle.
Some may choose to defend it as model arbitrage, but in reality, in some cases what is actually happening should be more accurately labelled as thieving. So I read with interest a recent report from consultancy firm Celent called Model Validation Best Practices: Achieving Value Added, authored by Umit Kaya and Il-Dong Kwon.
The report could have been sub-titled Stating the Bleeding Obvious, but that in no way detracts from it, for it highlights some pertinent truths about today’s financial markets. “The last decade has seen a maturing process for the management of financial risks....One of the drivers of this process has been the increasingly complex dynamics of the global financial markets, which have brought about the necessity for commensurately complex models. The upsurge in usage of models, sophistication of underlying theories, and complexity of the IT environment have firmly placed model risk (the risk that models do not perform as intended) on regulator and senior executive agendas.....Accounting troubles and tight regulatory controls around risk measurement models led the first model validation wave, primarily as a compliance exercise. The next wave will be characterized by repositioning beyond compliance towards a streamlined, value-adding model validation process.