In our previous blog, Its time to start thinking like a regulator, Len talked about regulators increasing focus on the quality of your transaction reporting data and the need for you to start looking and evaluating your data like a regulator would. However, this isn’t as easy as it sounds – the regulatory data sets required under MiFIR, SFTR, EMIR and other G20 regulations are complex. With over 150 fields in a single regulation in some cases and 100s of millions of records reported daily. To fulfil these requirements, data needs to be drawn from multiple different upstream systems within a firm and transformed to meet the specific regulatory need. To further compound the problem, change is constant within organisations and a small alteration to any of the upstream systems can create big problems once the data has fed down into the regulatory report. To effectively analyse and monitor your data requires advanced techniques, a full reporting data set, and smart technology. Unfortunately, individual firms often don’t have these requisite capabilities, and this leaves them vulnerable to fines and reputational damage.
The traditional approaches to ensuring regulatory data accuracy are to perform end-to-end reconciliations (a requirement under RTS 22) or periodic sample-based control reviews. On the surface this control framework appears satisfactory, but when you dig a little deeper there are three key inherent weaknesses.
The first is context. Without access to the same industry-wide data set as the regulators, these controls become too internally focused and omit key market context. For example, how is everyone else reporting this type of trade? Or are all my counterparties reporting the same trade timestamp as me?
The second is coverage. By definition, these controls don’t cover the whole population of reported trades, meaning, at best, it can take between 3-6 months to identify an error, and at worst it’s missed altogether. This can mean a relatively small issue escalates into one that requires a significant amount of back reporting to resolve.
The final one is cost. Traditional data quality reviews are often resource intensive, requiring significant manual effort to run or the cost of third-party experts to undertake on your behalf. Cost is also often the enemy of coverage.
Considering the backdrop of increased scrutiny from regulators, the techniques they are employing to monitor the market, and deficiencies in traditional models, it’s important to ask how you can improve your data and proactively highlight potential errors. In essence, how can you start employing the same techniques your regulator uses? The answer is simple: with MarketAxess Post-Trade’s latest solution, Sonar.
Having listened carefully to how regulators are using their data sets to monitor for accuracy, we realised that we could use our unique industry-wide dataset, powered by our network of 900+ clients, to apply
innovative data techniques to help firms undertake the same sorts of checks and address the limitations in existing models. Sonar’s automated approach to monitoring allows it to assess every field of every transaction you report, providing feedback on anomalies before they become issues and giving you the coverage you need.
Machines are only as good as the data feeding them and that is where the real value of Sonar lies – in its ability to provide you context. The regulatory reporting data set we have gives us view similar to that of your regulator. This means we can monitor market trends and behaviours to spot outliers in your data that wouldn’t otherwise be possible. If the whole industry is reporting a certain transaction type in a particular way, but you are doing it differently, Sonar will spot that.
Finally, Sonar’s highly automated approach to analysis means cost is no longer a prohibitive factor. Sonar is designed to streamline your internal regulatory controls and checks making them more efficient, more accurate, and more consistent. Not to mention helping you avoid the cost of regulatory fines!
Sonar is here to help you safely navigate the deep and murky waters of regulatory reporting. Stay tuned for our next blog to learn more about how Sonar works and the innovative techniques it employs to spot outliers.