Optimizing Transaction Monitoring
The costly burden of false alerts - How to capture the significant potential for optimization
Few missed it. The frustration expressed by a Swedish bank CEO in the Financial Times in 2021, saying that “banks are spending billions of dollars a year on anti-money laundering rules and compliance but are doing little to stop criminals moving money around the financial system.”
According to experience and existing data, 80-95% of transaction monitoring alerts may be false. Meanwhile real risks may not be adequately mitigated. The potential to increase the conversion rate and reduce costs is substantial. Moreover, costs can be reduced and time allocated to for example scenario development and better risk analysis.
In this paper, we discuss how transaction monitoring systems can be improved to deliver better results. Two approaches for optimization are presented, both involving existing data in your company.
Getting to the core issue
To begin the process of improvement, an organization needs to establish two things:
- ”What should be the volume of alerts in our transaction monitoring vs. the ratio to suspicious activity reports (SAR)?
- ”How do we establish that our scenarios are sufficient to capture the risks that we have identified in our business operations?”
With credible data and adequate information about the conditions of the business of the company, there is often quite a significant potential to improve performance and reduce costs for most companies.
There are quantitative and qualitative approaches to addressing TM performance through scenario optimization. We will discuss both separately and combined. The output gives a direction, which can inform where a certain scenario should lie. Field experience suggests that responsible functions tentatively prefer to keep suboptimal setups because of a resistance to change or fear of uncertainty of better options. With clear proof through validated data on false alerts at hand, demonstrating in absolute terms that the volume should be reduced, the opportunity for improvement is the greatest.
In 2022, FCG carried out a benchmark against banks and other financial institutions to establish a view on the number of alerts and handling time per alarm considering the size and nature of the business. Based on our experience from the industry and the estimated distribution of measures taken for different types of alerts, we could establish the cost of a single false alarm to appr. 360 SEK. Calculations are based on the average hourly rate for resources deployed, and the time needed to for investigation per TM alarm.
Swedish authorities, in particular by the Swedish Financial Supervisory Authority and the Gambling Inspectorate, are increasing their attention to the effectiveness of transaction monitoring in companies.
First of all, key functions need to take a step back to assess the bigger picture. Work to mitigate money laundering risk is to a large extent conducted by investigating alerts generated in transaction monitoring. Experience at FCG confirms that, the need for resources in the monitoring department is a direct equation of the lack of efficiency. Within the remits of transaction monitoring, this translates into inefficient scenario-setting and lead-times for investigations causing bottlenecks to accumulate hampering the handling of alerts. This is costly from several perspectives.
The correlation between the number of reports submitted to the Financial Intelligence Unit (FIU) and the number of alerts tells us that the average transaction monitoring system today is highly inefficient. Typically, the ratio is around 1%. With a large volume of irrelevant alerts, a benchmark is needed to establish how many alerts that should be generated, corresponding to the specific conditions of a given company. The factors considered when developing scenarios can include the availability of credible data, the potential for optimization, system prerequisites and lead-times. This can be complex. The easier way to address this is to stop at a more arbitrary answer to the question of how effective the process of setting scenarios for your company should be. Based on our experience advising numerous companies on transaction monitoring, we would like to recommend two approaches to achieve a more motivated and methodical response to this issue.
The quantitative approach
Using available open source quantitative data from e.g. Swedish Bankers Association, the Swedish Financial Supervisory Authority and the Financial Police, a mathematical approach can be taken to determine what a reasonable interval for the number of alerts should be for a company of a given size. Data on the number of employees can be obtained from the Swedish Bankers Association, and data on the categorization of each financial institute is available from the SFSA. Information on the number of reports submitted to the FIU is available in its annual report. The latter can be translated into the number of alerts that all reporting companies have had, based on the assumption of a certain efficiency, i.e. number of reports versus the number of alerts. In the sector ”Banks, financing companies including credit institutions” a total of 27 801 reports were sent to the FIU in 2021. Based on the assumption that 1% of the number of generated alerts lead to a FIU-report, it suggests that in total 2 780 100 alerts were generated by all companies in that sector. It then follows that the number of alerts per company can be estimated based on data on the size and nature of each company available from the Swedish Bankers Association and the FSA.
Further, a linear regression model can be adjusted according to the number of alerts per company, as dependent variable and adopted, explanatory variables from the collected data. For example, what convergence point(s) that the FSA has set for the company within its category and/or the number of employees. The result derived from this model based on the data applied is a theoretical estimate of the number of alerts your company should have, assuming a certain monitoring efficiency and the assumed distribution of alerts in a company of comparable size.
The qualitative approach
A more qualitative approach can be based on the AML risk assessment of the company, documented experience and insights derived from performed testing of scenarios in the transaction monitoring. The General Risk Assessment should include several data points of varying levels of granularity which can be used to gauge the risk exposure of the company in relation to various products and services, users and customers including individual transaction volumes. Combined with a carefully formulated test documentation for the scenarios that the company deploys, where different alert levels are balanced and efficiency is included, data from scenario performance in product environments, it is possible to estimate and motivate what the total alert volumes of the company should be. It may be more or less difficult depending on ability of the company to generate new and historical data in relation to documented monitoring outcomes.
Conclusions
The two approaches differ in terms of methodology and can generate very different results. Results should be considered as guidance and comparison in relation to the actual performance of the transaction monitoring system. As a general recommendation, a quantitative approach may be preferred since it is less dependent on various conditions within the company. A quantitative approach can be complemented with the qualitative, where the conditions allow for carrying out such an analysis. A thorough and well-founded analysis is decisive to arriving at a better and more adequate view on performance, beyond general or relative assumption.
According to André Christiansen, Senior Associate at FCG, specialist in developing TM optimization models, it is important to understand that the proposed approaches for carrying out a quantitative and qualitative statistical analysis of alerts and reporting will establish the actual state, not the aspired state. However, part of the core issue is precisely the lack of validation of the actual status. Large variances between the amount of alerts and effectiveness is a real problem, and to improve something, first you need to validate the real current performance. It allows you to build more realistic models. The result is a bench-mark which allows you to drive optimization and performance improvement.
We will host a webinar on this subject, read more and register here.
For more information, please contact: