Nov 20, 2023

Event Monitoring for Modern Financial Crime Prevention

Transaction monitoring isn’t just a regulatory mandate. It’s a crucial line of defense that safeguards financial institutions and their clients from illicit activities. However, while once the bedrock of compliance efforts, traditional transaction monitoring systems have struggled to address the sophisticated tactics of bad actors and ever-changing regulations.

Transaction monitoring, or more appropriately “event monitoring,” examines customer transactions like deposits and withdrawals, but also actions like account linking, signership changes, and digital signatures. This process enables organizations to detect broadly suspicious behavior, adhere to anti-money laundering (AML) and counter-terrorist financing (CTF) guidelines, and ultimately fulfill reporting mandates like suspicious activity reports (SARs). More than just tracking irregularities, monitoring also provides insight into customer risk levels and predicted future activities.

The increase in the use of newer technologies and products like digital currencies, peer-to-peer payment platforms, and cross-border digital transactions has started to show the limitations of traditional transaction monitoring systems. These modern financial innovations, coupled with criminals’ increasingly savvy tactics, allow for faster, more complex transactions across borders and platforms, making it even more challenging for outdated monitoring systems to keep pace and identify suspicious activities.

To combat these challenges, institutions must strengthen their event monitoring systems, aligning safeguards to distinct business factors like size, geographical scope, and inherent operational risks. Globally, regulators like the Financial Action Task Force (FATF) and Joint Money Laundering Steering Group (JMLSG) champion a ‘risk-based approach,’ emphasizing the importance of customizing strategies to fit individual customer risk profiles. But often, these approaches tend to be either overly broad or too short-sighted in scope.

To understand why new approaches are needed, we must first examine the drawbacks of these traditional calculation methods.

The Drawbacks of Traditional Event Monitoring Calculation Methods

While financial crime becomes more complex, many institutions still rely on outdated and simplistic detection methods, often incapable of analyzing vast datasets or complex transaction chains.

The core drawbacks of these traditional approaches to event monitoring and detection highlight the pressing need for intelligent solutions ready for the future.

One-Size Fits All Detection

Traditional monitoring relies heavily on basic rule-based detection algorithms designed to look for predetermined patterns and thresholds. While somewhat effective, these calculations often overlook the nuances, taking a one-size-fits-all approach rather than tailoring it to the expected behaviors of a specific industry or activities.

The nuance of a business’s specific industry category, as defined by North American Industry Classification System (NAICS) and Standard Industrial Classification (SIC) codes, is a simple example of a piece of data that is often overlooked when setting up monitoring strategies.

For example, a software company like Sandbar shouldn’t regularly deposit tens of thousands of dollars in cash. If the system misses this anomaly, a human analyst might notice the mismatch in expectations later during other investigations or diligence processes.

Messy Data Mapping

Data mapping is the process of matching fields and data elements from one or more source systems or databases to a destination system or database.

For example, the source system may have a field labeled “Transaction Amount” while the target has one called “Currency Amount.” Both represent the same information — the amount of money transferred in a transaction. Data mapping helps ensure these fields are aligned so “Transaction Amount” is correctly understood as “Currency Amount” in the new system.

Mismatches in mapping data fields can lead to transactions being flagged as suspicious due to misinterpreted fields. Take product type for example. If the system can’t tell the difference between cash and checks in certain scenarios, then it might apply a structuring rule to check deposits. Careful validation is required to confirm all data conveys the same meaning after mapping. While seemingly trivial, these mismatches can make integration challenging, often requiring weeks or months of work to achieve compatibility. More importantly, inaccurate mapping can lead to several operational issues, including overlooked risks or incorrect flagging of normal activities.

Duplicative Alerts

Event monitoring software often flags the same issues repeatedly, such as creating ten separate alerts in a week when, ideally, these would be combined into one alert that adjusts the risk score with each new activity.

Not only does this lead to confusion among analysts who must deduplicate alerting activity, but it also results in a bloat of operational overhead. Rather than aggregate ten alerts into a single unit of work, analysts now must look at each alert to understand the specific transactions, accounts, and subjects that were flagged, further diverting their attention from doing what matters — the analysis.

How to Build More Dynamic Event Monitoring Calculation Engines

Institutions must make a shift towards more dynamic event monitoring software designed to address existing drawbacks and increase confidence among internal and external stakeholders.

By implementing solutions like nuanced detection, accurate data mapping, and comprehensive and holistic alerts, institutions can evolve outdated transaction monitoring practices into robust, advanced systems ready to tackle today’s sophisticated financial crime.

Nuanced Detection

Risk teams should leverage the troves of available data to craft more refined, precise calculations, incorporating industry specifics to assess if activities align with expected behaviors for a business type.

Accurate Data Mapping

To ensure accurate data migration and avoid discrepancies, advanced monitoring systems must accurately map all fields to ensure the engine knows exactly what to look for.

By correctly tackling the task of data mapping, these systems do more than just preserve data accuracy and consistency. They offer a tangible benefit to clients: significantly reducing operational burdens. It’s a clear-cut case of technology taking on complexity to deliver simplicity and reliability.

Aggregated Alerts

Repetitive alerts drain resources and cause “alert fatigue,” leading critical warnings to be overlooked amid excessive notifications. Whereas traditional monitoring systems are often plagued with duplication, advanced systems can aggregate similar alerts into a unified case and update existing alerts based on new events.

This allows institutions to take action swiftly and more effectively, empowering compliance teams with accurate risk assessment without distraction or duplicative work.

Adopting Better Transaction Monitoring Engines

Given the financial and reputational risks of non-compliance, organizations need robust and reliable monitoring systems capable of adapting to new threats. The stakes, both operationally and optically, are too high to rely on outdated technology.

To keep customers, employees, and the broader communities safe, organizations must invest in advanced event monitoring engines, allowing analysts to accurately detect and efficiently manage risks without unnecessary operational strain.

These systems should be agile, adapting swiftly to emerging threats and shifting regulatory landscapes while offering everything institutions need to tailor their compliance strategies to exact individual customer risk profiles.

To see how tools like Sandbar can elevate your compliance programs and optimize your operations, book a demo today.

Get started today.

Talk with our team to learn how we can scale your AML systems, with less