Skip to content

Financial Crime

Metaʼs focus on ‘high-severityʼ violations will not stop organised crime

By 0 minute read

January 15, 2025

Meta Platforms, the parent company of Facebook, Instagram, Threads, and WhatsApp, said it will continue to focus its automated systems on “tackling illegal and high-severity violations, like terrorism, child sexual exploitation, drugs, fraud and scams”.  Meta platforms are often cited by law enforcement, civil society groups and financial services as originating points for scams, frauds, sextortion, and other crimes.

The US social media giant’s pledge to focus on so-called high-severity violations is welcome, but its actual scope is unclear. Many of these crimes occur in private areas such as direct messaging (DM), encrypted chats and private groups, which are not surveilled by automated moderation systems.

Meta has been fined $11 billion globally since 2013, according to Violation Tracker and the GDPR Enforcement Tracker. Most fines and settlements related to data privacy law violations, such as data breaches that exposed millions of users’ data to criminals, including children’s data, Compliance Corylated found. Thirty-three US states are suing Meta, alleging it harms children and illegally collects data on children under 13 without parental consent.

DMs, encrypted chats

“There is certainly room for improvement, but there doesn’t need to be a trade-off [between fact checking and reducing harm]. If it helps reduce some of these scams and sextortion, I hope it works and is successful,” said Alison Jimenez, president of Dynamic Securities Analytics, a financial crime investigator and expert witness in Tampa, Florida.

There are dozens of public Facebook groups dedicated drug dealing, selling sextortion instruction manuals and know-your-customer (KYC) information and Instagram interest groups for child sex abuse material (CSAM), which Meta has not closed. Those are areas requiring immediate attention, she added.

“My understanding is [Meta’s statement] applies to public posts as opposed to direct messaging.  The actual exchange of CSAM, the actual grooming of kids is in direct messaging, which doesn’t appear to be affected by this policy. The messages are encrypted anyway and that’s something Meta is not able to look at,” Jimenez said.

Jimenez has written extensively about financially motivated sextortion gangs and international money laundering challenges. Many criminals take payment from victims through apps such as Zelle, Apple Pay, Cash App and others, then use cryptocurrency to move the proceeds of crime across borders, her investigations show.

Less moderation

Last week, Meta announced it would end third-party fact checking in the US and move to a user-initiated community notes system for reporting content that violates its standards. It will change its policies governing political speech, for example in posts about immigration and LGBTQ issues, too, a change Meta said should rectify censorship on its platforms.

Chief executive Mark Zuckerberg said in a statement: “We used to have filters that scanned for any policy violation. Now we’re going to focus those filters on tackling illegal and high-severity violations, and for lower-severity violations we’re going to rely on someone reporting an issue before we take action. We’re also going to tune our filters to require a much higher confidence before taking down content.

“The reality is this is a trade-off. It means that we’re going to catch less bad stuff, but we’ll also reduce the number of innocent people’s posts and accounts that we accidentally take down.”

It is unclear whether Zuckerberg’s comments about mistakes in Meta’s content moderation filters applied to high-severity violations, and whether catching “less bad stuff” also applied to criminal activity. The Meta press office did not respond to a request for clarification, and did not provide any details or time scale for its efforts to tackle high-severity violations on its platforms.

Over half of UK scams start on Meta

Recent data from the UK’s Payment Systems Regulator (PSR) showed that in 2023, Meta platforms were linked to 54% of scam incidents (119,338 cases) and 18% of total losses (£62.7 million) — or roughly £1 in every £5 lost in scams in the UK.

Cybertip.ca, Canada’s national tipline for reporting the online sexual exploitation of children, reported that between September 2023 and August 2024, 74% of sextortion incidents occurred on Instagram or SnapChat, which is owned by Snap. Sextortion demands often come from international organised crime gangs, Cybertip.ca said.

In 2023, the UK passed the Online Safety Bill, which was designed to protect children and require social media companies to ensure platforms are not used for illegal activity. Non-compliant companies can be fined up to £18 million or 10% of their qualifying worldwide revenue, whichever is greater. Regulation implementing the bill has not been finalised.