Regulatory Reform
Fincrime prevention is central to UKʼs ‘go and innovateʼ AI strategy
• 0 minute read
June 18, 2025

Financial crime prevention is central to the UK’s “go and innovate” artificial intelligence (AI) strategy because institutions need AI to fight criminals who exploit the technology, said Nikhil Rathi, chief executive of the Financial Conduct Authority (FCA), during an oral evidence session at the Treasury Committee last week.
“Some of the AI roll-out helps to tackle financial crime, as well as potentially being a vector of financial crime for criminals, so there is a balance to be struck here,” Rathi said.
“Some of the payments products are very convenient for consumers. We are not in the business of trying to inhibit beneficial innovation, or some of the scale and cost benefit that these products can bring, but we do need good, active co-operation with the regulatory and law enforcement community to tackle the downsides,” he added.
The FCA is also partnering with the Alan Turing Institute and financial crime advisory firm Plenitude Consulting on an anti-money laundering (AML) project that uses synthetic data. The aim is to create a dataset that mimics real-world financial transactions and incorporates money laundering typologies, according to the FCAʼs written evidence submitted to the Treasury Committee inquiry.
Improving customer protection
The regulatorʼs approach aligns with examples provided by the City of London Corporation (CoLC) in its written evidence submitted to the Treasury Committee on the use of AI in UK financial services.
According to the CoLC, financial services are investing in innovation and collaborating with tech firms to improve customer protection. For example, HSBC has partnered with Google to co-develop an AI system — known internally as Dynamic Risk Assessment — to enable the faster detection of financial crime while minimising the impact on customers.
Similarly, Revolut uses AI scam detection to safeguard customers against authorised push payment (APP) scams, while Mastercard reduces false fraud alerts by 85% through its real-time Decision Intelligence (DI) tool.
No new rules
“We are not standing in the way of innovation,” said the FCAʼs Rathi. “The position we have taken on AI is, I think, consistent with the overall government position on the regulation of AI nationally. We are not looking at new and detailed rules.”
With the outcomes-based framework, the FCA claimed that its Consumer Duty is enough to address potential risks. If problems occur, said Rathi, the regulator expects firms to measure consumer outcomes and senior manager governance to correct courses.
On June 9, the FCA announced a “supercharged sandbox” using NVIDIA technology to help financial firms experiment safely with AI innovations, which launches in October 2025. It also announced a live AI testing service and has sought industry feedback in an engagement paper, which closed on June 13.
While the FCA is not looking at new rules for AI, the government said it will set out its approach on AI regulation and “will act to ensure that we have a competitive copyright regime that supports both our AI sector and the creative industries”.
Budget allocation
The FCA’s approach supports plans set out in the UK governmentʼs Spending Review 2025, published on June 11, which sees HM Treasury accelerating the UK’s “go and innovate” AI strategy with a £2 billion commitment to its AI opportunities action plan.
The government stated that the investment will fund at least a “twentyfold expansion” of the AI research resource, and support AI firms through the new sovereign AI unit — promising 1.5% yearly productivity growth and 13,250 new jobs as well as positioning the UK as a “global AI superpower”. It will seek to introduce AI in government services, and potentially its internal fraud prevention work, to drive efficiencies.
Janine Hirt, chief executive of the trade body Innovate Finance, said: “The significant long-term funding commitments announced today will ensure AI adoption and skills development across society and the fintech industry will no doubt play the leading role in diffusing this technology across financial services.”
However, despite this support for AI solutions, Hirt called for additional resources to tackle fraud and economic crime in the UK. “The Home Office must have sufficient funding to build a cross-industry data-sharing platform to block and disrupt fraud online as part of its new fraud strategy,” she added.