Regulatory
AI guidance needed to spur financial services adoption
• 0 minute read
May 27, 2025

Submissions to the UK Treasury Committee’s call for evidence on artificial intelligence (AI) in financial services overwhelmingly urged regulators to provide guidance to enable its safe adoption. Most submissions argued there is no need for AI-specific regulation, however — a view also held by the Financial Conduct Authority (FCA).
“Whereas high regulatory burden might be considered the main type of regulatory constraint limiting AI development in this area — at least according to respondents to the 2024 BoE/FCA survey on the use of AI in finance — the second regulatory constraint most often flagged by market players is ‘lack of clarity of current regulation’,” wrote Clara Martins Pereira, associate professor of financial law at Durham Law School, Durham University, in her submission.
Uncertainty is discouraging financial firms from integrating AI-driven solutions fully, legal and compliance efforts slow down innovation and limits AI adoption, according to Amplified Global, a UK regtech.
Meanwhile, trade association UK Finance said in its submission: “Overall, we consider that the UK’s pro-innovation sectoral approach to AI regulation is correct. Proportionate, tech-neutral rules should be relied on as far as possible, with existing regulators issuing guidance in their domains to resolve any emerging uncertainties if and when these become apparent.”
The Investing and Saving Alliance (TISA) said AI is an area that its diverse group of members are actively using, or planning to adopt soon with additional guidance or regulation.
Areas requiring clarity
Firms, their lobbying groups and academics identified a need for guidance in areas such as robo-advice; algorithmic credit scoring; ethical considerations; data privacy concerns; senior management accountability; bias detection and mitigation; and governance structures.
UK Finance said: “We note that the dynamics between AI providers and AI deployers may need further consideration from regulators. Development of commonly understood best practice, or a light-touch code of practice, might help set or standardise technical disclosures that would provide firms with greater confidence in their AI deployment.
“The merits of such options could be considered collaboratively by industry and regulators in a public-private partnership arrangement. This is an issue we continue to discuss with members.”
The Bank of England has already indicated where it believes guidance might be necessary: model risk management; training data quality; expectations on senior managers; “human in the loop” expectations; and governance.
Existing rules
The FCA’s submission to the committee says its Consumer Duty and Senior Managers and Certification Regime (SM&CR) are relevant to the “safe and responsible use of AI”.
“We continue an open dialogue with the financial services industry, consumer groups, academics and others to understand what they need and considering where we can further elaborate on our framework. This includes supporting firms by enabling safe testing and experimenting through the AI Lab,” the FCA said.