The interplay between rapidly developing AI tools and the, as yet, largely unclear regulatory position can be a challenge for financial institutions looking to harness the new technology to their advantage.
How can AI help firms comply with the Consumer Duty?
In our view, AI has the potential to improve firms’ compliance with the requirements of the Consumer Duty. For example, firms could use it to improve their operational efficiency. AI can help firms categorise and organise customer data and feedback (and do so quickly). This data can be used to improve customer facing processes saving time and money and improving the customer experience. It may also help firms comply with the Consumer Duty by feeding into customer modelling – allowing a better understanding of specific consumer characteristics and needs – meaning that consumers benefit from tailored products and offerings.
Similarly, AI tools are already being used for the quicker and more effective detection of fraud and other financial crime, specifically by the increasing use (and complexity) of systems designed to detect suspicious activity.
"AI tools are already being used for the quicker and more effective detection of fraud and other financial crime"
However, conversely many of these apparent benefits bring risks. The FCA has openly spoken about the “responsible adoption” of AI – but what does this really mean?
Bad outcomes for consumers – where does risk lie?
We already know that the use of AI within firms is increasing and is predicted to exponentially grow over the next two years. Some of the key areas where firms with retail customers will need to be attentive, or risk bad consumer outcomes are:
- Data Quality. The output of any AI system is only as good as the data which feeds it. Discrimination and bias caused by low-quality data input may result in bad outcomes for consumers.
- Outsourcing and model risk management. Firms must have a plan to keep track of their outsourced functions and the use of externally provided algorithmic systems. We predict that oversight of outsourced systems (and AI use by third parties) is an area that the regulators will be focused on – and, in particular, how responsibility is apportioned in the context of the Consumer Duty.
- Bad actors. The use of AI to cause harm in the financial services sector means we can expect an increase in consumer scams causing individual loss such as audio deepfake technology, scam calls and biometric theft. Firms themselves may also be at risk from AI-powered cyber-attacks. The sophistication of the systems used by firms to detect and protect consumers will need to develop at pace (and at expense).
- Operational resilience. Increased reliance on AI for key business functions will become a key risk. Firms that fail to update their business continuity plans may expose their customers to disruption if their AI dependent systems fail.
So, what is on the legislative horizon?
Well, probably nothing legislative – at least for the time being. The FCA describes its current approach to consumer protection and AI as based on “a combination of the FCA’s Principles for Businesses, other high-level detailed rules, and guidance, including the Consumer Duty” [2]. The UK government has been clear that it does not intend to introduce any AI-specific statute. Instead, it will focus on principles-based guidance which the UK financial regulators can adapt and implement.
The UK government has been clear that it does not intend to introduce any AI-specific statute. Instead, it will focus on principles-based guidance which the UK financial regulators can adapt and implement.
Most recently, a steer on the proposed approach to AI regulation has been provided by the publication of the collective response paper of the FCA, PRA and Bank of England on AI and Machine Learning in late October 2023 [3]. This response paper contains a summary of feedback on the proposals that have been put forward, and although it does not provide any concrete views or policy proposals from the regulators themselves, we expect this is the writing on the wall for their future approach.