29 novembre 2024
We reported in last month's round up the launch of the FCA's AI Lab. One component of this is the AI Input Zone, an online feedback platform through which the FCA is asking stakeholders to give their views on the future of AI in UK financial services.
The FCA is keen to get the views of different market participants on the following five topics:
The FCA has asked for responses by 31 January 2025 and has said it may publish information about the comments it receives (no doubt after its own AI tools have analysed them first!).
The FCA is also holding an AI Sprint on 29 and 30 January 2025, which will bring together industry, academics, regulators, technologists and customer representatives and will help guide the FCA on its regulatory approach to AI and how it can promote a pro-growth and innovation-friendly environment. The application deadline is 9 December 2024.
One issue that central banks and regulators are grappling with is the potential financial stability risks that arise from AI, particularly as AI models become more powerful and their adoption increases.
A report from the Financial Stability Board (FSB), published on 14 November 2024, on the financial stability implications of AI is therefore particularly welcome.
The FSB sets out an overview of recent developments in AI, along with an assessment of their potential financial stability implications since its November 2017 report on AI and machine learning in financial services. The report discusses selected AI use cases by industry participants and official sector authorities and the implications for financial stability, focusing on how AI could amplify specific types of financial sector vulnerabilities.
It highlights potential vulnerabilities relating to third-party dependencies and service provider concentration, market correlations, cyber risks, and model risk, data quality and governance.
The FSB suggests that it, standard-setting bodies, and national authorities should:
Consider ways to address data and information gaps in monitoring developments in AI use in the financial system and assessing their financial stability implications. The FSB suggests that national authorities could consider leveraging periodic and ad-hoc surveys on AI adoption and use cases, reporting from regulated entities and public disclosures.
Assess whether current regulatory and supervisory frameworks adequately address the vulnerabilities identified in the report, both domestically and internationally. The FSB could consider the implications of sector-specific regulatory and supervisory frameworks on the level playing field across sectors, as well as between established firms and new entrants.
Consider ways to enhance regulatory and supervisory capabilities for overseeing policy frameworks related to the application of AI in finance. The FSB could consider facilitating international and cross-sectoral co-operation by enhancing the sharing of information and good practices across member jurisdictions.
The Bank of England and FCA have published the results of their third survey of AI and machine learning in UK financial services. The last survey was published in 2022.
What did they find?
75% of respondents are already using AI, with an additional 10% planning to adopt it within the next three years.
The insurance and international banking sectors have the highest AI usage rates at 95% and 94%, respectively, while financial market infrastructures report the lowest at 57%.
AI is primarily used for optimising internal processes (41%), cyber security (37%), and fraud detection (33%). Future plans include increased AI use for customer support (36%), regulatory compliance (32%), fraud detection (31%), and the optimisation of internal processes (31%).
Many respondents have only a "partial understanding" of AI (46%), attributed to reliance on third-party models, while 34% claim "complete understanding".
A high regulatory burden is seen as the main regulatory constraint facing firms, within which the three biggest constraints are data protection and privacy (33%), consumer duty (23%), and other FCA regulation (20%).
There is also a lack of clarity in regulations concerning intellectual property rights (18%), consumer duty (13%), and operational resilience and cyber security rules (11%).
Our team has significant experience in advising on the interface between technology and financial services and can advise you on the regulatory considerations around the use of AI and ML in the financial sector.
par plusieurs auteurs