Authors

Sherin Sayed

Associate

Read More

Dr. Stefanie Greifeneder

Partner

Read More
Authors

Sherin Sayed

Associate

Read More

Dr. Stefanie Greifeneder

Partner

Read More

18 September 2023

Medical devices in the context of the European Commission's AI Regulation draft

  • Briefing

Artificial intelligence (AI) has been accompanying society for several years now. Currently, an increasing number of tools and programs are being made available to the general public, which can be used for a wide variety of purposes, such as work or entertainment. In the industry, too, AI-based production aids are being used or integrated into their products. Medical device manufacturers are no exception. AI is being used in a variety of forms in the medical device industry. The European Commission is taking this development into account and presented a draft for a "Regulation laying down harmonizing rules on artificial intelligence (AI Act) and amending certain Union acts" as early as April 2021. The fact that the Commission has opted for a regulation rather than a directive shows that it attaches great importance to the need for uniform regulation throughout the EU.

But how would this regulation concretely change things for manufacturers, distributors, importers of medical devices and in vitro diagnostic medical devices?

The concept of artificial intelligence

The draft defines the term artificial intelligence in a unified European context for the first time. Art. 3 No. 1 of the draft contains a legal definition: “artificial intelligence system’ (AI system) means software that is developed with one or more of the techniques and approaches listed in Annex I and can, for a given set of human-defined objectives, generate outputs such as content, predictions, recommendations, or decisions influencing the environments they interact with“.

The Commission chose a very broad definition which does not refer to specific types of technology. The intention is to address the general problem of legislation being regularly overtaken by new technologies, while lengthy legislative procedures prevent it from keeping up with advances in technology. However, the definition is concretized by the techniques and concepts listed in Annex I of the draft proposal, such as machine learning, deep learning, logic and knowledge-based concepts and scientific methods. Within the framework of this broad definition, however, it might be considered problematic that this very broad material scope of application leads to possibly all procedures and methods of conventional programming with a low risk potential falling under Art. 3 No. 1 of the draft, which would mean that medical software would be comprehensively covered by this regulation. As a result, this could lead to legal uncertainty as to which set of rules is to be applied to which software.

Medical devices in view of the AI Regulation draft

Medical devices are also affected by the draft. Art. 6 (1) in conjunction with Annex II of the draft categorizes devices meeting the conditions listed therein as "high-risk AI systems". Both Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices (MDR) and Regulation (EU) 2017/746 of the European Parliament and of the Council of 5 April 2017 on in vitro diagnostic medical devices (IVDR) are listed in Annex II, Section A, No. 11 and No. 12 as a result of their harmonization legislation. If a medical device which is operated by means of AI or contains AI components falls within risk class IIa or higher and, as a result, a Notified Body is to be involved in the conformity assessment procedures, this medical device is a high-risk AI system within the meaning of the draft. In view of the innovations introduced by the MDR and the IVDR, software in particular must regularly be classified higher than under the previous legal situation, which leads to the result that the existence of a high-risk AI system within the meaning of the draft regulation must also be assumed.

Requirements under the AI Regulation draft

For high-risk AI systems, the regulation provides for far-reaching requirements for risk management systems including testing procedures, technical documentation and record-keeping obligations, transparency and provision of information to users, establishment of human oversight, as well as implementation of the principles of accuracy, robustness and cybersecurity. In addition, the draft contains not only obligations for providers, but also specific obligations for importers, distributors and even users of such systems. The requirements of the draft include new requirements compared to the MDR and IVDR. This will result in a considerable additional burden for app developers in the conformity assessment procedure.

Conformity assessment of AI medical devices

Economic operators within the meaning of the MDR and IVDR already have to keep numerous requirements and obligations from these regulations in mind, which include, among others, requirements for risk management systems and documentation. The overlap of requirements between the MDR, IVDR and the new AI Regulation in some places is to be resolved by subjecting the safety risks of AI systems to the requirements of the draft, while the safety of the product as a whole will be assessed under the MDR.

Art. 43 of the draft regulates various conformity assessment procedures for high-risk AI systems. Providers of high-risk AI systems must prove that they meet the requirements according to the regulation. Relevant for the conformity assessment of medical devices is paragraph 3 of the provision. According to this provision, the conformity assessment for high-risk AI systems subject to the legal acts listed in Annex II Section A is governed by these legal acts accordingly. Both the MDR and the IVDR are listed in this Annex under Section A, Nos. 11 and 12. The provider must therefore follow the relevant conformity assessment procedures under MDR and IVDR respectively. According to this paragraph, the Notified Bodies designated under the MDR shall also be entitled to check the conformity of high-risk AI systems with the requirements of the draft Regulation and thus be able to constitute a " Notified Body " within the meaning of the draft Regulation. As a result, it is still only necessary to carry out one single conformity assessment procedure for AI medical devices in accordance with the requirements of the MDR, which must additionally ensure compliance with the requirements of the draft regulation. However, Notified Bodies must have sufficient internal competence to effectively assess the tasks performed by external bodies on their behalf. This may, in exceptional cases, result in a double referral of a Notified Body under the MDR/IVDR and a Notified Body under the draft, if the Notified Bodies under the MDR/IVDR cannot provide this capacity and expertise.

In this respect, all involved must ensure at an early stage that, in addition to the requirements of the MDR and IVDR for medical devices - and in particular the software that falls under them - those of the future regulation that go beyond them are complied with. In particular, it must be checked whether the AI used meets the safety requirements, while internal processes and products are adapted accordingly.

Additional costs at the expense of medical device companies

As a result of all these measures, all involved parties have to be prepared for not insignificant additional costs. According to the Commission's calculations, the cost of complying with these requirements for the provision of an average high-risk AI system worth around EUR 170,000 is expected to be around EUR 6,000 to EUR 7,000. In addition, there are human supervision costs of around EUR 5,000 to EUR 8,000 per year. Suppliers of high-risk AI could also incur verification costs of between EUR 3,000 and EUR 7,500. Whether these calculations will come true or whether even higher costs may incur, remains to be seen.

Conclusion

On the one hand, embedding AI in medical devices leads to greater and continuously expandable functionality of medical devices. On the other hand, however, the requirements for functionality and, in particular, safety are correspondingly increased and the liability risks as well as the cost burden for all economic actors involved are higher.

Call To Action Arrow Image

Latest insights in your inbox

Subscribe to newsletters on topics relevant to you.

Subscribe
Subscribe

Related Insights

Life sciences & healthcare

The CJEU: The recurring question of the separation of medicinal products and medical devices

21 April 2023
Briefing

by Dr. Manja Epping and Sherin Sayed

Click here to find out more