Artificial intelligence (AI) is on everyone's lips, especially since the software "ChatGPT" published by OpenAI has been available to the public. This software is capable of automatically producing texts, conducting conversations with the user, and providing information in response to any questions asked by the user.
To do this, the software uses data available on the internet, which is put together into the texts and answers requested by the user with the help of artificial intelligence. The software's answers sound almost human, which explains the current euphoria about ChatGPT.
If, for example, the user is undecided about what to cook with the ingredients left in the refrigerator, he or she tells the software the ingredients and receives not only suggestions for dishes but also instructions on how to prepare them. In addition to such banal questions, whose incorrect answer by the software in case of doubt only leads to an inedible dinner, ChatGPT also answers medical questions. For example, if ChatGPT is asked the question "I have a headache. What should I do?", the answer you get is, "I'm sorry to hear you have a headache. Here are some tips that might help you: 1. Take a painkiller: If you have no medical concerns, you can take a painkiller such as an paracetamol or ibuprofen. However, be careful not to exceed the recommended dosage. (...)". The software thus suggests a concrete treatment for the claimed headaches, so the question arises whether ChatGPT could possibly be classified as a medical device.
Consequences of classification of ChatGPT as a medical device
The consequence of the classification of ChatGPT as a medical device would be that ChatGPT would have to undergo a conformity assessment procedure according to EU Regulation 2017/745 (in short: "MDR"). In principle, this must be carried out with the assistance of a Notified Body. The only exception to this is for medical devices in MDR risk class I, for which the manufacturers can carry out the conformity assessment procedures themselves. In the case of software that qualifies as a medical device, however, MDR risk class IIa is usually applicable, since MDR Annex VIII (Rule 11), which determines the classification of medical devices, stipulates that "software intended to provide information used for decision-making for diagnostic or therapeutic purposes" belongs to class IIa. This will usually be the case.
The question that remains to be answered is whether ChatGPT falls within the definition of a medical device under the MDR. The definition of a medical device is found in Art. 2 No. 1 MDR. Here it is explicitly clarified that software can also be a medical device. According to Art. 2 No. 1 MDR, a medical device is, among other things, "(...) software (...) [which] according to the manufacturer is intended for human beings and is intended to fulfill, alone or in combination, one or more of the following specific medical purposes: Diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of disease (...)." This definition covers so-called "standalone software", which – like ChatGPT – is offered on the market without any associated hardware.
If the answer of ChatGPT to the question about headaches is now subsumed under the definition of Art. 2 No. 1 MDR, it can be found that the information given by ChatGPT is in any case adequate for the "treatment or alleviation of illness" within the meaning of Art. 2 No. 1 MDR. Thus, the definition of a medical device seems to be fulfilled at first sight. However, Art. 2 No. 1 MDR contains another component: The so-called "intended purpose". A medical device in the sense of Art. 2 No. 1 MDR is only a medical device if the product (in this case the software ChatGPT) is intended by the manufacturer (in this case OpenAI) to fulfill a specific medical purpose. In contrast to pharmaceutical law, which generally considers the objective intended purpose according to the view of the market, the subjective intended purpose specified by the manufacturer is decisive in the context of medical devices. In simple terms, medical devices are only medical devices if the manufacturer ascribes this characteristic to them.
Intended purpose determines classification as a medical device
The context in which the manufacturer can ascribe its intended purpose to the medical device is in turn determined by the MDR. This defines the term intended purpose in Art. 2 No. 12 MDR as "the use for which a device is intended in accordance with the information provided by the manufacturer on the label, in the instructions for use or in the advertising or sales material or the advertising or sales information and his statements in the clinical evaluation." However, a limitation of the subjective intended purpose is made in cases where the intended purpose specified by the manufacturer is not scientifically tenable. After all, a manufacturer should not be able to circumvent the classification as a medical device and the mandatory conformity assessment procedure by formulating a intended purpose that is obviously scientifically incompatible with the function of the device.
Conclusion
In the present case, at first glance, there is no evidence that OpenAI, e.g., in the context of its promotional material, would have intended the purpose of ChatGPT to the diagnosis, prevention, monitoring, prediction, prognosis, treatment, or alleviation of disease, as required by Article 2 No. 1 MDR. Neither on OpenAI's website nor elsewhere can such statements by the manufacturer be found. There can only be found the information that ChatGPT was developed to generate new texts from various text modules available on the internet. The medical intended purpose required for the classification as a medical device by the manufacturer is therefore basically not given for ChatGPT, which speaks against the classification as a medical device. However, in view of the responses to the request about headaches, there are indications that ChatGPT can also provide medical advice beyond the mere texting intended by the manufacturer. However, whether this already exceeds the limit of "scientific tenability" of the intended purpose must be clarified by the courts in case of doubt.
By the way: If you ask ChatGPT itself whether ChatGPT is a medical device, you get the following answer: "No, ChatGPT is not a medical device, ChatGPT is an artificial intelligence-based language model service developed by OpenAI. It is not a medical device or a product used for the diagnosis, treatment, or prevention of diseases or medical conditions. It is merely a text generation platform."