Skip to main content

Clifford Chance

Clifford Chance
Healthtech<br />

Healthtech

Talking Tech

Is ChatGPT a medical device?

Artificial Intelligence Healthtech Healthcare & Life Sciences 7 November 2023

ChatGPT can provide medical information, but should it be classified a medical device? It's a question that Germany's Federal Institute for Drugs and Medical Devices (BfArM) was confronted with in an open letter from a Hamburg-based law firm which suggests that ChatGPT – a human-trained OpenAI-based chatbot that feeds its knowledge via freely accessible online sources – falls under the regulations applicable to medical devices in Germany and Europe.

In the open letter, the firm states that ChatGPT provides precise answers and concrete medication recommendations even on individual diagnostic questions. The law firm took the inherent capabilities of ChatGPT as an opportunity to ask the BfArM to limit "the use of the software in the field of medicine by way of supervision and to ensure the necessary quality assurance measures." However, the BfArM has pointed out that it is not its responsibility, but rather the responsibility of the manufacturer to ensure product conformity in accordance with the Medical Device Regulation (MDR). In addition, the BfArM stipulates that according to the federalist system in Germany monitoring of compliance with the MDR is a matter for the German federal states.

Yannick Frost, a Clifford Chance Senior Associate based in Düsseldorf and a member of the Clifford Chance Tech and Healthcare Group, focusing on the application of AI systems in the healthcare sector, agrees with the opinion expressed in the open letter to the extent that the function of ChatGTP software falls within the scope of diagnosis, monitoring, treatment or prevention of diseases. "If, for example, one enters questions about the treatment of flu-like symptoms into the system, one receives information about which medicines are suitable for the treatment of these symptoms and which dosage to take based on the user's weight."

The qualification of software as a medical device is based, amongst other things, on the guiding questions of the Medical Device Coordination Group of the European Commission. The guidelines essentially focus on whether a mere storing and reproducing (search) function is exceeded. Furthermore, the software must serve the benefit of the individual patient in order to qualify as a medical device. The presented functions of ChatGPT would probably have to be classified as a diagnosing function, which represents a decision support for the individual user. The mere reproduction of stored information, similar to an online library, would be exceeded in Frost's view.

However, Frost emphasises that the classification of software as a medical device depends first and foremost on the manufacturer's intended purpose. This means that the software must be intended by the manufacturer to be used for the purposes specified in the MDR. "As far as can be seen, ChatGPT has not yet been expressly designated as a medical device by the manufacturer OpenAI LLC. Therefore, this is contrary to the legally conclusive classification of the software as a medical device, despite its described functions."

Finally, Frost agrees with the description in the open letter to the extent that the federalist system in Germany – which provides responsibility to monitor compliance with the MDR to each German federal state - harbors the danger that diagnostic software applications such as ChatGPT could be distributed on the market without meeting the safety and quality requirements of medical device regulations. However, ultimately the member states as well as the European Commission are themselves responsible – Article 4 of the MDR provides a procedure for determining the legal status of products as medical devices. "Against the background of the extensive functionality of ChatGPT in the field of health, a rapid clarification of its qualification as a medical device is desirable," Frost concludes.