data
1 of 6

1 July 2019

AI – 1 of 6 Insights

AI and big data in the healthcare sector

Artificial intelligence algorithms and big data are already being used in hospitals and other areas of healthcare and diagnostics. What exactly does this mean for patients and for those seeking to overcome the legal challenges?

More
Author

Thanos Rammos, LL.M.

Partner

Read More

How is AI used in hospitals?

Artificial Intelligence, machine learning (an application of AI), and big data, are used in a wide range of applications in hospitals. For example, cognitive computer systems are used to facilitate repetitive processes and to create sample analyses in the context of radiological diagnostics. Where a learning or predictive function is added, then we are into the realms of AI.

Other examples are the Deep Learning Model of Stanford University and Google Brain. These tools are used to determine the most probable time of death of seriously ill patients and the best treatments as a result, including enabling patients to be transferred to palliative care in good time in order to give them a dignified farewell.

Surgical robots or other robotic-based systems are another field of application. Some believe that medical robots will be indispensable for high-precision surgical treatments in the future. They have the potential to improve post-operative rehabilitation and to provide highly efficient logistical support in clinics. There is no doubt that as machine learning and AI become more sophisticated, the use of robots in the context of treatment will only increase.

Data protection law issues

The use of AI (which for ease, we will take to include machine leaning) in the healthcare sector is the cause of some alarm among patients who are concerned about privacy and accuracy. Software-based treatment options in which the underlying assessment is only partially made by a doctor, are a particular focus of scepticism. This is hardly surprising, as on the one hand very sensitive data is processed and, on the other, machines are entering into an area that was previously only occupied by humans.

At least on the privacy side, patients should find comfort in the fact data protection law not only places specific requirements on software-based systems in hospitals, it also lays down general guidelines, which must be observed in the context of big data and AI. Particularly relevant (but not exhaustive) aspects include:

Consent

Health data may only be processed if permitted by law or with the consent of the data subject. For the time being, specific legislation is unlikely to apply so we are looking at consent. In order to obtain effective GDPR consent, transparency is essential. The individual must be able to understand not only how their personal data is processed, but also who the recipients of the data are. For big data applications, this is a huge challenge because the purposes of the processing are not always determinable in advance. AI solutions may be able to learn differently based on different data sources. This can be problematic for consent purposes as it may be difficult to explain to the patient what exactly happens to their data.

Right to erasure

Another principle of data protection law is that personal data must be deleted once it is no longer required for the purposes for which it was collected, unless it is subject to special retention obligations. The GDPR places a focus on the retention issue by giving individuals the right to have their data erased under a variety of circumstances (including where consent is withdrawn – see below). With big data applications, this right can lead to complications, since the very purpose of data analysis is to access a large amount of information in order to derive corresponding results on this basis. An optimal solution would be to anonymise the data but this is not always possible. Whatever decision is taken, it must be possible for a data controller of AI-generated health personal data, to be able to give effect to any lawful erasure requests.

Purpose limitation

Purpose limitation is another important aspect that poses challenges for the applications in question here. Data collected for a specific purpose can generally only be used for this purpose. However, if a software solution is supposed to make predictions for the future based on self-learning functions and derives further steps from them, data could conceivably end up used for a different purpose to the one for which it was originally collected. Steps must be taken to ensure this does not happen before the individual has given consent unless there are other grounds for processing the data or the new purpose is compatible with the original purpose.

Right of access

The right of access by individuals to their personal data under the GDPR may be asserted at any time. For hospitals using AI solutions, this means that they must be able to provide information about the patient's personal data at any time upon request. A data management system to enable prompt response to subject access requests will be essential.

Withdrawal of consent

In order to be a valid GDPR consent, the data subject must be able to withdraw it at any time without detriment. The Article 29 Working Party (now the European Data Protection Board) noted specifically in its consent guidelines that this makes the use of consent as a lawful basis unsuitable in circumstances where, for example, data is being used for medical research.

Other applicable laws in Germany

It is worth remembering that it's not just the GDPR which will be important; other laws may apply. In Germany, hospitals subject to public law are often required to comply with more onerous requirements than private ones. This is because data protection law only partially applies to public institutions and there may be specific requirements for hospitals in accordance with local law. Church data protection regulations must also be observed if the organisation is church-based.

Another aspect to be considered when using third party applications in hospitals is the handling of confidentiality, in particular, patient secrecy for medical professionals. In the past, breaches of professional secrecy were subject to criminal law sanctions which made it difficult to use software solutions that could transmit patient data to providers. A new regulation allows doctors and hospitals to fall back on external service providers to an unprecedented extent. However, it must be taken into account that the new provisions of criminal law have no effect on the permissibility of the use of service providers under data protection law.

Innovation v regulation

AI, machine learning and big data can improve the quality of healthcare, by processing existing information, records, symptoms and treatment and analysing outcomes, enabling faster and more effective prevention, diagnosis and treatment, with the added benefit of a flawless memory.

Taking advantage of the opportunities offered now and in future by AI in the field of healthcare will further the development of digitised medicine. At the same time, progress can only be made in accordance with the law as it adapts to disruptive technology. Some may see the law as restraining progress in AI, but others will welcome the caution which the law imposes on this rapidly developing and sensitive application of AI, machine learning and the use of big data.

If you have any questions on this article please contact us.

Return to

home

Go to Interface main hub