4 of 5

29 October 2021

Femtech and issues around digital health products – 4 of 5 Insights

'Femtech' – getting data protection right in health apps

Debbie Heywood looks at privacy issues with femtech apps and at how getting compliance right can help address some of the issues of bias in digital health products.

  • Briefing
More
Author

Debbie Heywood

Senior Counsel – Knowledge

Read More

mHealth apps targeted largely at people assigned female at birth tend to focus on reproductive issues from period tracking, contraception, fertility, particular types of cancer management, and menopause. 

This data is highly personal, and people are unlikely to take advantage of new products if they can't trust them to keep their data safe or understand what exactly it's being used for. Yet the data these apps collect can be a vital tool in resolving some of the issues around bias in the development of health products for women. Being able to leverage it in a way which aids research while preserving privacy and trust for individuals, requires a careful balancing act and excellent communication with those supplying the data.

Privacy concerns

At the end of 2020, period trackers hit the news after Privacy International revealed its concerns about the amount of personal data collected by them. The privacy campaign group had made subject access requests to five apps in an attempt to discover what personal data was held by them and to whom it was disclosed. The exercise raised a number of concerns including around transparency (especially in relation to disclosure to third and fourth parties), the amount of data collected and the fact that it was largely stored on servers rather than on user devices. 

More recently, period tracker Flo, which has more than 100 million users, was the subject of a US Federal Trade Commission complaint alleging the company behind it had misled users about the extent of its data sharing. The FTC said Flo had shared data extensively with third-party companies. Flo settled with the FTC although it did not admit to wrongdoing.

Of course, for US users, Flo is not bound by the GDPR, but the concerns raised by the FTC are typical privacy issues around femtech and other health apps. The majority of health apps targeted at women will process personal data which is protected by the UK GDPR and the EU GDPR in the UK and EU respectively (and referred to here as the GDPR for ease). Because this type of data is particularly sensitive, it is given additional protection as 'special data'. 

Meeting GDPR compliance requirements can be complex but getting it wrong can be fatal in terms of reputation management as well as financial penalties. Paying particular attention to the following areas can help femtech and other health apps build user trust and achieve regulatory compliance.

Top tips

Privacy by design and default – get it right at the outset

Data protection has to be considered at the earliest stages of app development and built in from the start. Privacy settings should be set to their highest levels by default.

Carry out data protection impact assessments (DPIAs)

DPIAs will be required for many healthcare apps but even where they aren't mandatory, they are a useful exercise to help focus on potential risks to data subjects of planned processing, and ways to reduce that risk. They are also vital to help meet the accountability requirement which requires you to be able to demonstrate GDPR compliance.

Identify your lawful basis for processing and the condition for processing special data

Each processing operation must be carried out under one of the lawful bases for processing personal data. In addition, where special data (which includes health data) is being processed, an Article 9(2) condition must be met. This can be a tricky process, particularly because valid GDPR consent can be difficult to obtain in a health context so it needs to be carefully thought through. The personal data cannot then be processed for a purpose which is incompatible with the original purpose for which it was collected.

Minimise processing of personal data

How much of the data you process really needs to link to an identifiable individual? This will depend on the nature of your app. If it is used to help deliver tailored medical treatment, then the data is more likely to need to be personal. If your data relates to more generalised issues – for example, tracking menstrual cycles and symptoms – does it really need to be personal? Consider whether it can be anonymised or, at least pseudonymised. And do you really need the individual to register?

Can you leave the data on the individual's device?

Many of the COVID contact tracing apps have taken a decentralised approach which means that personal data they collect stays on the user's device. While we can debate the overall usefulness of contact tracing apps, a decentralised approach has helped build public confidence that personal data won't be misused. If you need to store data on a server rather than on a user device, can you anonymise or pseudonymise it?

Data security should be a top priority

The more sensitive the nature of the data being processed, the tighter the security should be. Any data breach can attract regulator scrutiny and cause reputational damage, but failure to protect health data is particularly likely to set off alarms.

Be transparent about what you are doing

Transparency is essential to avoid 'data shock'. Beyond this being a central GDPR requirement, it is common sense. If it is clear to the user what is happening to their data, they are less likely to have a problem with it or to discover at a later stage that their data is being used in unexpected ways. For femtech apps, the challenge can be conveying this information in a digestible form so be creative – ICO guidance may help.

Sort out your processes for giving effect to data subject rights

Users should have control over their data. They should, for the most part, be able to access it, correct it and delete it. You need to have the processes in place to enable them to do that.

Be accountable

It is not enough to comply with the GDPR, you have to be able to demonstrate compliance which requires clear policies, procedures and an audit trail.

Take particular care over data transfers

This isn't just an issue where the data is leaving the EU or UK (as the case may be) although that certainly brings in a range of additional considerations. It's also important to consider which third parties are getting access to the data and why, as well as who they might pass data on to. Again, anonymisation can help leverage the data without creating additional risks to the rights and freedoms of data subjects.

Use it or lose it

Personal data should not be held for longer than necessary for the purpose for which it was originally collected. If you anonymise it, the restriction no longer applies.

Think ahead

It's tempting, given the value of data (financial or otherwise), to collect as much of it as possible for as wide a range of purposes as you can. While the GDPR aims to prevent this, one of the interesting points highlighted by Privacy International's investigation into period tracker apps was that some of them were doing everything in accordance with their privacy policies; users had agreed to all of it. Yet Privacy International still felt that users would be shocked if they really understood how much intimate personal data was being processed and transferred to third parties. 

App providers need to think about whether they really need all the data they collect in order to fulfil their purpose. If not, on what basis are they processing it? Apps may want to collect additional data to provide their users with a more tailored experience but they have to be able to justify this as a real and desired benefit to users. Principles of data minimisation and purpose limitation must be observed. 

Data for the greater good?

As the COVID pandemic has highlighted, enormous research value can be found in the data submitted to health apps. King's College London's ZOE app has been a vital tool in monitoring the pandemic but it's very clear to users that the data they are submitting will be used for research purposes. 

Research into women's health and how women respond to medicines can be shamefully lacking. In her book Invisible Women which explores this issue in detail, Caroline Criado Perez noted, for example, that Viagra was found to alleviate period pain in early trials but that avenue of research was dropped once it was also found to help treat erectile disfunction. 

Femtech presents an opportunity to be part of correcting the imbalance and many apps –  including menopause app Balance and period tracker Clue – are upfront about working with scientific researchers, providing opportunities to opt out and being clear about steps taken to protect and/or anonymise data. 

Many people would be willing for their data to be used to help further research providing the principles outlined above are observed. But when an app purports to be offering users a benefit, for example, tracking menopausal symptoms, and does not clarify that the data may also be used for research purposes, it will breach data protection law.

The UK is currently consulting on changes to its data protection regime now that it is now longer required to follow the EU's GDPR. One of the areas it's looking at is defining "scientific research" and clarifying the lawful bases which can be used to justify processing data for research purposes. It is also considering widening the scope of consent for re-use of research data for further processing and clarifying when further processing is compatible with the original purposes for which the data was collected, for example, where it is used in an important public interest. There are also proposals around the processing of health data by public bodies.

Meanwhile, the EU is also looking to increase availability of data for research through its Data Governance Act which is going through the legislative process.

If these changes go ahead, will they make those who use them more or less likely to give their data to femtech products? It will really depend on how much emphasis is placed on protecting and anonymising that data.

Find out more

If you would like to discuss any of the issues raised in this article in more detail, please contact a member of our Life Sciences & Healthcare or Data Protection & Cyber teams.

Return to

home

Go to Interface main hub