As digital products, monitoring aids and an increasing use of AI become prevalent in the healthcare system, product developers and regulators must ask tough questions about diversity and bias.
To avoid a healthcare system operating using machines with baked-in biases, product developers, regulators and healthcare providers must focus on the system's raison d'être: improving the health and treatment options for the entire patient population.
Biases in the healthcare system and the dangers of not listening meaningfully to patient experiences were highlighted by Baroness Cumberlege in her 2020 report, 'First Do No Harm'. The report raised the issue that certain parts of the patient population have no real voice in the development of medicines and medical devices.
The report analysed how the English healthcare system responded to patient complaints of three products, all of which caused harm to women or their unborn children. The report was highly critical, finding that concerns and symptoms were dismissed as being "women's problems". The lack of action taken by the Medicines and Healthcare products Regulatory Agency (the "MHRA") and healthcare providers in response to complaints was criticised by Cumberlege and listed as one of several areas for future actions.
The digitalisation of the medical sector will only exacerbate these problems unless product developers, regulators and healthcare providers take into account the voices of the wider patient population. Artificial intelligence and digital products can only be as impartial as the data that it is used to train their algorithms. If the data is imbalanced in terms of gender, race, ethnicity, age, and level of education attained, these biases will effectively be built in to new technologies.
By way of example, researchers are concerned that algorithms designed to detect skin cancer are worse at detecting cancers in darker skin tones, due to the algorithms being trained primarily on individuals with lighter skin. Product developers, regulators and healthcare providers must be mindful to address these biases. This change must permeate medicines and medical devices at all stages of their life cycles, from involving a diverse range of clinical trial participants, to meaningfully listening and acting on patient experiences post market.
The lack of diversity in product development results from several sources, the first of which is non-diverse clinical trials. Characteristics such as race, age and gender and even education level attained may affect how somebody responds to a medicine or medical device. Despite this, clinical trials predominantly involve white, male participants. Consequently, most patients using medicines and medical devices will not have been represented in the investigative trial. To spot issues that show up in certain demographic characteristics, participants in clinical trials must become more representative of the eventual patients.
Secondly, once a product is on the market, post-market surveillance requires healthcare providers to take careful note of adverse events and to report on and act in relation to them. This action might include a change to instructions for prescribers or for patients, or a change to the product itself. Researchers have found, as noted by the Cumberledge Review, that the health concerns of women are not taken as seriously as those of men. For example, one US study found that women hospitalised for the most serious type of heart attack are more than twice as likely to die as men. This was attributed to women receiving fewer interventions than men, partly because their symptoms are often less obvious. Failure to listen to women, and/ or to minority groups about their experiences with drugs and devices can lead to biases affecting the safety of products for particular groups.
Biases and lack of representation in both clinical data and in post market data is a known issue, and when "baked into" a digital health product via its programming, can have an on-going and potentially detrimental effect on the safety and efficacy of products for certain populations.
It is critical to avoid our healthcare system relying on products that are programmed to further marginalise the marginalised. Product developers, the MHRA and healthcare providers need to engage and collaborate more with both patients and data scientists to ensure algorithmic fairness. This should help to facilitate a more equitable healthcare system which caters not just for commercial interests, but which also prioritises patient safety for all.
Digital products might themselves be the facilitators to achieving this goal. Clinical trial populations are currently drawn from areas near centres of academic clinical medicine. These populations tend to be less diverse both ethnically and socially. The fixed hours of a clinical site also mean that women juggling work and childcare responsibilities can find participation impossible. But a mobile healthcare team paired with digital communication and monitoring tools will reach a geographically wider and more diverse population. During the pandemic the MHRA has shown flexibility in allowing clinical trials to be undertaken away from clinical trial sites. If the MHRA accepts that going forward, not all clinical trials require a fixed clinical trial site but might be safely carried out via more flexible and mobile means, this will allow a more diverse population to be engaged and represented in trial data that is used to develop products.
Equally, the use of digital technology for reporting and analysing complaints about drugs and medical devices will both facilitate and speed-up the observation of trends which need addressing. To this end, the Yellow Card scheme, which is the means for reporting complaints and adverse events to the MHRA, is now available as an app. However, an app cannot remove biases in the activities of healthcare providers in determining whether a complaint is 'real' or not.
To remove these biases, we advocate a change in the law so that all adverse events seen by healthcare providers must be reported. A doctor determining whether a product complaint from a female patient is merely 'women's problems' or a reportable adverse event might feel obliged to set aside their biases when faced with a fine or criminal action for failure to report. This legal change would lead to the MHRA and product manufacturers holding much richer datasets from which to extrapolate and determine the actions to take to ensure the effectiveness and safety of their products for a more diverse population.
Building health technologies – including AI-based products – without soaking up biases presents a challenge for everyone involved in the health system. Digital products should be leveraged to reach more diverse populations for clinical trials and post-market data. Changes in the regulatory system both for clinical trials and post-market reporting will go a long way to supporting a digital revolution that has potential to reduce the biases highlighted by the Cumberledge report.
To discuss the issues raised in this article in more detail, please reach out to a member of our Life Sciences & Healthcare team.
Jo Joyce with Marija Butkovic, the founder and CEO of Women of Wearables, look at the evolution of 'femtech'.
2 of 5 Insights
Alexandra Richardson and Phil Shepherd look at the issues facing femtech funding and at the potential for growth.
3 of 5 Insights
Debbie Heywood looks at privacy issues with femtech apps and at how getting compliance right can help address some of the issues of bias in digital health products.
4 of 5 Insights
Tasmina Goraya looks at the application of medical device and healthcare law to femtech products and services.
5 of 5 Insights