by Lizzie Remfry
Whitehead Full Report: Equity in Medical Devices: An Independent Review
The Whitehead review, released in March 2024, was an independent review into equity in medical devices. This review was prompted by concerns around biases in medical devices used in the COVID-19 pandemic, in particular the pulse oximeter, a device to record levels of oxygen in the blood, which was found to perform less accurately in patients with darker skin tones. The report focuses predominantly on equity in optical medical devices, which includes pulse oximeters, but also importantly includes sections on AI-enabled medical devices and polygenic risk scores in genomics, both of which are highlighted as critical areas in medical devices which have growing evidence of bias.
AI-enabled medical devices have taken the world by storm and are being released at a rapid pace. Although the use of AI is relatively new in the NHS there are examples of its use in predicting hospital readmissions, diagnosing cancer and providing recommendations for treatments, and it is predicted that the use of AI in healthcare will only continue to grow. AI can be hugely beneficial, but how do we ensure that these benefits are distributed equitably?
The Whitehead review highlights key areas that the Data Science for Health Equity (DSxHE) community care about and are actively engaged with. At DSxHE our vision is to build a world where data improves everyone’s health. We aim to do this through bringing people together who are working at the intersection of data science and health inequalities to ensure that the latest research, innovations and practice improve health equity.
The review mentions Data Science for Health Equity (DSxHE) as one of the many collaborative efforts in the equitable development and use of AI in healthcare. This is alongside research institutions such as the Alan Turing Institute, the Ada Lovelace Institute and other organisations such as the World Health Organisation and the NHS AI Lab, all of which have presented research, recommendations or frameworks around equity in healthcare AI.
The report lays out 7 recommendations for preventing bias in AI-assisted medical devices. Below we outline some of the recommendations and highlight where we think DSxHE could help and what we can do moving forward.
AI developers and stakeholders should engage with diverse groups of patients and the public and ensure they are supported to contribute to the co-design of AI-enabled devices. At DSxHE we have a Participatory Research Theme which offers guidance for putting participatory research into action and placing communities with lived experience at the heart of equitable design processes. The theme also is collating resources, tips and real-life examples that can help elevate practices in participatory research.
To improve the understanding among stakeholders of equity in AI-assisted medical devices through the development of materials and training for lay and professional stakeholders (students, health professionals, patients, computer scientists, clinical guideline bodies, etc.) This is a critical and often underlooked area that rings true for many students across the UK. A recent survey into the inclusion of technical machine learning courses offered at university, only 12% included ethics-related content. There is also a lack of materials for use with patients and members of the public, another gap that our Participatory Research theme is focused on. DSxHE has also developed materials and workshop content around fairness and ethics in healthcare AI, if you’re interested in using these or want to know more please get in touch!
To work together to ensure that best practice guidance, assurance and governance processes support the reduction of bias, particularly around the evaluation and auditing of AI-assisted devices, design checklists and development of methodologies to identify and eliminate bias. This moves beyond data and focuses on some other parts of the development and deployment processes that can address biases, particularly around the methodological advancements. Our DSxHE Statistical Methods Theme focuses on the investigation and development of statistical methods that have the potential to help us use data to better understand health inequalities and to promote health equity, and runs regular bi-weekly and monthly events bringing together experts in the field to discuss and share methods.
To prepare and plan for the disruption of generative AI and foundation models on medical devices and the potential impact on equity. With the boom taking place in generative and foundational AI, this recommendation is critical. Many guidelines and standards in place for the use of AI in healthcare, such as TRIPOD-AI, are already out of date as they don’t include the use of generative AI. At DSxHE we would like to set up a Theme focused on generative AI, and call for anyone interested in volunteering or collaborating on this theme to get in touch.Â
DSxHE is ready to act on the Whitehead Report’s recommendations and as a community we are eager to work with individuals and organisations to make impactful progress towards reducing bias in AI-assisted medical devices. If you’d like to find out more about DSxHE, and how you or your organisation can get involved, you can:
- Peruse our website
- Subscribe to our monthly newsletter (scroll down to the bottom)
- Join our slack workspace
- Get involved with a Theme: http://www.dsxhe.com/themes
Comments