This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 1 minute read

Real world effect of bias in medical devices

The Health Secretary, Sajid Javid, has ordered a review into whether oximeters (which have been used to measure oxygen levels in COVID-19 patients) overstate the level of oxygen in the blood of people from ethnic minorities, meaning they are less likely to receive the treatment they need.

Oximeters work by transmitting light through a patient's finger and it is thought that skin pigmentation affects how the light is absorbed, therefore also affecting the readings given by the device. 

The concern in this case reflects the worries many parties have expressed regarding AI medical devices which are often trained on data which does not reflect the communities the device is developed to work in. In an article we published earlier this year, we explained how AI systems can produce biased results, leading to varying quality of outcomes for different sexes, ethnic communities, age ranges or other groups. This is particularly dangerous given the general public's perception that AI systems are inanimate and therefore cannot be biased in the same way that humans can be.  

This topic was also addressed at our recent Life Sciences Summit on AI in Healthcare. Please see here for the accompanying White Paper for this event. 

He said there was racial bias in some medical instruments, adding: "It's unintentional but it exists."

Tags

health tech, artificial intelligence, data protection and privacy, life sciences regulatory