The UK health secretary, Sajid Javid, has announced a review into systemic racism and gender bias in medical devices in response to concerns it could contribute to poorer outcomes for women and people of colour.
Javid said that it is easy to assume that everyone is getting the same experience when looking at a machine. bias can be an issue here too, because technologies are created and developed by people.
Concerns over racial bias have been raised over some of the gadgets used in healthcare.
Analyses of blood pressure.
A vital tool in determining which Covid patients need hospital care is the amount of oxygen in a person's blood.
Concerns have been raised that the devices don't work as well for patients with darker skin. According to the MHRA, pulse oximeters can overstate the amount of oxygen in the blood.
The devices were designed for caucasians, according to Javid. He said that if you were black or brown, you were less likely to end up on oxygen.
The experts believe that the higher death rates among ethnic minorities could be due to the inaccuracies, as well as other factors.
The masks are used for respiratory problems.
Medical-grade respirators offer protection to the wearer against both large and small particles that others exhale, so they are crucial to keep healthcare workers safe from Covid.
Research has shown that filters do not fit as well on people from ethnic groups, so it's important that they fit properly.
Adequate viral protection can only be provided by respirators that fit the wearer. The initial fit pass rate varies between 40% and 90% and is especially low in female and Asian healthcare workers.
Studies on the fit of PPE were mostly focused on Caucasian or single ethnic populations. BAME people are under-represented, limiting comparisons between ethnic groups.
Spirometers are used to measure blood pressure.
Experts have raised concerns that there are racial biases in the interpretation of data gathered from Spirometers.
A woman is using a spirometer. A picture of Justin Tallis.
Dr Kadambi, an electrical engineer and computer scientist at the University of California, Los Angeles, wrote in the journal Science that black and Asian people are thought to have lower lung capacity than white people. Correcting factors are applied to the interpretation of spirometer data in order to affect the order in which patients are treated.
A person's lung capacity might be measured to be lower than that of a white person.
It is expected that a Black person should have a lower lung capacity than a white person, so treatment plans would prioritize the white person.
Kadambi said that remote plethysmography, a technology in which pulse rates are measured by looking at changes in skin colour captured by video, may be affected by racial bias. Kadambi said that visual signals may be biased by skin colour.
There are artificial intelligence systems.
Artificial intelligence is being developed for healthcare applications. There are concerns that biases in data used to develop such systems could make them less accurate for people of colour.
Concerns have been raised about the use of artificial intelligence in the diagnosis of skin cancer. Few freely available image databases that could be used to develop such artificial intelligence are labelled with skin type. Only a few people were recorded as having dark brown or black skin.
Javid acknowledged the issue. He announced new funding last month for projects to tackle racial inequalities in healthcare, such as the detection of diabetes, and noted that one area of focus would be the development of standards.
If we only train our artificial intelligence using data from white patients it will not help the population as a whole. We need to make sure the data we collect is representative of our nation.