Bad Robots

Bad Robot Outcome: AI determines a heart attack for men = panic attack for women

The Story

In September 2019, the Times reported a complaint against HealthTech start up, Babylon Health (endorsed by the NHS), noting that the app told a 60 year old female smoker who reported a sudden onset chest pain and nausea that she is probably having a panic attack. However, the Times notes, a 60-year-old male smoker with the same symptoms was told that he may be having a heart attack. The female was advised to get in touch with her GP within 6 hours if the symptoms persist and the male was told to go to the Emergency Department.

The app was highly criticised for being biased towards women.

The company published a blog post in response to these accusations of gender bias noting,

“We know that it looked bad – ‘There she goes, being all hysterical again… Have a cup of tea, dear’, as she fades away silently in the corner”…


The Babylon blog goes on to state, “For many years, women have been prejudiced against when it comes to cardiac care. They’re less likely to be offered the right treatment after a heart attack and more likely to die as a result of that. Women’s complaints of symptoms such as pain are frequently put down to emotional rather than physical causes and, for many health conditions, men are investigated and treated more extensively than women by doctors. One particular study showed that if a woman were to collapse in public due to a heart attack, she’d be less likely to receive CPR from passersby than a male. Oh yes and…older women are less likely to be admitted to intensive care for life-saving treatment than older men with the same severity of illness.”
They also go on to say,

“It [AI] also has no opinions on sex and gender. It doesn’t look at the data it receives through a veil of prejudice. In many ways, you could argue that an ‘AI doctor’ is the only type of doctor that isn’t clouded by the unintentional force that is gender bias…or is it? Of course, AI is not perfect. Both it and we are constantly learning. AI is as great as the data that goes into it and despite the fact our data scientists, researchers, epidemiologists and doctors are all specially trained for this role – sometimes the research that provides medical data can exhibit gender bias.”

Our view

Our view on this at Ethical AI Advisory is that a statement such as AI is only… “as great as the data that goes into it” is completely inappropriate. It is absolutely the responsibility of the researcher, the data scientist, the head engineer, the product manager, the CEO and all people in the company to make sure that the data, the algorithms, the decisions and actions of the AI are fair, without bias and do not discriminate. Sort the data out first, remove the bias and skews from the data that are used to train the algorithm(s). It is the organisation and each person’s responsibility.