Bad Robots
Bad Robot Outcome: AI determines a heart attack for men = panic attack for women
The Story
In September 2019, the Times reported a complaint against HealthTech start up, Babylon Health (endorsed by the NHS), noting that the app told a 60 year old female smoker who reported a sudden onset chest pain and nausea that she is probably having a panic attack. However, the Times notes, a 60-year-old male smoker with the same symptoms was told that he may be having a heart attack. The female was advised to get in touch with her GP within 6 hours if the symptoms persist and the male was told to go to the Emergency Department.
The app was highly criticised for being biased towards women.
The company published a blog post in response to these accusations of gender bias noting,
“We know that it looked bad – ‘There she goes, being all hysterical again… Have a cup of tea, dear’, as she fades away silently in the corner”…
(Article)
“It [AI] also has no opinions on sex and gender. It doesn’t look at the data it receives through a veil of prejudice. In many ways, you could argue that an ‘AI doctor’ is the only type of doctor that isn’t clouded by the unintentional force that is gender bias…or is it? Of course, AI is not perfect. Both it and we are constantly learning. AI is as great as the data that goes into it and despite the fact our data scientists, researchers, epidemiologists and doctors are all specially trained for this role – sometimes the research that provides medical data can exhibit gender bias.”
Our view
Our view on this at Ethical AI Advisory is that a statement such as AI is only… “as great as the data that goes into it” is completely inappropriate. It is absolutely the responsibility of the researcher, the data scientist, the head engineer, the product manager, the CEO and all people in the company to make sure that the data, the algorithms, the decisions and actions of the AI are fair, without bias and do not discriminate. Sort the data out first, remove the bias and skews from the data that are used to train the algorithm(s). It is the organisation and each person’s responsibility.