Article • Preprogrammed bias?

AI and the gender gap: Data holds a legacy of discrimination

Technologies based on artificial intelligence (AI) are considered the epitome of progress. However, the data AI algorithms use to draw their conclusions is outdated.

Report: Karoline Laarmann

It ignores the existence of biological sex and socio-cultural gender and their effects on individual health and disease states. Thus, the ‘thinking machines’ not only reproduce discriminating bias and prejudices but also produce suboptimal and even outright wrong results. German experts discussed the gender problem in healthcare AI at the Econ Forum by TK at virtual.MEDICA 2020.

portrait of Christiane Gross
Dr. Christiane Groß

Image source: Ärztekammer Nordrhein / © Jochen Rolfes

This was a women-only panel, underlining that it is mostly women who are interested in the gender issue. More so, the audience reflected the panel composition, as Dr Christiane Gross, Chair of the Medical Committee on Telematics North Rhine Westphalia and President of female physicians’ association Deutscher Ärztinnenbund, pointed out: ‘In training sessions on gender issues I often mainly see women. But the moment you call the session ‘men’s healthcare’ the audience is split evenly in 50 percent women and 50 percent men.’ The word ‘gender’ is still widely considered a women’s niche topic, although gender-specific differences concern all humans. Men, however, have fewer reasons to feel excluded in medical sciences because they are over-represented at all levels. After all, the majority of participants in medical research projects are men, thus the training data fed into AI systems is either focused on men or neglects the gender dimension altogether.

When machines think like men

‘The gender bias in medicine has been known for years but it took the research community quite a while to catch on,’ said Brigitte Strahwald, the Chair for Public Health and Health Services Research at Ludwig Maximilian University, Munich. During development of AI technologies, legacy data was used that carry the gender-related distortions from the analogue into the digital world. ‘This is due, among other reasons, to the gender gap in AI research and development,’ Strahwald pointed out. Algorithms are not only the product of the data sets on which they are based, they are also the product of the humans who program them – the majority of whom are men. Since most healthcare AI applications do not differentiate between sexes and genders, it can be assumed that the IT experts in charge either consider this aspect irrelevant or are not even aware of it.

Gendering as a success factor

portrait of Barbara Steffens
Barbara Steffens

Photo: Techniker Krankenkasse

Neglecting the gender dimension is not only a matter of discrimination against women – it undermines the usefulness of AI. Take the digital health applications that are being reimbursed by health insurers under the German Digital Care Act (DVG), which came into effect last October. Barbara Steffens, former health secretary of North Rhine Westphalia and current head of regional policy support of a large health insurer, Techniker Krankenkasse, points out that the ‘non-differentiating apps’ help neither men nor women, but only generate costs: ‘In the first certified apps we are aware of five health areas in which we know men and women are affected differently, not to mention different perceptions and courses of disease.’ More men than women, for example, suffer from chronic tinnitus, but women experience the disease more intensely. Thus it can be expected that the success of a tinnitus app also hinges on the question of whether contents are prepared and communicated in a gender-sensitive manner. Currently, this is not the case.

Recommended article

Photo

News • Inconceivable?

Many fertility apps not exactly fussy about data privacy, study shows

The majority of top-rated fertility apps collect and even share intimate information without the users’ knowledge or permission, a collaborative study by Newcastle and Umea Universities has found. Researchers are now calling for a tightening of the categorisation of these apps by platforms to protect women from intimate and deeply personal information being exploited and sold.

Gender was, and is, the big unknown, not only in digital healthcare but also in teaching, training and research. There appears to be an easy way to strengthen awareness of gendered healthcare, said Dr Christiane Gross: ‘If we manage to get men on board we will have reached a major milestone.’

02.07.2021

Read all latest stories

Related articles

Photo

News • Health IT Forum at Medica 2021

A glimpse into the future of digitalised medicine

"Virtual Care & Digital Therapeutics", "Medical Artificial Intelligence & Robotics", "Fields of Innovation" and "Societal aspects of digitized…

Photo

Article • IoT in the clinical environment

The smart hospital: A place with ears and eyes

A new platform allows hospitals to deliver improved organisational and financial performance by deploying an Internet of Things (IoT) sensor network to collect data and using AI to analyse the…

Photo

Article • Sex differences in heart failure

Why a closer look at a woman’s heart matters

The hearts of men and women are different – while this insight has been established for quite some time now, it might even surprise cardiologists just how deep these differences really run. In her…

Related products

ScreenPoint Medical – Transpara

Artificial Intelligence

ScreenPoint Medical – Transpara

ScreenPoint Medical
ScreenPoint Medical – Transpara

Artificial Intelligence

ScreenPoint Medical – Transpara

ScreenPoint Medical
AB-CT – Advanced Breast-CT – nu:view

Mammo CT

AB-CT – Advanced Breast-CT – nu:view

AB-CT – Advanced Breast-CT GmbH
Subscribe to Newsletter