Understanding the causes of gender bias in AI can lead us to effective solutions.
Artificial intelligence (AI) is quickly integrating into our everyday lives, from creating ads and music to informing critical industries such as healthcare and finance. But along with innovation and efficiency comes an inescapable truth: gender bias in AI.
Gender bias in AI has been a subject of study for several years. It occurs when algorithms perpetuate and even amplify societal biases and stereotypes related to gender. This phenomenon poses significant challenges, not only in terms of fairness and equity but also in terms of the broader impact on society.
Understanding how gender bias manifests in AI and how we can combat it is crucial for creating AI systems that are more equitable and inclusive.
Where does gender bias in AI come from?
Data is at the heart of the issue. AI algorithms use vast amounts of existing data to learn and make decisions. If the data lacks diversity or contains biases, the AI system may replicate and amplify these biases in its decision-making processes.
For instance, natural language processing systems like Amazon’s Alexa and Apple’s Siri can associate “man” with “doctor” and “woman” with “nurse,” reflecting outdated views. In 2017, Amazon abandoned its automated recruitment system—built off internal data—when the software systematically discriminated against women applying for technical jobs. In healthcare, where data is mainly collected from men, an online app could tell a woman with pain in her left arm that she has depression, while a male user is more likely to be warned of a heart attack.
In short, it’s a case of bias in, bias out. If gender biases and stereotypes are built into the data—implicitly or explicitly—the resulting AI systems and output will perpetuate these biases.
Is there gendered ageism in AI?
Where gender and age intersect—known as gendered ageism—AI also shows biased output.
In August of 2023, the U.S. Equal Employment Opportunity Commission (EEOC) settled its first lawsuit concerning AI bias in hiring. Software used by an English-language tutoring service was programmed to automatically reject female applicants over the age of 55 and male applicants over the age of 60.
A study using Midjourney, a generative AI platform, revealed gender and age biases in visual representations of media professions. Analyzing over 100 visualizations, non-specialized roles (such as “journalist,” “reporter,” “correspondent,” and “the press”) depicted only younger individuals. Specialized roles (such as “news analyst,” “news commentator,” and “fact-checker”) included both younger and older individuals, but older individuals were exclusively depicted as men. Women were consistently depicted as younger and wrinkle-free, while men were shown with wrinkles.
How can we combat gender bias in AI?
Researchers out of Adelphi University note that “UNESCO data shows that only 12 percent of AI researchers are women, and they represent only 6 percent of software developers. To combat bias in AI, there must be a conscious, deliberate effort to ensure not only that more women enter the field, but that data represents the diversity of our population.”
According to Girls Who Code, only 24% of the tech industry is run by women and only 7% of the tech startups are founded by women. Since tech is a male-dominated industry, AI outputs will inevitably mirror gender biases, underscoring the need for increasing women’s participation in STEM-based education and careers. Companies can also build a more inclusive talent pipeline by investing in STEM education and training for women.
As much promise as it shows, AI can’t run the show alone. Human input and oversight is still the most effective way to reduce gender bias in AI. And while reducing gender bias in AI is a complex and lengthy endeavor, getting more women into technical fields will go a long way.
The good news
“The very first advantage of having AI is that it has brought forward this problem of gender bias in the data. Because of the sheer amount of data, we would not have been able to see the patterns that clearly the way that these AI algorithms have shown to us.”
This, according to Dr. Muneera Bano, is the good news about gender bias in AI. Now that we see it, we can do something about it.
Interested in this topic?
We recently posted another blog related to this topic: Harnessing AI for Age-Inclusive Hiring Practices
You can also view our Equity Summit 2023 discussion The Implications of AI for Older Workers