How to End Gender Bias in Internet Algorithms

How to End Gender Bias in Internet Algorithms

How to put an end to gender biases in internet algorithms Articles indexed by Scopus for various gender-related terms. Credit: Algorithms (2022). DOI: 10.3390/a15090303

Endless screeds have been written about whether the internet algorithms we constantly interact with suffer from gender bias, and all you have to do is perform a simple search to see it for yourself.

However, according to the researchers behind a new study that seeks to draw a conclusion on this, “so far, the debate has not included any scientific analysis”. This new article, written by an interdisciplinary team, proposes a new way of approaching the question and proposes solutions to prevent these deviations in the data and the discriminations they entail.

Algorithms are increasingly being used to decide whether to grant a loan or accept applications. As the range of uses of artificial intelligence (AI) expands, and its capabilities and importance increase, it becomes increasingly critical to assess the potential biases associated with these operations.

“Although this is not a new concept, there are many cases in which this problem has not been examined, thus ignoring the potential consequences,” said the researchers, whose study, published open access in the Algorithms journal, focusing primarily on gender biases in the various fields of AI.

Such prejudices can have a huge impact on society: “Prejudices affect everything that is discriminated against, excluded or associated with a stereotype. For example, a sex or race can be excluded in a decision-making process or, quite simply, certain behaviors can be assumed because of one’s sex or the color of one’s skin,” explained the principal researcher of the research. , Juliana Castañeda Jiménez, industrial doctoral student at the Universitat Oberta de Catalunya (UOC) under the supervision of Ángel A. Juan, from the Universitat Politècnica de València, and Javier Panadero, from the Universitat Politècnica de Catalunya.

According to Castañeda, “it is possible for algorithmic processes to discriminate because of gender, even when programmed to be ‘blind’ to this variable”.

The research team, which also includes researchers Milagros Sáinz and Sergi Yanes, both from the Gender and ICT (GenTIC) Research Group of the Internet Interdisciplinary Institute (IN3), Laura Calvet, from the University School Salesian of Sarrià, Assumpta Jover, from the Universitat de València and Ángel A. Juan — illustrate this with several examples: the case of a well-known recruitment tool which preferred male applicants to female applicants, or that of certain credit services which offered less favorable conditions to women than to men.

“If old and unbalanced data is used, you’re likely to see negative conditioning when it comes to black, gay, and even female demographics, depending on when and where the data comes from,” Castañeda explained.

Science is for boys and the arts are for girls

To understand how these patterns affect the different algorithms we deal with, the researchers analyzed previous work that identified gender biases in data processing in four types of AI: those that describe applications in processing and generating the natural language, decision management, voice recognition and facial recognition. acknowledgement.

In general, they found that all algorithms identified and ranked white men better. They also found that they reproduced false beliefs about the physical attributes that should define a person based on their biological sex, ethnic or cultural background, or sexual orientation, and also made stereotypical associations linking men in the sciences and women in the arts.

Many procedures used in image and voice recognition are also based on these stereotypes: cameras recognize white faces more easily and audio analysis has problems with higher-pitched voices, mainly affecting women.

The cases most likely to suffer from these problems are those whose algorithms are built based on the analysis of real data associated with a specific social context. “Some of the main causes are the under-representation of women in the design and development of AI products and services, and the use of datasets with gender biases,” noted the researcher, who argued that the problem stems from the cultural environment in which they are developed.

“An algorithm, when trained with biased data, can detect hidden patterns in society and, when it works, reproduce them. So if in society men and women have unequal representation, the design and development of AI products and services will show gender bias.”

How can we put an end to this?

The many sources of gender bias, as well as the particularities of each type of algorithm and dataset, mean that closing this gap is a very difficult, but not impossible, challenge.

“Designers and everyone else involved in their design should be aware of the possibility of the existence of biases associated with the logic of an algorithm. In addition, they should understand the measures available to minimize, as much as possible, the potential biases, and implement so that they do not occur, because if they are aware of the types of discriminations that occur in society, they will be able to identify when the solutions they develop reproduce them” , suggested Castañeda.

This work is innovative because it was carried out by specialists from different fields, including a sociologist, an anthropologist and experts in gender and statistics. “Team members provided a perspective that went beyond the stand-alone mathematics associated with algorithms, helping us to think of them as complex socio-technical systems,” said the study’s lead researcher.

“If you compare this work with others, I think it is one of the few that presents the problem of biases in algorithms from a neutral point of view, highlighting both the social and technical aspects for identify why an algorithm might make a biased decision,” she added. concluded.

More information:
Juliana Castaneda et al, Addressing Gender Bias Issues in Algorithmic Data Processes: A Socio-Statistical Perspective, Algorithms (2022). DOI: 10.3390/a15090303

Provided by Universitat Oberta de Catalunya (UOC)

Quote: How to End Gender Biases in Internet Algorithms (2022, November 23) Retrieved November 24, 2022 from https://techxplore.com/news/2022-11-gender-biases-internet-algorithms.html

This document is subject to copyright. Except for fair use for purposes of private study or research, no part may be reproduced without written permission. The content is provided for information only.


#Gender #Bias #Internet #Algorithms

Leave a Comment

Your email address will not be published. Required fields are marked *