en
Proceedings chapter
Open access
English

Evaluating the Impact of Stereotypes and Language Combinations on Gender Bias Occurrence in NMT Generic Systems

Presented at Varna, 07.09.2023
PublisherStroudsburg : Association for Computational Linguistics (ACL)
Publication date2023-09-07
Abstract

Machine translation, and more specifically neural machine translation (NMT), have been proven to be subject to gender bias in recent years. Following previous studies’ methodology, we rely on a test suite formed with occupational nouns to investigate, through human evaluation, the influence of two different potential factors on gender bias occurrence in generic NMT: stereotypes and language combinations. Similarly to previous findings, we confirm stereotypes as a major source of gender bias, especially in female contexts, while observing bias even in language combinations traditionally less examined.

eng
Citation (ISO format)
TRIBOULET, Bertille, BOUILLON, Pierrette. Evaluating the Impact of Stereotypes and Language Combinations on Gender Bias Occurrence in NMT Generic Systems. In: Proceedings of the Third Workshop on Language Technology for Equality, Diversity and Inclusion. Varna. Stroudsburg : Association for Computational Linguistics (ACL), 2023. p. 62–70.
Main files (1)
Proceedings chapter (Published version)
accessLevelPublic
Identifiers
  • PID : unige:171410
ISBN978-954-452-084-7
256views
108downloads

Technical informations

Creation09/15/2023 12:21:43 PM
First validation09/18/2023 9:57:30 AM
Update time04/22/2024 12:28:06 PM
Status update04/22/2024 12:28:06 PM
Last indexation05/06/2024 4:59:22 PM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack