Proceedings chapter
OA Policy
English

Evaluating the Impact of Stereotypes and Language Combinations on Gender Bias Occurrence in NMT Generic Systems

Presented atVarna, 07.09.2023
PublisherStroudsburg : Association for Computational Linguistics (ACL)
Publication date2023-09-07
Abstract

Machine translation, and more specifically neural machine translation (NMT), have been proven to be subject to gender bias in recent years. Following previous studies’ methodology, we rely on a test suite formed with occupational nouns to investigate, through human evaluation, the influence of two different potential factors on gender bias occurrence in generic NMT: stereotypes and language combinations. Similarly to previous findings, we confirm stereotypes as a major source of gender bias, especially in female contexts, while observing bias even in language combinations traditionally less examined.

Citation (ISO format)
TRIBOULET, Bertille, BOUILLON, Pierrette. Evaluating the Impact of Stereotypes and Language Combinations on Gender Bias Occurrence in NMT Generic Systems. In: Proceedings of the Third Workshop on Language Technology for Equality, Diversity and Inclusion. Varna. Stroudsburg : Association for Computational Linguistics (ACL), 2023. p. 62–70.
Main files (1)
Proceedings chapter (Published version)
accessLevelPublic
Identifiers
  • PID : unige:171410
Additional URL for this publicationhttps://aclanthology.org/2023.ltedi-1.9/
ISBN978-954-452-084-7
302views
134downloads

Technical informations

Creation15/09/2023 14:21:43
First validation18/09/2023 11:57:30
Update time22/04/2024 14:28:06
Status update22/04/2024 14:28:06
Last indexation17/12/2024 16:38:29
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack