Proceedings chapter
OA Policy
English

Jargon: A Suite of Language Models and Evaluation Tasks for French Specialized Domains

Presented atTorino, Italy, 20-25 May 2024
Published inNicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue (Ed.), Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), p. 9463-9476
PublisherELRA and ICCL
Publication date2024
Abstract

Pretrained Language Models (PLMs) are the de facto backbone of most state-of-the-art NLP systems. In this paper, we introduce a family of domain-specific pretrained PLMs for French, focusing on three important domains: transcribed speech, medicine, and law. We use a transformer architecture based on efficient methods (LinFormer) to maximise their utility, since these domains often involve processing long documents. We evaluate and compare our models to state-of-the-art models on a diverse set of tasks and datasets, some of which are introduced in this paper. We gather the datasets into a new French-language evaluation benchmark for these three domains. We also compare various training configurations: continued pretraining, pretraining from scratch, as well as single- and multi-domain pretraining. Extensive domain-specific experiments show that it is possible to attain competitive downstream performance even when pre-training with the approximative LinFormer attention mechanism. For full reproducibility, we release the models and pretraining data, as well as contributed datasets.

Keywords
  • Self-supervised learning
  • Pretrained language models
  • Evaluation benchmark
  • Biomedical document processing
  • Legal document processing
  • Speech transcription
Citation (ISO format)
SEGONNE, Vincent et al. Jargon: A Suite of Language Models and Evaluation Tasks for French Specialized Domains. In: Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024). Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue (Ed.). Torino, Italy. [s.l.] : ELRA and ICCL, 2024. p. 9463–9476.
Main files (1)
Proceedings chapter (Published version)
Identifiers
  • PID : unige:177151
81views
15downloads

Technical informations

Creation05/18/2024 9:02:56 AM
First validation05/21/2024 8:22:30 AM
Update time05/21/2024 8:22:30 AM
Status update05/21/2024 8:22:30 AM
Last indexation11/01/2024 9:35:09 AM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack