Making monolingual sentence embeddings multilingual using knowledge distillation

624Citations
Citations of this article
509Readers
Mendeley users who have this article in their library.

Abstract

We present an easy and efficient method to extend existing sentence embedding models to new languages. This allows to create multilingual versions from previously monolingual models. The training is based on the idea that a translated sentence should be mapped to the same location in the vector space as the original sentence. We use the original (monolingual) model to generate sentence embeddings for the source language and then train a new system on translated sentences to mimic the original model. Compared to other methods for training multilingual sentence embeddings, this approach has several advantages: It is easy to extend existing models with relatively few samples to new languages, it is easier to ensure desired properties for the vector space, and the hardware requirements for training are lower. We demonstrate the effectiveness of our approach for 50+ languages from various language families. Code to extend sentence embeddings models to more than 400 languages is publicly available.

References Powered by Scopus

A large annotated corpus for learning natural language inference

2562Citations
N/AReaders
Get full text

Supervised learning of universal sentence representations from natural language inference data

1547Citations
N/AReaders
Get full text

Massively Multilingual Sentence Embeddings for Zero-Shot Cross-Lingual Transfer and Beyond

678Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Reimers, N., & Gurevych, I. (2020). Making monolingual sentence embeddings multilingual using knowledge distillation. In EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference (pp. 4512–4525). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.emnlp-main.365

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 155

69%

Researcher 41

18%

Lecturer / Post doc 15

7%

Professor / Associate Prof. 13

6%

Readers' Discipline

Tooltip

Computer Science 191

84%

Engineering 18

8%

Linguistics 9

4%

Business, Management and Accounting 9

4%

Save time finding and organizing research with Mendeley

Sign up for free