AliBERT : the first pretrained language model for French biomedical text
The paper “AliBERT : A pretrained language model for French biomedical text” was written in collaboration with Aman Berhe, Guillaume Draznieks, Vincent Martenot, Valentin Masdeu, Lucas Davy and Jean-Daniel Zucker. BERT architecture, which allow for context learning on text documents, is mostly trained on common English text resources.Performances in other languages, especially in specific topics which requires deep knowledge and vocabulary, are […]