Blanket
About
Posts
Tags
en
BERT
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter (2019)
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
RoBERTa: A Robustly Optimized BERT Pretraining Approach (2019)
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (2018)