-
Notifications
You must be signed in to change notification settings - Fork 62
Open
Description
RoBERTa is an update of BERT with better performance, and is more preferred in many applications. Similarly DistilBERT mostly matches the performance of BERT with significantly faster inference. There is little reason to use BERT anymore, and it would be great if MATLAB updated its transformer models to reflect the progress of the last 6-7 years.
DistilBERT maintains 97% of BERT's language understanding capabilities while being 40% small and 60% faster
https://zilliz.com/learn/distilbert-distilled-version-of-bert
https://huggingface.co/docs/transformers/model_doc/distilbert
https://huggingface.co/docs/transformers/en/model_doc/roberta
Metadata
Metadata
Assignees
Labels
No labels