outdated transformer models

JH 2025-5-6 (编辑时间: 2025-5-14)
最新活动JH 编辑于 2025-5-14

Why is RoBERTa not available as a pretrained model? It is superior to BERT in many fields and has become more popular in the literature. For faster inference, you should offer DistilBERT, which is more modern than BERT but smaller/faster. The respository hasn't been updated in two years, which is a lifetime in the field of deep learning.
https://github.com/matlab-deep-learning/transformer-models
Mike Croucher
Mike Croucher 2025-5-12
Thanks for your suggestions.I have passed them on to development