Text Analytics Toolbox Model for BERT-Large Network

Pretrained BERT-Large Network for MATLAB
94.0 次下载
更新时间 2024/9/11
BERT-Large is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 24 self-attention layers and a hidden size of 1024.
To load a BERT-Large model, you can run the following code:
[net, tokenizer] = bert(Model="large");
MATLAB 版本兼容性
创建方式 R2023b
兼容 R2023b 到 R2024b 的版本
平台兼容性
Windows macOS (Apple 芯片) macOS (Intel) Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!