Skip to content
This repository was archived by the owner on Sep 10, 2025. It is now read-only.

Commit 1a05269

Browse files
authored
Updated XLMR docs (#1497)
1 parent 776a15d commit 1a05269

File tree

1 file changed

+18
-0
lines changed

1 file changed

+18
-0
lines changed

torchtext/models/roberta/bundler.py

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -168,6 +168,15 @@ def encoderConf(self) -> RobertaEncoderConf:
168168
'''
169169
XLM-R Encoder with Base configuration
170170
171+
The XLM-RoBERTa model was proposed in `Unsupervised Cross-lingual Representation Learning
172+
at Scale <https://arxiv.org/abs/1911.02116>`. It is a large multi-lingual language model,
173+
trained on 2.5TB of filtered CommonCrawl data and based on the RoBERTa model architecture.
174+
175+
Originally published by the authors of XLM-RoBERTa under MIT License
176+
and redistributed with the same license.
177+
[`License <https://github.com/pytorch/fairseq/blob/main/LICENSE>`__,
178+
`Source <https://github.com/pytorch/fairseq/tree/main/examples/xlmr#pre-trained-models>`__]
179+
171180
Please refer to :func:`torchtext.models.RobertaModelBundle` for the usage.
172181
'''
173182
)
@@ -189,6 +198,15 @@ def encoderConf(self) -> RobertaEncoderConf:
189198
'''
190199
XLM-R Encoder with Large configuration
191200
201+
The XLM-RoBERTa model was proposed in `Unsupervised Cross-lingual Representation Learning
202+
at Scale <https://arxiv.org/abs/1911.02116>`. It is a large multi-lingual language model,
203+
trained on 2.5TB of filtered CommonCrawl data and based on the RoBERTa model architecture.
204+
205+
Originally published by the authors of XLM-RoBERTa under MIT License
206+
and redistributed with the same license.
207+
[`License <https://github.com/pytorch/fairseq/blob/main/LICENSE>`__,
208+
`Source <https://github.com/pytorch/fairseq/tree/main/examples/xlmr#pre-trained-models>`__]
209+
192210
Please refer to :func:`torchtext.models.RobertaModelBundle` for the usage.
193211
'''
194212
)

0 commit comments

Comments
 (0)