Skip to content

Commit

Permalink
RoBERTa now supported on TPU and TensorFlow via transformers library
Browse files Browse the repository at this point in the history
Summary: Pull Request resolved: facebookresearch#1197

Differential Revision: D17651374

Pulled By: myleott

fbshipit-source-id: 5feb986de1e682eb83c4479f419ad51325718572
  • Loading branch information
Myle Ott authored and facebook-github-bot committed Sep 28, 2019
1 parent 1cb267e commit ea1a410
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions examples/roberta/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ RoBERTa iterates on BERT's pretraining procedure, including training the model l

### What's New:

- September 2019: TensorFlow and TPU support via the [transformers library](https://github.com/huggingface/transformers).
- August 2019: RoBERTa is now supported in the [pytorch-transformers library](https://github.com/huggingface/pytorch-transformers).
- August 2019: Added [tutorial for finetuning on WinoGrande](https://github.com/pytorch/fairseq/tree/master/examples/roberta/wsc#roberta-training-on-winogrande-dataset).
- August 2019: Added [tutorial for pretraining RoBERTa using your own data](README.pretraining.md).
Expand Down

0 comments on commit ea1a410

Please sign in to comment.