From ea1a410d590e63e6fd24942ab8376600c12e2194 Mon Sep 17 00:00:00 2001 From: Myle Ott Date: Sat, 28 Sep 2019 08:56:03 -0700 Subject: [PATCH] RoBERTa now supported on TPU and TensorFlow via transformers library Summary: Pull Request resolved: https://github.com/pytorch/fairseq/pull/1197 Differential Revision: D17651374 Pulled By: myleott fbshipit-source-id: 5feb986de1e682eb83c4479f419ad51325718572 --- examples/roberta/README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/examples/roberta/README.md b/examples/roberta/README.md index 68dc6701ea..1661b604f7 100644 --- a/examples/roberta/README.md +++ b/examples/roberta/README.md @@ -8,6 +8,7 @@ RoBERTa iterates on BERT's pretraining procedure, including training the model l ### What's New: +- September 2019: TensorFlow and TPU support via the [transformers library](https://github.com/huggingface/transformers). - August 2019: RoBERTa is now supported in the [pytorch-transformers library](https://github.com/huggingface/pytorch-transformers). - August 2019: Added [tutorial for finetuning on WinoGrande](https://github.com/pytorch/fairseq/tree/master/examples/roberta/wsc#roberta-training-on-winogrande-dataset). - August 2019: Added [tutorial for pretraining RoBERTa using your own data](README.pretraining.md).