Skip to content

Commit

Permalink
Fix dummy batch logic (facebookresearch#1744)
Browse files Browse the repository at this point in the history
Summary: Pull Request resolved: facebookresearch#1744

Differential Revision: D20097876

Pulled By: myleott

fbshipit-source-id: 420cbfa53bd8fa3aa4710d16d0e2d1280977dfec
  • Loading branch information
Myle Ott authored and facebook-github-bot committed Feb 25, 2020
1 parent b152183 commit f1a9ce8
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions fairseq/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -271,7 +271,7 @@ def get_train_iterator(
@metrics.aggregate("train")
def train_step(self, samples, raise_oom=False):
"""Do forward, backward and parameter update."""
if self._dummy_batch is None:
if self._dummy_batch == "DUMMY":
self._dummy_batch = samples[0]

self._set_seed()
Expand Down Expand Up @@ -420,7 +420,7 @@ def maybe_no_sync():
@metrics.aggregate("valid")
def valid_step(self, sample, raise_oom=False):
"""Do forward pass in evaluation mode."""
if self._dummy_batch is None:
if self._dummy_batch == "DUMMY":
self._dummy_batch = sample

with torch.no_grad():
Expand Down

0 comments on commit f1a9ce8

Please sign in to comment.