Skip to content

Commit 984700d

Browse files
authored
Fix grad in-place operation (#779)
p.add_(-lr, p.grad) throws RuntimeError: a leaf Variable that requires grad is being used in an in-place operation, using p.data.add_(-lr, p.grad) fixes this issue.
1 parent d91adc9 commit 984700d

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

word_language_model/main.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -178,7 +178,7 @@ def train():
178178
# `clip_grad_norm` helps prevent the exploding gradient problem in RNNs / LSTMs.
179179
torch.nn.utils.clip_grad_norm_(model.parameters(), args.clip)
180180
for p in model.parameters():
181-
p.add_(-lr, p.grad)
181+
p.data.add_(-lr, p.grad)
182182

183183
total_loss += loss.item()
184184

0 commit comments

Comments
 (0)