We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent f990291 commit 9259e7bCopy full SHA for 9259e7b
intermediate_source/pipeline_tutorial.py
@@ -199,7 +199,7 @@ def batchify(data, bsz):
199
# the transformer model. It subdivides the source data into chunks of
200
# length ``bptt``. For the language modeling task, the model needs the
201
# following words as ``Target``. For example, with a ``bptt`` value of 2,
202
-# we’d get the following two Variables for ``i`` = 0:
+# we'd get the following two Variables for ``i`` = 0:
203
#
204
# .. image:: ../_static/img/transformer_input_target.png
205
0 commit comments