Skip to content

Commit 88bda68

Browse files
committed
address comments
1 parent e0f26a8 commit 88bda68

File tree

2 files changed

+12
-9
lines changed

2 files changed

+12
-9
lines changed
-72 Bytes
Loading

beginner_source/deploy_seq2seq_hybrid_frontend_tutorial.py

Lines changed: 12 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@
1818
# regarding data preprocessing, model theory and definition, and model
1919
# training.
2020
#
21-
# What is the TorchScript?
21+
# What is TorchScript?
2222
# ----------------------------
2323
#
2424
# During the research and development phase of a deep learning-based
@@ -53,19 +53,22 @@
5353
# will be recorded. In other words, the control flow itself is not
5454
# captured. To convert modules and functions containing data-dependent
5555
# control flow, a **scripting** mechanism is provided. The
56-
# ``torch.jit.script`` function takes module or function and does not
57-
# requires example inputs. Scripting then explicitly converts the module
58-
# or function code to TorchScript, including all possible control flow
59-
# routes. The one caveat with using scripting is that it only supports
60-
# a subset of Python, so you might need to rewrite the code to make it
61-
# compatible with TorchScript syntax.
56+
# ``torch.jit.script`` function/decorator takes a module or function and
57+
# does not requires example inputs. Scripting then explicitly converts
58+
# the module or function code to TorchScript, including all control flows.
59+
# One caveat with using scripting is that it only supports a subset of
60+
# Python, so you might need to rewrite the code to make it compatible
61+
# with the TorchScript syntax.
6262
#
6363
# For all details relating to the supported features, see the TorchScript
6464
# `language reference <https://pytorch.org/docs/master/jit.html>`__. To
6565
# provide the maximum flexibility, you can also mix tracing and scripting
6666
# modes together to represent your whole program, and these techniques can
6767
# be applied incrementally.
6868
#
69+
# .. figure:: /_static/img/chatbot/pytorch_workflow.png
70+
# :align: center
71+
# :alt: workflow
6972
#
7073

7174

@@ -385,7 +388,7 @@ def forward(self, hidden, encoder_outputs):
385388
# TorchScript Notes:
386389
# ~~~~~~~~~~~~~~~~~~~~~~
387390
#
388-
# Similarly to the ``EncoderRNN```, this module does not contain any
391+
# Similarly to the ``EncoderRNN``, this module does not contain any
389392
# data-dependent control flow. Therefore, we can once again use
390393
# **tracing** to convert this model to TorchScript after it
391394
# is initialized and its parameters are loaded.
@@ -692,7 +695,7 @@ def evaluateExample(sentence, searcher, voc):
692695
# for some part of your models, you must call .to(device) to set the device
693696
# options of the models and .eval() to set the dropout layers to test mode
694697
# **before** tracing the models. `TracedModule` objects do not inherit the
695-
# ``to``` or ``eval``` methods. Since in this tutorial we are only using
698+
# ``to`` or ``eval`` methods. Since in this tutorial we are only using
696699
# scripting instead of tracing, we only need to do this before we do
697700
# evaluation (which is the same as we normally do in eager mode).
698701
#

0 commit comments

Comments
 (0)