File tree Expand file tree Collapse file tree 2 files changed +3
-3
lines changed Expand file tree Collapse file tree 2 files changed +3
-3
lines changed Original file line number Diff line number Diff line change 5858# A function that we apply to tensors to construct computational graph is
5959# in fact an object of class ``Function``. This object knows how to
6060# compute the function in the *forward* direction, and also how to compute
61- # it's derivative during the *backward propagation* step. A reference to
61+ # its derivative during the *backward propagation* step. A reference to
6262# the backward propagation function is stored in ``grad_fn`` property of a
6363# tensor. You can find more information of ``Function`` `in the
6464# documentation <https://pytorch.org/docs/stable/autograd.html#function>`__.
Original file line number Diff line number Diff line change @@ -67,7 +67,7 @@ def forward(self, x):
6767
6868##############################################
6969# We create an instance of ``NeuralNetwork``, and move it to the ``device``, and print
70- # it's structure.
70+ # its structure.
7171
7272model = NeuralNetwork ().to (device )
7373print (model )
@@ -119,7 +119,7 @@ def forward(self, x):
119119# nn.Linear
120120# ^^^^^^^^^^^^^^^^^^^^^^
121121# The `linear layer <https://pytorch.org/docs/stable/generated/torch.nn.Linear.html>`_
122- # is a module that applies a linear transformation on the input using it's stored weights and biases.
122+ # is a module that applies a linear transformation on the input using its stored weights and biases.
123123#
124124layer1 = nn .Linear (in_features = 28 * 28 , out_features = 20 )
125125hidden1 = layer1 (flat_image )
You can’t perform that action at this time.
0 commit comments