We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent febdbe7 commit cd32ed9Copy full SHA for cd32ed9
beginner_source/basics/autogradqs_tutorial.py
@@ -47,7 +47,7 @@
47
#
48
# In this network, ``w`` and ``b`` are **parameters**, which we need to
49
# optimize. Thus, we need to be able to compute the gradients of loss
50
-# function with respect to those variables. In orded to do that, we set
+# function with respect to those variables. In order to do that, we set
51
# the ``requires_grad`` property of those tensors.
52
53
#######################################################################
0 commit comments