You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -916,7 +922,10 @@ True if using Automatic Mixed Precision (AMP)
916
922
917
923
automatic_optimization
918
924
~~~~~~~~~~~~~~~~~~~~~~
919
-
When set to ``False``, Lightning does not automate the optimization process. This means you are responsible for handling your optimizers. However, we do take care of precision and any accelerators used.
925
+
When set to ``False``, Lightning does not automate the optimization process. This means you are responsible for handling
926
+
your optimizers. However, we do take care of precision and any accelerators used.
927
+
928
+
See :ref:`manual optimization<common/optimizers:Manual optimization>` for details.
920
929
921
930
.. code-block:: python
922
931
@@ -931,7 +940,9 @@ When set to ``False``, Lightning does not automate the optimization process. Thi
931
940
self.manual_backward(loss)
932
941
opt.step()
933
942
934
-
This is recommended only if using 2+ optimizers AND if you know how to perform the optimization procedure properly. Note that automatic optimization can still be used with multiple optimizers by relying on the ``optimizer_idx`` parameter. Manual optimization is most useful for research topics like reinforcement learning, sparse coding, and GAN research.
943
+
This is recommended only if using 2+ optimizers AND if you know how to perform the optimization procedure properly. Note
944
+
that automatic optimization can still be used with multiple optimizers by relying on the ``optimizer_idx`` parameter.
945
+
Manual optimization is most useful for research topics like reinforcement learning, sparse coding, and GAN research.
0 commit comments