We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 98ae65b commit f2a2e2eCopy full SHA for f2a2e2e
examples/optimizers_lazyadam.ipynb
@@ -90,7 +90,7 @@
90
"source": [
91
"# LazyAdam\n",
92
"\n",
93
- "> LazyAdam is a variant of the Adam optimizer that handles sparse updates moreefficiently.\n",
+ "> LazyAdam is a variant of the Adam optimizer that handles sparse updates more efficiently.\n",
94
" The original Adam algorithm maintains two moving-average accumulators for\n",
95
" each trainable variable; the accumulators are updated at every step.\n",
96
" This class provides lazier handling of gradient updates for sparse\n",
0 commit comments