Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/tutorials/layers_normalizations.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@
"* **Instance Normalization** (TensorFlow Addons)\n",
"* **Layer Normalization** (TensorFlow Core)\n",
"\n",
"The basic idea behind these layers is to normalize the output of an activation layer to improve the convergence during training. In contrast to [batch normalization](https://keras.io/layers/normalization/) these normalizations do not work on batches, instead they normalize the activations of a single sample, making them suitable for recurrent neual networks as well. \n",
"The basic idea behind these layers is to normalize the output of an activation layer to improve the convergence during training. In contrast to [batch normalization](https://keras.io/layers/normalization/) these normalizations do not work on batches, instead they normalize the activations of a single sample, making them suitable for recurrent neural networks as well. \n",
"\n",
"Typically the normalization is performed by calculating the mean and the standard deviation of a subgroup in your input tensor. It is also possible to apply a scale and an offset factor to this as well.\n",
"\n",
Expand Down Expand Up @@ -260,7 +260,7 @@
"### Introduction\n",
"Layer Normalization is special case of group normalization where the group size is 1. The mean and standard deviation is calculated from all activations of a single sample.\n",
"\n",
"Experimental results show that Layer normalization is well suited for Recurrent Neural Networks, since it works batchsize independt.\n",
"Experimental results show that Layer normalization is well suited for Recurrent Neural Networks, since it works batchsize independently.\n",
"\n",
"### Example\n",
"\n",
Expand Down