From 4b8df7b2cb3d43e97ee4f49a529a31714fc8f64c Mon Sep 17 00:00:00 2001 From: Inyong Hwang Date: Mon, 15 Nov 2021 11:29:09 +0900 Subject: [PATCH 1/2] Update layers_normalizations.ipynb - typo "neual" -> "neural" --- docs/tutorials/layers_normalizations.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/tutorials/layers_normalizations.ipynb b/docs/tutorials/layers_normalizations.ipynb index 01fffc7aa7..bdf6848d9c 100644 --- a/docs/tutorials/layers_normalizations.ipynb +++ b/docs/tutorials/layers_normalizations.ipynb @@ -67,7 +67,7 @@ "* **Instance Normalization** (TensorFlow Addons)\n", "* **Layer Normalization** (TensorFlow Core)\n", "\n", - "The basic idea behind these layers is to normalize the output of an activation layer to improve the convergence during training. In contrast to [batch normalization](https://keras.io/layers/normalization/) these normalizations do not work on batches, instead they normalize the activations of a single sample, making them suitable for recurrent neual networks as well. \n", + "The basic idea behind these layers is to normalize the output of an activation layer to improve the convergence during training. In contrast to [batch normalization](https://keras.io/layers/normalization/) these normalizations do not work on batches, instead they normalize the activations of a single sample, making them suitable for recurrent neural networks as well. \n", "\n", "Typically the normalization is performed by calculating the mean and the standard deviation of a subgroup in your input tensor. It is also possible to apply a scale and an offset factor to this as well.\n", "\n", From 82abea5b89359fc2fc924d0fd0d80befa787d438 Mon Sep 17 00:00:00 2001 From: Inyong Hwang Date: Mon, 15 Nov 2021 14:27:50 +0900 Subject: [PATCH 2/2] Update layers_normalizations.ipynb - typo "independt" -> "independently" --- docs/tutorials/layers_normalizations.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/tutorials/layers_normalizations.ipynb b/docs/tutorials/layers_normalizations.ipynb index bdf6848d9c..0788784667 100644 --- a/docs/tutorials/layers_normalizations.ipynb +++ b/docs/tutorials/layers_normalizations.ipynb @@ -260,7 +260,7 @@ "### Introduction\n", "Layer Normalization is special case of group normalization where the group size is 1. The mean and standard deviation is calculated from all activations of a single sample.\n", "\n", - "Experimental results show that Layer normalization is well suited for Recurrent Neural Networks, since it works batchsize independt.\n", + "Experimental results show that Layer normalization is well suited for Recurrent Neural Networks, since it works batchsize independently.\n", "\n", "### Example\n", "\n",