@@ -34,7 +34,7 @@ to retain the 10 first elements of the array ``X`` and ``y``::
3434 >>> np.all(y_res == y[:10])
3535 True
3636
37- In addition, the parameter ``validate `` control input checking. For instance,
37+ In addition, the parameter ``validate `` controls input checking. For instance,
3838turning ``validate=False `` allows to pass any type of target ``y `` and do some
3939sampling for regression targets::
4040
@@ -51,7 +51,7 @@ sampling for regression targets::
5151 75.46571114, -67.49177372, 159.72700509, -169.80498923,
5252 211.95889757, 211.95889757])
5353
54- We illustrate the use of such sampler to implement an outlier rejection
54+ We illustrated the use of such sampler to implement an outlier rejection
5555estimator which can be easily used within a
5656:class: `~imblearn.pipeline.Pipeline `:
5757:ref: `sphx_glr_auto_examples_applications_plot_outlier_rejections.py `
@@ -69,7 +69,7 @@ will generate balanced mini-batches.
6969TensorFlow generator
7070~~~~~~~~~~~~~~~~~~~~
7171
72- The :func: `~imblearn.tensorflow.balanced_batch_generator ` allow to generate
72+ The :func: `~imblearn.tensorflow.balanced_batch_generator ` allows to generate
7373balanced mini-batches using an imbalanced-learn sampler which returns indices.
7474
7575Let's first generate some data::
@@ -96,7 +96,7 @@ balanced::
9696 ... random_state=42,
9797 ... )
9898
99- The ``generator `` and ``steps_per_epoch `` is used during the training of the
99+ The ``generator `` and ``steps_per_epoch `` are used during the training of the
100100Tensorflow model. We will illustrate how to use this generator. First, we can
101101define a logistic regression model which will be optimized by a gradient
102102descent::
0 commit comments