@@ -34,7 +34,7 @@ to retain the 10 first elements of the array ``X`` and ``y``::
3434 >>> np.all(y_res == y[:10])
3535 True
3636
37- In addition, the parameter ``validate `` control input checking. For instance,
37+ In addition, the parameter ``validate `` controls input checking. For instance,
3838turning ``validate=False `` allows to pass any type of target ``y `` and do some
3939sampling for regression targets::
4040
@@ -51,7 +51,7 @@ sampling for regression targets::
5151 75.46571114, -67.49177372, 159.72700509, -169.80498923,
5252 211.95889757, 211.95889757])
5353
54- We illustrate the use of such sampler to implement an outlier rejection
54+ We illustrated the use of such sampler to implement an outlier rejection
5555estimator which can be easily used within a
5656:class: `~imblearn.pipeline.Pipeline `:
5757:ref: `sphx_glr_auto_examples_applications_plot_outlier_rejections.py `
@@ -69,10 +69,11 @@ will generate balanced mini-batches.
6969TensorFlow generator
7070~~~~~~~~~~~~~~~~~~~~
7171
72- The :func: `~imblearn.tensorflow.balanced_batch_generator ` allow to generate
72+ The :func: `~imblearn.tensorflow.balanced_batch_generator ` allows to generate
7373balanced mini-batches using an imbalanced-learn sampler which returns indices.
7474
7575Let's first generate some data::
76+
7677 >>> n_features, n_classes = 10, 2
7778 >>> X, y = make_classification(
7879 ... n_samples=10_000, n_features=n_features, n_informative=2,
@@ -96,7 +97,7 @@ balanced::
9697 ... random_state=42,
9798 ... )
9899
99- The ``generator `` and ``steps_per_epoch `` is used during the training of the
100+ The ``generator `` and ``steps_per_epoch `` are used during the training of a
100101Tensorflow model. We will illustrate how to use this generator. First, we can
101102define a logistic regression model which will be optimized by a gradient
102103descent::
0 commit comments