@@ -38,7 +38,7 @@ data set, this classifier will favor the majority classes::
3838 >>> bc.fit(X_train, y_train) #doctest:
3939 BaggingClassifier(...)
4040 >>> y_pred = bc.predict(X_test)
41- >>> balanced_accuracy_score(y_test, y_pred) # doctest:
41+ >>> balanced_accuracy_score(y_test, y_pred)
4242 0.77...
4343
4444In :class: `BalancedBaggingClassifier `, each bootstrap sample will be further
@@ -54,10 +54,10 @@ sampling is controlled by the parameter `sampler` or the two parameters
5454 ... sampling_strategy='auto',
5555 ... replacement=False,
5656 ... random_state=0)
57- >>> bbc.fit(X_train, y_train) # doctest:
57+ >>> bbc.fit(X_train, y_train)
5858 BalancedBaggingClassifier(...)
5959 >>> y_pred = bbc.predict(X_test)
60- >>> balanced_accuracy_score(y_test, y_pred) # doctest:
60+ >>> balanced_accuracy_score(y_test, y_pred)
6161 0.8...
6262
6363Changing the `sampler ` will give rise to different known implementation
@@ -78,10 +78,10 @@ each tree of the forest will be provided a balanced bootstrap sample
7878
7979 >>> from imblearn.ensemble import BalancedRandomForestClassifier
8080 >>> brf = BalancedRandomForestClassifier(n_estimators=100, random_state=0)
81- >>> brf.fit(X_train, y_train) # doctest:
81+ >>> brf.fit(X_train, y_train)
8282 BalancedRandomForestClassifier(...)
8383 >>> y_pred = brf.predict(X_test)
84- >>> balanced_accuracy_score(y_test, y_pred) # doctest:
84+ >>> balanced_accuracy_score(y_test, y_pred)
8585 0.8...
8686
8787.. _boosting :
@@ -97,10 +97,10 @@ a boosting iteration :cite:`seiffert2009rusboost`::
9797 >>> from imblearn.ensemble import RUSBoostClassifier
9898 >>> rusboost = RUSBoostClassifier(n_estimators=200, algorithm='SAMME.R',
9999 ... random_state=0)
100- >>> rusboost.fit(X_train, y_train) # doctest:
100+ >>> rusboost.fit(X_train, y_train)
101101 RUSBoostClassifier(...)
102102 >>> y_pred = rusboost.predict(X_test)
103- >>> balanced_accuracy_score(y_test, y_pred) # doctest:
103+ >>> balanced_accuracy_score(y_test, y_pred)
104104 0...
105105
106106A specific method which uses :class: `~sklearn.ensemble.AdaBoostClassifier ` as
@@ -111,10 +111,10 @@ the :class:`BalancedBaggingClassifier` API, one can construct the ensemble as::
111111
112112 >>> from imblearn.ensemble import EasyEnsembleClassifier
113113 >>> eec = EasyEnsembleClassifier(random_state=0)
114- >>> eec.fit(X_train, y_train) # doctest:
114+ >>> eec.fit(X_train, y_train)
115115 EasyEnsembleClassifier(...)
116116 >>> y_pred = eec.predict(X_test)
117- >>> balanced_accuracy_score(y_test, y_pred) # doctest:
117+ >>> balanced_accuracy_score(y_test, y_pred)
118118 0.6...
119119
120120.. topic :: Examples
0 commit comments