.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/plot_sklearn_transformed_target.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_plot_sklearn_transformed_target.py: .. _l-sklearn-transformed-target: Transformed Target ================== `TransformedTargetRegressor `_ proposes a way to modify the target before training. The notebook extends the concept to classifiers. TransformedTargetRegressor -------------------------- Let's reuse the example from `Effect of transforming the targets in regression model `_. .. GENERATED FROM PYTHON SOURCE LINES 17-42 .. code-block:: Python import pickle from pickle import PicklingError import numpy from numpy.random import randn, random from pandas import DataFrame import matplotlib.pyplot as plt from sklearn.compose import TransformedTargetRegressor from sklearn.metrics import accuracy_score, r2_score from sklearn.linear_model import LinearRegression, LogisticRegression from sklearn.datasets import load_iris from sklearn.model_selection import train_test_split from sklearn.exceptions import ConvergenceWarning from sklearn.utils._testing import ignore_warnings from mlinsights.mlmodel import TransformedTargetRegressor2 from mlinsights.mlmodel import TransformedTargetClassifier2 rnd = random((1000, 1)) rndn = randn(1000) X = rnd[:, :1] * 10 y = rnd[:, 0] * 5 + rndn / 2 y = numpy.exp((y + abs(y.min())) / 2) y_trans = numpy.log1p(y) .. GENERATED FROM PYTHON SOURCE LINES 44-51 .. code-block:: Python fig, ax = plt.subplots(1, 2, figsize=(14, 4)) ax[0].plot(X[:, 0], y, ".") ax[0].set_title("Exponential target") ax[1].plot(X[:, 0], y_trans, ".") ax[1].set_title("Exponential target transform with log1p") .. image-sg:: /auto_examples/images/sphx_glr_plot_sklearn_transformed_target_001.png :alt: Exponential target, Exponential target transform with log1p :srcset: /auto_examples/images/sphx_glr_plot_sklearn_transformed_target_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none Text(0.5, 1.0, 'Exponential target transform with log1p') .. GENERATED FROM PYTHON SOURCE LINES 53-57 .. code-block:: Python reg = LinearRegression() reg.fit(X, y) .. raw:: html
LinearRegression()
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


.. GENERATED FROM PYTHON SOURCE LINES 59-65 .. code-block:: Python regr_trans = TransformedTargetRegressor( regressor=LinearRegression(), func=numpy.log1p, inverse_func=numpy.expm1 ) regr_trans.fit(X, y) .. raw:: html
TransformedTargetRegressor(func=<ufunc 'log1p'>, inverse_func=<ufunc 'expm1'>,
                               regressor=LinearRegression())
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


.. GENERATED FROM PYTHON SOURCE LINES 67-80 .. code-block:: Python fig, ax = plt.subplots(1, 2, figsize=(14, 4)) ax[0].plot(X[:, 0], y, ".") ax[0].plot(X[:, 0], reg.predict(X), ".", label="Regular Linear Regression") ax[0].set_title("LinearRegression") ax[1].plot(X[:, 0], y, ".") ax[1].plot( X[:, 0], regr_trans.predict(X), ".", label="Linear Regression with modified target" ) ax[1].set_title("TransformedTargetRegressor") .. image-sg:: /auto_examples/images/sphx_glr_plot_sklearn_transformed_target_002.png :alt: LinearRegression, TransformedTargetRegressor :srcset: /auto_examples/images/sphx_glr_plot_sklearn_transformed_target_002.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none Text(0.5, 1.0, 'TransformedTargetRegressor') .. GENERATED FROM PYTHON SOURCE LINES 81-83 TransformedTargetRegressor2 --------------------------- .. GENERATED FROM PYTHON SOURCE LINES 83-91 .. code-block:: Python # Same thing with *mlinsights*. regr_trans2 = TransformedTargetRegressor2( regressor=LinearRegression(), transformer="log1p" ) regr_trans2.fit(X, y) .. raw:: html
TransformedTargetRegressor2(regressor=LinearRegression(), transformer='log1p')
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


.. GENERATED FROM PYTHON SOURCE LINES 93-111 .. code-block:: Python fig, ax = plt.subplots(1, 3, figsize=(14, 4)) ax[0].plot(X[:, 0], y, ".") ax[0].plot(X[:, 0], reg.predict(X), ".", label="Regular Linear Regression") ax[0].set_title("LinearRegression") ax[1].plot(X[:, 0], y, ".") ax[1].plot( X[:, 0], regr_trans.predict(X), ".", label="Linear Regression with modified target" ) ax[1].set_title("TransformedTargetRegressor") ax[2].plot(X[:, 0], y, ".") ax[2].plot( X[:, 0], regr_trans2.predict(X), ".", label="Linear Regression with modified target" ) ax[2].set_title("TransformedTargetRegressor2") .. image-sg:: /auto_examples/images/sphx_glr_plot_sklearn_transformed_target_003.png :alt: LinearRegression, TransformedTargetRegressor, TransformedTargetRegressor2 :srcset: /auto_examples/images/sphx_glr_plot_sklearn_transformed_target_003.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none Text(0.5, 1.0, 'TransformedTargetRegressor2') .. GENERATED FROM PYTHON SOURCE LINES 112-117 It works the same way except the user does not have to specify the inverse function. Why another? ------------ .. GENERATED FROM PYTHON SOURCE LINES 117-121 .. code-block:: Python by1 = pickle.dumps(regr_trans) by2 = pickle.dumps(regr_trans2) .. GENERATED FROM PYTHON SOURCE LINES 123-127 .. code-block:: Python tr1 = pickle.loads(by1) tr2 = pickle.loads(by2) .. GENERATED FROM PYTHON SOURCE LINES 129-134 .. code-block:: Python numpy.max(numpy.abs(tr1.predict(X) - tr2.predict(X))) .. rst-class:: sphx-glr-script-out .. code-block:: none np.float64(0.0) .. GENERATED FROM PYTHON SOURCE LINES 135-137 Well, to be honest, I did not expect numpy functions to be pickable. Lambda functions are not. .. GENERATED FROM PYTHON SOURCE LINES 137-145 .. code-block:: Python regr_trans3 = TransformedTargetRegressor( regressor=LinearRegression(), func=lambda x: numpy.log1p(x), inverse_func=numpy.expm1, ) regr_trans3.fit(X, y) .. raw:: html
TransformedTargetRegressor(func=<function <lambda> at 0x7f84539b65f0>,
                               inverse_func=<ufunc 'expm1'>,
                               regressor=LinearRegression())
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


.. GENERATED FROM PYTHON SOURCE LINES 147-155 .. code-block:: Python try: pickle.dumps(regr_trans3) except PicklingError as e: print(e) .. rst-class:: sphx-glr-script-out .. code-block:: none Can't pickle at 0x7f84539b65f0>: attribute lookup on __main__ failed .. GENERATED FROM PYTHON SOURCE LINES 156-161 Classifier and classes permutation ---------------------------------- One question I get sometimes from my students is: regression or classification? .. GENERATED FROM PYTHON SOURCE LINES 161-166 .. code-block:: Python data = load_iris() X, y = data.data, data.target X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=7) .. GENERATED FROM PYTHON SOURCE LINES 168-174 .. code-block:: Python reg = LinearRegression() reg.fit(X_train, y_train) log = LogisticRegression() log.fit(X_train, y_train) .. raw:: html
LogisticRegression()
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


.. GENERATED FROM PYTHON SOURCE LINES 176-181 .. code-block:: Python r2_score(y_test, reg.predict(X_test)), r2_score(y_test, log.predict(X_test)) .. rst-class:: sphx-glr-script-out .. code-block:: none (0.8752883470101485, 0.8325991189427313) .. GENERATED FROM PYTHON SOURCE LINES 182-184 The accuracy does not work on the regression output as it produces float. .. GENERATED FROM PYTHON SOURCE LINES 184-194 .. code-block:: Python try: accuracy_score(y_test, reg.predict(X_test)), accuracy_score( y_test, log.predict(X_test) ) except ValueError as e: print(e) .. rst-class:: sphx-glr-script-out .. code-block:: none Classification metrics can't handle a mix of multiclass and continuous targets .. GENERATED FROM PYTHON SOURCE LINES 195-198 Based on that figure, a regression model would be better than a classification model on a problem which is known to be a classification problem. Let's play a little bit. .. GENERATED FROM PYTHON SOURCE LINES 198-218 .. code-block:: Python @ignore_warnings(category=(ConvergenceWarning,)) def evaluation(): rnd = [] perf_reg = [] perf_clr = [] for rs in range(200): rnd.append(rs) X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=rs) reg = LinearRegression() reg.fit(X_train, y_train) log = LogisticRegression() log.fit(X_train, y_train) perf_reg.append(r2_score(y_test, reg.predict(X_test))) perf_clr.append(r2_score(y_test, log.predict(X_test))) return rnd, perf_reg, perf_clr rnd, perf_reg, perf_clr = evaluation() .. GENERATED FROM PYTHON SOURCE LINES 220-228 .. code-block:: Python fig, ax = plt.subplots(1, 1, figsize=(12, 4)) ax.plot(rnd, perf_reg, label="regression") ax.plot(rnd, perf_clr, label="classification") ax.set_title("Comparison between regression and classificaton\non the same problem") .. image-sg:: /auto_examples/images/sphx_glr_plot_sklearn_transformed_target_004.png :alt: Comparison between regression and classificaton on the same problem :srcset: /auto_examples/images/sphx_glr_plot_sklearn_transformed_target_004.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none Text(0.5, 1.0, 'Comparison between regression and classificaton\non the same problem') .. GENERATED FROM PYTHON SOURCE LINES 229-231 Difficult to say. Knowing the expected value is an integer. Let's round the prediction made by the regression which is known to be integer. .. GENERATED FROM PYTHON SOURCE LINES 231-239 .. code-block:: Python def float2int(y): return numpy.int32(y + 0.5) fct2float2int = numpy.vectorize(float2int) .. GENERATED FROM PYTHON SOURCE LINES 241-271 .. code-block:: Python @ignore_warnings(category=(ConvergenceWarning,)) def evaluation2(): rnd = [] perf_reg = [] perf_clr = [] acc_reg = [] acc_clr = [] for rs in range(50): rnd.append(rs) X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=rs) reg = LinearRegression() reg.fit(X_train, y_train) log = LogisticRegression() log.fit(X_train, y_train) perf_reg.append(r2_score(y_test, float2int(reg.predict(X_test)))) perf_clr.append(r2_score(y_test, log.predict(X_test))) acc_reg.append(accuracy_score(y_test, float2int(reg.predict(X_test)))) acc_clr.append(accuracy_score(y_test, log.predict(X_test))) return ( numpy.array(rnd), numpy.array(perf_reg), numpy.array(perf_clr), numpy.array(acc_reg), numpy.array(acc_clr), ) rnd2, perf_reg2, perf_clr2, acc_reg2, acc_clr2 = evaluation2() .. GENERATED FROM PYTHON SOURCE LINES 273-289 .. code-block:: Python fig, ax = plt.subplots(1, 2, figsize=(14, 4)) ax[0].plot(rnd2, perf_reg2, label="regression") ax[0].plot(rnd2, perf_clr2, label="classification") ax[0].set_title( "Comparison between regression and classificaton\non the same problem with r2_score" ) ax[1].plot(rnd2, acc_reg2, label="regression") ax[1].plot(rnd2, acc_clr2, label="classification") ax[1].set_title( "Comparison between regression and classificaton\n" "on the same problem with accuracy_score" ) .. image-sg:: /auto_examples/images/sphx_glr_plot_sklearn_transformed_target_005.png :alt: Comparison between regression and classificaton on the same problem with r2_score, Comparison between regression and classificaton on the same problem with accuracy_score :srcset: /auto_examples/images/sphx_glr_plot_sklearn_transformed_target_005.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none Text(0.5, 1.0, 'Comparison between regression and classificaton\non the same problem with accuracy_score') .. GENERATED FROM PYTHON SOURCE LINES 290-291 Pretty visually indecisive. .. GENERATED FROM PYTHON SOURCE LINES 291-294 .. code-block:: Python numpy.sign(perf_reg2 - perf_clr2).sum() .. rst-class:: sphx-glr-script-out .. code-block:: none np.float64(6.0) .. GENERATED FROM PYTHON SOURCE LINES 296-301 .. code-block:: Python numpy.sign(acc_reg2 - acc_clr2).sum() .. rst-class:: sphx-glr-script-out .. code-block:: none np.float64(6.0) .. GENERATED FROM PYTHON SOURCE LINES 302-311 As strange as it seems to be, the regression wins on Iris data. But... There is always a but… The but... ---------- There is one tiny difference between regression and classification. Classification is immune to a permutation of the label. .. GENERATED FROM PYTHON SOURCE LINES 311-316 .. code-block:: Python data = load_iris() X, y = data.data, data.target X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=12) .. GENERATED FROM PYTHON SOURCE LINES 318-324 .. code-block:: Python reg = LinearRegression() reg.fit(X_train, y_train) log = LogisticRegression() log.fit(X_train, y_train) .. raw:: html
LogisticRegression()
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


.. GENERATED FROM PYTHON SOURCE LINES 326-334 .. code-block:: Python ( r2_score(y_test, fct2float2int(reg.predict(X_test))), r2_score(y_test, log.predict(X_test)), ) .. rst-class:: sphx-glr-script-out .. code-block:: none (1.0, 0.9609053497942387) .. GENERATED FROM PYTHON SOURCE LINES 335-336 Let's permute between 1 and 2. .. GENERATED FROM PYTHON SOURCE LINES 336-347 .. code-block:: Python def permute(y): y2 = y.copy() y2[y == 1] = 2 y2[y == 2] = 1 return y2 y_train_permuted = permute(y_train) y_test_permuted = permute(y_test) .. GENERATED FROM PYTHON SOURCE LINES 349-355 .. code-block:: Python regp = LinearRegression() regp.fit(X_train, y_train_permuted) logp = LogisticRegression() logp.fit(X_train, y_train_permuted) .. raw:: html
LogisticRegression()
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


.. GENERATED FROM PYTHON SOURCE LINES 357-365 .. code-block:: Python ( r2_score(y_test_permuted, fct2float2int(regp.predict(X_test))), r2_score(y_test_permuted, logp.predict(X_test)), ) .. rst-class:: sphx-glr-script-out .. code-block:: none (0.43952802359882015, 0.9626352015732547) .. GENERATED FROM PYTHON SOURCE LINES 366-368 The classifer produces almost the same performance, the regressor seems off. Let's check that it is just luck. .. GENERATED FROM PYTHON SOURCE LINES 368-391 .. code-block:: Python rows = [] for _i in range(10): regpt = TransformedTargetRegressor2(LinearRegression(), transformer="permute") regpt.fit(X_train, y_train) logpt = TransformedTargetClassifier2( LogisticRegression(max_iter=200), transformer="permute" ) logpt.fit(X_train, y_train) rows.append( { "reg_perm": regpt.transformer_.permutation_, "reg_score": r2_score(y_test, fct2float2int(regpt.predict(X_test))), "log_perm": logpt.transformer_.permutation_, "log_score": r2_score(y_test, logpt.predict(X_test)), } ) df = DataFrame(rows) df .. raw:: html
reg_perm reg_score log_perm log_score
0 {0: 0, 1: 1, 2: 2} 1.000000 {0: 0, 1: 1, 2: 2} 0.960905
1 {0: 0, 1: 2, 2: 1} 0.061728 {0: 2, 1: 1, 2: 0} 0.960905
2 {0: 1, 1: 2, 2: 0} -0.759259 {0: 1, 1: 2, 2: 0} 0.960905
3 {0: 2, 1: 0, 2: 1} 0.061728 {0: 2, 1: 1, 2: 0} 0.960905
4 {0: 0, 1: 2, 2: 1} 0.061728 {0: 1, 1: 2, 2: 0} 0.960905
5 {0: 0, 1: 2, 2: 1} 0.061728 {0: 0, 1: 2, 2: 1} 0.960905
6 {0: 2, 1: 0, 2: 1} 0.061728 {0: 0, 1: 1, 2: 2} 0.960905
7 {0: 1, 1: 2, 2: 0} -0.759259 {0: 1, 1: 0, 2: 2} 0.960905
8 {0: 0, 1: 2, 2: 1} 0.061728 {0: 2, 1: 0, 2: 1} 0.960905
9 {0: 1, 1: 2, 2: 0} -0.759259 {0: 1, 1: 2, 2: 0} 0.960905


.. GENERATED FROM PYTHON SOURCE LINES 392-393 The classifier produces a constant performance, the regressor is not. .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 4.313 seconds) .. _sphx_glr_download_auto_examples_plot_sklearn_transformed_target.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_sklearn_transformed_target.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_sklearn_transformed_target.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_sklearn_transformed_target.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_