
.. DO NOT EDIT.
.. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY.
.. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE:
.. "auto_examples/ensemble/plot_random_forest_regression_multioutput.py"
.. LINE NUMBERS ARE GIVEN BELOW.

.. only:: html

    .. note::
        :class: sphx-glr-download-link-note

        :ref:`Go to the end <sphx_glr_download_auto_examples_ensemble_plot_random_forest_regression_multioutput.py>`
        to download the full example code or to run this example in your browser via JupyterLite or Binder.

.. rst-class:: sphx-glr-example-title

.. _sphx_glr_auto_examples_ensemble_plot_random_forest_regression_multioutput.py:


============================================================
Comparing random forests and the multi-output meta estimator
============================================================

An example to compare multi-output regression with random forest and
the :ref:`multioutput.MultiOutputRegressor <multiclass>` meta-estimator.

This example illustrates the use of the
:ref:`multioutput.MultiOutputRegressor <multiclass>` meta-estimator
to perform multi-output regression. A random forest regressor is used,
which supports multi-output regression natively, so the results can be
compared.

The random forest regressor will only ever predict values within the
range of observations or closer to zero for each of the targets. As a
result the predictions are biased towards the centre of the circle.

Using a single underlying feature the model learns both the
x and y coordinate as output.

.. GENERATED FROM PYTHON SOURCE LINES 23-97



.. image-sg:: /auto_examples/ensemble/images/sphx_glr_plot_random_forest_regression_multioutput_001.png
   :alt: Comparing random forests and the multi-output meta estimator
   :srcset: /auto_examples/ensemble/images/sphx_glr_plot_random_forest_regression_multioutput_001.png
   :class: sphx-glr-single-img





.. code-block:: Python


    # Authors: The scikit-learn developers
    # SPDX-License-Identifier: BSD-3-Clause

    import matplotlib.pyplot as plt
    import numpy as np

    from sklearn.ensemble import RandomForestRegressor
    from sklearn.model_selection import train_test_split
    from sklearn.multioutput import MultiOutputRegressor

    # Create a random dataset
    rng = np.random.RandomState(1)
    X = np.sort(200 * rng.rand(600, 1) - 100, axis=0)
    y = np.array([np.pi * np.sin(X).ravel(), np.pi * np.cos(X).ravel()]).T
    y += 0.5 - rng.rand(*y.shape)

    X_train, X_test, y_train, y_test = train_test_split(
        X, y, train_size=400, test_size=200, random_state=4
    )

    max_depth = 30
    regr_multirf = MultiOutputRegressor(
        RandomForestRegressor(n_estimators=100, max_depth=max_depth, random_state=0)
    )
    regr_multirf.fit(X_train, y_train)

    regr_rf = RandomForestRegressor(n_estimators=100, max_depth=max_depth, random_state=2)
    regr_rf.fit(X_train, y_train)

    # Predict on new data
    y_multirf = regr_multirf.predict(X_test)
    y_rf = regr_rf.predict(X_test)

    # Plot the results
    plt.figure()
    s = 50
    a = 0.4
    plt.scatter(
        y_test[:, 0],
        y_test[:, 1],
        edgecolor="k",
        c="navy",
        s=s,
        marker="s",
        alpha=a,
        label="Data",
    )
    plt.scatter(
        y_multirf[:, 0],
        y_multirf[:, 1],
        edgecolor="k",
        c="cornflowerblue",
        s=s,
        alpha=a,
        label="Multi RF score=%.2f" % regr_multirf.score(X_test, y_test),
    )
    plt.scatter(
        y_rf[:, 0],
        y_rf[:, 1],
        edgecolor="k",
        c="c",
        s=s,
        marker="^",
        alpha=a,
        label="RF score=%.2f" % regr_rf.score(X_test, y_test),
    )
    plt.xlim([-6, 6])
    plt.ylim([-6, 6])
    plt.xlabel("target 1")
    plt.ylabel("target 2")
    plt.title("Comparing random forests and the multi-output meta estimator")
    plt.legend()
    plt.show()


.. rst-class:: sphx-glr-timing

   **Total running time of the script:** (0 minutes 0.534 seconds)


.. _sphx_glr_download_auto_examples_ensemble_plot_random_forest_regression_multioutput.py:

.. only:: html

  .. container:: sphx-glr-footer sphx-glr-footer-example

    .. container:: binder-badge

      .. image:: images/binder_badge_logo.svg
        :target: https://mybinder.org/v2/gh/scikit-learn/scikit-learn/1.8.X?urlpath=lab/tree/notebooks/auto_examples/ensemble/plot_random_forest_regression_multioutput.ipynb
        :alt: Launch binder
        :width: 150 px

    .. container:: lite-badge

      .. image:: images/jupyterlite_badge_logo.svg
        :target: ../../lite/lab/index.html?path=auto_examples/ensemble/plot_random_forest_regression_multioutput.ipynb
        :alt: Launch JupyterLite
        :width: 150 px

    .. container:: sphx-glr-download sphx-glr-download-jupyter

      :download:`Download Jupyter notebook: plot_random_forest_regression_multioutput.ipynb <plot_random_forest_regression_multioutput.ipynb>`

    .. container:: sphx-glr-download sphx-glr-download-python

      :download:`Download Python source code: plot_random_forest_regression_multioutput.py <plot_random_forest_regression_multioutput.py>`

    .. container:: sphx-glr-download sphx-glr-download-zip

      :download:`Download zipped: plot_random_forest_regression_multioutput.zip <plot_random_forest_regression_multioutput.zip>`


.. include:: plot_random_forest_regression_multioutput.recommendations


.. only:: html

 .. rst-class:: sphx-glr-signature

    `Gallery generated by Sphinx-Gallery <https://sphinx-gallery.github.io>`_
