================= Linear Regression ================= :doc:`/WorkProcessClassifiers/GlobalAlgorithm/index` - :doc:`/WorkProcessClassifiers/OneDimensionalAlgorithm/index` *Linear Regression* algorithm aims to find parameters :math:`p_0` and :math:`p_1` for a line, :math:`y = p_0 + p_1 \cdot t`\ , that best fits :math:`N` data points. The task is equivalent to solve systems of linear equations .. math:: Ap = \begin{bmatrix} 1&t_1 \\ 1&t_2 \\ \vdots&\vdots \\ 1&t_N \end{bmatrix} \begin{bmatrix} p_0 \\ p_1 \end{bmatrix} = \begin{bmatrix} y_1 \\ y_2 \\ \vdots \\ y_N \end{bmatrix} = Y. The method of least squares is the most common method for finding the fitted parameters. If :math:`A` is of full column rank, the least squares solution is .. math:: p = (A^T A)^{-1} A^T Y .. rubric:: Input Parameters +----------------------------+------------------------------------------------+----------------------------------------+------------------------------------------------+---------+ | Parameter | Type | Constraint | Description | Remarks | +============================+================================================+========================================+================================================+=========+ | :math:`[t_i]` | :math:`[t_i] \in \mathbb R^N` | :math:`N \in \mathbb{N}` | | | +----------------------------+------------------------------------------------+----------------------------------------+------------------------------------------------+---------+ | :math:`Y` | :math:`Y \in \mathbb R^N` | :math:`N \in \mathbb{N}` | Input data vector of length :math:`N` | | +----------------------------+------------------------------------------------+----------------------------------------+------------------------------------------------+---------+ .. rubric:: Output Parameters +----------------------------+----------------------------------------------------+----------------------------------------+-------------------------------------------------+---------+ | Parameter | Type | Constraint | Description | Remarks | +============================+====================================================+========================================+=================================================+=========+ | :math:`p` | :math:`p \in \mathbb R^2` | | | | +----------------------------+----------------------------------------------------+----------------------------------------+-------------------------------------------------+---------+ | :math:`\hat{Y}` | :math:`\hat{Y} \in \mathbb R^N` | :math:`N \in \mathbb{N}` | Output data vector of length :math:`N` | | +----------------------------+----------------------------------------------------+----------------------------------------+-------------------------------------------------+---------+ .. rubric:: Tool Support * :doc:`/Tools/MapleTool/index` For details refer to the online documentation of the function `'LinearFit' `__. * :doc:`/Tools/MatlabTool/index` For details refer to the online documentation of the function `'polyfit' `__. .. rubric:: Single Steps using the Algorithm * :doc:`/DataPreprocessing/DataReduction/NumerosityReduction/DataReductionWithLinearRegression/index` * :doc:`/DataPreprocessing/DataCleaning/HandlingImproperValues/ReconstructingImproperValues/ReconstructingImproperValuesWithLinearRegression/index` .. rubric:: References - C.R.\ Rao, H. Toutenburg, A. Fieger, C. Heumann, T. Nittner and S. Scheid, Linear Models: Least Squares and Alternatives, Springer Series in Statistics, pp. 23-33, 1999. - R.C.\ Aster, B. Borchers, C.H. Thurber, Parameter Estimation and Inverse Problems, Academic Press, 2005.