SSJ API Documentation
Stochastic Simulation in Java
Loading...
Searching...
No Matches
umontreal.ssj.functionfit.LeastSquares Class Reference

This class implements different linear regression models, using the least squares method to estimate the regression coefficients. More...

Static Public Member Functions

static double[] calcCoefficients (double[] X, double[] Y)
 Computes the regression coefficients using the least squares method.
static double[] calcCoefficients (double[] X, double[] Y, int deg)
 Computes the regression coefficients using the least squares method.
static double[] calcCoefficients0 (double[][] X, double[] Y)
 Computes the regression coefficients using the least squares method.
static double[] calcCoefficients (double[][] X, double[] Y)
 Computes the regression coefficients using the least squares method.

Detailed Description

This class implements different linear regression models, using the least squares method to estimate the regression coefficients.

Given input data \(x_{ij}\) and response \(y_i\), one want to find the coefficients \(\beta_j\) that minimize the residuals of the form (using matrix notation)

\[ r = \min_{\beta}\| Y - X\beta\|_2, \]

where the \(L_2\) norm is used. Particular cases are

\[ r = \min_{\beta}\sum_i \left(y_i - \beta_0 - \sum_{j=1}^k \beta_j x_{ij}\right)^2. \]

for \(k\) regressor variables \(x_j\). The well-known case of the single variable \(x\) is

\[ r = \min_{\alpha,\beta} \sum_i \left(y_i - \alpha- \beta x_i\right)^2. \]

Sometimes, one wants to use a basis of general functions \(\psi_j(t)\) with a minimization of the form

\[ r = \min_{\beta}\sum_i \left(y_i - \sum_{j=1}^k \beta_j\psi_j(t_i)\right)^2. \]

For example, we could have \(\psi_j(t) = e^{-\lambda_j t}\) or some other functions. In that case, one has to choose the points \(t_i\) at which to compute the basis functions, and use a method below with

\(x_{ij} = \psi_j(t_i)\).

      <div class="SSJ-bigskip"></div>

Definition at line 56 of file LeastSquares.java.

Member Function Documentation

◆ calcCoefficients() [1/3]

double[] umontreal.ssj.functionfit.LeastSquares.calcCoefficients ( double[] X,
double[] Y )
static

Computes the regression coefficients using the least squares method.

This is a simple linear regression with 2 regression coefficients,

\(\alpha\) and \(\beta\). The model is

\[ y = \alpha+ \beta x. \]

Given the \(n\) data points \((X_i, Y_i)\), \(i=0,1,…,(n-1)\), the method computes and returns the array \([\alpha, \beta]\).

Parameters
Xthe regressor variables
Ythe response
Returns
the regression coefficients

Definition at line 104 of file LeastSquares.java.

◆ calcCoefficients() [2/3]

double[] umontreal.ssj.functionfit.LeastSquares.calcCoefficients ( double[] X,
double[] Y,
int deg )
static

Computes the regression coefficients using the least squares method.

This is a linear regression with a polynomial of degree deg \(= k\) and \(k+1\) regression coefficients \(\beta_j\). The model is

\[ y = \beta_0 + \sum_{j=1}^k \beta_j x^j. \]

Given the \(n\) data points \((X_i, Y_i)\), \(i=0,1,…,(n-1)\), the method computes and returns the array \([\beta_0, \beta_1, …, \beta_k]\). Restriction: \(n > k\).

Parameters
Xthe regressor variables
Ythe response
degdegree of the function
Returns
the regression coefficients

Definition at line 128 of file LeastSquares.java.

◆ calcCoefficients() [3/3]

double[] umontreal.ssj.functionfit.LeastSquares.calcCoefficients ( double X[][],
double[] Y )
static

Computes the regression coefficients using the least squares method.

This is a model for multiple linear regression. There are \(k\) regression coefficients \(\beta_j\), \(j=0,1,…,(k-1)\) and

\(k\) regressors variables \(x_j\). The model is

\[ y = \sum_{j=0}^{k-1} \beta_j x_j. \]

There are \(n\) data points \(Y_i\), \(X_{ij}\), \(i=0,1,…,(n-1)\), and each \(X_i\) is a \(k\)-dimensional point. Given the response Y[i] and the regressor variables X[i][j], \(\mathtt{i} =0,1,…,(n-1)\), \(\mathtt{j} =0,1,…,(k-1)\), the method computes and returns the array \([\beta_0, \beta_1, …, \beta_{k-1}]\). Restriction: \(n > k\).

Parameters
Xthe regressor variables
Ythe response
Returns
the regression coefficients

Definition at line 220 of file LeastSquares.java.

◆ calcCoefficients0()

double[] umontreal.ssj.functionfit.LeastSquares.calcCoefficients0 ( double X[][],
double[] Y )
static

Computes the regression coefficients using the least squares method.

This is a model for multiple linear regression. There are \(k+1\) regression coefficients \(\beta_j\), and \(k\) regressors variables \(x_j\). The model is

\[ y = \beta_0 + \sum_{j=1}^k \beta_j x_j. \]

There are \(n\) data points \(Y_i\), \(X_{ij}\),

\(i=0,1,…,(n-1)\), and each \(X_i\) is a \(k\)-dimensional point. Given the response Y[i] and the regressor variables X[i][j], \(\mathtt{i} =0,1,…,(n-1)\), \(\mathtt{j} =0,1,…,(k-1)\), the method computes and returns the array \([\beta_0, \beta_1, …, \beta_k]\). Restriction: \(n > k+1\).

Parameters
Xthe regressor variables
Ythe response
Returns
the regression coefficients

Definition at line 178 of file LeastSquares.java.


The documentation for this class was generated from the following file: