g02ch performs a multiple linear regression with no constant on a set of variables whose sums of squares and crossproducts about zero and correlationlike coefficients are given.
Syntax
C# 

public static void g02ch(
int n,
int k1,
int k,
double[,] sspz,
double[,] rz,
double[] result,
double[,] coef,
double[,] rznv,
double[,] cz,
out int ifail
) 
Visual Basic 

Public Shared Sub g02ch ( _
n As Integer, _
k1 As Integer, _
k As Integer, _
sspz As Double(,), _
rz As Double(,), _
result As Double(), _
coef As Double(,), _
rznv As Double(,), _
cz As Double(,), _
<OutAttribute> ByRef ifail As Integer _
) 
Visual C++ 

public:
static void g02ch(
int n,
int k1,
int k,
array<double,2>^ sspz,
array<double,2>^ rz,
array<double>^ result,
array<double,2>^ coef,
array<double,2>^ rznv,
array<double,2>^ cz,
[OutAttribute] int% ifail
) 
Parameters
 n
 Type: System..::..Int32
On entry: $n$, the number of cases used in calculating the sums of squares and crossproducts and correlationlike coefficients.
 k1
 Type: System..::..Int32
On entry: the total number of variables, independent and dependent $\left(k+1\right)$, in the regression.
Constraint:
$2\le {\mathbf{k1}}\le {\mathbf{n}}$.
 k
 Type: System..::..Int32
On entry: the number of independent variables $k$ in the regression.
Constraint:
${\mathbf{k}}={\mathbf{k1}}1$.
 sspz
 Type: array<System..::..Double,2>[,](,)[,][,]
An array of size [
dim1,
k1]
Note: dim1 must satisfy the constraint:
$\mathrm{dim1}\ge {\mathbf{k1}}$
On entry: ${\mathbf{sspz}}[\mathit{i}1,\mathit{j}1]$ must be set to ${\stackrel{~}{S}}_{\mathit{i}\mathit{j}}$, the sum of crossproducts about zero for the $\mathit{i}$th and $\mathit{j}$th variables, for $\mathit{i}=1,2,\dots ,k+1$ and $\mathit{j}=1,2,\dots ,k+1$; terms involving the dependent variable appear in row $k+1$ and column $k+1$.
 rz
 Type: array<System..::..Double,2>[,](,)[,][,]
An array of size [
dim1,
k1]
Note: dim1 must satisfy the constraint:
$\mathrm{dim1}\ge {\mathbf{k1}}$
On entry: ${\mathbf{rz}}[\mathit{i}1,\mathit{j}1]$ must be set to ${\stackrel{~}{R}}_{\mathit{i}\mathit{j}}$, the correlationlike coefficient for the $\mathit{i}$th and $\mathit{j}$th variables, for $\mathit{i}=1,2,\dots ,k+1$ and $\mathit{j}=1,2,\dots ,k+1$; coefficients involving the dependent variable appear in row $k+1$ and column $k+1$.
 result
 Type: array<System..::..Double>[]()[][]
An array of size [$13$]
On exit: the following information:
${\mathbf{result}}\left[0\right]$  $SSR$, the sum of squares attributable to the regression; 
${\mathbf{result}}\left[1\right]$  $DFR$, the degrees of freedom attributable to the regression; 
${\mathbf{result}}\left[2\right]$  $MSR$, the mean square attributable to the regression; 
${\mathbf{result}}\left[3\right]$  $F$, the $F$ value for the analysis of variance; 
${\mathbf{result}}\left[4\right]$  $SSD$, the sum of squares of deviations about the regression; 
${\mathbf{result}}\left[5\right]$  $DFD$, the degrees of freedom of deviations about the regression; 
${\mathbf{result}}\left[6\right]$  $MSD$, the mean square of deviations about the regression; 
${\mathbf{result}}\left[7\right]$  $SST$, the total sum of squares; 
${\mathbf{result}}\left[8\right]$  $DFT$, the total degrees of freedom; 
${\mathbf{result}}\left[9\right]$  $s$, the standard error estimate; 
${\mathbf{result}}\left[10\right]$  $R$, the coefficient of multiple correlation; 
${\mathbf{result}}\left[11\right]$  ${R}^{2}$, the coefficient of multiple determination; 
${\mathbf{result}}\left[12\right]$  ${\stackrel{}{R}}^{2}$, the coefficient of multiple determination corrected for the degrees of freedom. 
 coef
 Type: array<System..::..Double,2>[,](,)[,][,]
An array of size [dim1, $3$]
Note: dim1 must satisfy the constraint:
$\mathrm{dim1}\ge {\mathbf{k}}$
On exit: for
$i=1,2,\dots ,k$, the following information:
 ${\mathbf{coef}}[i1,0]$
 ${b}_{i}$, the regression coefficient for the $i$th variable.
 ${\mathbf{coef}}[i1,1]$
 $se\left({b}_{i}\right)$, the standard error of the regression coefficient for the $i$th variable.
 ${\mathbf{coef}}[i1,2]$
 $t\left({b}_{i}\right)$, the $t$ value of the regression coefficient for the $i$th variable.
 rznv
 Type: array<System..::..Double,2>[,](,)[,][,]
An array of size [
dim1,
k]
Note: dim1 must satisfy the constraint:
$\mathrm{dim1}\ge {\mathbf{k}}$
On exit: the inverse of the matrix of correlationlike coefficients for the independent variables; that is, the inverse of the matrix consisting of the first
$k$ rows and columns of
rz.
 cz
 Type: array<System..::..Double,2>[,](,)[,][,]
An array of size [
dim1,
k]
Note: dim1 must satisfy the constraint:
$\mathrm{dim1}\ge {\mathbf{k}}$
On exit: the modified inverse matrix,
$C$, where
 ifail
 Type: System..::..Int32%
On exit:
${\mathbf{ifail}}={0}$ unless the method detects an error or a warning has been flagged (see
[Error Indicators and Warnings]).
Description
g02ch fits a curve of the form
to the data points
such that
The method calculates the regression coefficients,
${b}_{1},{b}_{2},\dots ,{b}_{k}$, (and various other statistical quantities) by minimizing
The actual data values
$\left({x}_{1i},{x}_{2i},\dots ,{x}_{ki},{y}_{i}\right)$ are not provided as input to the method. Instead, input to the method consists of:
(i) 
The number of cases, $n$, on which the regression is based. 
(ii) 
The total number of variables, dependent and independent, in the regression, $\left(k+1\right)$. 
(iii) 
The number of independent variables in the regression, $k$. 
(iv) 
The $\left(k+1\right)$ by $\left(k+1\right)$ matrix $\left[{\stackrel{~}{S}}_{ij}\right]$ of sums of squares and crossproducts about zero of all the variables in the regression; the terms involving the dependent variable, $y$, appear in the $\left(k+1\right)$th row and column. 
(v) 
The $\left(k+1\right)$ by $\left(k+1\right)$ matrix $\left[{\stackrel{~}{R}}_{ij}\right]$ of correlationlike coefficients for all the variables in the regression; the correlations involving the dependent variable, $y$, appear in the $\left(k+1\right)$th row and column. 
The quantities calculated are:
(a) 
The inverse of the $k$ by $k$ partition of the matrix of correlationlike coefficients, $\left[{\stackrel{~}{R}}_{ij}\right]$, involving only the independent variables. The inverse is obtained using an accurate method which assumes that this submatrix is positive definite (see [Further Comments]). 
(b) 
The modified matrix, $C=\left[{c}_{ij}\right]$, where
where ${\stackrel{~}{r}}^{ij}$ is the $\left(i,j\right)$th element of the inverse matrix of $\left[{\stackrel{~}{R}}_{ij}\right]$ as described in (a) above. Each element of $C$ is thus the corresponding element of the matrix of correlationlike coefficients multiplied by the corresponding element of the inverse of this matrix, divided by the corresponding element of the matrix of sums of squares and crossproducts about zero. 
(c) 
The regression coefficients:
where ${\stackrel{~}{S}}_{j\left(k+1\right)}$ is the sum of crossproducts about zero for the independent variable ${x}_{j}$ and the dependent variable $y$. 
(d) 
The sum of squares attributable to the regression, $SSR$, the sum of squares of deviations about the regression, $SSD$, and the total sum of squares, $SST$:
 $SST={\stackrel{~}{S}}_{\left(k+1\right)\left(k+1\right)}$, the sum of squares about zero for the dependent variable, $y$;
 $SSR={\displaystyle \sum _{j=1}^{k}}{b}_{j}{\stackrel{~}{S}}_{j\left(k+1\right)}\text{; \hspace{1em}}SSD=SSTSSR$.

(e) 
The degrees of freedom attributable to the regression, $DFR$, the degrees of freedom of deviations about the regression, $DFD$, and the total degrees of freedom, $DFT$:

(f) 
The mean square attributable to the regression, $MSR$, and the mean square of deviations about the regression, $MSD$:

(g) 
The $F$ value for the analysis of variance:

(h) 
The standard error estimate:

(i) 
The coefficient of multiple correlation, $R$, the coefficient of multiple determination, ${R}^{2}$, and the coefficient of multiple determination corrected for the degrees of freedom, ${\stackrel{}{R}}^{2}$:

(j) 
The standard error of the regression coefficients:

(k) 
The $t$ values for the regression coefficients:

References
Draper N R and Smith H (1985) Applied Regression Analysis (2nd Edition) Wiley
Error Indicators and Warnings
Accuracy
The accuracy of any regression method is almost entirely dependent on the accuracy of the matrix inversion method used. In
g02ch, it is the matrix of correlationlike coefficients rather than that of the sums of squares and crossproducts about zero that is inverted; this means that all terms in the matrix for inversion are of a similar order, and reduces the scope for computational error. For details on absolute accuracy, the relevant section of the document describing the inversion method used,
(F04ABF not in this release), should be consulted.
g02da uses a different method, based on
(F04AMF not in this release), and that method may well prove more reliable numerically. It does not handle missing values, nor does it provide the same output as this method.
If, in calculating
$F$ or any of the
$t\left({b}_{i}\right)$
(see
[Description]), the numbers involved are such that the result would be outside the range of numbers which can be stored by the machine, then the answer is set to the largest quantity which can be stored as a real variable, by means of a call to
x02al.
Parallelism and Performance
Further Comments
Example
See Also