Derivation of Expressions for Multiple Regression Coefficients

Supose we suspect that an observable
\[Y\]
and random variables
\[X+_i , \: i=1,2,3,...n\]
are linearly related. We can try to find a relationship of the form
\[y =\beta_0+\beta_1(x_{1}- \bar{x}_1)+ \beta_2(x_{2}-\bar{x}_2))+...+\beta_n(x_{n}- \bar{x}_n)\]
.
The statistically most desirable way to find the coefficients
\[\beta_k, \: k=1,2,...,n\]
is to minimise the sum of the squares of the errors, so that if the observable
\[Y\]
and the random variables
\[X_i, \: 1,2,3,...,n\]
have  
\[k\]
  data points
\[(x_{11},x_{21},x_{31},..., x_{n1}, y_1)\]

\[(x_{11},x_{22},x_{32},..., x_{n2}, y_2)\]

\[\vdots \: \: \: \: \: \: \: \: \: \: \: \: \: \: \: \vdots \: \: \: \: \: \: \: \: \: \: \: \: \: \: \: \vdots\]

\[(x_{1k},x_{2k},x_{32},..., x_{nk2}, y_k)\]

Wee want to minimise
\[E=\sum^k_{i=1} e^2_i= \sum^k_{i=1} (y_i-(\beta_0+\beta_1(x_{i1}- \bar{x}_1)+ \beta_2(x_{i2}-\bar{x}_2))+...+\beta_n(x_{in}- \bar{x}_n))^2 \]
.
Find the partial derivatives and set each equal to zero.
\[\frac{\partial E}{\partial \beta_0}=\sum^k_{i=1} 2 (\beta_0 - y_i) =0 \rightarrow k \beta_0 = \sum^k_{i=1} y_i \rightarrow \beta_0 = \frac{\sum^k_{i=1} y_i}{k} \]

\[\frac{\partial E}{\partial \beta_i}=\sum^k_{i=1} (-2(x_{ij}-\bar{x}_i) (y_i- \bar{y})+ 2(x_{ij}- \bar{x}_j)^2 =0\]

\[\beta_i = \frac{\sum^k_{j=1} (x_{ij}-\bar{x}_j) (y_i- \bar{y})}{\sum^k_{i=1} (x_{ij}- \bar{x}_j)^2}\]

Add comment

Security code
Refresh