Multiple regression occurs when there are more than one predictor variable is present in the regression equation. For example:

y = a_{0 }+ a_{1}x_{1} + a_{2}x_{2}

Here *a0, *a1 and a2 we take (x1;, x2;, Y;), *i *= 1,2, …… , *n *as observed data. Note must be made upon the values of x that are errorless and values of y that are random.

So when,

n

s =∑[y_{i} – (a_{0 }+ a_{1}x_{1i} + a_{2}x_{2i})]^{2}

i=1

Here you will find sum of squares of errors. To proceed in minimizing S,

αS/αa_{0} = 0, αS/αa_{1} = 0 and αS/αa_{2} = 0

From here on we come to three equations:

∑y_{i }= n a_{0} + a_{1}∑_{x1i }+ a_{2 }∑_{x2i}

∑x_{i}y_{i }= a_{0}∑_{x1i }+ a_{1}∑_{x1i}^{2}+ a_{2}∑_{x1i }x_{2i}

∑_{x2i}y_{i }= a_{0}∑_{x2i }+ a_{1 }∑_{x1i} x_{2i }+ a_{2}∑x^{2}_{2i}

This is how we’ll come to least square estimates of a0, a1 and a2. There are few things one must remember here;this is specifically a regression plane and by any chance value of a2 disappears then the regression line will represent y upon x.

Let us consider this data chart,

x_{1} | 2 | 4 | 5 | 6 | 3 | 1 |

x_{2} | 1 | 2 | 1 | 3 | 5 | 2 |

y | 14 | 16 | 17 | 20 | 18 | 12 |

The first thing is to fit these data into a regression plane, for example y. n = 6.

Then, y= a_{0 }+ a_{1 }x_{1 }+ a_{2 }x_{2}.

x_{1} | x_{2} | x_{1}^{2} | x_{2}^{2} | x_{1}x_{2} | x_{1}y | x_{2}y | y | |

2 4 5 6 3 1 | 1 2 1 3 5 2 | 1 16 25 36 9 1 | 1 4 1 9 25 4 | 2 8 5 18 15 2 | 28 64 85 120 54 12 | 14 32 17 60 90 24 | 14 16 17 20 18 12 | |

∑ | 21 | 14 | 91 | 44 | 50 | 363 | 237 | 97 |

So when placing data in the above mentioned normal equations, we get:

*97 = 6a _{0} + 21 a_{1} + 14a_{2}*

*363 = 21 a _{0} + 91 a_{1} +50 a_{2}*

*237 = 14 a _{0} + 50 a_{1} + 44 a_{2}*

So at last we come to:

*a _{0}= 9.7, a_{1}= 1.3, a_{2}= 0.83*