Enter your keyword

Multiple regression occurs when there are more than one predictor variable is present in the regression equation. For example:

y = a0 + a1x1 + a2x2

Here a0, a1 and a2 we take (x1;, x2;, Y;), i = 1,2, …… , n as observed data. Note must be made upon the values of x that are errorless and values of y that are random.

So when,

n

s =∑[yi – (a0 + a1x1i + a2x2i)]2

i=1

Here you will find sum of squares of errors. To proceed in minimizing S,

αS/αa0 = 0, αS/αa1 = 0 and αS/αa2 = 0

From here on we come to three equations:

∑yi = n a0 + a1x1i + a2 x2i

∑xiyi = a0x1i + a1x1i2+ a2x1i  x2i

x2iyi = a0x2i + a1 x1i x2i + a2∑x22i

This is how we’ll come to least square estimates of a0, a1 and a2. There are few things one must remember here;this is specifically a regression plane and by any chance value of a2 disappears then the regression line will represent y upon x.

Let us consider this data chart,

x1 2 4 5 6 3 1
x2 1 2 1 3 5 2
y 14 16 17 20 18 12

 

The first thing is to fit these data into a regression plane, for example y. n = 6.

Then, y= a0 + a1 x1 + a2 x2.

x1 x2 x12 x22 x1x2 x1y x2y y
2

4

5

6

3

1

1

2

1

3

5

2

1

16

25

36

9

1

1

4

1

9

25

4

2

8

5

18

15

2

28

64

85

120

54

12

14

32

17

60

90

24

14

16

17

20

18

12

21 14 91 44 50 363 237 97

 

So when placing data in the above mentioned normal equations, we get:

97 = 6a0 + 21 a1 + 14a2

363 = 21 a0 + 91 a1 +50 a2

237 = 14 a0 + 50 a1 + 44 a2

So at last we come to:

a0= 9.7, a1= 1.3, a2= 0.83

live chat support software