least squares regression 7장(최소자승법)

18
1 Least Squares Regression 7(최소자승법) Measured data Error Fitting a straight line to a set of paired observations: (x 1 , y 1 ), (x 2 , y 2 ),…,(x n , y n ). y=a 0 +a 1 x+e a 1 - slope a 0 - intercept e- error, or residual, between the model and the observations 곡선맞춤 (Curve fitting)

Upload: others

Post on 12-Jun-2022

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Least Squares Regression 7장(최소자승법)

1

Linear RegressionLeast Squares Regression

7장(최소자승법)

Measured data Error

Fitting a straight line to a set of paired observations: (x1, y1), (x2, y2),…,(xn, yn).

y=a0+a1x+ea1- slopea0- intercepte- error, or residual, between the model and the observations

곡선맞춤 (Curve fitting)

Page 2: Least Squares Regression 7장(최소자승법)

2

3rd method :

Best strategy is to minimize the sum of the squares of

the residuals between the measured y and the y

calculated with the linear model:

• Yields a unique line for a given set of data.

n

i

n

i

iiii

n

i

ir xaayyyeS1 1

2

10

2

1

2 )()model,measured,(

Page 3: Least Squares Regression 7장(최소자승법)

by Lale Yurttas, Texas A&M University

3

List-Squares Fit of a Straight Line/

2

10

10

1

1

1

0

0

0)(2

0)(2

iiii

ii

iioir

ioi

o

r

xaxaxy

xaay

xxaaya

S

xaaya

S

xaya

xxn

yxyxna

yaxna

naa

ii

iiii

ii

10

221

10

00

Normal equations(정규방정식)

can be solved simultaneously

Mean values

n

i

iirxaayS

1

2

10)(

Page 4: Least Squares Regression 7장(최소자승법)

4

y = 0.8393x + 0.0714R² = 0.8683

-2

-1

0

1

2

3

4

5

6

7

0 2 4 6 8

yi

y-a_o-a1_xi

y-a_o-a1_xi

선형 (yi)

Page 5: Least Squares Regression 7장(최소자승법)

5

Polynomial Regression

(7.2 최소자승다항식)

• Some engineering data is poorly represented by a

straight line.

• For these cases a curve is better suited to fit the data.

• The least squares method can readily be extended to

fit the data to higher order polynomials

Page 6: Least Squares Regression 7장(최소자승법)

2

00

4

2

0

3

1

0

2

0

00

3

2

0

2

1

0

1

0

00

2

2

0

1

1

0

0

0

2

1

0

i

n

i

i

n

i

i

n

i

i

n

i

i

i

n

i

i

n

i

i

n

i

i

n

i

i

n

i

i

n

i

i

n

i

i

n

i

i

xyxaxaxaj

xyxaxaxaj

yxaxaxaj

Page 7: Least Squares Regression 7장(최소자승법)

xi yi xi^2 xi^3 xi^4 xiyi xi^2yi

0 2.1 0 0 0 0 0

1 7.7 1 1 1 7.7 7.7

2 13.6 4 8 16 27.2 54.4

3 27.2 9 27 81 81.6 244.8

4 40.9 16 64 256 163.6 654.4

5 61.1 25 125 625 305.5 1527.5

sum 15 152.6 55 225 979 585.6 2488.8

average 2.5 25.43333

6 15 55 a0 152.6

15 55 225 a1 585.6

55 225 979 a2 2488.8

y = 1.8607x2 + 2.3593x + 2.4786R² = 0.9985

0

10

20

30

40

50

60

70

0 1 2 3 4 5 6

Ex 16.5

99851.039.2513

74657.339.2513

12.136

74657.3

2

/

r

sxy

Page 8: Least Squares Regression 7장(최소자승법)

다중선형회귀분석

8

exaxaaxya

2210

i

n

i

i

n

i

i

n

i

ii

n

i

i

i

n

i

i

n

i

ii

n

i

i

n

i

i

n

i

i

n

i

i

n

i

i

n

i

i

xyxaxxaxa

xyxxaxaxa

yxaxaxa

2

00

2

22

0

211

0

2

0

00

212

0

2

11

0

10

00

22

0

11

0

0

0

y x1 x2 x1^2 x2^2 x1x2 x1y x2y

1 5 0 0 0 0 0 0 0

2 10 2 1 4 1 2 20 10

3 9 2.5 2 6.25 4 5 22.5 18

4 0 1 3 1 9 3 0 0

5 3 4 6 16 36 24 12 18

6 27 7 2 49 4 14 189 54

sum 54 16.5 14 76.25 54 48 243.5 100

6 16.5 14 a0 54

16.5 76.25 48 a1 243.5

14 48 54 a2 100

Page 9: Least Squares Regression 7장(최소자승법)

9

Page 10: Least Squares Regression 7장(최소자승법)

MATLAB 방법

>> p = ployfit(x,y,n)

Page 11: Least Squares Regression 7장(최소자승법)
Page 12: Least Squares Regression 7장(최소자승법)

12

숙제p.229

2번

Page 13: Least Squares Regression 7장(최소자승법)

7.3 비선형관계식의선형화xey 1

1

xy11

lnln

2

2

xy

22logloglog xy

x

xy

3

3

33

3111

xy

Page 14: Least Squares Regression 7장(최소자승법)

14

Page 15: Least Squares Regression 7장(최소자승법)

15

7.3.2 쌍곡선형태

Page 16: Least Squares Regression 7장(최소자승법)

16

General Linear Least Squares

residualsE

tscoefficienunknown A

variabledependent theof valuedobservedY

t variableindependen theof valuesmeasured at the

functions basis theof valuescalculated theofmatrix

functions basis 1 are 10

221100

Z

EAZY

m, z, , zz

ezazazazay

m

mm

2

1 0

n

i

m

j

jijir zayS

Minimized by taking its partial

derivative w.r.t. each of the

coefficients and setting the

resulting equation equal to zero

Page 17: Least Squares Regression 7장(최소자승법)

17

Page 18: Least Squares Regression 7장(최소자승법)