ch04 lectu

Upload: spread-love

Post on 03-Apr-2018

254 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/29/2019 Ch04 Lectu

    1/25

    1

    Eco 205: Econometrics

    Any questions?

  • 7/29/2019 Ch04 Lectu

    2/25

    2

    population data point

    Y

    X

    Observed Value

    of Y for X3

    X3

    Population Linear Regression Model

    Yi = b 0 + b 1X i + u

    u

    b 0X11

    Y11

    u11

    pop slope = b 1

  • 7/29/2019 Ch04 Lectu

    3/25

    3

    chosen in sample

    not chosen in sample

    estimated error for X3(residual)

    Y

    X

    Y3

    Yi

    =b0

    +b1

    Xi

    X3

    estimated slope =estimated

    intercept =

    Sample Regression Equation

    u

    u

    b 0

    b1

    pop slope = b 1

    b 0

    Y3

  • 7/29/2019 Ch04 Lectu

    4/25

    4

    The OLS estimator solves

    b 0 , b 1min[Yi - ( b 0

    i=1

    n

    + b 1Xi)]2

  • 7/29/2019 Ch04 Lectu

    5/25

    5

  • 7/29/2019 Ch04 Lectu

    6/25

    6

    CaliforniaTest Score/Class Size data

    Interpretations

  • 7/29/2019 Ch04 Lectu

    7/25

    7

    Predicted values & residuals:

  • 7/29/2019 Ch04 Lectu

    8/25

    8

    OLS regression: STATA output

    regress testscr str, robust

    Regression with robust standard errors Number of obs = 420

    F( 1, 418) = 19.26

    Prob > F = 0.0000

    R-squared = 0.0512

    Root MSE = 18.581

    -------------------------------------------------------------------------| Robust

    testscr | Coef. Std. Err. t P>|t| [95% Conf. Interval]

    --------+----------------------------------------------------------------

    str | -2.279808 .5194892 -4.39 0.000 -3.300945 -1.258671

    _cons | 698.933 10.36436 67.44 0.000 678.5602 719.3057

    -------------------------------------------------------------------------

  • 7/29/2019 Ch04 Lectu

    9/259

    Measures of Fit

  • 7/29/2019 Ch04 Lectu

    10/2510

    TheStandard Error of theRegression (SER)

  • 7/29/2019 Ch04 Lectu

    11/2511

    Root Mean Squared Error (RMSE)

  • 7/29/2019 Ch04 Lectu

    12/2512

    R2 and SER Example

  • 7/29/2019 Ch04 Lectu

    13/2513

    The Least Squares Assumptions

  • 7/29/2019 Ch04 Lectu

    14/2514

    LSA #1: E(u|X = x) = 0

  • 7/29/2019 Ch04 Lectu

    15/2515

    LSA #2: (Xi,Yi), i= 1,,n are i.i.d.

    LSA #3: E(X4) < and E(Y4) <

  • 7/29/2019 Ch04 Lectu

    16/2516

    OLS can be sensitive to an outlier

  • 7/29/2019 Ch04 Lectu

    17/2517

    Sampling Distribution of1b

  • 7/29/2019 Ch04 Lectu

    18/2518

    Some Preliminary Algebra

  • 7/29/2019 Ch04 Lectu

    19/2519

    b 1 - b 1 =

    (Xi - X)

    i=1

    n

    (ui - u )

    (Xi - X)2

    i=1

    n

    =

    (Xi - X)

    i=1

    n

    ui

    (Xi - X)2

    i=1

    n

  • 7/29/2019 Ch04 Lectu

    20/2520

    Now we can calculate E( ) and var( )1b1b

    E[ b 1 ] = E[b 1] + E (Xi-

    X)i=1

    n

    ui

    (Xi - X)2

    i=1

    n

    E[ b 1 ] = b 1 + E(X

    i- X)E[

    i=1

    n

    u iXi ]

    (Xi - X)2

    i=1

    n

    = ?

  • 7/29/2019 Ch04 Lectu

    21/2521

    Next calculate var( )1

    b

  • 7/29/2019 Ch04 Lectu

    22/25

  • 7/29/2019 Ch04 Lectu

    23/2523

    The larger the variance of X, thesmaller the variance of

    1

    b

    There are the same number of black and blue dots using whichwould you get a more accurate regression line?

  • 7/29/2019 Ch04 Lectu

    24/2524

    What is the sampling distribution of ?1

    b

  • 7/29/2019 Ch04 Lectu

    25/25

    We are now ready to turn to hypothesis tests & confidence

    intervals