Regression Analysis Methods and Tests Overview

Slide Note
Embed
Share

Regression analysis involves various methods and tests like OLS estimation, hetroscedasticity detection, and Goldfeld-Quandt & Breush-Pagan-Godfrey tests. Understanding these techniques is crucial for interpreting regression results accurately.


Uploaded on Sep 27, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Fundamentals of regression analysis 2 OBID A.KHAKIMOV

  2. OLS Estimation: Hetroscedasticity 2 n n X = i = i ) 2 = = = 2 2 ( ( ) ( ) i Var W E u E u i i i n = j 2 1 1 X j 1 n = i 2 2 X If variance of residuals is constant then Our equation collapses to original variance Formula. i i 1 2 n = j 2 X j 1 2 ) = ( Var n = j 2 X j 1

  3. Consequences: The regression coefficients are unbiased The usual formula for coefficient variances is wrong The OLS estimation is BLU but not efficient. t-test and F test are not valid.

  4. Method of Generalized Least Squares X Y + = , 0 1 = + + u 1 2 i i i , 0= i 1 X + Y X X u 2 i i i i X Y X * , 0 = + + i i i u 1 2 i i i i 2 1 1 u 2 = = = = = * * 2 2 ( ) ( ) ( ) 1 i Var u E u E E u i i i i 2 2 i i i

  5. Method of Generalized Least Squares 2 i * 2 X u Y X * * , 0 = i i i i Min 1 2 * * { , } 1 2 i i i w ) X ( ) w w Y X w X w Y * i i i i X i i i i = 2 2 2 ( )( ( ) w w i i i i i X w * i = ( ) Var 2 2 2 ( ) ( ) w w w X i i i i i

  6. Hetroscedasticity: Detection. Graphical Method Park test White s general Hetroscedasticity test Breush-Pagan-Godfrey Test

  7. Park test = + + Y X u i i i = u 2 i 2 X e i i = + Log + 2 i 2 log( ) log( ) ( ) ( ) X Log u i i 2 Is not known and we use = + Log + 2 i 2 u log( ) log( ) ( ) ( ) X Log u i i If coefficient beta is statistically different from zero, it indicates that Hetroscedasticity is present.

  8. Goldfeld-Quandt Test = + + Y X u i i i 1. Order your sample from lowest to highest. Omit your central your central observation and divide your sample into two samples. 2. Run two regressions for two samples and obtain RSS1 and RSS2. RSS1 represents RSS from small X sample. 3. Each RSS has following degrees of freedom n c RSS k 1 2 2 n c k RSS 2 2 df / RSS = 2 1 Calculate / RSS df 2k n c 2 2 Follows F distribution with df of num and denom equal to

  9. Breush-Pagan-Godfrey Test = + + + + ... Y X X X u 0 1 1 2 2 i k k i = = + + 0 + Z + 1 + Z ... + Z 2 i ( ... + 2 ) f Z Z Z v 0 1 1 2 2 k k i + 2 v 1 2 i k k i : _ _ _ hom Ho The residuals are oscedastic If you reject your Null hypothesis then there is Hetroscedasticity. .

  10. Breush-Pagan-Godfrey Test Step1. Estimate original regression model and get residuals . Step2. Obtain u , 1 , 2 n u .......... u u 3 n = u i ~ 2 =1 i N K 2 u Step3. Construct p = i ~ i 2 Step4. Estimate the following regression model. = + + + + ... p Z Z Z v 0 1 1 2 2 i k k i Step5. Obtain 1 = ESS 2 2 ~ 1 m m- is number of parameters of Step 4 regression

  11. Whites general Hetroscedasticitytest Step1. Estimate original regression model and get residuals . = + + + Y X X u 0 1 1 2 2 i i Step2. Estimate 2 2 = + + + + + + 2 u X X X X X X v 0 1 1 2 2 3 3 4 4 5 1 2 i i 2~ 2 nR m If you reject your Null hypothesis then there is Hetroscedasticity. .

  12. Remedial Measures Weighted Least squares White s Hetroscedasticity consistent variance and standard errors. Transformations according to Hetroscedasticity pattern.

  13. LM test score and assume that 1. Regress each element of X2 onto all elements of X1 and collect residual in r matrix Y = + + X X u 1 1 2 2 2. Then form u*R 3. Then run regression 1 on ur = : 0 H 4. 0 2 N SSR 0 k 2 ) 1 1 = 2 i var( ' ' A X X u X X X X i i

  14. Autocorrelation reasons: Inertia. Specification Bias: Omitted relevant variables. Specification bias: Incorrect functional form. Cobweb phenomenon. Lags Data manipulation. Data Transformation. Non-stationary

  15. Consequences: The regression coefficients are unbiased The usual formula for coefficient variances is wrong The OLS estimation is BLU but not efficient. t-test and F test are not valid.

  16. Detection: Breusch-Godfrey = + + + ... + Y X X X u 0 1 , 1 2 , 2 , t t t k k t i = + + + + ..... + ... u + X X X 0 1 1 2 2 t k k + + + + u u u u e 1 1 2 2 3 3 t t t k t k t : H There is no kth order serial correlation 0 2~ 2 p ( ) n p R Test Statistic Where n- number of observations, p-number of residual lag variables.

  17. Generalized Least Square If the value of rho is known = + + Y X u 1 u 2 t t t 1 1 = + u v 1 t t t = Y + + Y X u 1 1 2 1 + 1 t t t X = + 1 ( ) ( ) Y X u u 1 1 2 1 1 t t t t t t If the value of rho is not known = + ( ) Y Y X X u u 1 2 1 1 t t t t t t = + Y X u 2 t t t

  18. Cochrane-Orcutt procedure : First estimate original regression and obtain residuals = + + 1 Y X + u t 1 2 t 1 t = 1 u u v 1 t t t After runing AR(1) regression obtain the value of Rho and run GLS regression ( ) ( ) ( + ) = + * * 1 ( ) Y Y X X u u 1 2 1 1 1 t t t t t t Using GLS coefficients obtain new residuals and obtain new value of Rho = + + * * 2 Y X u 1 2 t t t = + 2 2 u u v 1 t t t Continue the process until you get convergence in coefficients.

  19. Endogeneity 1. Omission of relevant variables = + the model e t which you estimate Y X u 1 1 t t = + + Y X X 1 1 2 2 t = 1 ' ( ' ) X X X Y 1 1 1 = + + = 1 ' ( ' ) ( ) = X X X X X e 1 ) 1 1 1 ( 1 2 ) 2 X t + + = 1 ' 1 ' 1 ' ( ' ' ( ' ) X X X X X X X X X X e 1 + 1 1 X 1 ' 1 1 1 1 = 2 2 X 1 ) 1 1 t + 1 ' 1 ' ( ) where ( ' X X e X X X 1 2 1 1 1 1 1 1 2 t = 1 + + ' or and v if E( ) 0 X X X e 2 1 t ) = ( E 1 2

  20. What to do ? If omitted variable does not relate to other included independent variables then OLS estimator still BLUE Proxy variables Use other methods other than OLS

  21. Measurement error = + * Y Y v y = + + the * model which you estimate Y X u v t y = 1 ' ( ' ) X X X Y = + + 1 ' ( ' ) ( ) X X X X u v 1 1 t y ) = + + 1 ' 1 ' 1 ' ( ( ' ) ( ' ) ( ) ( ' ) ( ) E X X X X X X E X u X X E X v 1 1 t y = ' ' E if ( ) then 0 and in most cases ( ) 0 X u E X v t y ) = + 1 ' ( ( ' ) ( ) E X X E X v y

  22. Measurement error: independent variable = + * X X v x = + ) u + = * ( X Y X v u x t = + + = = + * * { } Y v Y X e t X x = + + 1 ' * ( ' ) X X X u v t x '* ) ( = + * * 1 * * * * 1 ( ' ) ' ( ' ) ( ) E X X X X X X E X u t '* '* + ) = * * 1 ( ' ) ( and if ( ) 0 X X E X v E X u x t '* ) ( = + ) * * 1 ( ' ) ( E X X E X v x

  23. Endogenous regressors and bias Bias. Single equation (OLS) estimators will be biased if one or more regressors is endogenous (jointly dependent). Consistency. Indirect Least Squares, Instrumental Variables or Two Stage Least Squares. 23

Related


More Related Content