tailieunhanh - Kinh tế ứng dụng_ Lecture 2: Simple Regression Model

The estimation process begins by assuming or hypothesizing that the least squares linear regression model (drawn from a sample) is valid. The formal two-variable linear regression model is based on the following assumptions: (1) The population regression is adequately represented by a straight line: E(Yi) = μ(Xi) = β0 + β1Xi (2) The error terms have zero mean: E(∈i) = 0 (3) A constant variance (homoscedasticity): V(∈i) = σ2 | Applied Econometrics 1 Simple Linear Regression Model Applied Econometrics Lecture 2 Simple Regression Model It does require maturity to realize that models are to be used but not to be believed HENRI THEIL Principles of Econometrics 1 Assumptions of the two-variable linear regression model The estimation process begins by assuming or hypothesizing that the least squares linear regression model drawn from a sample is valid. The formal two-variable linear regression model is based on the following assumptions 1 The population regression is adequately represented by a straight line E Yi p Xi p0 P1Xi 2 The error terms have zero mean E ei 0 3 A constant variance homoscedasticity V ei ct2 4 Zero covariance no correlation E ei Sj 0 for all i j 5 X is non-stochastic implying that E Xi ei 0 2 Least squares estimation The sample regression model can be written as follows Yi b0 biXi e Its least squared estimators b0 and b1 are obtained by minimizing the sum of squared residual with respect to b0 and b1 Ze 2 Z Yi - b0 - b X 2 min The resulting estimators of b0 and b1 are then given by Xi- X Yi- Y Xi-X b0 Y -biX 3 Analysis of Variances The least squared regression splits the variation in the Y variable into two components the explained variation due to the variation in Xi and the residual variation TSS RSS ESS Written by Nguyen Hoang Bao May 20 2004 Applied Econometrics 2 Simple Linear Regression Model t y-Y - t t .-Y i 1 z . 1 . 1 where TSS is the total sum of squares which is observed in the dependent variable Y RSS is the residual sum of squares ESS is the explanatory sum of squares which is the variation of the predicted values b0 b1X The coefficient of determination which measures the goodness of fit of the estimated sample regression R2 - ESS TSS 0 R2 1 Because the explained variation cannot exceed the total variation the maximum value of R2 is one that is 100 percent of the variation is explained or accounted for. Conversely its minimum value is zero which implies .

TỪ KHÓA LIÊN QUAN