tailieunhanh - Class Notes in Statistics and Econometrics Part 19
CHAPTER 37 OLS With Random Constraint. A Bayesian considers the posterior density the full representation of the information provided by sample and prior information. Frequentists have discoveered that one can interpret the parameters of this density as estimators of the key unknown parameters | CHAPTER 37 OLS With Random Constraint A Bayesian considers the posterior density the full representation of the information provided by sample and prior information. Frequentists have discoveered that one can interpret the parameters of this density as estimators of the key unknown parameters and that these estimators have good sampling properties. Therefore they have tried to re-derive the Bayesian formulas from frequentist principles. If 3 satisfies the constraint R 3 u only approximately or with uncertainty it has therefore become customary to specify R 3 u n n o t2 n and e uncorrelated. Here it is assumed t2 0 and T positive definite. 877 878 37. OLS WITH RANDOM CONSTRAINT Both interpretations are possible here either u is a constant which means necessarily that 3 is random or 3 is as usual a constant and u is random coming from whoever happened to do the research this is why it is called mixed estimation . It is the correct procedure in this situation to do GLS on the model y X 3 i 1 with 1 - H ct2 P l Y u r n n o o Ki j Therefore X TX k2rJ R -1 X Ty k2rJu . where k2 ct2 t2. A This is the BLUE if in repeated samples and u are drawn from such distributions that R u has mean o and variance t21 but E 5 can be anything. If one A considers both and u fixed then is a biased estimator whose properties depend on how close the true value of R is to u. A Under the assumption of constant and u the MSE matrix of is smaller than that of the OLS if and only if the true parameter values u and a2 satisfy Rf3 u T f-21 R XtX -1 Rt I R 3 u ct2. k2 1 37. OLS WITH RANDOM CONSTRAINT 879 This condition is a simple extension of . An estimator of the form 3 XTX K2I -1XTy where k2 is a constant is called ordinary ridge regression. Ridge regression can be considered the imposition of a random constraint even though it does not hold again in an effort to trade bias for variance. This is similar to the imposition of a constraint which does not hold.
đang nạp các trang xem trước