Monte Carlo simulation of a linear regression model with a lagged dependet variable
2 次查看(过去 30 天)
显示 更早的评论
I have a linear regression model with a lagged dependet variable: y_t=beta_0 + beta_1 * y_{t-1} + u_t The initial starting point is y_0=2 and I know the real coefficients of beta_0=2 and beta_1=1. How can I perform a Monte Carlo Simulation that´s estimating the bias of the OLS coefficients?
0 个评论
回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Linear and Nonlinear Regression 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!