Mean Square Error with ML Estimate
1 次查看(过去 30 天)
显示 更早的评论
Hello,
I've been struggling with an code in my Detection and Estimation theory course.
I have a signal of know frequency w0 given by:
y(t)=Asin(w0*t) + Sigma(t) where Sigma(t) is a noisy sequence of i.i.d random variables with mean zero and variance a^2
We need to estimate the parameter A when the noise is gaussian.
The question is to compute the mean square error (MSE) defined as MSE(AEst)=E(AEst-A)^2 by averaging simulation results over 5,000 independent runs. (Assuming A=1 and w0=2*pi*0.2)
Can anyone help me in approaching this problem ?
0 个评论
回答(0 个)
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!