Minimizing linear equation Ax=b using gradient descent
6 次查看(过去 30 天)
显示 更早的评论
I want to find the error in the solution to Ax=b, using gradient descent.
E=||Ax-b||^2
x = [x1;x2], where x1 and x2 range between -5 and 5, with step size 0.2 for each direction.
How do I use Gradient Descent to search for a local minimum with know step size of 0.2, learning rate= 0.1. The search should stop when the difference between previous and current value is 0.002. I am to find solution for x using Gradient Descent, as well error E.
4 个评论
Hiro Yoshino
2022-12-20
You need to derive the derivative of the Error function. Gradient Descent requires it to move the point of interest to the next.
采纳的回答
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Mathematics and Optimization 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!