Minimizing linear equation Ax=b using gradient descent

6 次查看(过去 30 天)
I want to find the error in the solution to Ax=b, using gradient descent.
E=||Ax-b||^2
x = [x1;x2], where x1 and x2 range between -5 and 5, with step size 0.2 for each direction.
How do I use Gradient Descent to search for a local minimum with know step size of 0.2, learning rate= 0.1. The search should stop when the difference between previous and current value is 0.002. I am to find solution for x using Gradient Descent, as well error E.
  4 个评论
Hiro Yoshino
Hiro Yoshino 2022-12-20
You need to derive the derivative of the Error function. Gradient Descent requires it to move the point of interest to the next.
Tevin
Tevin 2022-12-20
Thank you. The function that I wrote already does that. My problem is that I struggle to calculate error for all the grid values (X,Y). The array sizes are incompatible but I am not sure how to fix that.

请先登录,再进行评论。

采纳的回答

Matt J
Matt J 2022-12-20
编辑:Matt J 2022-12-20
[X1,X2]= meshgrid(-5:0.2:5);
x=[X1(:)';X2(:)'];
E=vecnorm( A*x-b, 2,1);
E=reshape(E,size(X1)); %if desired

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Mathematics and Optimization 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by