How to use the gradient with respect to a given vector?

7 次查看(过去 30 天)
I am trying to implement an algorithm that uses the matrices R - ( m x m matrix) and K - ( n x n ) that are the inputs of a function f(R,K). I am trying to use a starting point z = [r,n] where r is the vectorization of R and n is the vectorization of K. Starting values of R and K are given thus z is also known. I need to calculate r(z) = [gradient_r(f(R,K)) gradient_n(f(R,K))].
Once this is calculated I will do more calculations and use my next starting point in the algorithm, so z is trying to optimize to r(z) = 0 which is the Kurush-Khan-Tucker condition. Thus optimizing my matrices R and K in the process.
I do not know whether I should use numerical or symbolic gradient or even if what I am trying to do is possible. The paper I am referencing appears to use this method.
I have been trying something like this:
syms z [48 1]
r = z([1:16]);
n = z([17:48]);
R_hat = Dr*r;
R = reshape(R_hat, [4,4]);
K_hat = Dn*n;
K = reshape(K_hat, [8,8]) + eye(8);
g = solve(gradient(ft_RK));
Any guidance would be greatly appreciated.

回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Nonlinear Optimization 的更多信息

产品


版本

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by