gradient descent for custom function
5 次查看(过去 30 天)
显示 更早的评论
I have four equations:
1) e = m - y
2) y = W_3 * h
3) h = z + W_2 * z + f
4) f = W_1 * x
I want to update W_1, W_2 and W_3 in order to minimize a cost function J = (e^T e ) by using gradient descent.
x is an input, y is the output and m is the desired value for each sample in the dataset
I would like to do W_1 = W_1 - eta* grad(J)_w_1
W_2 = W_2 - eta* grad(J)_w_2
W_3 = W_3 - eta* grad(J)_w_3
Going through documentation I found out that you can train standard neural networks. But notice that I have some custom functions, so I guess it would be more of an optimization built in function to use.
Any ideas?
2 个评论
Matt J
2024-4-24
x is an input, y is the output and m is the desired value for each sample in the dataset
It looks like z is also an input. It is not given by any other equations.
回答(2 个)
Matt J
2024-4-24
编辑:Matt J
2024-4-24
so I guess it would be more of an optimization built in function to use.
No, not necessarily. Your equations can be implemented with fullyConnectedLayers and additionLayers.
3 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!