could you anyone help me how to include sine, cosine and tanh activation function for training the neural network
2 次查看(过去 30 天)
显示 更早的评论
In my code i have written the
layers = [ ...
sequenceInputLayer(inputSize)
fullyConnectedLayer(numHiddenUnits1)
reLuLayer
fullyConnectedLayer(numHiddenUnits2)
reLuLayer
fullyConnectedLayer(numClasses)
reLuLayer
regressionLayer]
Now i want to execute the code using sine, cosine and tanh instead of reLu.
Could anyone please help me on this.
0 个评论
回答(1 个)
Akshat
2024-8-27
Hi Jaah,
I see you want to use different activation functions instead of reLu.
In the case of "tanh", you can use the "tanh layer", about which you understand here:
Using "sine" and "cosine" as activation functions is not a viable choice, as "sine" and "cosine" are periodic functions and they have many local extrema. Thus, we lose the uniqueness of values. Due to this reason, it is not a popular choice to use these functions as the activation functions.
Hope this helps!
Akshat
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!