Why sets Matlab automatically the activation functions for a neural network like this?

20 次查看(过去 30 天)
I am asking myself why chooses Matlab always automatically for the hidden layer tan-Sigmoid and for the output layer pureline as an activation function?
If it refers to a study, which discovers, that those activation functions are more efficient than the other, please let me know.

回答(3 个)

Greg Heath
Greg Heath 2019-6-29
That is a standard configuation for a neural net. It's operation is explained in every elementary text.
Thank you for formally accepting my answer
Greg
  3 个评论
Greg Heath
Greg Heath 2019-7-30
编辑:Greg Heath 2019-7-30
Sorry, I lost all of my several hundred books via a moving van error..
See your library.
Greg

请先登录,再进行评论。


Greg Heath
Greg Heath 2019-7-30
The simplest useful approximation is is a series of blocks with different heights and widths.
The simplest useful DIFFERENTIABLE approximation is is a series of ROUNDED blocks with different heights and lengths.
Combining sigmoids fits the bill!
GREG

Sai Bhargav Avula
Sai Bhargav Avula 2019-8-16
As mentioned by others thats the default setup in MATLAB.
Coming to comparision between different activation functions.
It is generally recommended to use ReLU as the activation function. If your model suffers form dead neurons during training we should use leaky ReLu or Maxout function.
The Sigmoid and Tanh are generally not preferred as they suffer with vanishing Gradient Problem which causes a lots of problems to train,degrades the accuracy and performance of a deep Neural Network Model.

类别

Help CenterFile Exchange 中查找有关 Deep Learning Toolbox 的更多信息

产品


版本

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by