changing the activation function to ReLU? nntool command.

4 次查看(过去 30 天)
Hi !
I want to build a neural network such that it consist of 3 layers ( 2 ReLU layers and an output layer) with 10 neurons in each of the nonlinear layers.
I am currntly using "nntool".
However, I couldn't figure out how to change the activation function to ReLU?
Thanks

回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Deep Learning Toolbox 的更多信息

Translated by