changing the activation function to ReLU? nntool command.
4 次查看(过去 30 天)
显示 更早的评论
Hi !
I want to build a neural network such that it consist of 3 layers ( 2 ReLU layers and an output layer) with 10 neurons in each of the nonlinear layers.
I am currntly using "nntool".
However, I couldn't figure out how to change the activation function to ReLU?
Thanks