how can I apply Relu activation function in Levenberg_Marquardt algorithm for training neural network?

30 次查看(过去 30 天)
x = input';
t = target';
trainFcn = 'trainlm';
hiddenLayerSize = 10;
net = feedforwardnet(hiddenLayerSize,trainFcn);
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
[net] = train(net,x,t);

回答(1 个)

Varun Sai Alaparthi
Varun Sai Alaparthi 2022-11-22
Hello Ashish,
You can use ‘poslin’ as transfer function to the hidden layer which is same as applying ‘ReLU’ activation function.
You can use this code for using 'poslin'
net.layers{1}.transferFcn = 'poslin';
I hope this information helps and please reach out for any further issues.

类别

Help CenterFile Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by