Is it possible (yet) to implement a maxout activation "layer" in 2017b Deep Learning network?
1 次查看(过去 30 天)
显示 更早的评论
Maxout is an activation function that includes RELU and “leaky” RELUs as special cases, basically allowing for piecewise linear (planar/hyperplanar) activation functions. They seem to work better than either in a number of cases. Here’s a reference: Goodfellow, I. J., Warde-Farley, D., Mirza, M., Courville, A., & Bengio, Y. (2013). Maxout networks. arXiv preprint arXiv:1302.4389. https://arxiv.org/abs/1302.4389
Ultimately I’m interested in playing with architectures like this one, which use maxouts extensively:
Zhang, Y., Pezeshki, M., Brakel, P., Zhang, S., Bengio, C. L. Y., & Courville, A. (2017). Towards end-to-end speech recognition with deep convolutional neural networks. arXiv preprint arXiv:1701.02720. Speech recognition using convolutional nets with maxout activation. https://arxiv.org/abs/1701.02720
But I simply can’t see any way to fake a maxout activation in a convolutional network framework in 2017b. While I’m a Matlab vet (since Version 4, I think), I’m a total newbie to Matlab deep learning networks, so maybe I’m missing something. Any suggestions greatly appreciated.
-Terry Nearey
0 个评论
回答(1 个)
Pankaj Wasnik
2018-1-2
Hi, You can try using https://github.com/yechengxi/LightNet, which is a bit simpler cnn toolbox where you can debug easily also it's easier to understand. You can try to implement the maxout layer by yourself. I am also trying the same. If I finish before you, I will share the code.
Regards, Pankaj Wasnik
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Install Products 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!