Adam stochastic gradient descent optimization

版本 1.0.0.0 (100.7 KB) 作者: Dylan Muir
Matlab implementation of the Adam stochastic gradient descent optimisation algorithm
1.5K 次下载
更新时间 2017/8/16

`fmin_adam` is an implementation of the Adam optimisation algorithm (gradient descent with Adaptive learning rates individually on each parameter, with Momentum) from Kingma and Ba [1]. Adam is designed to work on stochastic gradient descent problems; i.e. when only small batches of data are used to estimate the gradient on each iteration, or when stochastic dropout regularisation is used [2].
See GIT repository for examples:
https://github.com/DylanMuir/fmin_adam

Usage:
[x, fval, exitflag, output] = fmin_adam(fun, x0 <, stepSize, beta1, beta2, epsilon, nEpochSize, options>)

See the function help for a detailed reference. The github repository has a couple of examples.

References:
[1] Diederik P. Kingma, Jimmy Ba. "Adam: A Method for Stochastic Optimization", ICLR 2015. [https://arxiv.org/abs/1412.6980](https://arxiv.org/abs/1412.6980)

[2] Geoffrey E Hinton, Nitish Srivastava, Alex Krizhevsky, Ilya Sutskever, and Ruslan R. Salakhutdinov. "Improving neural networks by preventing co-adaptation of feature detectors." arXiv preprint. [https://arxiv.org/abs/1207.0580](https://arxiv.org/abs/1207.0580)

引用格式

Dylan Muir (2025). Adam stochastic gradient descent optimization (https://github.com/DylanMuir/fmin_adam), GitHub. 检索时间: .

MATLAB 版本兼容性
创建方式 R2016b
兼容任何版本
平台兼容性
Windows macOS Linux
类别
Help CenterMATLAB Answers 中查找有关 Statistics and Machine Learning Toolbox 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

无法下载基于 GitHub 默认分支的版本

版本 已发布 发行说明
1.0.0.0

Updated title
Updated description
Updated description
Updated description

Updated description
Updated description
Updated description

要查看或报告此来自 GitHub 的附加功能中的问题,请访问其 GitHub 仓库
要查看或报告此来自 GitHub 的附加功能中的问题,请访问其 GitHub 仓库