When comparing with the network output with desired output, if there is error the weight vector w(k) associated with the ith processing unit at the time instant k is corrected (adjusted) as
w(k+1) = w(k) + D[w(k)]
where, D[w(k)] is the change in the weight vector and will be explicitly given for various learning rules.
Perceptron Learning rule is given by:
w(k+1) = w(k) + eta*[ y(k) - sgn(w'(k)*x(k)) ]*x(k)
引用格式
Bhartendu (2025). Perceptron Learning (https://www.mathworks.com/matlabcentral/fileexchange/63046-perceptron-learning), MATLAB Central File Exchange. 检索时间: .
MATLAB 版本兼容性
创建方式
R2016a
兼容任何版本
平台兼容性
Windows macOS Linux类别
- AI and Statistics > Deep Learning Toolbox > Function Approximation, Clustering, and Control > Function Approximation and Clustering > Define Shallow Neural Network Architectures >
在 Help Center 和 MATLAB Answers 中查找有关 Define Shallow Neural Network Architectures 的更多信息
标签
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!版本 | 已发布 | 发行说明 | |
---|---|---|---|
1.0.0.0 |