Using tapped delays in the process of training an artificial neural network for the purpose of dynamic system modeling.
1 次查看(过去 30 天)
显示 更早的评论
Hello,I'm trying to use an artificial neural network for creating a model for the system : ![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1390249/image.png)
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1390249/image.png)
from what I have got so far first I should get the response of the system to an arbitrary input and use that data for training my network. after discretisation this second order diffrence equation (
) would describe the dynamic behaviour of system.
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1390254/image.png)
Now what is the method to implement this delays in my neural network? should I just give these delayed outputs and inputs of the plant as a input to my neural network or is there an easier way to achive this?
0 个评论
回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!