Using tapped delays in the process of training an artificial neural network for the purpose of dynamic system modeling.

2 次查看(过去 30 天)
Hello,I'm trying to use an artificial neural network for creating a model for the system :
from what I have got so far first I should get the response of the system to an arbitrary input and use that data for training my network. after discretisation this second order diffrence equation () would describe the dynamic behaviour of system.
Now what is the method to implement this delays in my neural network? should I just give these delayed outputs and inputs of the plant as a input to my neural network or is there an easier way to achive this?

回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by