A Suitable Machine Learning Technique to Learn Y=f(X,t)
2 次查看(过去 30 天)
显示 更早的评论
I have a blackbox model that accepts an input vector X (variables) and gives three outputs Ys but as a function of time Y1(t), Y2(t) and Y3(t). In the outputs "t" is the discrete time with a known number of time steps. The model is a simulator that predicts the output quantities as a function of time. Therefore Y1(t1), Y1(t2),... are not independent. Y1(t), Y2(t) and Y3(t) can also have some relationships but for now we can ignore that.
I have several instances (samples) of X with their corresponding outputs. Which machine learning technique can handle this learning process to relate X with Y(t) ?
I am a bit confused because I have always seen the machine learning algorithm to relate X with one output Y which is not time dependent. On the other hand the time series prediction methods only look at Y=f(t) and not the X (i.e. the input is t and the output is Y)
Any suggestion to a specific method is highly appreciated
0 个评论
采纳的回答
Ameer Hamza
2020-5-6
Yes, common neural networks are not well-suited for time-series data. Although you can use them by defining multiple inputs (say n), where each corresponding to a time step value t(n), t(n-1), t(n-2), ..., t(1), but that is not a commonly used way.
For time-dependent series, mainly we use LSTM networks:
See MATLAB examples here:
You can also see the Recurrent Neural networks which are a general form of LTSM.
0 个评论
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
产品
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!