Why is the derivative of a function or signal one sample shorter that the orignal?

14 次查看(过去 30 天)
Hi All
taking the derivative of y=x.^2 using diff(y) , between 0 and 100 gives a signal with one less Sample, meaning instead of length of 101, it is 100 samples. so every derivative, it will get one sample shorter and if I have to integrate again, it will need those samples again maybe ? what should I do ?

采纳的回答

Star Strider
Star Strider 2020-4-11
Use the gradient function to calculate the numerical derivatives. The output will be the same length as the input. (It is also more accurate in that respect.) The function assumes regularly-sampled data, however if the sampling intervals are not constant, a work-around for that is:
dydx = gradient(y) ./ gradient(x);
.

更多回答(1 个)

Cris LaPierre
Cris LaPierre 2020-4-11
diff is taking the difference between adjacent data points. Because the difference is between two points, this causes the result to be one data point shorter.
For a simple array x=[3 4 5], diff(x)=[4-3 5-4] = [1 1].
  2 个评论
farzad
farzad 2020-4-11
I see, So : How would you handle it ? should the derivative of a signal be shorter always ?
farzad
farzad 2020-4-11
If the signal is over time, then it will be somehow meaningless or meaningfull to have a shorter signal for acceleration or velocity of that signal ?

请先登录,再进行评论。

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by