Mean of selected range of a matrix based on a range of values from another matrix
8 次查看(过去 30 天)
显示 更早的评论
Hello everyone,
I have a mat file (attached) containing 4 parameters: month, sa, ta, and sig. My intention is to get the mean and standard deviation from each month of sa and ta at a specific range of sig (let's say the value of sig : 27.4 - 27.5).
So, the intended output should be like this:
Thank you!
2 个评论
Shivam Gothi
2024-10-10
What I understand is, you want to find the mean and the standard deviation of only those set of values of "ta" and "sa" for which "sig" is within the range of 27.4 - 27.5. Also, the "sig_range" is different for different months.
Is my understanding of question corrrect ?
采纳的回答
Voss
2024-10-10
load('data_my.mat')
T = table(month,sa,ta,sig);
% only sig 27.4 to 27.5
idx = sig >= 27.4 & sig < 27.5;
G = groupsummary(T(idx,:),'month',{'mean','std'},{'sa','ta'})
% for reference, all sig
G = groupsummary(T,'month',{'mean','std'},{'sa','ta'})
4 个评论
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Numeric Types 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!