Relatively easy optimization problem in Excel it's hard to implement on Matlab
1 次查看(过去 30 天)
显示 更早的评论
rng('default')
% creating fake data
data = randi([-1000 +1000],30,500);
yt = randi([-1000 1000],30,1);
% creating fake missing values
row = randi([1 15],1,500);
col = rand(1,500) < .5;
% imputing missing fake values
for i = 1:500
if col(i) == 1
data(1:row(i),i) = nan;
end
end
%% here starts my problem
wgts = ones(1,500); % optimal weights needs to be binary (only zero or one)
% this would be easy with matrix formulas but I have missing values at the
% beginning of the series
for j = 1:30
xt(j,:) = sum(data(j,:) .* wgts,2,'omitnan');
end
X = [xt(3:end) xt(2:end-1) xt(1:end-2)];
y = yt(3:end);
% from here I basically need to:
% maximize the Adjusted R squared of the regression fitlm(X,y)
% by changing wgts
% subject to wgts = 1 or wgts = 0
% and optionally to impose sum(wgts,'all') = some number;
% basically I need to select the data cols with the highest explanatory
% power, omitting missing data
This is easy to implement with Excel solver, but it only can handle 200 decision variables and it takes a lot of time. Thank you in advance.
2 个评论
Sam Chak
2022-7-6
Unsure what went wrong. Can you show your results in Excel? It is probably better to compare the performances of having the same data.
回答(1 个)
Torsten
2022-7-6
编辑:Torsten
2022-7-6
If you accept a MATLAB solution for this problem:
min: norm(X*p-y)
s.c.
xt = data*w
0<=w<=1
w integer
where p is (3x1) and "norm" is either 1-norm or max-norm, you can use intlinprog.
If you insist at maximizing adjustable r-squared, I think you will have to use ga.
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!