Inconsistencies in functions called by incremental regression kernel
2 次查看(过去 30 天)
显示 更早的评论
Regarding classreg.learning.rkeutils.featureMapper, which is called by incremental learning kernel. There are inconsistencies I am unable to resolve:
1. In the help of the file you state
NU = [nu_1 nu_2 ... nu_(n/d)]
nu_i = diag(S(:,i))*H*diag(G(:,i))*PM*H*diag(B(:,i)) ./ (sigma*sqrt(d))
This means the maximum i is n/d, which can't be because S,G,B have dimensions d x n/2/d...which means i can't be greater than n/2/d
2. You state that Z = [cos(X*NU) sin(X*NU)]
this means if Z dimensions 1*ncols, for example,
Z(1,1:ncols/2).^2+Z(1,ncols/2+1:ncols).^2=ones(1,ncols/2)
(sum of sin squared and cos squared)
If there is some scaling, the one would be replaced by the scaling factor. However, this isn't the case when you use map function to get Z. mapfwht gives different result that satisfies sin^2+cos^2=1 rule, but this isn't the one used by default.
Finally, a special request: please, in future matlab versions, in the returned model by updatemetricsandfit, return the mapping, beta and bias of SVM regression plane as explicit public properties. It's very important for us as researchers.
0 个评论
回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Linear Regression 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!