Negative D2 score on training data after lassoglm fit
13 次查看(过去 30 天)
显示 更早的评论
How can the deviance from a null model (i.e. betas all equal zero) be lower than the deviance from the full model? Surely lassoglm should choose betas all zero in this case?
From the code below, my d2Train is -0.0808.
[B, FitInfo] = lassoglm(table2array(indat.params.trainDataX), indat.params.trainDataY(:, minInd), 'poisson', 'Lambda', indat.combTable.bestLambdas(minInd), 'Alpha', indat.combTable.bestAlphas(minInd));
predCountsTrain = calculateRates(table2array(indat.params.trainDataX),B,FitInfo.Intercept)+eps;
predDevianceTrain = calculateDeviance(indat.params.trainDataY(:, minInd),predCountsTrain);
nullCountsTrain = calculateRates(table2array(indat.params.trainDataX),zeros(size(B)),FitInfo.Intercept)+eps;
nullDevianceTrain = calculateDeviance(indat.params.trainDataY(:, minInd),nullCountsTrain);
d2Train = 1 - (predDevianceTrain ./ nullDevianceTrain);
function rates = calculateRates(x,y,int)
rates = exp((x * y) + int);
end
function dev = calculateDeviance(observed,predicted)
scaledLogRatio = log(observed./predicted).*observed;
rawDifference = observed-predicted;
diffOfTerms = scaledLogRatio - rawDifference;
dev = nansum(diffOfTerms)*2;
end
0 个评论
回答(1 个)
Jaimin
2024-11-25,6:44
Hi @T0m07
The negative (d^2) value indicates that the full model's deviance is unexpectedly higher than the null model's. Kindly refer to the quick checks and fixes mentioned below:
Verify Calculations: Ensure “calculateRates” and “calculateDeviance’ are correctly implemented. Use a small constant (eps) to avoid division by zero.
function rates = calculateRates(x, y, int)
rates = exp((x * y) + int);
end
function dev = calculateDeviance(observed, predicted)
% Avoid division by zero or log of zero by adding a small constant
observed = observed + eps;
predicted = predicted + eps;
% Calculate scaled log ratio and raw difference
scaledLogRatio = log(observed ./ predicted) .* observed;
rawDifference = observed - predicted;
% Deviance calculation
diffOfTerms = scaledLogRatio - rawDifference;
dev = nansum(diffOfTerms) * 2;
end
Model Overfitting: Check if the model is overfitting. Adjust lambda and alpha values in “lassoglm” (https://www.mathworks.com/help/stats/lassoglm.html).
Data Issues: Inspect the data for anomalies or outliers that might affect predictions.
Poisson Assumptions: Ensure the Poisson model assumptions hold (mean ≈ variance). If not, consider alternatives like the negative binomial model.
Cross-validation: Use cross-validation to validate model performance and prevent overfitting.
By addressing these areas, you should improve model performance and resolve the deviance issue.
I hope this will be helpful.
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Gaussian Process Regression 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!