- The MATLAB shapley function can take a function handle for model prediction. See https://www.mathworks.com/help/stats/shapley.html and the example section "Specify Blackbox Model Using Function Handle" openExample('stats/SpecifyBlackboxModelUsingFunctionHandleExample') . So, the shapley function could be used with a prediction function handle for your LIBSVM model. However, the shapley calculations will be much faster if they take advantage of the linear structure of the SVM model ( https://www.mathworks.com/help/stats/shapley-values-for-machine-learning-model.html ), as could be done with an in-built SVM classifier from MATLAB fitcsvm.
- The MATLAB LIME (Local interpretable model-agnostic explanations) function can take a function handle for model prediction. So, the LIME function could be used with a prediction function handle for your LIBSVM model. See https://www.mathworks.com/help/stats/lime.html and openExample('stats/SpecifyBlackboxModelAsFunctionHandleExample')
- The MATLAB partialDependence function can take a function handle for model prediction. See https://www.mathworks.com/help/stats/regressiontree.partialdependence.html and openExample('stats/SpecifyModelUsingFunctionHandlePartialDependenceExample')
How to apply Explainable AI on user defined classification models (without using inbuilt classifiers)?
1 次查看(过去 30 天)
显示 更早的评论
I am working on a classification model where I have used LIBSVM for classication. I want to investigate the key features responsible for classification using Explainable AI. Kindly suggest me a solution.
0 个评论
回答(2 个)
Drew
2023-8-23
编辑:Walter Roberson
2023-8-30
You can use the MATLAB explainable AI functions shapley, lime, and partialDependence while specifying the model using a function handle to the predict method for your LIBSVM model. In more detail:
If this answer helps you, it is recommended to accept the answer.
5 个评论
Walter Roberson
2023-8-30
Note that libsvm is a third-party product not supported by Mathworks. There was a period during which Mathworks used a modified version of libsvm, but that was completely replaced about 5 or so years ago.
Drew
2023-8-30
编辑:Drew
2023-8-31
You will need classifier scores from the predict function of libsvm in order to do shapley or lime analysis. Explainability analysis cannot be done with only the predicted labels.
Here is an example which uses fitcsvm, but with a function handle. Note that the classifier scores from prediction are used. Note that the function definition for the function handle appears at the very bottom of the code here.
% Load Fisher's iris data set. Remove all observed setosa irises.
% Leave versicolor and virginica, which are not linearly separable.
load fisheriris.mat
inds = ~strcmp(species,'setosa');
X = meas(inds,:);
Ylabels = species(inds);
% For classification problems, for each query point, there are
% Shapley Values for each class and for each predictor.
% When using a function handle, need to get the shapley values
% for one class at a time. Start with the first class, which is versicolor.
rng(1);
mdl = fitcsvm(X,Ylabels);
f= @(x) getFirstClassScore(mdl,x); % See definition at bottom of this file
explainer = shapley(f,X);
explainer = fit(explainer,X(1,:));
plot(explainer)
% Look at the explainer and ShapleyValues for one class, when using function handle
explainer
% The first class is versicolor, so these are the versicolor shapley values
explainer.ShapleyValues
Now, add an example of how this looks when using fitcsvm ClassificationSVM object with shapley:
% Much more convenient to load fisher iris with the predictor labels, and
% use builtin shapley command to get shapley values for both classes at
% once.
t=readtable("fisheriris.csv");
inds = ~strcmp(t{:,5},'setosa');
t_selected=t(inds,:);
rng(1);
mdl=fitcsvm(t_selected,"Species");
explainer=shapley(mdl,t_selected);
explainer=fit(explainer,t_selected(1,:));
plot(explainer)
% Or you can plot shapley values for multiple classes at once
plot(explainer,ClassNames=mdl.ClassNames);
% Look at the explainer and ShapleyValues for both classes when using
% ClassificationSVM
% explainer also reports that it is using the 'interventional-linear'
% method, which is faster than 'interventional-kernel' (especially when
% the number of predictors is large).
explainer
explainer.ShapleyValues
% Create function to return the score for the first class.
% This is used in the function handle example above.
function score = getFirstClassScore(mdl, X)
[~, score] = predict(mdl, X);
score = score(:, 1); % Extract scores of the first class
end
BHARTI RANA
2023-12-29
Can you help me resolve the following error:
"Error using shapley (line 226)
The value of 'X' is invalid. Expected input to be two-dimensional."
I am trying to run shapley as:
shapley(f,data3D);
where f: function handler
data3D: 3D matrix where 3rd dimension indicates feature. (say n x n x d dimensional, where n are number of samples and d are number of features)
my intent is to predict importance of features.
Kindly suggest.
4 个评论
Walter Roberson
2024-1-5
When you specify a blackbox model using a function handle, then the data matrix you supply must have rows corresponding to observations and must have colums corresponding to variables. 3D data is not supported.
BHARTI RANA
2024-1-5
No, I am modelling using Composite Kernel where I am having 3D data (n x n x d), where 3rd dimension corresponds to observations.
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Discriminant Analysis 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!