Regarding Lagrange Multiplier in SVM

2 次查看(过去 30 天)
Hello all, In my work I am using SVM for classification. I had trained the SVM classifier and also obtained the Lagrangian multiplier (α) which is a column vector .
My query is how we can use this Lagrangian multiplier (α) to predict the labels of the data points.
Any help in this regard will be highly appreciated.
  4 个评论
charu shree
charu shree 2023-3-24
Thank you sir for your response....
I am using the paper " Supervised Learning-Based Semi-Blind Detection for Generalized Space Shift Keying MIMO Systems" for my work.
I am using SVM method for classification. The SVM classifier is trained using the training data in vector having dimension , the labels in vector having dimension .
So as per equation (5) of this paper, I had obtained 16 different α of dimension .
My query in mathematical form is as follows:
How can I predict the labels of the test data say in vector having dimension 404 x 4 ?
charu shree
charu shree 2023-3-24
Any hint will help me in moving ahead in tackling this problem...

请先登录,再进行评论。

回答(1 个)

Parag
Parag 2025-4-10
To predict the labels for your test data (e.g., a 404 × 4 matrix), using SVM trained in a one-vs-all setting with 16 classifiers, you can follow this approach:
  • Each SVM classifier corresponds to one class and provides a decision score using the learned parameters (α, support vectors, labels, and bias).
  • For each test vector, compute the decision score from all 16 classifiers using the kernel function (e.g., linear).
  • Stack all scores in a matrix of size 404 × 16.
  • Assign each test vector the class label of the classifier with the highest score (i.e., maximum margin decision function output).
This approach implements the standard prediction step in multi-class SVM, aligned with the formulation in the paper.
% Inputs (assumed precomputed and available):
% X_test : 404 x 4 test data matrix
% supportVectors : N x 4 training vectors
% alpha : 1x16 cell array, each cell contains alpha vector for a class
% labels : N x 16 label matrix in one-vs-all format (+1/-1 per class)
% b : 16 x 1 bias vector for each classifier
numTest = size(X_test, 1);
numClasses = 16;
F = zeros(numTest, numClasses); % Stores decision scores
for k = 1:numClasses
% Linear kernel: dot product
K = X_test * supportVectors'; % 404 x N
F(:, k) = K * (alpha{k} .* labels(:, k)) + b(k);
end
% Predict class with highest score
[~, predicted_labels] = max(F, [], 2); % Output: 404 x 1 vector of predicted classes
Hope it helps!

类别

Help CenterFile Exchange 中查找有关 Statistics and Machine Learning Toolbox 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by