Using SVD for Dimensionality Reduction

22 次查看(过去 30 天)
Hello everyone.
I have a matrix that has 300 rows(samples) and 5000 columns(features).
I need to reduce the number of columns for classification.
As far as I know for using pca() function the number of samples should be greater than the number of features.
So I try to use Singular Value Decomposition function with below codes.
%Singular value decomposition of X;
[U, Sig, V]=svd(X);
sv=diag(Sig)
%for the distribution of singular values;
figure;
sv=sv/sum(sv);
stairs(cumsum(sv));
xlabel('singular values');
ylabel('cumulative sum');
I have two questions.
1) As i understand from the above figure i have to take approximately 250 singular values that it counts for 95% of my data.
So should I take first 250 singular values for creating a new data for classification?
How can i see the variance of each principal components like in pca() functions explained matrix to decide how many of them should i use?
2) After defining the number of principal components, I need to create a new matrix for classification.
Can I do this with below code? (for example with first two principal components)
new_matrix_for_classification = X*(V:,1:2);
Thanks in advance.

采纳的回答

Mahesh Taparia
Mahesh Taparia 2021-4-2
Hi
The first part is already answered here.
For 2nd part, you can use the function pca to directly calculate the input with principal components. For example, in your case if you want 1st 2 components, then:
[coeff,score,latent] = pca(X);
new_matrix_for_classification = score(:,1:2); %score is representation in new space
Hope it will help!

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Dimensionality Reduction and Feature Extraction 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by