LDA classifier problem in age classification

I'm doing project on age classification using gabor features.First of all,I calculated gabor features of 4 scales and 5 orientations for each of the image of child,adolescent,adult and senior adult.I get a matrix of size 64x80.THen paper says reduce its dimension using PCA.I don't know PCA so I deleted every 2nd and 3rd row and column of matrix.I reshape the matrix columnwise and get a matrix of 594x1.I did this for 114 images(doing 10 times per image gabor feature calc on different orientations of image) so I get basically 594x1140 size matrix as input matrix.I made target matrix of 4x1140 haiving columnwise data as[1;0;0;0](child),[0;1;0;0] (adolescent) and so on. So,now I have training matrix of 594x1140 and target of 4x1140 .Next step says to do LDA classification.Sample would be a matrix of size 594x1 having gabor features of test image.Classifier should classify it in either of 4 groups-child,adolescent,adult and senior adult.Please help me with LDA classifier thing!!!I'm attaching some code where I stored gabor features in matrix:
% % % function IMVECTOR = im2vec (W16x16) % load gabor; % W16x16 = adapthisteq(W16x16,'Numtiles',[8 8]); % Features80x128 = cell(4,5); % for s = 1:4 % for j = 1:5 % Features80x128{s,j} = mminmax(abs(ifft2(G{s,j}.*fft2(double(W16x16),32,32),16,16))); % end % end % Features27x43 = cell2mat(Features80x128); % % Features27x43 (3:3:end,:)=[]; % Features27x43 (2:2:end,:)=[]; % % Features27x43 (:,3:3:end)=[]; % Features27x43 (:,2:2:end)=[]; % % % IMVECTOR = reshape (Features27x43,[594 1]); % * Item one * Item two

 采纳的回答

Learn PCA. QUICKLY. Use the help and doc commands
>> lookfor pca
processpca - Processes rows of matrix with principal component analysis.
prepca - Principal component analysis.
trapca - Principal component transformation.
pcacov - Principal Components Analysis (PCA) using a covariance matrix.
pcares - Residuals from a Principal Components Analysis (PCA).
princomp - Principal Components Analysis (PCA) from raw data.
Concentrate on the above functions. I don't think you need the ones below.
rotatefactors - Rotation of FA or PCA loadings.
wmspca - Multiscale Principal Component Analysis.
wmspcatool - Multisignal Principal Component Analysis GUI.
wmspcatoolmoab - MATLAB file for wmspcatoolmoab.fig
wmspcatoolmopc - MATLAB file for wmspcatoolmopc.fig
wpca - Principal Component Analysis.
dguiwmspca - Demonstrates Multivariate Wavelet PCA tool in the Wavelet Toolbox. >>

3 个评论

Thanks a lot Sir!But I'm having main problem in LDA.My paper says that this dimensionly reduced data(say 120x1140 input matrix now after reduction) and target matrix of 4x1140 must be supplied to LDA(Linear Discriminant Analysis) for classification to classify into 4 age groups.I'm quoting what is written in paper:
Linear discriminant analysis (LDA) as a classical dimension reduction method aims to find out optimal project directions to maximize the ratio of the between-class scatter and the within-class scatter. After finding the projected directions, data can be mapped to a low-dimensional subspace, and the nearest class center criteria can be used for classification. In LDA method, each training sample is assigned to one class label exactly, this is easy in many other classification problems, because these classes have been clearly defined.
Please help me out with LDA!!
If you have a relatively recent Statistics Toolbox license (R2011b or later), you can try using ClassificationDiscriminant.fit. There is documentation here for the discriminant analysis classifier. This classifier does both linear and quadratic discriminant analysis.
For earlier toolbox versions, use the classify function.
Alan Weiss
MATLAB mathematical toolbox documentation
Alan gave you good advice. Keep in mind though that ClassificationDiscriminant finds K*(K-1)/2 hyperplanes for K classes (each hyperplane separates a pair of classes). The excerpt from the paper implies that you may be looking for a different version of LDA, in which you find K-1 hyperplanes with most informative projections. This can be easily done using ClassificationDiscriminant as well:
load fisheriris
L = ClassificationDiscriminant.fit(meas,species);
[LTrans,Lambda] = eig(L.BetweenSigma,L.Sigma,'chol');
[Lambda,sorted] = sort(diag(Lambda),'descend') % sort by eigenvalues
LTrans = LTrans(:,sorted);
LTrans(:,[3 4]) = [] % get rid of zero eigenvalues
Xtransformed = L.XCentered*LTrans;
Xtransformed represents data mapped onto the low-dimensional space.

请先登录,再进行评论。

更多回答(0 个)

类别

帮助中心File Exchange 中查找有关 Dimensionality Reduction and Feature Extraction 的更多信息

标签

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by