Given a matrix A^n. Comparing normal multiplication versus Diagonalization. I expect the former to be faster but its not in my case

4 次查看(过去 30 天)
I have been learning diagonalization and SVD. I tried to show by code that doing A^n = A* A* A*......* n is slower than when we use the concept of diagonalization where A^n = P*D^n*P^-1
n = 2;
% given a matrix
A = rand(1000,1000);
% CASE 1
tic
product_1 = A^n;
T1=toc;
fprintf('%f is the time matrix multiplication \n',T1 );
% CASE 2
tic
[P, D]=eig(X);
product_2 = P * (D^n)/(P);
T2=toc;
fprintf('%f is the time for diagonalization :',T2 );
0.014834 is the time matrix multiplication
1.222272 is the time for diagonalization

采纳的回答

Christine Tobler
Christine Tobler 2020-3-6
A^2 is just one matrix multiplication, A*A, which is much faster to do directly than the call to EIG.
For larger n, A^n isn't just doing the multiplication A*A*...*A. You can see the algorithm used by reading the mpower function, edit mpower.
To show a case where EIG is faster, you could increase n significantly, explcitly compute A*A*...*A instead of calling A^n, and have EIG return a vector of eigenvalues instead a dense diagonal matrix D. I think that should work to show an example where EIG is faster.
Note that there are numerical issues with using EIG, in cases where the matrix of eigenvectors P is close to singular. For example, try the EIG method on the matrix A = [3 1; 0 3]:
>> A*A
ans =
9 6
0 9
>> [P, D]=eig(A); product_2 = P * (D^2)/(P)
product_2 =
9 0
0 9
>> cond(P)
ans =
3.0024e+15

更多回答(1 个)

James Tursa
James Tursa 2020-3-6
编辑:James Tursa 2020-3-6
Well, the timings didn't meet your expectations. Why would you expect A*A to be slower than doing a full eig calculation followed up by two matrix multiplies and a matrix divide operator?
Also, did you mean for D^n to be D.^n for the comparison?

类别

Help CenterFile Exchange 中查找有关 Linear Algebra 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by