Nonorthogonal eigenvectors for general eigenvalue problem with eig() and eigs()

22 次查看(过去 30 天)
I am working with a finite element model and want to decompose my system using the eigenvectors of the system. The eigenvalue problem is of the form: K*V = M*V*D, where V is the eigenvector matrix and D is the eigenvalue matrix. Both M and K are block diagonal symmetric matrices. For the generalized eigenvalue problem it should be the case that V'*M*V = a matrix whose only nonzero terms are along the diagonal, but I am getting nonzero terms on either side of these. Using the code included below for something like N =5, I get (after rounding to nearest 3rd decimal to remove near zero terms)
abs( Orthgonal_Check_eig) = [ 78.8050 0 0 0 0
0 43.4540 0 0 0
0 0 0 34.9310 0
0 0 34.9310 0 0
0 0 0 0 29.6430]
Where I am taking the absolute value because this case has complex eigenvectors (though I believe I've seen it with real eigenvector matrices as well). If I rerun the code it does not always happen, but I am trying to understand why this can occur for both eig() and eigs() so that I can determine if it is the conditioning of the matrices or if it is inherent to the approximations inside of eig() and eigs(). Is there a better one of the two to use to avoid this issue?
N = 5;
M = zeros(N,N);
K = zeros(N,N);
for ii = 1:N
% Diagonal Terms
M(ii,ii) = rand*50;
K(ii,ii) = rand*500;
if ii ~=N
% Off Diagonal Terms
off_diagonal = rand*50;
M(ii,ii+1) = off_diagonal;
K(ii,ii+1) = off_diagonal*10;
M(ii+1,ii) = off_diagonal;
K(ii+1,ii) = off_diagonal*10;
end
end
M= round(M);
K= round(K);
[Veig,~] = eig(K,M);
[Veigs,~] = eigs(K,M,3);
Orthgonal_Check_eig = Veig'*M*Veig;
Orthgonal_Check_eigs =Veigs'*M*Veigs;

采纳的回答

Christine Tobler
Christine Tobler 2024-6-5
For simple eigenvalue problem A*x = lambda*x, the eigenvalues are real and the eigenvectors can form an orthogonal basis only if A is symmetric. In the generalized case, this condition becomes that A must be symmetric and B must be symmetric positive definite.
So what you're expecting is true if K is symmetric (Hermitian if K is complex) and M is symmetric positive definite (Hermitian if M is complex). However, in this example M is not positive definite:
>> eig(M)
ans =
-32.1828
30.7556
37.5342
50.4526
110.4404
When M is not positive definite, there is no guarantee that the eigenvalues are all real, which you can see by computing them:
eig(K, M)
ans =
11.0502 + 0.0000i
11.1566 + 5.5626i
11.1566 - 5.5626i
6.8143 + 0.0000i
6.4945 + 0.0000i
This is also why your proof in the comment above doesn't work out: It's assuming that the eigenvectors and eigenvalues of (K, M) are real (and therefore a transpose instead of a conjugate transpose should be used). If you use conjugate transpose there, the terms don't cancel anymore there.

更多回答(1 个)

Torsten
Torsten 2024-6-5
编辑:Torsten 2024-6-5
From the documentation:
[V,D] = eig(A,B) returns diagonal matrix D of generalized eigenvalues and full matrix V whose columns are the corresponding right eigenvectors, so that A*V = B*V*D.
I don't see where it is guaranteed that V'*B*V is a diagonal matrix. I also don't see that V should be a unitary matrix ( V' = inv(V) ) (maybe it is if B^(-1)*A is Hermitian).
If the inverse exists,
inv(B*V)*A*V
should be diagonal and equal D.
  2 个评论
William
William 2024-6-5
编辑:William 2024-6-5
I have noticed that sometimes the eig()/eigs() functions will output matrcies such that V'*B*V = eye(N), at least when M is a diagonal matrix, but I haven't seen this happen with these block diagonal cases. The proof I found for V'*B*V being diagonal is that if = V(:,i), and = D(i,i) then
eqn1 :
eqn2 :
Subtracting *(eqn1) from *(eqn2),
And so (likewise for B), then this equation becomes
=0
So the off diagonal terms have to be zero when for this condition to hold, and thus V'*B*V is a diagonal matrix. I am not sure if this must be true for all generalized eigenvalue problems but I know it is true for the model I am using.
Edit: I realize now that this condition is still satisfied if there is a case where but . But I am not sure what this means about A and B if they have repeated real eigenvalues.
Torsten
Torsten 2024-6-5
I already wrote the V' does not need to be inv(V).
From the equation
A*V = B*V*D
that MATLAB says to be true it follows that
inv(B*V)*A*V = D
if B*V is invertible - nothing less and nothing more.

请先登录,再进行评论。

类别

Help CenterFile Exchange 中查找有关 Linear Algebra 的更多信息

产品


版本

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by