What is the best way to constrain a matrix to be positive definite ?
66 次查看(过去 30 天)
显示 更早的评论
I understand a couple of ways to test whether a matrix, call it p, is positive definite: check eigenvalues, or use [~,tmp] = chol(p).
In my situation, p is part of the solution to an optimization problem for which part of the the nonlinear inequality constraints is that p be positive definite. Ideally, I would like to Specify Constraint Gradient as well, since the other parts of the constraints for this problem benefit significantly from specifying their gradients. Does anyone have an idea how to do this?
3 个评论
John D'Errico
2025-3-10
Yes. It is true that a good way to constrain a matrix to be numerically SPD is to work with a triangular matrix instead. However, that does not insure the resulting product is always numerically positive definite. For example...
n = 50;
T = tril(rand(n),-1) + diag(repmat(1e-8,n,1));
Clearly, T is a valid triangular matrix. The lower triangle is surely sufficient to make T*T' spd, right?
A = T*T';
all(A == A','all')
It is symmetric. But it is only at best non-negative definite. And some of the time, I'd bet that if you computed the eigenvalues of A, you would see some tiny negative ones appear.
chol(A)
The chol test failed here, which in MATLAB is what most tools use to decide if a matrix is positive definite.
So, while this is a good way to do the job, it still has its limits.
采纳的回答
John D'Errico
2017-1-6
编辑:John D'Errico
2017-1-6
There is NO perfect way to do this numerically. However...
Brendan is correct. You simply cannot constrain a matrix to be explicitly positive definite, because those explicit constraints will not be differentiable. And worse, you have also to enforce symmetry, so you really have lots of unknowns, with lots of equality constraints.
Far better is to work with a matrix factorization - Cholesky is a logical choice. This reduces the number of unknowns. Note that you really do not even need to force the diagonal elements to be positive. The fact is, choice of upper triangular matrix L will suffice, because L'*L will always be positive semi-definite, even with negative elements on the diagonal. Negative elements on the diagonal do not happen with a true Cholesky factorization, but that does not make the product any less or more definite.
One problem is you cannot enforce a truly positive definite matrix. At best you can try to force the matrix to be positive semi-definite. And even then, be careful, because it is easy enough to choose a matrix L that apparently should be fine, but...
L = [1e-8 1;0 1e-8]
L =
1e-08 1
0 1e-08
svd(L'*L)
ans =
1
2.2204e-24
rank(L'*L)
ans =
1
So clearly the product is not numerically positive definite, even though we know it is so in terms of strict mathematics. Chol backs that up.
chol(L'*L)
Error using chol
Matrix must be positive definite.
Finally, you can always use the tool I've posted on the FEX: nearestSPD . It will take a matrix and find the one that is as close as possible to the original, yet also be SPD. You could not put it inside an optimization though.
1 个评论
Walter Roberson
2017-1-6
See, by the way, some of the numeric problems that can arise in practice, more so since R2016a; see http://www.mathworks.com/matlabcentral/answers/319225-hac-results-vary-bewteen-matlab-r2016a-and-matlab-r2016b
更多回答(1 个)
Nick
2025-9-5,16:32
I agree with the other posts. There are multiple ways you can parameterize your positive definite matrix though with some offering potentially more benefits over others. For instance, you can use eigendecomposition to parameterize your matrix in terms of a diagonal matrix of eigenvalues,
, and then an orthogonal matrix of eigenvectors, Q such that
, i.e.,
. Then, to ensure it is positive definite you just need to ensure the elements of Λ are positive definite.



The diagonal eigenvalue matrix is trivial to parameterize but there are many ways to parameterize the orthogonal eigenvector matrix. See this journal paper regarding different parameterizations of orthogonal matrices. It offers some good insight into the differences between each in terms of, for example, computation of each in terms of operation counts. Additionally see this arxiv paper (soon to be published... hopefully) for this eigendecomposition parameterization applied to an optimization problem. However, as far as I am aware, the benefits of one parameterization over another in regards to optimizing a positive definite matrix are unexplored.
As far as ensuring your positive definite matrix is actually positive definite, it will depend on the bounds you set for your eigenvalues. These bounds might take some trial and error to set depending on your problem. I know the original question is quite old at this point, but are you able to share details of the optimization problem you were trying to solve? I am curious what kinds of applications this problem might arise in other than one in the arxiv paper I linked.
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Operating on Diagonal Matrices 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!