How does the svd function determine or fix the phase of singular vectors?
36 views (last 30 days)
Show older comments
Hello everyone,
I have been experimenting with MATLAB's `svd()` and `eig()` functions to compute the singular value decomposition of complex matrices. I noticed that:
- The results from `svd()` and `eig()` are numerically very close, but not exactly identical.
- Sometimes the signs of the singular vectors match, while other times they differ by -1 (or a phase factor in the complex case).
I would like to understand:
1. How does MATLAB internally compute SVD?
2. When there is phase or sign ambiguity in the singular vectors, how does `svd()` standardize or normalize them?
Any insights or references on this would be greatly appreciated, as I want to use this knowledge in my research on tensor analysis and array signal processing.
Thank you very much!
Best regards,
Lv-lin Zhong
0 Comments
Answers (3)
Steve Eddins
on 17 Oct 2025 at 13:14
For a discussion of how MATLAB computes the SVD, see Cleve Moler's blog post, "Two Flavors of SVD," 23-Feb-2025.
The svd doc page implies that singular vectors are not "standardized." See the descriptions of the output arguments. For example, the description of U says: "Different machines and releases of MATLAB® can produce different singular vectors that are still numerically accurate. Corresponding columns in U and V can flip their signs, since this does not affect the value of the expression A = U*S*V'."
0 Comments
John D'Errico
on 17 Oct 2025 at 14:54
First, signs are to some extent arbitrary. That is, for EIG, you can multiply an eigenvector by -1, and it is still an eigenvector. Nothing changes. For SVD, it is not so easy, but you can still change the signs of the vectors to some extent, but you will need to then also fiddle with the signs of the other singular vectors.
That is, if we have
A = U*S*V'
then trivially
A = (-U)*S*(-V')
Can you flip signs of a singular vector? Well, yes, if you are careful. Consider this example:
A = rand(5)
[U,S,V] = svd(A)
As you can see, U,S,V can be used to reconstruct A.
U*S*V' - A
Now tweak the signs of the second singular vectors by -1.
U2 = U;U2(:,2) = -U2(:,2);
V2 = V;V2(:,2) = -V2(:,2);
U2*S*V2' - A
And the result is the same, all floating point trash. In fact, the trash is identical in both cases.
Next, you tell us the results are close, but not exact. SURPRISE! They are done using two different algorithms, and anytime you have a set of numbers computed using two distinct sequences of operations or by compeltely different algorithms, you should NEVER expect to see identical results. Must I repeat that? NEVER. In fact, it is more often a surprise if you do see the same results down to the least significant bit.
A trivial example of this is...
x = 0.3 - 0.2 - 0.1
y = -0.2 - 0.1 + 0.3
x and y should be identical, no? Even for such a simple thing, they are not, because these computations were done in double precision, not in infinite precision arithmetic.
How is SVD computed? Its been a long time since I wrote code for an SVD, but I recall the old LINPACK codes for the SVD first b-diagonalized the matrix, using Householder transformations. Then they killed off the upper diagonal elements using a sequence of Givens rotations. I'd guess that is not how the corrent code works since it has been roughly 40 years since I wrote code for an SVD, but that description is probably not terribly far off.
Finally, how does SVD standardize/normalize the eigenvectors? It does nothing of the sort. There is no need to artifically normalize them, since they are already unit vectors. That is, every operation done to create them is in the form of a product with a "rotation" matrix, and that does not change the norm of a vector.
If you want to understand more than that, my guess is MATLAB uses LAPACK codes to compute both EIG and SVD. So start reading the LAPACK documentation for those tools. The docs for EIG and SVD say a little, but they don't really state the methods used in any depth, since those algorithms might change in a future release.
0 Comments
Christine Tobler
on 17 Oct 2025 at 18:06
Hi Lv-lin,
1.
MATLAB computes SVD internally through a completely separate algorithm than EIG, with both algorithms provided by the LAPACK library.
While in exact arithmetic, the svd can be computed through eig on A'*A, this would result in larger numerical errors. The algorithms for EIG and SVD are related, but not identical.
2. MATLAB applies no standardization to the singular vectors and eigenvectors in terms of phase or sign. In the svd, and for simple Hermitian eigenvalue problems, the eigenvectors are normalized in Euclidean norm and form an orthogonal basis.
That said, for the complex case, the LAPACK method applies some standardization for the phase - we don't document this, because in future there might be new, faster methods that choose a different standardization. My recommendation is that if you require some standardizeation of the phase, this should be applied to the outputs, to guarantee it will stay the same over time.
In complex non-Hermitian eig, the phase is chosen so that the element with largest magnitude for each eigenvector is real positive:
[U, ~] = eig(randn(4, 'like', 1i))
max(abs(U))
I find this a quite reasonable standard, which you can apply to U for all complexities and "symmetricities" as follows:
% Complex Hermitian has a different standardization
A = randn(4, 'like', 1i); A = A+A';
[U, ~] = eig(A)
% Find element with maximum absolute value
maxAbs = max(U, [], ComparisonMethod="abs")
% Multiply with the conjugate of the sign of this element
U = U .* conj(sign(maxAbs))
% Verify that now, the real positive element in each column has maximal
% magnitude
max(abs(U))
For SVD, you could choose either U and V, and apply a similar standardization. What standardization you apply (if any) depends on your use case. For many applications, no standardization is needed at all.
0 Comments
See Also
Categories
Find more on Linear Algebra in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!