## Documentation Center |

Diagonal scaling to improve eigenvalue accuracy

`[T,B] = balance(A)[S,P,B] = balance(A)B = balance(A)B = balance(A,'noperm')`

`[T,B] = balance(A)` returns
a similarity transformation `T` such that `B
= T\A*T`, and `B` has,
as nearly as possible, approximately equal row and column norms. `T` is
a permutation of a diagonal matrix whose elements are integer powers
of two to prevent the introduction of roundoff error. If `A` is
symmetric, then `B == A` and `T` is the identity matrix.

`[S,P,B] = balance(A)` returns
the scaling vector `S` and the permutation vector `P` separately.
The transformation `T` and balanced matrix `B` are
obtained from `A`, `S`, and `P` by `T(:,P) = diag(S)` and `B(P,P) = diag(1./S)*A*diag(S)`.

`B = balance(A)` returns
just the balanced matrix `B`.

`B = balance(A,'noperm')` scales `A` without
permuting its rows and columns.

This example shows the basic idea. The matrix `A` has
large elements in the upper right and small elements in the lower
left. It is far from being symmetric.

A = [1 100 10000; .01 1 100; .0001 .01 1] A = 1.0e+04 * 0.0001 0.0100 1.0000 0.0000 0.0001 0.0100 0.0000 0.0000 0.0001

Balancing produces a diagonal matrix `T` with
elements that are powers of two and a balanced matrix `B` that
is closer to symmetric than `A`.

[T,B] = balance(A) T = 1.0e+03 * 2.0480 0 0 0 0.0320 0 0 0 0.0003 B = 1.0000 1.5625 1.2207 0.6400 1.0000 0.7813 0.8192 1.2800 1.0000

To see the effect on eigenvectors, first compute the eigenvectors
of `A`, shown here as the columns of `V`.

[V,E] = eig(A); V V = 0.9999 -0.9999 -0.9999 0.0100 0.0059 + 0.0085i 0.0059 - 0.0085i 0.0001 0.0000 - 0.0001i 0.0000 + 0.0001i

Note that all three vectors have the first component the largest.
This indicates `V` is badly conditioned; in fact `cond(V)` is `8.7766e+003`.
Next, look at the eigenvectors of `B`.

[V,E] = eig(B); V V = 0.6933 -0.6993 -0.6993 0.4437 0.2619 + 0.3825i 0.2619 - 0.3825i 0.5679 0.2376 - 0.4896i 0.2376 + 0.4896i

Now the eigenvectors are well behaved and `cond(V)` is `1.4421`.
The ill conditioning is concentrated in the scaling matrix; `cond(T)` is `8192`.

This example is small and not really badly scaled, so the computed
eigenvalues of `A` and `B` agree
within roundoff error; balancing has little effect on the computed
results.

Balancing can destroy the properties of certain matrices; use it with some care. If a matrix contains small elements that are due to roundoff error, balancing might scale them up to make them as significant as the other elements of the original matrix.

Was this topic helpful?