文章发布时间:2016/2/18 6:10:54

Something special about: Matrix A*A'

Multiplying a matrix by its transpose is a very interesting and important concept in applied linear algebra.

The main thing is presumably that A*A' is a real symmetric matrix. (A*A')'=(A')'*A'=A*A'. For symmetric matrices, some very useful matrix decompostion thories come into play, e.g., spectral theorem and singular value decomposition (SVD).

Moreover, if A is regular, the matrix A*A' is postively difinite, since x'*(A*A')x=(A'*x)'(A*x)>0 for any x=~0. This property opens up much possibilty for solving linear system problems.

Another take-home message is that if A*A' or A'*A=0, A must zero. Simple proof: Supposing A: m x n; A': n x m. Let A = [a_{i, j}], and call the product

AA' = C where C = [c_{i, j}].

By the definition of matrix multiplication,

c_{i, j} = Σ a_{i, k} a_{j, k}, with the sum k = 1 to n.

Taking the special case i = j, the diagonal elements of the product are (accounting for the fact that AA'= O)

n

Σ [a_{i, k}]² = 0.

k=1

As the sum of squares, this sum is only zero if each term is zero. Hence for every pair {i, k}, i = 1, ..., m and k= 1, ..., n

a_{i, k} = 0.

That is A = O.

To put it briefly, theiagonal entries in the product will be the sum of squares of the entries of each row of A. Because each of these sums is 0, each term must be 0, thus the original matrix A was all 0.

不存在相应的目录