|
Multiplying a matrix by its transpose is a very interesting and important concept in applied linear algebra.
The main thing is presumably that A*A' is a real symmetric matrix. (A*A')'=(A')'*A'=A*A'. For symmetric matrices, some very useful matrix decompostion thories come into play, e.g., spectral theorem and singular value decomposition (SVD).
Moreover, if A is regular, the matrix A*A' is postively difinite, since x'*(A*A')x=(A'*x)'(A*x)>0 for any x=~0. This property opens up much possibilty for solving linear system problems.
Another take-home message is that if A*A' or A'*A=0, A must zero. Simple proof: assuming A: m x n; A': n x m. Let A = [a_{i, j}], and call the product
AA' = C where C = [c_{i, j}].
Then,
c_{i, j} = Σ a_{i, k} a_{j, k}, with the sum k = 1 to n.
Taking the special case i = j, the diagonal elements are (accounting for the fact that AA'= O)
n
Σ [a_{i, k}]² = 0.
k=1
As the sum of squares, this sum is only zero if each term is zero. Hence for every pair {i, k}, i = 1, ..., m and k= 1, ..., n
a_{i, k} = 0.
That is A = O.
To put it briefly, the diagonal entries in the product will be the sum of squares of the entries of eachrow of A. Because each of these sums is 0, each term must be 0, thus the original matrix A was all 0.
Archiver|手机版|科学网 ( 京ICP备07017567号-12 )
GMT+8, 2024-12-4 04:46
Powered by ScienceNet.cn
Copyright © 2007- 中国科学报社