2.1 Matrix operations (矩阵运算)
本文为《Linear algebra and its applications》的读书笔记
目录
If is an matrix, each column of is a list of real numbers, which identifies a vector in .
The diagonal entries (对角线元素) in an matrix are , , … and they form the main diagonal (主对角线) of . A diagonal matrix (对角矩阵) is a square matrix whose nondiagonal entries are zero. An matrix whose entries are all zero is a zero matrix (零矩阵) and is written as . The size of a zero matrix is usually clear from the context.
Sums and Scalar Multiples
The arithmetic for vectors described earlier has a natural extension to matrices.
The sum is the matrix whose columns are the sums of the corresponding columns in and . The sum is defined only when and are the same size.
If is a scalar and is a matrix, then the scalar multiple is the matrix whose columns are times the corresponding columns in .
Matrix Multiplication
When a matrix multiplies a vector , it transforms into the vector . If this vector is then multiplied in turn by a matrix , the resulting vector is . See Figure 2.
Thus is produced from by a of mappings. Our goal is to represent this composite mapping as multiplication by a single matrix, denoted by , so that
If is , is , and is in . Then
By the linearity of multiplication by ,
The vector is a linear combination of the vectors ,…,, using the entries in as weights. In matrix notation, this linear combination is written as
Thus multiplication by transforms into . We have found
the matrix we sought!
Multiplication of matrices corresponds to composition of linear transformations.
The definition of lends itself well to parallel processing on a computer. The columns of are assigned individually or in groups to different processors, which independently and hence simultaneously compute the corresponding columns of .
EXAMPLE 3
Compute , where and .
SOLUTION
Each column of is a linear combination of the columns of using weights from the corresponding column of . ( 的每一列都是 的各列的线性组合,以 的对应列的元素为权)
Obviously, the number of columns of must match the number of rows in in order for a linear combination such as to be defined. Also, the definition of shows that has the same number of rows as and the same number of columnsas .
The definition of is important for theoretical work and applications, but the following rule provides a more efficient method for calculating the individual entries in when working small problems by hand.
Let denote the th row of a matrix . Then
View vectors in as matrices. For and in , the matrix product is a matrix, called the scalar product (数量积), or inner product (内积), of and . It is usually written as a single real number without brackets. The matrix product is an matrix, called the outer product (外积) of and .
inner products ( and ) have the transpose symbol in the middle. Outer products ( and ) have the transpose symbol on the outside.
内积
外积 和 互为对称矩阵
EXAMPLE 4
Suppose the last column of is entirely zero but itself has no column of zeros. What can you say about the columns of ?
SOLUTION
Let be the last column of . By hypothesis, the last column of is zero. Thus, = 0. However, is not the zero vector. Thus, the equation is a linear dependence relation among the columns of , and so the columns of are linearly dependent.
Properties of Matrix Multiplication
Recall that represents the identity matrix and for all in .
PROOF (只证明性质(a),其他性质的证明类似)
Property (a)
Property (a) follows from the fact that matrix multiplication corresponds to composition of linear transformations (which are functions), and it is known that the composition of functions is associative(结合律).
在离散数学中,函数的复合运算 被定义为关系乘积 ,可以证明关系乘积是满足结合律的,因此函数的复合运算也满足结合律
Here is another proof of (a) that rests on the “column definition” of the product of two matrices. Let
Recall that the definition of makes for all , so
If , we say that and commute with one another.(可交换的)
WARNINGS:
- In general, .
- The cancellation laws(消去律) do not hold for matrix multiplication. That is, if , then it is not true in general that .
- If a product is the zero matrix, you cannot conclude in general that either or .
Tip: When is square and has fewer columns than has rows, it is more efficient to compute than .
: Show that if is a linear combination of the columns of , then is a linear combination of the columns of .
: If is a linear combination of the columns of , then there is a vector such that = . By definition of matrix multiplication, . This expresses y as a linear combination of the columns of using the entries in the vector as weights.
Powers of a Matrix
If is an matrix and if is a positive integer, then
If ; then should be itself. Thus is interpreted as the identity matrix.
The Transpose of a Matrix
PROOF
Property (d)
The -entry of is the -entry of , which is
The entries in row of are , because they come from column of . Likewise, the entries in column of are , because they come from row of . Thus the -entry in is , as above.
The generalization of Theorem 3(d) to products of more than two factors can be stated in words as follows: