正在加载图片...
A-1 may not exist.In particular,we say that A is invertible or non-singular if A-1 exists and non-invertible or singular otherwise.1 In order for a square matrix A to have an inverse A-1,then A must be full rank.We will soon see that there are many alternative sufficient and necessary conditions,in addition to full rank,for invertibility. The following are properties of the inverse;all assume that A,B E Rnxn are non-singular: 。(A-1)-1=A ·(AB)-1=B-1A-1 .(A-1)T=(AT)-1.For this reason this matrix is often denoted A-T. As an example of how the inverse is used,consider the linear system of equations,Ax =b where AE Rx",and z,bE R".If A is nonsingular (i.e.,invertible),then z=A-b.(What if A E Rmxn is not a square matrix?Does this work?) 3.8 Orthogonal Matrices Two vectors x,y E R"are orthogonal if rTy =0.A vector x E R"is normalized if llcl2=1.A square matrix U E Rxn is orthogonal (note the different meanings when talking about vectors versus matrices)if all its columns are orthogonal to each other and are normalized (the columns are then referred to as being orthonormal). It follows immediately from the definition of orthogonality and normality that UTU =I=UUT. In other words,the inverse of an orthogonal matrix is its transpose.Note that if U is not square-i.e.,UE Rmx",n <m-but its columns are still orthonormal,then UTU =I, but UUTI.We generally only use the term orthogonal to describe the previous case, where U is square. Another nice property of orthogonal matrices is that operating on a vector with an orthogonal matrix will not change its Euclidean norm,i.e., Uxl2 Illl2 for any x∈R",U∈Rnxn orthogonal.. 3.9 Range and Nullspace of a Matrix The span of a set of vectors [,2,...n}is the set of all vectors that can be expressed as a linear combination of {i,...nThat is, IIt's easy to get confused and think that non-singular means non-invertible.But in fact,it means the opposite!Watch out! 12A−1 may not exist. In particular, we say that A is invertible or non-singular if A−1 exists and non-invertible or singular otherwise.1 In order for a square matrix A to have an inverse A−1 , then A must be full rank. We will soon see that there are many alternative sufficient and necessary conditions, in addition to full rank, for invertibility. The following are properties of the inverse; all assume that A, B ∈ R n×n are non-singular: • (A−1 ) −1 = A • (AB) −1 = B−1A−1 • (A−1 ) T = (AT ) −1 . For this reason this matrix is often denoted A−T . As an example of how the inverse is used, consider the linear system of equations, Ax = b where A ∈ R n×n , and x, b ∈ R n . If A is nonsingular (i.e., invertible), then x = A−1 b. (What if A ∈ R m×n is not a square matrix? Does this work?) 3.8 Orthogonal Matrices Two vectors x, y ∈ R n are orthogonal if x T y = 0. A vector x ∈ R n is normalized if kxk2 = 1. A square matrix U ∈ R n×n is orthogonal (note the different meanings when talking about vectors versus matrices) if all its columns are orthogonal to each other and are normalized (the columns are then referred to as being orthonormal). It follows immediately from the definition of orthogonality and normality that U TU = I = UUT . In other words, the inverse of an orthogonal matrix is its transpose. Note that if U is not square — i.e., U ∈ R m×n , n < m — but its columns are still orthonormal, then U TU = I, but UUT 6= I. We generally only use the term orthogonal to describe the previous case, where U is square. Another nice property of orthogonal matrices is that operating on a vector with an orthogonal matrix will not change its Euclidean norm, i.e., kUxk2 = kxk2 for any x ∈ R n , U ∈ R n×n orthogonal. 3.9 Range and Nullspace of a Matrix The span of a set of vectors {x1, x2, . . . xn} is the set of all vectors that can be expressed as a linear combination of {x1, . . . , xn}. That is, span({x1, . . . xn}) = ( v : v = Xn i=1 αixi , αi ∈ R ) . 1 It’s easy to get confused and think that non-singular means non-invertible. But in fact, it means the opposite! Watch out! 12
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有