Learning & Reasoning/Math Revisit

Eigenvalues and Eigenvectors (고유값과 고유벡터)

이현봉 2016. 3. 28. 20:31

A가 n×n matrix 일 때, Ax = λx 를 만족하는 0 이 아닌 벡터 xR, scalar λ가 존재하나?  만약 xλ가 존재하면;

- A의 column들을 x의 entry를 weight 삼아 linear combination한 결과가 x를 λ로 scale 한 것과 같다는 뜻.  

- 벡터 x를 matrix A로 linear transform하면 x를 λ로 scale 한 것과 같음

- A의 row들과 x가 inner product 하면 λx 와 같다는 뜻

 

Introduction


Definition of eigenvalue and eigenvector

Let A be an n x n square matrix. The scalar λ is called an eigenvalue(고유값) of A if there is a nonzero vector x such that

Ax = λx

Then such a vector x is called an eigenvector(고유벡터) of A corresponding to λ


- Ax = λ에서  (A - λI)x = 0   (∵ AI = IA = Ax = Ix,  λx = λIx  assuming multiplications are possible among the vectors and matrices).  따라서 (A - λI)x=0  를 만족하는 non-trivial 한 해 x가 존재하면 λ eigenvalue.  다른 말로, A - λI matrix의 null space가 nontrivial하면 λ eigenvalue.

- Obviously, if x is an eigenvector of A corresponding to the eigenvalue λ, then so is any nonzero multiple of x.

- Ax = 0 homogeneous equations에서 A가 invertible (ie. det(A) ≠ 0) 하면 A는 trivial solution x=0 만 갖음.  Ax = 0 은 solution이 없을 수는 없고 (즉, consistent) 하나의 solution x=0, 또는 many solution 두 경우을 갖으므로 A가 non-invertible (즉, det(A) = 0) 이면 many solution을 갖을 것이며 (x=0 경우는 A가 invertible 로 소비) 그 것들은 non-trivial 즉 nonzero vector일 것임.  따라서 다음이 성립; 

Homogenous equation (A - λI)x = 0 has nonzero solutions if and only if A -λI is noninvertible which is true if and only if det(A-λI) = 0  (역시, 전에 A -λI 의 augmentation matrix가 free variable을 갖으면 (A -λI)x = 0 homogeneous equation이 non-trivial 솔루션을 갖음을 보았다.  참고-Invertible Matrix Theorem)

이 말은, square matrix A에서 eigenvalue λ에 대응되는 eigenvector는

1)  A - λI matrix의 nullspace 중 0을 제외한 nonzero vector.  That is, zonzero vectors x satisfying (A - λI)x = 0

2) 다른 말로, square matrix A-λI의 non-invertibility det(A-λI) = 0 을 만족하는 λ가 있으면 Aeigenvalue λ와 해당 eigenvector x 를 갖고 있다는 뜻.  det(A-λI) = 0 방정식을 A의 characteristic equation이라 함.

The eigenvalues of a nxn square matrix A are the solutions λ of the equation

det(AλI) = 0

The eigenvectors x∈Rof A corresponding to λ are the nonzero solutions of

(AλI)x = 0

 

Let A be an n x n matrix and let λ be an eigenvalue of A. The collection of all eigenvectors corresponding to λ, together with the zero vector, is called the eigenspace of λ and is denoted by Eλ

 

- Eigenspace corresponding to eigenvalue λ may be represented as a span of the basis. The basis should span the collection of all eigenvectors corresponding to λ including the zero vector.

sol)   det(A-λI) = 0  →   λ2 - 4λ - 5 = (λ - 5)(λ + 1)    λ = 5 or  -1 ,  So two eigenvalues are 5 and -1.

1) to find corresponding eigenvector for eigenvalue 5, solve (A - 5I)x= 0  (Note.  이 homogeneous eq. 형태에서 A - λI  행렬이 independent column들이면 trivial solution, dependent이면 non-trivial 해를 갖을 것이다)

       2) corresponding eigenvectors for eigenvalue of -1 can be calculated similarly

 

■ Theorems on eigenvalues (eigenvectors)

The eigenvalues of a (square) triangular/diagonal matrix are the entries on its main diagonal

 

A square matrix A is invertible if and only if 0 is not an eigenvalue of A (also, when det A ≠ 0)

 

Let A be a square matrix with eigenvalue λ and corresponding eigenvector x.

a) for any positive integer n, λn is an eigenvalue of An with corresponding eigenvector x 

b) if A is invertible, then 1/λ is an eigenvalue of A-1 with corresponding eigenvector x

c)  if A is invertible, then for any integer n, λn is an eigenvalue of An with corresponding eigenvector x

 

Let A be square matrix and let λλ, ...  λ be distinct eigenvalues of A with corresponding eigenvectors vv, ...  vm .  Then vv, ...  vm  are linearly independent.

 

For any square matrix A, A and AT have the same characteristic polynomial (∵ det(A) = det(AT) and so det(A-λI) = det(A-λI)T  and since (A+B)T = AT + BT, (A-λI)T = AT-λI thus det(A-λI) = det(AT-λI) ) and hence the same eigenvalues (but generally not the same corresponding eigenvectors).  

 

■ Diagonalization 

Question:  For a square matrix A, is there an invertible matrix P such that P-1AP is diagonal?  

Let A and B are nxn matrices. Then A is similar to B (denoted by A ~ B) if there is an invertible matrix P such that 

P-1AP = B.  Or equivalently, A = PBP-1 or AP = PB

If A is similar to B, then B is similar to A, and we say A and B are similar.  Transforming A into P-1AP  is called a similarity transformation.

 

If A, B, C are nxn matrices,

a. A ~ A

b. If A ~ B, then B ~ A

c. If A ~ B and B ~ C, then A ~ C

 

If A and B are similar nxn matrices, then

a. A and B have the same eigenvalues

b. A is invertible if and only if B is invertible

c. det A = det B

d. A and B have the same rank

 

Definition

A n x n matrix A is diagonalizable if A is similar to a diagonal matrix - that is, A is diagonalizable if there exists an invertible matrix P such that P-1AP is a diagonal matrix.  Or equivalently, if A = PDP-1 for some invertible matrix P and some diagonal matrix D.

 

Diagonalization의 조건

The Diagonalization Theorem

A n x n matrix A is diagonalizable if and only if it has n linearly independent eigenvectors.

(corollary: A is diagonalizable as A = PDP-1, with D a diagonal matrix iff A has a invertible matrix P whose columns consist of the n linearly independent eigenvectors of A, and the diagonal entries of diagonal matrix D are the eigen values of A corresponding to the columns in P (eigenvectors of A) in the same order) 

Example) Is 3x3 matrix A diagonalizable?  If possible, what's it gonna be?  즉, A = PDP-1 되게 하는 invertible matrix P와 diagonal matrix D를 구하라. 

PDP-1 = A 확인할 때 P가 invertible 하다는 것만 알면 P-1을 구하지 않고도 AP=PD 를 확인하면 된다.   

Note that with an n x n matrix with n distinct eigenvalues λ1 , … λn with corresponding n eigenvectors v1, v2, … vn, then v1, v2, … vn  are linearly independent.

Thus, an n x n matrix with n distinct eigenvalues is diagonalizable

But, n x n matrix가 diagonalizable 되기 위해 서로 다른 n개의 eigenvalue를 가져야 되는 것은 아님.  위의 예에서 보듯이, 3 x 3 matirx가 두개의 서로 다른 eigenvalue 1과 -2 만 갖고 있음에도 diagonalize 됨.

 

'Learning & Reasoning > Math Revisit' 카테고리의 다른 글

Single Variable 미적분 토막 연습  (0) 2016.04.12
Applications of Eigenvalues and Eigenvectors  (0) 2016.03.29
Determinants  (0) 2016.03.24
Vector Spaces  (0) 2016.03.21
행렬  (0) 2016.03.17