NumPy - Eigenvalues



What are Eigenvalues?

Eigenvalues are special numbers associated with a matrix that provide important information about the matrix's properties.

In the context of linear algebra, if A is a square matrix, an eigenvalue is a scalar such that there exists a non-zero vector v (called an eigenvector) satisfying the equation −

Av = v

This means that when the matrix A multiplies the vector v, the result is the same as multiplying the vector v by the scalar .

Computing Eigenvalues in NumPy

NumPy provides the numpy.linalg.eig() function to compute the eigenvalues and eigenvectors of a square matrix. Let us see how this function works with an example.

Example

In this example, the eigenvalues of the matrix A are 3 and 2. The corresponding eigenvectors are shown in the output −

import numpy as np

# Define a 2x2 matrix
A = np.array([[4, -2], 
              [1,  1]])

# Compute the eigenvalues and eigenvectors
eigenvalues, eigenvectors = np.linalg.eig(A)

print("Eigenvalues:", eigenvalues)
print("Eigenvectors:\n", eigenvectors)

The output from numpy.linalg.eig() function provides two arrays: one for eigenvalues and one for eigenvectors.

The eigenvalues array contains the eigenvalues of the matrix, and each column of the eigenvectors array represents an eigenvector corresponding to the respective eigenvalue −

Eigenvalues: [3. 2.]
Eigenvectors:
 [[ 0.89442719  0.70710678]
 [ 0.4472136  -0.70710678]]

Properties of Eigenvalues and Eigenvectors

Eigenvalues and eigenvectors have several important properties. They are −

  • Linearity: Eigenvectors corresponding to different eigenvalues are linearly independent.
  • Determinant Relation: The product of the eigenvalues of a matrix is equal to its determinant.
  • Trace Relation: The sum of the eigenvalues of a matrix is equal to its trace (the sum of its diagonal elements).
  • Similarity Transformation: If a matrix A is similar to a matrix B (i.e., B = P-1AP for some invertible matrix P), then A and B have the same eigenvalues.

Applications of Eigenvalues and Eigenvectors

Eigenvalues and eigenvectors have numerous applications, such as −

  • Principal Component Analysis (PCA): Used in data analysis and machine learning for dimensionality reduction.
  • Stability Analysis: Used in control theory to analyze the stability of systems.
  • Quantum Mechanics: Used to solve the Schrdinger equation and find the energy levels of a system.
  • Vibration Analysis: Used in engineering to analyze the natural frequencies of structures.
  • Graph Theory: Used to analyze the properties of graphs and networks.

Example: Eigenvalues of a 3x3 Matrix

In the following example, we are computing the eigenvalues and eigenvectors of a 3x3 matrix using NumPy −

import numpy as np

# Define a 3x3 matrix
B = np.array([[1, 2, 3],
              [0, 1, 4],
              [5, 6, 0]])

# Compute the eigenvalues and eigenvectors
eigenvalues, eigenvectors = np.linalg.eig(B)

print("Eigenvalues:", eigenvalues)
print("Eigenvectors:\n", eigenvectors)

This will produce the following result −

Eigenvalues: [-5.2296696  -0.02635282  7.25602242]
Eigenvectors:
[[ 0.22578016 -0.75769839 -0.49927017]
 [ 0.52634845  0.63212771 -0.46674201]
 [-0.81974424 -0.16219652 -0.72998712]]

Symmetric Matrices and Real Eigenvalues

A symmetric matrix is a matrix that is equal to its transpose (i.e., A = AT). Symmetric matrices have some special properties regarding their eigenvalues −

  • Real Eigenvalues: The eigenvalues of a symmetric matrix are always real numbers.
  • Orthogonal Eigenvectors: The eigenvectors of a symmetric matrix corresponding to distinct eigenvalues are orthogonal.

Example

Let us compute the eigenvalues of a symmetric matrix −

import numpy as np

# Define a symmetric matrix
C = np.array([[4, 1, 1],
              [1, 4, 1],
              [1, 1, 4]])

# Compute the eigenvalues and eigenvectors
eigenvalues, eigenvectors = np.linalg.eig(C)

print("Eigenvalues:", eigenvalues)
print("Eigenvectors:\n", eigenvectors)

Following is the output of the above code −

Eigenvalues: [6. 3. 3.]
Eigenvectors:
[[-0.57735027 -0.81649658 -0.15430335]
 [-0.57735027  0.40824829 -0.6172134 ]
 [-0.57735027  0.40824829  0.77151675]]

Eigenvalues and Diagonalization

A square matrix A is said to be diagonalizable if it can be written as −

A = PDP-1

where, D is a diagonal matrix containing the eigenvalues of A, and P is a matrix whose columns are the eigenvectors of A.

Example

Let us see how to diagonalize a matrix using NumPy −

import numpy as np

# Define a matrix
D = np.array([[2, 0, 0],
              [1, 3, 0],
              [4, 5, 6]])

# Compute the eigenvalues and eigenvectors
eigenvalues, eigenvectors = np.linalg.eig(D)

# Diagonal matrix of eigenvalues
D_diag = np.diag(eigenvalues)

# Reconstruct the original matrix
reconstructed_D = eigenvectors @ D_diag @ np.linalg.inv(eigenvectors)

print("Original matrix:\n", D)
print("Reconstructed matrix:\n", reconstructed_D)

The original matrix is successfully reconstructed using its eigenvalues and eigenvectors, demonstrating the process of diagonalization −

Original matrix:
 [[2 0 0]
 [1 3 0]
 [4 5 6]]
Reconstructed matrix:
 [[2. 0. 0.]
 [1. 3. 0.]
 [4. 5. 6.]]
Advertisements