Let (A) be an (n\times n) square matrix. A non - zero vector (\mathbf{v}) is called an eigenvector of (A) if there exists a scalar (\lambda) such that:
[A\mathbf{v}=\lambda\mathbf{v}]
The scalar (\lambda) is called the eigenvalue corresponding to the eigenvector (\mathbf{v}). Geometrically, an eigenvector of a matrix (A) is a vector that only changes by a scalar factor when the matrix (A) is applied to it.
NumPy provides the numpy.linalg.eig
function to compute the eigenvalues and right eigenvectors of a square array. Here is a simple code example:
import numpy as np
# Define a square matrix
A = np.array([[1, 2], [2, 1]])
# Compute eigenvalues and eigenvectors
eigenvalues, eigenvectors = np.linalg.eig(A)
print("Eigenvalues:", eigenvalues)
print("Eigenvectors:", eigenvectors)
In this code, we first define a (2\times2) square matrix A
. Then we use the np.linalg.eig
function to compute the eigenvalues and eigenvectors of A
. The function returns two arrays: the first array contains the eigenvalues, and the second array contains the corresponding eigenvectors. Each column of the eigenvectors
array corresponds to an eigenvector.
We can visualize the eigenvectors of a matrix to gain a better understanding of their geometric properties. Here is an example of visualizing the eigenvectors of a (2\times2) matrix:
import numpy as np
import matplotlib.pyplot as plt
# Define a square matrix
A = np.array([[1, 2], [2, 1]])
# Compute eigenvalues and eigenvectors
eigenvalues, eigenvectors = np.linalg.eig(A)
# Plot the original matrix and eigenvectors
plt.figure(figsize=(8, 6))
plt.quiver(0, 0, eigenvectors[0, 0], eigenvectors[1, 0], angles='xy', scale_units='xy', scale=1, color='r', label='Eigenvector 1')
plt.quiver(0, 0, eigenvectors[0, 1], eigenvectors[1, 1], angles='xy', scale_units='xy', scale=1, color='b', label='Eigenvector 2')
plt.xlim(-3, 3)
plt.ylim(-3, 3)
plt.xlabel('x')
plt.ylabel('y')
plt.title('Eigenvectors of Matrix A')
plt.legend()
plt.grid(True)
plt.show()
Eigenvectors can be used for dimensionality reduction techniques such as Principal Component Analysis (PCA). Here is a simple example of using eigenvectors for PCA:
import numpy as np
# Generate some sample data
X = np.array([[1, 2], [2, 3], [3, 4], [4, 5]])
# Compute the covariance matrix
cov_matrix = np.cov(X.T)
# Compute eigenvalues and eigenvectors of the covariance matrix
eigenvalues, eigenvectors = np.linalg.eig(cov_matrix)
# Sort the eigenvalues and eigenvectors in descending order
idx = eigenvalues.argsort()[::-1]
eigenvalues = eigenvalues[idx]
eigenvectors = eigenvectors[:, idx]
# Select the top k eigenvectors for dimensionality reduction
k = 1
reduced_data = np.dot(X, eigenvectors[:, :k])
print("Reduced data:", reduced_data)
Before using np.linalg.eig
, make sure that the input matrix is square. If the matrix is not square, a LinAlgError
will be raised.
import numpy as np
# Non - square matrix
A = np.array([[1, 2, 3], [4, 5, 6]])
try:
eigenvalues, eigenvectors = np.linalg.eig(A)
except np.linalg.LinAlgError as e:
print("Error:", e)
For large matrices or matrices with ill - conditioned properties, numerical errors may occur. In such cases, it is recommended to use more advanced numerical methods or pre - condition the matrix.
In this blog post, we have explored the fundamental concepts of eigenvectors and eigenvalues and learned how to use NumPy to compute them. We have also looked at some common practices such as visualizing eigenvectors and using them for dimensionality reduction. Additionally, we have discussed some best practices for using NumPy’s eigenvector computation functions. By understanding these concepts and techniques, you can effectively use eigenvectors in various applications.