Matrix Diagonalization Calculator

Diagonalize square matrices by finding eigenvalues and eigenvectors. Compute similarity transformations P⁻¹AP = D, analyze Jordan canonical form, and visualize eigenspaces. Get complete step-by-step solutions with geometric interpretations.

Matrix Input

Matrix A

Matrix Diagonalization Theory

What is Matrix Diagonalization?

Matrix diagonalization is the process of finding a diagonal matrix D that is similar to a given square matrix A. A matrix A is diagonalizable if there exists an invertible matrix P such that P⁻¹AP = D, where D is diagonal. The columns of P are eigenvectors of A, and the diagonal entries of D are the corresponding eigenvalues.

Diagonalization Process

Step 1: Find Characteristic Polynomial

Compute det(A - λI) to get the characteristic polynomial

Step 2: Solve for Eigenvalues

Find roots of characteristic polynomial: det(A - λI) = 0

Step 3: Find Eigenvectors

For each λᵢ, solve (A - λᵢI)v = 0 to find eigenvectors

Step 4: Check Linear Independence

Verify that eigenvectors are linearly independent

Step 5: Form Matrices P and D

P = [v₁ v₂ ... vₙ], D = diag(λ₁, λ₂, ..., λₙ)

Step 6: Verify Diagonalization

Check that AP = PD or A = PDP⁻¹

Types of Matrices

Diagonalizable Matrix

Has n linearly independent eigenvectors

Can be written as A = PDP⁻¹

Powers: Aᵏ = PDᵏP⁻¹

Non-Diagonalizable Matrix

Deficient in eigenvectors

Requires Jordan canonical form

Has Jordan blocks for repeated eigenvalues

Symmetric Matrix

Always diagonalizable

Real eigenvalues

Orthogonal eigenvectors

Normal Matrix

AA* = A*A (commutes with conjugate transpose)

Always diagonalizable

Includes symmetric, skew-symmetric, unitary matrices

Applications

Linear Systems

Solving differential equations, system dynamics, stability analysis

Principal Component Analysis

Data reduction, feature extraction, dimensionality reduction

Quantum Mechanics

Observable operators, state evolution, measurement theory

Vibration Analysis

Modal analysis, natural frequencies, mode shapes

Graph Theory

Spectral graph theory, network analysis, clustering

Machine Learning

Spectral clustering, kernel methods, manifold learning

Worked Examples

Example 1: Simple 2×2 Diagonalization

Problem:

Diagonalize the matrix:

A = [3 1]
    [0 2]

Solution:

Step 1: Characteristic polynomial: det(A - λI) = (3-λ)(2-λ) = 0

Step 2: Eigenvalues: λ₁ = 3, λ₂ = 2

Step 3: Eigenvectors: v₁ = [1; 0], v₂ = [1; -1]

Step 4: P = [1 1; 0 -1], D = [3 0; 0 2]

Result: A = PDP⁻¹ with P⁻¹ = [1 1; 0 -1]

Example 2: Symmetric Matrix

Problem:

Diagonalize the symmetric matrix:

A = [2 1]
    [1 2]

Solution:

Step 1: Characteristic polynomial: det(A - λI) = λ² - 4λ + 3 = 0

Step 2: Eigenvalues: λ₁ = 3, λ₂ = 1

Step 3: Eigenvectors: v₁ = [1; 1]/√2, v₂ = [1; -1]/√2

Step 4: P = [1/√2 1/√2; 1/√2 -1/√2] (orthogonal)

Result: A = PDP^T with D = [3 0; 0 1]

Example 3: Non-Diagonalizable Matrix

Problem:

Analyze the matrix:

A = [2 1]
    [0 2]

Solution:

Step 1: Characteristic polynomial: det(A - λI) = (2-λ)² = 0

Step 2: Eigenvalue: λ = 2 (algebraic multiplicity 2)

Step 3: Eigenspace: dim(E₂) = 1 (geometric multiplicity 1)

Step 4: Since geometric ≠ algebraic multiplicity

Result: Matrix is NOT diagonalizable, requires Jordan form

Frequently Asked Questions

A matrix is diagonalizable if and only if it has n linearly independent eigenvectors, where n is the size of the matrix. This occurs when the geometric multiplicity equals the algebraic multiplicity for each eigenvalue.

Eigenvalues are scalars λ that satisfy Av = λv for some non-zero vector v. Eigenvectors are the non-zero vectors v that satisfy this equation. Each eigenvalue has a corresponding eigenspace of eigenvectors.

The Jordan canonical form is a block diagonal matrix that represents any square matrix when diagonalization is not possible. It consists of Jordan blocks, each corresponding to an eigenvalue, with 1's on the superdiagonal for deficient eigenvalues.

Symmetric matrices have the special property that they always have real eigenvalues and orthogonal eigenvectors. The spectral theorem guarantees that any symmetric matrix can be diagonalized by an orthogonal matrix.

The characteristic polynomial is found by computing det(A - λI), where A is your matrix, λ is the variable, and I is the identity matrix. The roots of this polynomial are the eigenvalues of the matrix.

Diagonalization is used in solving systems of differential equations, computing matrix powers efficiently, principal component analysis, quantum mechanics, vibration analysis, and many areas of engineering and physics.