A Lagrange Multiplier Approach to PCA
Learn how Principal Component Analysis uses Lagrange multipliers to maximize data variance.

Principal Component Analysis (PCA) aims to find the directions (principal components) that maximize the variance of the projected data while ensuring that these directions are orthogonal and of unit length. This optimization problem can be elegantly solved using the method of Lagrange multipliers.
In this derivation, we'll:
- Set up the optimization problem: Maximize the variance of the projected data.
- Introduce the Lagrangian: Incorporate the constraints using Lagrange multipliers.
- Derive the eigenvalue equation: Show that maximizing variance leads to solving an eigenvalue problem.
1. Setting Up the Optimization Problem
Objective: Find a vector (principal component) that maximizes the variance of the projected data .
Given:
- Data matrix of size (centered so that each variable has zero mean).
PCA is sensitive to the scale of the variables. Standardize the dataset so that each variable has a mean of zero and a standard deviation of one.
- is the standardized data matrix.
- is the mean of each variable.
- is the standard deviation of each variable.
- Covariance matrix Covariance measures how much two variables change together. If two variables increase or decrease together, they have positive covariance; if one increases while the other decreases, they have negative covariance. By calculating the covariance matrix, you quantify the degree to which variables are linearly related to each other across the dataset. for zero-centered data covariance matrix can be computed as below: Variance of Projected Data:
Constraint:
- The principal component should be a unit vector:
2. Formulating the Lagrangian
Let’s say we have objective function to maximize, subject to below euqality and non equality constarints
- Inequality constraints:
- Equality constraints:
The Lagrangian incorporating both constraints is defined as:
where:
- is the Lagrange multiplier for the inequality constraint .
- is the Lagrange multiplier for the equality constraint .
To maximize subject to , we introduce a Lagrange multiplier and construct the Lagrangian function:
- Objective Function:
- Constraint:
3. Deriving the Eigenvalue Equation
Compute the Gradient of the Lagrangian with respect to and Set it to Zero:
Calculate the Partial Derivative:
-
Derivative of the Objective Function:
-
Derivative of the Constraint Term:
Set the Gradient to Zero:
Simplify by dividing both sides by 2:
Interpretation:
- This is the eigenvalue equation.
- is an eigenvector of the covariance matrix .
- is the corresponding eigenvalue.
By decomposing the covariance matrix, PCA transforms the correlated variables into a new set of uncorrelated variables (principal components) ordered by the amount of variance they capture.
4. Finding Principal Components
PCA seeks directions (principal components) along which the variance of the data is maximized. The covariance matrix's eigenvalues and eigenvectors reveal these directions.
Eigenvalues and Eigenvectors:
- Eigenvalues (): Represent the amount of variance captured by each principal component.
- Eigenvectors (): Directions in feature space along which variance is maximized.
Ordering Principal Components:
- Sort eigenvalues in descending order:
- The corresponding eigenvectors are the principal components.
By choosing eigenvectors corresponding to the largest eigenvalues, you reduce the dimensionality while preserving as much information as possible.