Regularization
Last updated
Last updated
We wish to choose to balance the two objective values
and .
Generally, we can make or small, but not both.
b is observation (noisy)
Naive Least-Sq formation:
true signal recovered via
⇒
Need to incorporate “side” information
“side” ↔ ”prior” ⇒ signal is smooth
Combining two models
Balance competing objectives and
,
: “regularization” parameter
: regularization function
,
,
A particularly common example of regularized least-squares is Tikhonov, which has the form
for which the objective can be expressed as
💡 Example.
Matrix notation:
This allows for a reformulation of the weighted least squares objective into a familiar least squares objective:
If has full column rank, then the stacked matrix
necessarily also has full rank for any positive , which implies that the regularized problem always has a well-defined unique solution.
Choose that promotes “smoothness”
signal
Measure smoothness by measuring change between and :
So the solution to the weighted least squares minimization program satisfies the normal equation , which simplifies to