x∈Rnminf(x) subject to x∈C⊆Rn f:Rn→R continuous differentiable and convex
C⊆Rn convex
Unconstrained (
C⊆Rn)
x∗∈x∈Rnargmin⇔0≤f’(x∗;x−x∗),∀x∈Rn=∇f(x∗)T(x−x∗)⇔∇f(x∗)=0
Constrained (
C⊆Rn)
x∗∈x∈Cargminf(x)⇔0≤f’(x∗;x−x∗),∀x∈C=∇f(x∗)T(x−x∗),∀x∈C directional derivative is non-negative in all feasible directions
−∇f(x∗)∈NR+2(x∗)⇔−∇f(x∗)T(x−x∗)≤0,∀x∈C Normal Cone of a set
C⊆Rn of a point
x∈C
NC(x)={g∈Rn∣gT(z−x)≤0,∀z∈C} Example. C={x∈Rn∣Ax=b}
NC(x)={g∈Rn∣gT(z−x)≤0,∀z∈C}={g∈Rn∣gTd≤0,∀d∈null(A)}={g∈Rn∣gTd=0,∀d∈null(A)}=null(A)⊥=range(AT) if z∈C, and x∈C
Ax=b
Az=b
A(z−x)=Az−Ax=b−b=0
range(A)⊕null(AT)=Rm
range(AT)⊕null(A)=Rn
Linearly Constrained Optimization
x∈Rnminf(x) subject to Ax=b⇔x∈C x∗ is a minimizer only if
{∇f(x∗)=ATy for some y∈RmAx∗=b, where A is a m×n matrix.
New Lagrange of Normal Cones
⇔⇔−∇f(x∗)∈NC(x∗),x∗∈C−∇f(x∗)∈range(AT),Ax∗=b∃y∗ s.t. −∇f(x∗)=ATy