n
nicefa.
Intermediate Proof

The Descent Property of Gradient-Based Optimization Methods

Students often assume descent occurs for any α>0 \alpha > 0 . However, if α2/L \alpha \geq 2/L , the objective value can increase or oscillate wildly. The descent property is strictly dependent on the relationship between the step size and the local smoothness (Lipschitz constant) of the function.
Institutional Reference: Fundamentals of Optimization
View Full Proof →