-

 

 

 




Optimization Online





 

The structure of conservative gradient fields

Adrian Lewis(adrian.lewis***at***cornell.edu)
Tonghua Tian(tt543***at***cornell.edu)

Abstract: The classical Clarke subdifferential alone is inadequate for understanding automatic differentiation in nonsmooth contexts. Instead, we can sometimes rely on enlarged generalized gradients called "conservative fields", defined through the natural path-wise chain rule: one application is the convergence analysis of gradient-based deep learning algorithms. In the semi-algebraic case, we show that all conservative fields are in fact just Clarke subdifferentials plus normals of manifolds in underlying Whitney stratifications.

Keywords: variational analysis, Clarke subdifferential, automatic differentiation, deep learning, subgradient descent, conservative field, stratification, semi-algebraic

Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Citation: Cornell ORIE - January 2021

Download: [PDF]

Entry Submitted: 01/03/2021
Entry Accepted: 01/03/2021
Entry Last Modified: 01/03/2021

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society