Optimization Online


Deep Neural Network Structures Solving Variational Inequalities

Patrick L. Combettes(plc***at***math.ncsu.edu)
Jean-Christophe Pesquet(jean-christophe***at***pesquet.eu)

Abstract: We propose a novel theoretical framework to investigate deep neural networks using the formalism of proximal fixed point methods for solving variational inequalities. We first show that almost all activation functions used in neural networks are actually proximity operators. This leads to an algorithmic model alternating firmly nonexpansive and linear operators. We derive new results on averaged operator iterations to establish the convergence of this model, and show that the limit of the resulting algorithm is a solution to a variational inequality. In general, this limiting output does not solve any minimization problem.

Keywords: neural networks, fixed point algorithms, proximity operator, averaged operators, non linear systems

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Complementarity and Variational Inequalities

Citation: internal report, NCSU/CentraleSupelec

Download: [PDF]

Entry Submitted: 09/05/2018
Entry Accepted: 09/05/2018
Entry Last Modified: 09/05/2018

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society