-

 

 

 




Optimization Online





 

An Extension of the Proximal Point Method for Quasiconvex Minimization

Erik Papa Quiroz (erikpapa***at***gmail.com)
Paulo Roberto Oliveira (poliveir***at***cos.ufrj.br)

Abstract: In this paper we propose an extension of the proximal point method to solve minimization problems with quasiconvex objective functions on the Euclidean space and the nonnegative orthant. For the unconstrained minimization problem, assumming that the function is bounded from below and lower semicontinuous we prove that iterations fxkg given by 0 2 b@(f(:)+(k=2)jj:􀀀xk􀀀1jj2)(xk) are well de ned and if, in addition, f is quasiconvex then ff(xk)g is decreasing and fxkg converges to a point of U := fx 2 IRn : f(x)  infj0 f(xj)g assumed nonempty. Under the assumption that the sequence of parameters is bounded and f is continuous it is proved that fxkg converges to a generalized critical point of f. Furthermore, if fkg converge to zero and the iterations fxkg are global minimizers of the regularized subproblems f(:) + (k=2)jj: 􀀀 xk􀀀1jj2; the sequence converges to an optimal solution. For the quasiconvex minimization on the nonnegative orthant, using the same premises of the unconstrained case and using a general proximal distance (which includes as particular cases Bregman distances, 􀀀 divergence distances and second order homogeneous distances), we nd that the iterations fxkg given by 0 2 b@(f(:) + kd(:; xk􀀀1))(xk); are well de ned and rest in the positive orthant and ff(xk)g is decreasing and convergent. If U+ = fx 2 IRn + \ domf : f(x)  infj0 f(xj)g is nonempty, fkg is bounded and the distance is separable, we obtain the convergence to a generalized KKT point of the problem. Furthermore, in the smooth case we introduce a sucient condition on the proximal distance such that the sequence converges to an optimal solution of the problem. When the parameters converge to zero we obtain a similar result, of the unconstrained case for self-proximal distances.

Keywords: Proximal point algorithm, Quasiconvex functions, Limiting subdifferential, Frechet Subdifferential

Category 1: Nonlinear Optimization (Unconstrained Optimization )

Category 2: Convex and Nonsmooth Optimization (Generalized Convexity/Monoticity )

Citation:

Download: [PDF]

Entry Submitted: 06/21/2006
Entry Accepted: 06/21/2006
Entry Last Modified: 01/21/2009

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society