  


An algorithmic framework for MINLP with separable nonconvexity
Claudia D'Ambrosio(c.dambrosiounibo.it) Abstract: Global optimization algorithms, e.g., spatial branchandbound approaches like those implemented in codes such as BARON and COUENNE, have had substantial success in tackling complicated, but generally small scale, nonconvex MINLPs (i.e., mixedinteger nonlinear programs having nonconvex continuous relaxations). Because they are aimed at a rather general class of problems, the possibility remains that larger instances from a simpler class may be amenable to a simpler approach. We focus on MINLPs for which the nonconvexity in the objective and constraint functions is manifested as the sum of nonconvex univariate functions. There are many problems that are already in such a form, or can be brought into such a form via some simple substitutions. In fact, the first step in spatial branchandbound is to bring problems into nearly such a form. For our purposes, we shift that burden back to the modeler. We have developed a simple algorithm, implemented at the level of a modeling language (in our case AMPL), to attack such separable problems. First, we identify subintervals of convexity and concavity for the univariate functions using external calls to MATLAB. With such an identification at hand, we develop a convex MINLP relaxation of the problem (i.e., as a mixedinteger nonlinear program having a convex continuous relaxation). Our convex MINLP relaxation differs from those typically employed in spatial branchandbound; rather than relaxing the graph of a univariate function on an interval to an enclosing polygon, we work on each subinterval of convexity and concavity separately, using linear relaxation on only the ``concave side'' of each function on the subintervals. The subintervals are glued together using binary variables. Next, we employ ideas of spatial branchandbound, but rather than branching, we repeatedly refine our convex MINLP relaxation by modifying it at the modeling level. We attack our convex MINLP relaxation, to get lower bounds on the global minimum, using the code BONMIN as a blackbox convex MINLP solver. Next, by fixing the integer variables in the original nonconvex MINLP, and then locally solving the associated nonconvex NLP restriction, we get an upper bound on the global minimum, using the code IPOPT. We use the solutions found by BONMIN and IPOPT to guide our choice of further refinements in a way that overall guarantees convergence. Note that our proposed procedure is an exact algorithm, and not just a heuristic. We have had substantial success in our preliminary computational experiments. In particular, we see very few major iterations occurring, so most of the time is spent in the solution of a small number of convex MINLPs. An advantage of our approach is that it can be implemented easily using existing software components, and that further advances in technology for convex MINLP will immediately give our approach a benefit. Keywords: mixedinteger nonlinear programming, global optimization, spatial branchandbound, separable, nonconvex Category 1: Global Optimization Category 2: Integer Programming ((Mixed) Integer Nonlinear Programming ) Citation: IBM Research Report RC24810, June 2009 Download: [PDF] Entry Submitted: 06/26/2009 Modify/Update this entry  
Visitors  Authors  More about us  Links  
Subscribe, Unsubscribe Digest Archive Search, Browse the Repository

Submit Update Policies 
Coordinator's Board Classification Scheme Credits Give us feedback 
Optimization Journals, Sites, Societies  