  


Some notes on applying computational divided differencing in optimization
Stephen Vavasis(vavasisuwaterloo.ca) Abstract: We consider the problem of accurate computation of the finite difference $f(\x+\s)f(\x)$ when $\Vert\s\Vert$ is very small. Direct evaluation of this difference in floating point arithmetic succumbs to cancellation error and yields 0 when $\s$ is sufficiently small. Nonetheless, accurate computation of this finite difference is required by many optimization algorithms for a ``sufficient decrease'' test. Reps and Rall proposed a programmatic transformation called ``computational divided differencing'' reminiscent of automatic differentiation to compute these differences with high accuracy. The running time to compute the difference is a small constant multiple of the running time to compute $f$. Unlike automatic differentiation, however, the technique is not fully general because of a difficulty with branching code (i.e., `if' statements). We make several remarks about the application of computational divided differencing to optimization. One point is that the technique can be used effectively as a stagnation test. Keywords: finite difference, termination test, divided difference, automatic differentiation Category 1: Optimization Software and Modeling Systems (Optimization Software Design Principles ) Citation: Download: [PDF] Entry Submitted: 07/15/2013 Modify/Update this entry  
Visitors  Authors  More about us  Links  
Subscribe, Unsubscribe Digest Archive Search, Browse the Repository

Submit Update Policies 
Coordinator's Board Classification Scheme Credits Give us feedback 
Optimization Journals, Sites, Societies  