## Recent Advances In Nonsmooth OptimizationDing-zhu Du, Frank Kwang-ming Hwang, Li-qun Qi, Robert S Womersley Nonsmooth optimization covers the minimization or maximization of functions which do not have the differentiability properties required by classical methods. The field of nonsmooth optimization is significant, not only because of the existence of nondifferentiable functions arising directly in applications, but also because several important methods for solving difficult smooth problems lead directly to the need to solve nonsmooth problems, which are either smaller in dimension or simpler in structure.This book contains twenty five papers written by forty six authors from twenty countries in five continents. It includes papers on theory, algorithms and applications for problems with first-order nondifferentiability (the usual sense of nonsmooth optimization) second-order nondifferentiability, nonsmooth equations, nonsmooth variational inequalities and other problems related to nonsmooth optimization. |

### From inside the book

Results 1-5 of 5

Page 3

It is this latter feature that has probably contributed to the relatively little interest

that has been shown in such methods. ... unconstrained minimization problem,

for which rapid convergence can be obtained by for example the

It is this latter feature that has probably contributed to the relatively little interest

that has been shown in such methods. ... unconstrained minimization problem,

for which rapid convergence can be obtained by for example the

**BFGS method**. Page 11

The

Where necessary we do this using a unit matrix. Some care has to be taken when

choosing the initial value of the matrix X, in particular the rank of X must be r.

The

**BFGS method**also requires the Hessian approximation to be initialized.Where necessary we do this using a unit matrix. Some care has to be taken when

choosing the initial value of the matrix X, in particular the rank of X must be r.

Page 12

We therefore consider hybrid methods in which the projection algorithm is used

sparingly as a way of establishing the correct rank, whilst the

used to provide rapid convergence. In order to ensure that each component

method ...

We therefore consider hybrid methods in which the projection algorithm is used

sparingly as a way of establishing the correct rank, whilst the

**BFGS method**isused to provide rapid convergence. In order to ensure that each component

method ...

Page 13

If the

solution of d, then D") is the solution D” of (2.2). ... Even if the rank r # r" in the

hopefully ...

If the

**BFGS method**is using the correct rank r = r and has found the globalsolution of d, then D") is the solution D” of (2.2). ... Even if the rank r # r" in the

**BFGS method**, (5.3) enables some information to be extracted from D") that ishopefully ...

Page 15

Once the projection iteration has settled down, the

solution rapidly and no further projection steps are needed. Algorithm 2 requires

a relatively large number of line searches (see Table 2) in the first call of the

BFGS ...

Once the projection iteration has settled down, the

**BFGS method**finds thesolution rapidly and no further projection steps are needed. Algorithm 2 requires

a relatively large number of line searches (see Table 2) in the first call of the

BFGS ...

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

1 | |

18 | |

24 | |

36 | |

Projected Gradient Methods for Nonlinear Complementarity Problems via Normal Maps | 57 |

An NCPFunction and its Use for the Solution of Complementarity Problems | 88 |

An Elementary Rate of Convergence Proof for the Deep Cut Ellipsoid Algorithm | 106 |

Solving Nonsmooth Equations by Means of QuasiNewton Methods with Globalization | 121 |

Generalized Convexity and Higher Order Duality of the Nonlinear Programming Problem with Nonnegative Variables | 224 |

Prederivatives and Second Order Conditions for Infinite Optimization Problems | 244 |

Necessary and Sufficient Conditions for Solution Stability of Parametric Nonsmooth Equations | 261 |

Applications in Nonsmooth Analysis | 289 |

SecondOrder Nonsmooth Analysis in Nonlinear Programming | 322 |

Characterizations of Optimality for Homogeneous Programming Problems with Applications | 351 |

On Regularized Duality In Convex Optimization | 381 |

An Interior Point Method for Solving a Class of LinearQuadratic Stochastic Programming Problems | 392 |

Superlinear Convergence of Approximate Newton Methods for LC1 Optimization Problems without Strict Complementarity | 141 |

On SecondOrder Directional Derivatives in Nonsmooth Optimization | 159 |

On the Solution of Optimum Design Problems with Variational Inequalities | 172 |

Monotonicity and Quasimonotonicity in Nonsmooth Analysis | 193 |

Sensitivity of Solutions in Nonlinear Programming Problems with Nonunique Multipliers | 215 |

A Globally Convergent Newton Method for Solving Variational Inequality Problems with Inequality Constraints | 405 |

Upper Bounds on a Parabolic Second Order Directional Derivative of the Marginal Function | 418 |

A SLP Method with a Quadratic Correction Step for Nonsmooth Optimization | 438 |

A Successive Approximation QuasiNewton Process for Nonlinear Complementarity Problem | 459 |

### Other editions - View all

Recent Advances in Nonsmooth Optimization Dingzhu Du,Liqun Qi,Robert S. Womersley Limited preview - 1995 |

### Common terms and phrases

algorithm Applications approximation assume assumption Banach space BFGS method bounded computing consider constraint qualification convex functions convex set defined Definition denote directional derivative dual duality ellipsoid epi-differentiable equivalent Euclidean distance matrix Example exists feasible finite function f given global convergence Hence holds implies iteration Lemma Let f linear Lipschitz Lipschitz continuous Lipschitzian lower semicontinuous Mathematical Programming minimization monotone multifunction neighborhood Newton method nonempty nonlinear complementarity problem nonlinear programming nonsmooth analysis Nonsmooth Optimization normal cone objective function obtain Operations Research optimal solution optimality conditions optimization problems parametric polyhedral function positively homogeneous programming problems Proof properties Proposition quasi-Newton methods quasimonotone R. T. Rockafellar result satisfied second order second-order semismooth sequence SIAM Journal step subdifferential subgradient sublinear function subset sufficient condition superlinear theory topology triangulation variational inequality vector zero