## Recent Advances in Nonsmooth OptimizationNonsmooth optimization covers the minimization or maximization of functions which do not have the differentiability properties required by classical methods. The field of nonsmooth optimization is significant, not only because of the existence of nondifferentiable functions arising directly in applications, but also because several important methods for solving difficult smooth problems lead directly to the need to solve nonsmooth problems, which are either smaller in dimension or simpler in structure.This book contains twenty five papers written by forty six authors from twenty countries in five continents. It includes papers on theory, algorithms and applications for problems with first-order nondifferentiability (the usual sense of nonsmooth optimization) second-order nondifferentiability, nonsmooth equations, nonsmooth variational inequalities and other problems related to nonsmooth optimization. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

20 | 103 |

35 | 140 |

45 | 169 |

S Komlósi | 211 |

Nonunique Multipliers | 215 |

Prederivatives and Second Order Conditions for Infinite | 244 |

Necessary and Sufficient Conditions for Solution Stability | 261 |

Miscellaneous Incidences of Convergence Theories in Optimization | 289 |

On Regularized Duality in Convex Optimization | 381 |

403 | |

A Globally Convergent Newton Method for Solving Variational | 405 |

Upper Bounds on a Parabolic Second Order Directional Derivative | 418 |

40 | 425 |

A SLP Method with a Quadratic Correction Step for Nonsmooth | 438 |

A Successive Approximation QuasiNewton Process for Nonlinear | 459 |

50 | 467 |

SecondOrder Nonsmooth Analysis in Nonlinear Programming | 322 |

Characterizations of Optimality for Homogeneous Programming | 351 |

### Other editions - View all

Recent Advances in Nonsmooth Optimization Ding-Zhu Du,Liqun Qi,Robert S Womersley Limited preview - 1995 |

### Common terms and phrases

algorithm Analysis applied approach approximate assume assumption Banach space bounded called closed complementarity problem computing cone consider constraint containing continuous convergence convex corresponding defined Definition denote differentiable directional derivative equal equations equivalent Example exists feasible finite function f give given global gradient Hence holds implies introduced iteration Journal Lemma linear locally lower mapping Mathematical Mathematical Programming matrix mean value theorem means method minimization monotone Moreover necessary Newton method nonlinear programming nonsmooth normal Note objective obtain Operations optimization problems parametric particular positive problem projection Proof properties Proposition prove quadratic Remark Research respectively result satisfied second order second-order sequence solution solving space stability step subdifferential subgradient subset sufficiently Suppose Theorem theory University variational inequality vector zero