tailieunhanh - Iterative Methods for Optimization

We saw that the three-element set above had 8 = 2 subsets. In general, a set with n elements has 2n subsets, as can be seen in the following manner. We form subsets P of U by considering each of the elements of U in turn and deciding whether or not to include it in the subset P. If we decide to put every element of U into P, we get the universal set, and if we decide to put no element of U into P, we get the empty set. In most cases we will put some but not all the elements into P and thus. | Iterative Methods for Optimization . Kelley North Carolina State University Raleigh North Carolina S1HJ1L Society for Industrial and Applied Mathematics Philadelphia Copyright 1999 by the Society for Industrial and Applied Mathematics. This electronic version is for personal use and may not be duplicated or distributed. Contents Preface xiii How to Get the Software xv I Optimization of Smooth Functions 1 1 Basic Concepts 3 The Problem. 3 Notation. 4 Necessary Conditions. 5 Sufficient Conditions. 6 Quadratic Objective Functions. 6 Positive Definite Hessian. 7 Indefinite Hessian. 9 Examples. 9 Discrete Optimal Control. 9 Parameter Identification. 11 Convex Quadratics. 12 Exercises on Basic Concepts. 12 2 Local Convergence of Newton s Method 13 Types of Convergence. 13 The Standard Assumptions. 14 Newton s Method. 14 Errors in Functions Gradients and Hessians . 17 Termination of the Iteration . 21 Nonlinear Least Squares. 22 Gauss-Newton Iteration. 23 Overdetermined Problems. 24 Underdetermined Problems . 25 Inexact Newton Methods. 28 Convergence Rates. 29 Implementation of Newton-CG. 30 Examples. 33 Parameter Identification. 33 Discrete Control Problem . 34 Exercises on Local Convergence. 35 ix Buy this book from SIAM at http SIAM . Copyright 1999 by the Society for Industrial and Applied Mathematics. This electronic version is for personal use and may not be duplicated or distributed. x CONTENTS 3 Global Convergence 39 The Method of Steepest Descent. 39 Line Search Methods and the Armijo Rule. 40 Stepsize Control with Polynomial Models. 43 Slow Convergence of Steepest Descent. 45 Damped Gauss-Newton Iteration. 47 Nonlinear Conjugate Gradient Methods. 48 Trust Region Methods. 50 Changing the Trust Region and the Step. 51 Global Convergence of Trust

TÀI LIỆU MỚI ĐĂNG
34    199    1    29-03-2024
14    160    0    29-03-2024
10    112    0    29-03-2024
5    118    0    29-03-2024
33    114    0    29-03-2024
7    119    0    29-03-2024
crossorigin="anonymous">
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.