Introduction to Stochastic Search and Optimization : Estimation, Simulation, and Control


James C. Spall
Bok Engelsk 2005 · Electronic books.
Annen tittel
Utgitt
Hoboken : : Wiley, , 2005.
Omfang
1 online resource (620 p.)
Opplysninger
Description based upon print version of record.. - INTRODUCTION TO STOCHASTIC SEARCH AND OPTIMIZATION; CONTENTS; Preface; 1 . Stochastic Search and Optimization: Motivation and Supporting Results; 1.1 Introduction; 1.1.1 General Background; 1.1.2 Formal Problem Statement; General Types of Problems and Solutions; Global versus Local Search; 1.1.3 Meaning of "Stochastic" in Stochastic Search and Optimization; 1.2 Some Principles of Stochastic Search and Optimization; 1.2.1 Some Key Points; 1.2.2 Limits of Performance: No Free Lunch Theorems; 1.3 Gradients, Hessians. and Their Connection to Optimization of Smooth" Functions <<. - 1.3.1 Definition of Gradient and Hessian in the Context of Loss Functions1.3.2 First- and Second-Order Conditions for Optimization; 1.4 Deterministic Search and Optimization: Steepest Descent and Newton-Raphson Search; 1.4.1 Steepest Descent Method; 1.4.2 Newton-Raphson Method and Deterministic Convergence Rates; 1.5 Concluding Remarks; Exercises; 2 . Direct Methods for Stochastic Search; 2.1 Introduction; 2.2 Random Search with Noise-Free Loss Measurements; 2.2.1 Some Attributes of Direct Random Search; 2.2.2 Three Algorithms for Random Search; 2.2.3 Example Implementations. - 2.3 Random Search with Noisy Loss Measurements2.4 Nonlinear Simplex (Nelder-Mead) Algorithm; 2.4.1 Basic Method; 2.4.2 Adaptation for Noisy Loss Measurements; 2.5 Concluding Remarks; Exercises; 3 . Recursive Estimation for Linear Models; 3.1 Formulation for Estimation with Linear Models; 3.1.1 Linear Model; 3.1.2 Mean-Squared and Least-Squares Estimation; 3.2 Least-Mean-Squares and Recursive-Least-Squares for Static 0; 3.2.1 Introduction; 3.2.2 Basic LMS Algorithm; 3.2.3 LMS Algorithm in Adaptive Signal Processing and Control; 3.2.4 Basic RLS Algorithm. - 3.2.5 Connection of RLS to the Newton-Raphson Method3.2.6 Extensions to Multivariate RLS and Weighted Summands in Least-Squares Criterion.; 3.3 LMS, RLS, and Kalman Filter for Time-Varying 0; 3.3.1 Introduction; 3.3.2 LMS; 3.3.3 RLS; 3.3.4 Kalman Filter; 3.4 Case Study: Analysis of Oboe Reed Data; 3.5 Concluding Remarks; Exercises; 4 . Stochastic Approximation for Nonlinear Root.Finding; 4.1 Introduction; 4.2 Potpourri of Stochastic Approximation Examples; 4.3 Convergence of Stochastic Approximation; 4.3.1 Background; 4.3.2 Convergence Conditions. - 4.3.3 On the Gain Sequence and Connection to ODES4.4 Asymptotic Normality and Choice of Gain Sequence; 4.5 Extensions to Basic Stochastic Approximation; 4.5.1 Joint Parameter and State Evolution; 4.5.2 Adaptive Estimation and Higher-Order Algorithms; 4.5.3 Iterate Averaging; 4.5.4 Time-Varying Functions; 4.6 Concluding Remarks; Exercises; 5 . Stochastic Gradient Form of Stochastic Approximation; 5.1 Root-Finding Stochastic Approximation as a Stochastic Gradient Method ........................................................................... ..; 5.1.1 Basic Principles. - 5.1.2 Stochastic Gradient Algorithm. - A unique interdisciplinary foundation for real-world problem solving Stochastic search and optimization techniques are used in a vast number of areas, including aerospace, medicine, transportation, and finance, to name but a few. Whether the goal is refining the design of a missile or aircraft, determining the effectiveness of a new drug, developing the most efficient timing strategies for traffic signals, or making investment decisions in order to increase profits, stochastic algorithms can help researchers and practitioners devise optimal solutions to countless real-world problems. Intro
Emner
Sjanger
Dewey
ISBN
0471330523

Bibliotek som har denne