# Review Sheet I 1. Maxima and Minima of Functions by RickiePBibey

VIEWS: 0 PAGES: 3

• pg 1
```									Review Sheet I 1. Maxima and Minima of Functions of Several Variables A. B. C. Taylor's Theorem Critical Points Hessian 1. 2. 3. D.. 1. 2. E. II. 1. A. B. C. Criteria for local maximizer and Minimizer Criteria for global Maximizer and Minimizer Saddle points Criteria for Definite a. Principal minors Matrix for a quadratic form Finding global minimum from local minimum

Matrices

Coercive functions

Convex Functions Convex Sets Definition of Convex Function 1. 1. 2. 3. D. 1. Examples Gradient Hessian Graph Global Minimum Criteria for Convex Function

Critical Points

IIII.

Arithmetic Geometric Inequality A. B. Inequality Strategy for solving problems using AG inequality a. b. C. a. b. c. D. a. b. c. Max for geometric part Min for algebraic part Choosing !i's to force constraint into problem as constant Using !i's to remove variables from geometric side by use of Solving for original variables Choosing !i's to force constraint into problem as constant Using !i's to remove variables from geometric side Solving for original variables

Problems with constraints

linear equations Problems without constraints

IV

Newton's Method A. Newton's sequence to produce minimizer 1. Necessary conditions for convergence

Review Sheet I page 1

2. B.. C. D. E. F. IV. A. 1.

Relation to Taylor Series Finding the quadratic approximator at a point

Quadratic Approximater Relations to finding 0 at a point Inverting the Hessian 1. Positive Definite Hessian Factorization scheme for inverting (Cholesky) Steepest Descent Telling if a function is convex 1. 2. 3. Combinations of convex functions Hessian Equation involving "f. Critical point is a global minimizer

Convex Functions

B. VII.

Minimizers 1.

Search Procedures A. B. C. D. E. Fibonacci Search Diochotomous Search Wolf Search One iteration of Newton’s Method Search Direction 1. A. HF

VIII. Secant Method Functions g: # n \$ #n 1. 2. To find 0 Algorithm a. b.. D k(xk+1 - xk) = - g(xk)

B.

D k+1(xk+1 - xk) = g(xk+1) - g(xk) Functions f : #n \$ # 1. 2. To find minimizer Algorithm for a zero of g = "f a. b. D k(xk+1 - xk) = - g(xk) D k+1(xk+1 - xk) = g(xk+1) - g(xk)

XI.

Broyden Rank 1 update for Secant Method for finding zero of g: #n \$ #n g(x k+1)ƒd k A. Dk+1 = D k + ||d k|| 2 B. Calculation schemes - Relevance of 1. 2. 3. 4. Starting x0, D0 Sherman-Morrison Formula and meaning of u%v: (D + u%v)-1 = D -1 Cholesky Factorization Matrix norm
D -1uv TD -1 1 + v TD -1u

Review Sheet I page 2

C. X. A. B.

General idea of what convergence means Positive definite updates Broyden-Fletcher-Goldfarb-Shanno Method y k%y k D kd k%D kd k 1. Update Dk+1 = D k + with yk = "f(xk+1) - "f(xk) and dk = xk+1 - xk T T d yk d D kd k
k k

2.

3. C.

Algorithm xk+1 = xk - tkDk-1"f(xk) a . tk = minimizer of &(t) = xk - tkDk-1"f(xk) for t ' 0 b. tk from Wolfe Search Procedure for &(t) = xk - tkDk-1"f(xk) for t ' 0 Starting D0

Davidon-Fletcher-Powell Method d k%d k D ky k%D ky k 1. Update D k + , with yk = "f(xk+1) - "f(xk) and dk = xk+1 - xk T T d yk y D ky k
k k

2.

Algorithm xk+1 = xk - tkDk"f(xk) a . tk = minimizer of &(t) = xk - tkDk-1"f(xk) for t ' 0 tk from Wolfe Search Procedure for &(t) = xk - tkDk"f(xk) for t ' 0 Starting D0 b.

3.

Review Sheet I page 3

```
To top