Exercise sheet

10 downloads 78535 Views 201KB Size Report
Université Paris Descartes, BME-Paris Master - UE Math II ... Write the second order Taylor of f at (4,0). Deduce ... Exercise 4 Simple linear regression Let (xi,yi) n.
Université Paris Descartes, BME-Paris Master - UE Math II Optimization – Exercises

Exercise 1 Let f be the function defined on R2 by : f (x1 , x2 ) = x31 + x32 − 6(x21 − x22 ) 1. Show that f admits four critical points. 2. Compute f (t, 0) and f (0, t), and deduce directly that f has no extremum at (0, 0). 3. Write the second order Taylor expansion of f at (4, 0). Deduce that f admits a local minimum at (4, 0). 4. Compute the eigenvalues of the Hessian matrix at the two other critical points and deduce directly their type (local minimum, local maximum, or saddle point).

Exercise 2 Let ε > 0. We define ∀x ∈ Rn ,

Jε (x) =

n−1 X

Nε (xi+1 + xi−1 − 2xi ),

où Nε (t) =



ε + t2 .

i=2

a) Calculate the derivative of Jε : DJε (x).η using three different ways : ε for all i ; - by first calculating ∂J ∂xi - as a directional derivative - by computing directly the differential form DJε (x). b) Write a Matlab function G = gradientJ(x,eps) which takes as input argument a colon vector x, and a positive real eps, and outputs the gradient G of Jε at x as a colon vector.

Exercise 3 Let f : R2 → R be defined by : f (x, y) = x2 + y 3 Does this function have a global extremum on R2 ? Does it have a local extremum ? Exercise 4 Study the local and global extrema of the following functions from R2 to R : 1. f1 (x, y) = xy(x + y − 1) 2. f2 (x, y) = 4xy +

1 x 3

+

1 y

3. f3 (x, y) = x3 + y − 9xy + 27 4. f4 (x, y) = x2 − 3x2 y + 2x4 5. f5 (x, y) = (y 2 − x2 )(y 2 − 2x2 ) 6. f6 (x, y) = (x + y)2 − (x4 + y 4 ) 7. f7 (x, y) = x ln y + y ln x 1

Exercise 5 Let p1 = (p11 , p12 ), p2 = (p21 , p22 ), . . ., pn = (pn1 , pn2 ) be n points in R2 . We want to find the point that minimize the sum of squared distances to all points pi : hence, for any point x = (x1 , x2 ) we define the following functional : J(x) =

n X

kx − pi k2 .

i=1 ∂J ∂J (x) and ∂x (x), and deduce that J has a unique critical point x∗ in R2 . 1. Calculate ∂x 1 2 How is called this point in geometrical terms ?

2. Show that x∗ is a local minimum point of J. 3. Explain why x∗ is also the unique global minimum point of J.

Exercise 6 Simple linear regression Let (xi , yi )ni=1 be a set of points in R2 , with n > 2. We assume that at least two points xi are distinct. The simple linear regression problem consists in finding the linear relashionship y = αx + β that best fit Pthe observations, which is obtained by minimizing the following functional in R2 : J(α, β) = ni=1 (yi − αxi − β)2 1. Show that this functional admits a unique global minimum and compute it. 2. Write a Matlab function function [alpha,beta] = LinearRegression(x,y) which computes this solution from input vectors of coordinates x, y. Test this function for x=rand(1,10) ; y=-5+12*x+randn(1,10) ; and plot on the same graph the points and the regression line.

Exercise 7 Linear model Let (xi , yi )ni=1 be a set of points in R2 . We want to find a relashionship between the variables xi and yi . We consider the model f (x) =

k X

βj wj (x),

j=0

where wj are functions from R to R and βj are coefficients. We want to find the coefficients βj which best fit the model yi = f (xi ) for i = 1, · · · , n by minimizing J(β0 , · · · , βk ) =

n k X X i=1

!2 βj wj (xi ) − yi

.

j=0

1. Show that one can write J(β) = kM β − yk2 , where β = (β0 , · · · , βk )T , y = (y1 , · · · , yn )T , M is a matrix to be defined. 2. Prove that the vector β of optimal coefficients satisfy M T M β = M T y. 3. Show that if M has full rank then the solution is unique. 4. We consider the case of polynomial regression : wj (x) = xj . Write a Matlab function function [alpha,beta] = PolynomialRegression(x,y,k) which computes this solution from input vectors of coordinates x, y and order k. Test this function for x=rand(1,10) ; y=5+12*x+randn(1,10) ; k=3 ; and plot on the same graph the points and the regression curve. 2

Exercise 8 Obstacle avoidance We consider the following constrained optimization problem in Rn (with n a multiple of 3) :  n   Minimize J(x) = x2 + (x − 1)2 + X(x − 2x + x )2 , n i+1 i i−1 1 (P ) i=2   subject to x = 0, x > 1 , x 1 xn = 1. 1 n/3 2n/3 6 2 , 2 This is a discrete model for the optimal trajectory of a vehicle which must go from position 0 at time 0 to position 1 at time 1, avoiding obstacles at times 1/3 and 2/3, and which must follow a smooth trajectory (J penalizes trajectories with sharp changes of direction).

1. Show that the set C of constraints is convex and that J is a positive definite quadratic form. Deduce that problem (P ) has a unique solution. Explain why we added the extra term x21 + (xn − 1)2 in functional J even if it vanishes on C. 2. Compute the differential form dJ(x), and write a Matlab function function G = gradientJ(x) which computes this gradient from input vector x. 3. Write a Matlab function p = projectionC(x) which computes the projection p of input vector x on the convex set C. 4. Write a Matlab function xm = minimise(x,N,lambda) which computes N steps of the projected gradient algorithm, with gradient stepsize λ. Test this program with n = 30, N = 5000, λ = 0.05. Plot the optimal trajectory at the end.

3