100% Guaranteed Results


CS439 – Labs Solved
$ 24.99
Category:

Description

5/5 – (1 vote)

Optimization for Machine Learning
EPFL
Martin Jaggi & Nicolas Flammarion github.com/epfml/OptML course

(Recap on convexity and gradient descent algorithms.)
The goal of this exercise session is to consolidate your understanding of general convex theory and of the several gradient descent algorithms seen in class up until now.
Gradient descent on a quadratic function. Consider the quadratic function , where A is a d × d symmetric matrix, b ∈ Rd and c in R.
1. What are the minimal conditions on A, b and c that ensure that f is strictly convex ? For the rest of the exercise we assume that these conditions are fulfilled.
2. Is f strongly convex ?
3. Prove that f has a unique minimum x∗ and give its closed form expression.
4. Show that f can be rewritten as
5. From an initial point x0 ∈ Rd, assume we run gradient descent with step-size γ > 0 on the function f. Show that the nth iterate xn satisfies xn = x∗ +(Id − γA)n (x0 − x∗), where Id is the d × d identity matrix.
6. In which range must the step-size γ be so that the iterates converge towards x∗ ?

Reviews

There are no reviews yet.

Be the first to review “CS439 – Labs Solved”

Your email address will not be published. Required fields are marked *

Related products