• Call a function with the name:
G
G
r
r
a
a
d
d
D
D
e
e
s
s
c
c
e
e
n
n
t
t that implements the
gradient descent algorithm for minimizing the aforementioned
function. The calling syntax of this function should be:
“[MinPoints, xVals] =
G
G
r
r
a
a
d
d
D
D
e
e
s
s
c
c
e
e
n
n
t
t(fun, x, lr, maxIterNum, tol)”
fun is the function name “@funToMin”, x is a row vector with the
user’s guess (starting value) about the minimum of the function. The
input argument “lr” is the learning parameter a that should be set to
0.1 if not specified by the user. “maxIterNum” is as before the
maximum number of iterations that should be set to 100 if not
specified and “tol” is the tolerance concerning the norm of the
updating component of the point x (stopping criterion) and should be
set to 1e-6 is not specified. The function returns “MinPoints” that if
the algorithm has converged is the point where the function is
minimized, and “xVals” that is an m-by-2 array with the set of the
values of x at each iteration of the algorithm (m is either equal to:
maxIterNum” or a smaller number if the algorithm converges).
This function should call another function with the name:
N
N
u
u
m
m
e
e
r
r
D
D
e
e
r
r
s
s
and with the following calling syntax:
“GradsVals =
N
N
u
u
m
m
e
e
r
r
D
D
e
e
r
r
s
s(fun,x)”
that takes as input a function and a single coordinate point and
returns the gradient vector at that point. Use the two sided finite
difference method explained before. For the above, use as initial
point: x
0
=[0 0].
After you have called the function, plot on the contour figure the
algorithm’s trajectory to the minimum found in “xVals”. Call once
more the function with x
0
=[-1 -2] and add the new trajectory to the
contour figure. Use “lr”=0.25 and “maxIterNum”=200. Use
g
g
t
t
e
e
x
x
t
t
to
enter a text that indicates the trajectory.
Commentaires sur ces manuels