SyTen

◆ gradient_descent_linesearch()

template<typename X , typename FunctionEval , typename GradientEval , typename LinesearchEval , typename RandomStep , typename Terminator >
auto syten::GradientOpt::gradient_descent_linesearch ( x,
FunctionEval  func,
GradientEval  grad,
LinesearchEval  lin,
RandomStep  rand,
Terminator  terminate,
Index  num_steps = 100,
Index  ls_num_steps = 50,
SRDef  ls_lower = 0,
SRDef  ls_upper = 1 
)

Attempt at a gradient descent with linesearch.

Works very badly.

Template Parameters
Xdomain type
FunctionEvaltype of the function evaluation function, taking an X
GradientEvaltype of the gradient evaluation function, taking a result of the function and its X
LinesearchEvaltype of the linesearch evaluation function, taking a step size a, the current iteration point x and the gradient g and returning the value of f(x-a*g) and ‘f’(x-a*g)` (wrt a).
RandomSteptype of the random stepper to be enacted if nothing else works. Takes the step size (either 0 or NAN) point and the gradient.
Terminatortype of the termination condition, returning true if we should break out. Takes the step size, current function value and current gradient.
Parameters
xinitial value
funcobject returning f(x) when given x
gradobject returning f'(x) when given f(x) and x
linobject returning f(x-a*g) and f'(x-a*g) (wrt a) when given a, x and g
randobject returning an updated random x when given x and g
terminateobject returning true if, given a step size alpha, the current function value and the current gradient, we should terminate early.
num_stepsmaximal number if outer steps to take
ls_num_stepsnumber of steps to be done during the linesearch
ls_lowerlower end of the line search
ls_upperupper end of the line search

References syten::isnan(), linesearch(), std::make_tuple(), and std::move().

+ Here is the call graph for this function: