0

I have a school exercise where I am supposed to implement the Newton method with and without a backtracking line search. The first one converges quite rapidly but, when I add the backtracking line search part, it does not... Here's what I did, could you help me find out what I am doing wrong? (I have been working on it for days now...) Thanks (ps : the def part did not format well here... sorry for that)

def Newton(x0, step0):

nbiter = 0   
eps = 1e-3  
x = x0
step = step0
while nbiter < 10000 and npl.norm(gf(x))>=eps:
    d = -step*npl.solve((Hf(x)), gf(x))
    x = x + d
    nbiter +=1
return x, nbiter

def NewtonB(x0, step0):

nbiter = 0   
eps = 1e-3    
x = x0
step = step0
while nbiter < 10000 and npl.norm(gf(x))>=eps:
    step = step * 3
    while f(x - step * gf(x)) > f(x) - step/2 * npl.norm(gf(x))**2:
    step = step / 2
d = step*npl.solve((Hf(x)), -gf(x))
x = x + d
nbiter +=1

return x, nbiter

Here are my results:

Newton algorithm

Nb of iterations: 157, Minimum: f([1.00007221 1.0001425 ])=5.584711920189459e-09

Newton algorithm with the Backtracking line search

Nb of iterations: 10000, Minimum: f([1.02398165 1.04787271])=0.0006194371664257272

Elle
  • 1

1 Answers1

1

Could you specify what line-search technique you are trying to implement? Here you check the condition $$f(x-\gamma\nabla f(x))\leq f(x) - \frac{\gamma}{2} \Vert\nabla f(x)\Vert^2$$ which means that you want the step-size $\gamma$ to ensure a sufficient decay after one iteration of gradient descent wheareas you use this step-size for Newton's method. I would try something like $$f\left(x-\gamma\nabla^2f(x)^{-1}\nabla f(x)\right) \leq f(x) - \frac{\gamma}{2} \Vert\nabla f(x)\Vert^2,$$ where $\nabla^2f(x)$ is the Hessian matrix of $f$.

Kas
  • 332
  • 1
    I did not know that there were two different methods (one for the gradient descent and one for the Newton's methods). I am going to try with yours – Elle Feb 13 '23 at 15:54