UPDATED: I was forgetting this basic fact (that the sum of convex functions are themselves convex), and therefore I thought that I needed a global optimizer. I now see that that is not the case and my problem does indeed reduce to the classical lasso problem.
Original:
I have an objective function that looks like the follow:
$L(x) = ||Ax-y||^2_{L2}+ ||x-x_{ref}||_{L2}^2 + ||x||^2_{L1}$
where $A$ is a matrix, $y,x_{ref}$ are vectors. I ideally want to find the global minimum for this function, but I am unsure what methods are suitable for this kind of operation. Having searched this site and done some reading I am aware of ISTA methods and in particular FISTA.
As far as I can see FISTA or similar algorithms will not really be applicable in my case since they assume a smooth convex function in combination with a nonsmooth convex function, which is not exactly what I have since I have an extra term convex term. Besides, ISTA methods are local optimization algorithms and I believe I need a global optimization algorithm (though perhaps I could do it with an ensemble of local optimizations).
I am unsure what algorithms are suitable to solving such optimization, especially the $L_1$-norm is making me uncertain since I know quite a few algorithms are assuming differentiability. On the other hand I know there are also global optimization algorithms like evolution based ones that have no such assumptions, but I believe those shouldn't be necessary since I still have a relatively well behaving function.
If size is a consideration then the problem I am trying to solve should be relatively small in size:
$n \sim 200$
$m \sim 100$
$A$ - shape [n,m]
$x,x_{ref}$ - shape [m]
$y$ - shape [n]
Edit: updated the formula to reflect the intended $L2$ norm