I need help to solve the following convex optimization problem: \begin{equation} X = argmin_{\mathbf{X}} \{ ||\mathbf{X} - \mathbf{L}||_F^2 \,\,\,+\,\, \lambda \,||\mathbf{X} - \mathbf{F}||_1 \} ~~~~~\color{red}{(1)} \end{equation}
where $\mathbf{X} \in\mathbb{R}^{n \times p}$, $\mathbf{L} \in \mathbb{R}^{n \times p}$, and $\mathbf{F} \in \mathbb{R}^{n \times p}$, and $||.||_F$ represents the Frobenius norm, and $||.||_1$ is the $L_1$-norm.
$\bf{Before~you~give~your~opinions}:~$ In fact, one can directly expect to first recast equation (1) into independent $(n,p)$ parts. That is: \begin{equation} X_{i,j} = argmin_{X_{i,j}} \{ (X_{i,j} - L_{i,j})^2 + \lambda |X_{i,j} - F_{i,j}|\} ~~~\color{red} {(2)} \end{equation} where $i = 1, \cdots, n$ and $j = 1, \cdots, p$.
In this case, after derivating (2) w.r.t $X_{i,j}$, we get:
$2 (X_{i,j} - L_{i,j})~\pm~\lambda = 0$. Hence we will obtain a soft thersholding solution for $X_{i,j}$.
Is this the correct way to solve problem (1)?