The idea is to bring the problem into the form:
$$\begin{aligned}
\arg \min_{ \boldsymbol{s} } \quad & \frac{1}{2} {\left\| K \boldsymbol{s} - \boldsymbol{m} \right\|}_{2}^{2} + \frac{\lambda}{2} {\left\| \boldsymbol{s} \right\|}_{2}^{2} \\
\text{subject to} \quad & A \boldsymbol{s} = \boldsymbol{u} \\
\quad & B \boldsymbol{s} = \boldsymbol{v}
\end{aligned}$$
Using the Kronecker Product we can see that:
- $ K = {K}_{1} \otimes {K}_{2} $.
- $ \boldsymbol{s} = \operatorname{vec} \left( S \right) $ where $ \operatorname{vec} \left( \cdot \right) $ is the vectorization operator.
- $ \boldsymbol{m} = \operatorname{vec} \left( M \right) $.
The matrices $ A $ and $ B $ are just Selectors of the corresponding elements in $ \boldsymbol{s} $.
Remark
Pay attention that if $ A $ and $ B $ represent a matrix which selects each element of $ \boldsymbol{s} $ exactly once then $ \sum_{i} {u}_{i} = \sum_{i} {v}_{i} $ must hold as it represent the sum of $ \boldsymbol{s} $. Namely $ \boldsymbol{1}^{T} A \boldsymbol{s} = \boldsymbol{1}^{T} B \boldsymbol{s} = \sum_{i} {s}_{i} $. This is the case for your constraints. So it must be like that in order to have a feasible solution.
Now the above is a basic Convex problem which can be solved by Projected Gradient Descent where we project onto the intersection of the 2 equality constraints.
You could even do something simpler by concatenate the matrices and vectors:
$$ C \boldsymbol{s} = \begin{bmatrix} A \\ B \end{bmatrix} \boldsymbol{s} = \boldsymbol{w} = \begin{bmatrix} \boldsymbol{u} \\ \boldsymbol{v} \end{bmatrix} $$
Then it is very similar to Linear Least Squares with Equality Constraint.
An interesting resource with that regard is Robert M. Freund - Projection Methods for Linear Equality Constrained Problems.