The problem is given by:
$$\begin{aligned}
\arg \min_{x} \quad & \frac{1}{2} {\left\| A x - b \right\|}_{2}^{2} + \lambda {\left\| C x \right\|}_{1} \\
\text{subject to} \quad & D x = 0 \\
& x \geq 0
\end{aligned}$$
Probably the fastest solver to this kind of a problem is by some adaptation of the ADMM algorithm.
Yet, a simple solver can be done by the Projected Sub Gradient Method.
In this case we need a projection onto the convex set $ \mathcal{D} = \left\{ x \mid D x = 0, x \geq 0 \right\} $. Since the set $ \left\{ x \mid x \geq 0 \right\} $ isn't a sub space we can't use the alternating projection method (See Projections onto Convex Sets). So we'll use the Dykstra's Projection Algorithm from Orthogonal Projection onto the Intersection of Convex Sets.
The projection onto each set are given by (You may have a look at Projection of $ z $ onto the Affine Half Space $ \left\{ x \mid A x = b, \; x \geq 0 \right\} $):
- $ \operatorname{Proj}_{ \left\{ x \mid D x = 0 \right\} } \left( y \right) = y - {D}^{T} {\left( D {D}^{T} \right)}^{-1} \left( D y \right) $.
- $ \operatorname{Proj}_{ \left\{ x \mid x \geq 0 \right\} } \left( y \right) = \max \left( y, 0 \right) $.
The Sub Gradient of the objective function is given by $ {A}^{T} \left( A x - b \right) + \lambda {C}^{T} \operatorname{sign} \left( D x \right) $.
Throw all of this into a simple iterative Sub Gradient and you have a solver.
The MATLAB code is available at my StackExchange Mathematics Q3892375 GitHub Repository.
P. S.
By generating all matrices randomly the solution is usually given by the zero vector.