Consider a linear operator $A: \mathbb{R}^{m} \to S^{n \times n}$, where $S^{n\times n}$ are the symmetric n by matrix.
Can we turn the problem of determining if there exists $x \in \mathbb{R}^{m}$ s.t. $A(x) \succ 0$ into a least square problem?
A least-squares problem is of the form
$$ \min_{y \in K} \|C(y) - b\|^2 $$
where we can determine the existence problem by checking if the optimal value is zero. K is a set, and C is another linear operator, and b is a column vector. We can formulate it in two ways, where $0$ either means existence or non-existence.
It is even better if we require K one set of positive semidefinite matrices. I don't care about the dimension.
Below are my thoughts, which are not necessary for the formulation of the problem
If we replace $\mathbb{R}^{m}$ by $S^{n \times n} \succeq 0$, the set of positive semidefinite matrices, I have come up a solution of doing this by solving the dual problem of separating the $\{ S \mid S \succ 0 \}$ and $\{ A(x) \mid x \succeq 0\}$. In this case, the two sets are separable iff exist such as x. I'm able to turn this into a least square problem but not the general case. Solving this existing problem is also known as an LMI Problem (LIMP).
Also, as I mentioned in the comments, I think this is just SDP. I am able to show that if it is feasible, then the least square is zero, but not the other way due to constraints qualification of the strong duality.