I would like to solve the support vector regression problem.
The formula for the optimization is the following:
$$a_1^*, a_2^* = \max\sum_{i=1}^{n} (a_{1i}-a_{2i})y_{i} - eta\sum_{i=1}^{n}(a_{1i}+a_{2i}) - 1/2\sum_{j=1}^{n}\sum_{i=1}^{n}(a_{1j}-a_{2j})(a_{1i}-a_{2i})〈x_i,x_i〉$$
with $〈x_i,x_i〉$ being the dot product of $x_i$
and the constrains: $0\leq a_{1j}, a_{2j}\leq C$ and $\sum_{i=1}^{n}(a_{1i}+a_{2i})=0$
How can I solve this problem using a quadratic programming solver? I would use cvxopt.solvers.qp for this. But the solver demands the following form as input.
$x^* = \min$ $1/2x^TPx+q^Tx$ subject to $Gx \leq h$ and $Ax=b$
Is there a way to reformulate the formula, which is dependent by two variable vectors instead of one to that form?
Thank you in advance!