Morgan math

Constrained Linear Regression

If you want do do linear regression with linear constrains on the coefficients ("w[0] > w[1]", etc.) you can phrase it as a Quadratic Program.<.p>
minimize$$ x^TQx + c^Tx $$
subject to$$Ax \leq b$$
To see this, consider the loss for least squares linear regression: $$(X w - y)^T (X w - y)$$ $$= (w^T X^T - y^T) (X w - y)$$ $$= w^T X^T X w - 2 X^T y + y^T y$$

Since $y^T y$ doesn't change the optimum we can drop it.

$$w^T X^T X w - 2 X^T y$$

Now let:

$$x = w$$ $$c = 2 X$$ $$Q = X^T X$$

And this is simply the quadratic programming objective. so we can use any off-the-shelf solver. Moreover, the matrices that we feed to the solver are all independent of N, so, for problems where the dimensionality is small, this should be no slower than ordinary linear regression.