0

I am at a point in n-dimensional space $X^0$ and it costs me to move to $X_i$ with varying cost depending on direction $\Sigma_i c_i |X_i-X^0_i|$.

But, it's good for me to move closer to a target point $X^*$ because I'm penalized quadratically for being away from the target $d \Sigma_i |X_i-X^*_i|^2$.

I know the absolute values make this problematic but is there a way to globally minimize the total cost function for $X_i$? (algorithmically?)

$Cost =\Sigma_i c_i |X_i-X^0_i| + d \Sigma_i |X_i-X^*_i|^2$

rhaskett
  • 177

0 Answers0