Is it possible to minimize least absolute deviations (LAD) analytically?
Say given a sample $\{x_i\}_{i=1..n}$ find
$$\arg\min_\lambda{\sum_{i=1}^{n}{|x_i-\lambda|}}$$
Is it possible to minimize least absolute deviations (LAD) analytically?
Say given a sample $\{x_i\}_{i=1..n}$ find
$$\arg\min_\lambda{\sum_{i=1}^{n}{|x_i-\lambda|}}$$
The solution is any median. And as far as I am aware of there is not an "analytical" solution to computing a median in general.
A simple way to do this is to sort the data. There are more elaborate ways of doing this using data structures that maintain order implicitly. Somewhat surprisingly there is a way to compute a median of an unordered list in linear time, that is, time proportional to the length of the list.