Grocery stores often offer buy-one-get-one-free sales. For boxed items, like cereal, there is not much mathematical content. You buy one box of cereal, and the next one (which is identical to the first) is free. However, for items that vary in weight and price, like watermelons, the situation is more interesting. In this case, buy-one-get-one-free really means if you buy two watermelons, you pay for the more expensive one and get the cheaper one for free. So, to optimize the benefit of the sale, a frugal shopper will go through the entire bin of watermelons and find the two that are closest in price. (Confession: if there's not a crowd, I do this!)
In other words, let $X_1,X_2,\dots,X_n$ be a sequence of i.i.d. random variables with common mean $\mu$, variance $\sigma^2$, and distribution $F(\cdot)$. Let $Y_1\le Y_2\le \dots\le Y_n$ be the ordered versions of $X_1,X_2,\dots,X_n$. Define the optimal benefit to be $$O_n=\min_{1\le m<n}Y_{m+1}-Y_m.$$ My question is:
What is the distribution of $O_n$ in terms of the common distribution of the samples ($\mu$, $\sigma^2$, $F(\cdot)$ ) and the sample size $n$?
My suspicion is that this type of problem is well-known. Could someone provide a quick derivation or a reference?