4

Let $B_t$, $t\ge0$ be a standard Brownian motion and suppose $0<x_1<x_2<\cdots<x_n<1$. Then the conditional expectation $$ \mathbb E\left(\int_0^1 B_t\,dt \,\middle\vert\, B_0, B_{x_1},B_{x_2},\ldots,B_{x_n},B_1 \right) $$ is just the trapezoidal-rule approximation to the integral. Since expected values minimize mean squared errors, it follows that for every integrable function $g$, $$ \mathbb E\left(\left(\int_0^1 B_t\,dt - E\left(\int_0^1 B_t\,dt \,\middle\vert\, B_0, B_{x_1},B_{x_2},\ldots,B_{x_n},B_1 \right) \right)^2\right) \le \mathbb E\left(\left( \int_0^1 B_t \, dt - g(B_0, B_{x_1},B_{x_2},\ldots,B_{x_n},B_1) \right)^2\right). $$

A theorem I have seen attributed to Persi Diaconis (but is that correct?) says Simpson's rule is not admissible in the decision-theoretic sense. That means that for every convex function $L$ that we could put in place of the squaring function above, there is some measurable function $g$ such that for every probability distribution on functions $C_t$ on $[0,1]$, $$ \mathbb E\left(L\left(\int_0^1 C_t\,dt - g(C_0,C_{x_1},C_{x_2},\ldots,C_1)\right)\right) \le \mathbb E\left(L\left(\int_0^1 C_t\,dt - \operatorname{Simpson}(C_0, C_{x_1},C_{x_2},\ldots,C_{x_n},C_1)\right)\right), $$ and for at least one probability distribution the inequality is strict.

Usually when Simpson's rule is applied, one is not thinking of probability at all. Does this result have any practical implications for numerical analysis in situations where no probabilities are mentioned? A Bayesian view that says all uncertainties can be expressed by probabilities would seem to imply that it does.

0 Answers0