I am reading the book by McCabe and Tremayne “Elements of modern asymptotic theory with statistical applications” and in Chapter 8 about Brownian motion I ran into this inequality:
$$2\sum_{i=1}^{n}(t_{i,n} - t_{i-1,n})^2 \ \leq\ 2\max(t_{i,n} - t_{i-1,n}) \sum_{i=1}^{n}(t_{i,n} - t_{i-1,n}).$$
In a statistical context, $(t_{i,n} - t_{i-1,n})$ is the variance of increment $W(t_{i,n})-W(t_{i-1,n})$ of a Brownian motion. But in a mathematical sense, i.m.h.o., it can be seen as a distance. In short, it says that the sum of squared distances is less than, or equal to, the maximum distance times the sum of those distances.
My question is, what is this inequality and why does it hold? I tried searching in Google for it, but did not find anything (probably because I don’t know its name). I also tried plugging in numbers and it indeed holds, but I struggle to understand why.