Not using $f'(x) = A''(x)$, and that $A''(x) >0$ means convex,
Show $A(x) = \int_{0}^{x}f(t)dt$ is convex if $f(x)$ is increasing.
This is from Apostol Calculus Vol. 1, Theorem 2.9 Pg. 122. My try is given
For a function to be convex in $[a,b]$ we need $\forall \alpha \in (0,1)$
$$f(\alpha b + (1-\alpha) a) < (\alpha) f(b) + (1-\alpha) f(a)$$
Using this, we need to show $A(\alpha x) < \alpha A(x)$
Now $A(\alpha x) = \int_0^{\alpha x} f(t) dt = \alpha \int _{0}^{x} f(\alpha t) dt < \alpha \int_{0}^{x} f(t) dt = \alpha A(x)$ because $t > \alpha t \implies f(t) > f(\alpha t)$.
Is this proof alright, and how can I write it more properly. I am self learning calculus and will then proceed to real analysis.