Okay, let's actually do this as a highschooler would or might be expected to do.
First, use the fact for positive $a_i$ that
$$\lim_{n \to \infty} (a_1^n + \cdots + a_k^n)^{1/n} = \max_{1\leq i \leq k} a_i.$$
How to show this? "Pull out" the largest $a_i$, say when $i=m$, and note that $(a_i/a_m)^n \to 0$.
So then what's the idea? The integral $\int_0^1 f(x)^n dx$ is basically the Riemann sum for large N plus an error term $\epsilon$:
$$\int_0^1 f(x)^n dx = \epsilon + \frac{1}{N} \sum_{i=1}^N f(i/N)^n.$$
Ignore the epsilon for now and take the $n$th root and use the previous fact mentioned to get that the limit is $\max f(i/N)$. Well, if $N$ is large enough, we can basically get the max of $f$ over $[0,1]$ as close as we like. So this tells us what the answer "should" be.
Now really, the error term $\epsilon = \epsilon(n,N)$. However, we can choose $N$ large enough so that $|f(i/N)-f(i+1/N)|<1$ and hence $\epsilon(n,N) < \epsilon(1, N)$. But using this puts a bound on $\epsilon$ independent of $n$, and if we include that term as we should and then take the limit, we can still show that the limit is really close to the max of $f$ on $[0,1]$ if $\max f(i/N)>1$. To make sure that happens we can just ignore all functions when it's not, and then do some scaling to make it work otherwise.
Is this a very rigorous argument? No. But I would hope a typical highschooler would follow it.