Gaussian processes are generally introduced as families of rvs where all finite vectors are multivariate normal. However, they are also described sometimes as "distributions over functions." I'd like to formalize this second claim using the language of measure theory on Banach spaces of functions like $L^p$.
I know that Brownian motion (a specific type of GP) can be seen as a "$C[0,1]$-valued random variable," that is, a measurable function whose output is a random function. I also know that we cannot write PDFs for arbitrary distributions over infinite-dimensional Banach spaces like $L^p$ because there is no Lebesgue measure on such spaces. My questions are 1) what exact space of functions, $F$, a GP with a specific mean and covariance function $\mu, K$ is a distribution (probability measure) over, 2) whether we can write a CDF for such a GP, since every random variable must have a CDF, and 3) how I would find the probability of a rv $f \sim GP(\mu, k)$ being contained in some subset of $F$, the space of functions a GP is defined over. For instance, if $F$ were $C[0,1]$, then I would want to know what subset of $C[0,1]$ is more probable to be drawn, and what is less, by looking at $\mu, K$.
I have taken a course in measure theoretic probability, so measure theory, where illuminating, is welcome. Also note that my question is to do with GPs as distributions over function space, not anything to do with GP regression. I have read this, this and this, which do some of what I ask and outlined the Kolmogorov existence theorem, but my questions are more specifically to do with PDFs/CDFs, and specifically writing down the space of functions the distribution of a GP is over, which is not addressed in those answers.