3

I originally asked this question on stats.SE but I didn't even get a handful of views. So I figured that here is probably a more appropriate site to ask. I am trying to figure out how to use PCE to quantify uncertainty in solutions of ODEs.

So if I understand correctly the main idea of PCE is that we have a random variable $X$, whose distribution we do not know, and we consider this as a function of the random variable $\Xi$, whose distribution we know.

Then we write this function in an orthogonal polynomial basis $\psi_j$ to get $$ X = \sum_{j\ge0}x_j \psi_j(\Xi). $$

There are some things regarding this expansion that people do not really explain. For example when does this thing converge? I couldn't find anything explicit on that. But my bigger problem is what happens when we multiply such series. But I will come to this in a moment.

So I am trying to understand how to use this in the setting of dynamics. I am trying to solve an example that is clearly (I think) non-trivial but it is hopefully still relatively simple.

Let's consider the dynamical system: $$\begin{align} \dot x &= x + y - x(x^2+y^2) \\ \dot y &= -x + y - y(x^2+y^2). \end{align}$$ This is the standard example of a limit cycle. The origin is an unstable equilibrium and the circle centered at the origin with radius $1$ is a stable periodic orbit, which is the global attractor for the system.

So my goal is to describe the evolution of this system when the initial point is $(10,0)+Z$, where $Z$ is random variable drawn from a bivariate normal distribution with identity as covariance matrix.

My expectation was the following:

We can write the (now) random variables $X(t)$ and $Y(t)$ as $$\begin{align} X(t) &= \sum_{j\ge0}x_j(t) \psi_j(\Xi) \\ Y(t) &= \sum_{j\ge0}y_j(t) \psi_j(\Xi). \end{align}$$ Then we can substitute into the ODE, collect terms and then get ODE equations for the coefficients $x_j(t)$ and $y_j(t)$. Truncate the series, plug the equations in a solver and get numerical solutions.

However, I got stuck almost immediately with this because I do not know how to deal with products of PCEs. For example $$ X^2 = \left(\sum_{j\ge0}x_j(t) \psi_j(\Xi)\right)^2. $$ has no obvious way of grouping elements. One could circumvent this problem by adding a dummy variable $\varepsilon$ and collect terms with respect to it $$ X^2 = \left(\sum_{j\ge0}x_j(t) \psi_j(\Xi) \varepsilon^j \right)^2 = \sum_{j\ge0}\sum_{i=0}^j x_i(t)\,x_{j-i}(t)\,\psi_i(\Xi) \,\psi_{j-i}(\Xi)\, \varepsilon^j . $$ However this does not really solve my problem because in order to get the $k$-th coefficient we have to project on $\psi_k$ so we have to compute something like $$\langle \psi_i \,\psi_{j-i},\psi_k \rangle,$$ Which is not $0$ in general. In this case the equations are up to third degree, so brute force could potentially work, however it is definitely something that cannot work in general.

So is there any general way to get the ODEs for the coefficients? How are this problems dealt with in general? Is there any book on these things aimed to mathematicians?

rom
  • 841

0 Answers0