0

I've stumbled upon a method (the method of Frobenius) for finding an infinite power series of an ODE about a regular singular point.

Say we know the two solutions for the indicial equation (denoted as $r_1$, $r_2$) follow:

$$r_1 - r_2 = m$$ (where $m$ is an integer)

Then the solutions of the ODE ($y_1$, $y_2$) can be expressed such that:

$$y_1 = x^{r_1}\sum_{n=0}^\infty a_nx^n$$ $$y_2 = Cy_1 \ln{x} + x^{r_2}\sum_{n=0}^\infty b_nx^n$$

The solution is derived from the link here. (Starts at the end of page 28)

However, it has some Big-O Notation usage which doesn't make much sense to me. I'll quote a part of the proof from the link above here:

$$ v'(x) = \frac{1}{x^{2r_1}y_1}e^{-\int{p_0/x + p_1+O(x)}} = \frac{1}{x^{2r_1 + p_0}y_1}e^{-p_1x + O(x^2)} = \frac{1}{x^{2r_1 + p_0}}O(1) $$

What's going on here - I didn't know big-O notation could be used this way, why does this work in the context of the proof, and how is the $O(1)$ even derived?

Later, this is integrated to somehow get an infinite sequence (which completes the proof, after using the fact that $y_2(x) = v(x)y_1(x)$ as stated previously in the proof) - how is integration defined for $O(1)$?

Sam
  • 1
  • Thank you for bringing this to my attention - I've slightly changed the focal point of my question (while keeping the underlying meaning the same) and tried to make it clearer as to the part of the proof I was confused about. Thank you for the proof, though - it helped a lot! – Sam Apr 06 '23 at 03:00
  • For a sufficiently tiny neighborhood around $0$, $p_1 x + |O(x^2)| \leq p_1 x + Mx^2 < N$ so the LHS is $O(1)$. For the same reason $\exp(O(1)) \leq \exp(M) \leq N \implies \exp(O(x)) = O(1)$ Also note that near $0, \tilde{y_1}$ will approach whatever it's constant term is (which is necessarily going to be nonzero by how we construct $y_1$. so the denominator is bounded below by the square of the constant term. Consequence the ratio will always be tinier than $O(1)/a_0^2 \leq M/a_0^2 \leq N$ so this means $\frac{O(1)}{\tilde{y_1}^2} = O(1)$. That's how you derive the $O(1)$ on the RHS – Sam Apr 06 '23 at 03:51
  • As for integration, since we know we're in a sufficiently small neighbourhood around $0$ where all our functions are analytic, whatever the $O(1)$ term is, is actually just representing an upper bound for the underlying analytic function. So we can remove the $O$ by writing arbitrary coefficients for the analytic function and integrate. – Sam Apr 06 '23 at 03:59
  • @Sam I have a couple of points about your reasoning:

    a) With your logic, surely you could make the claim that any analytic function could be written as $O(1)$

    b) Is your logic here that since the LHS (including ${y_1}^2$) has a valid power series expansion (due to it being analytic), we could just call it a day and integrate each term?

    – Sam Apr 06 '23 at 12:17
  • Almost. For a) You can say any analytic function is $O(1)$ only as $x\to a$ where $a$ is the center of the radius of convergence and if the analytic function has a non-zero constant term. For b) it having a valid power series expansion is important, but the more important fact is that after dividing, you'll still get a convergent power series since the power series is bounded below. I.e $\frac{\sum a_n}{\sum b_n} = \sum c_n$ with the RHS convergent only if $\sum b_n$ is bounded below in a sufficiently small neighborhood about $a$. Then yes at this point we can just integrate and call it a day – Sam Apr 06 '23 at 19:45
  • "if the analytic function has a non-zero constant term" - I'm trying to understand this, my current idea is due to the $\limsup$ definition of big-O notation, that: $\limsup_{x \to a}\frac{f(x)}{g(x)}<\infty$ (assuming positive). For $f(x) = O(1)$, you'd get $\limsup_{x \to a}f(x)<\infty$, then $\limsup_{x \to a}\sum_{n=0}^\infty c_n(x-a)^n<\infty$, which should just be $\limsup_{x \to a}c_0<\infty \therefore \lim_{x \to a}c_0 <\infty \therefore c_0 < \infty$, but surely this would be true even if $c_0 = 0$ – Sam Apr 06 '23 at 21:00
  • Ah actually you're right, non-zero constant term isn't necessary. I was just thinking that at that point might as well consider the function $O(x)$ or whatever the leading power of $x$ is. – Sam Apr 06 '23 at 22:24
  • Ah, that makes sense. Thanks for the help! I finally understand this now – Sam Apr 06 '23 at 22:34

0 Answers0