0

Consider two non-constant real polynomials $f(x)$ and $g(x)$:
$$f=f_0 + f_1 (x-x_0) +...+f_N(x-x_0)^N $$ $$g=g_0 + g_1(x-x_1) +...+g_M(x-x_1)^M $$ where $f_0...f_N,g_0...g_M,x,x_0,x_1 \in \mathbb{R}$ and $M,N\in \mathbb{N}^+$.

I would like to know if there is any function $H(f,g)$ such that:

  • the Taylor expansion of $h(x)=H(f(x),g(x))$ up to order $N$ around $x_0$ coincides with $f$.

  • the Taylor expansion of $h(x)$ up to order $M$ around $x_1$ coincides with $g$.

I was thinking about using a sort of harmonic mean of $f$ and $g$ or transition functions built out of $f$ and $g$, but I am stuck. Moreover, I am not even sure that such a function exists if we restrict $H$ to be a rational function of two variables. For sure different strategies are possible, if you have any idea please tell me. Thank you in advance for any help/hint/idea!

Note: I am asking the function $h(x)$ to stem from $H(f,g)$, not $H(f,g,x)$. Therefore, a "trivial" construction like $\tilde{h}(x)= \phi_0(x)f(x)+\phi_1(x)g(x)$ for some appropriate $\phi_0(x)$ and $\phi_1(x)$ is not admissible. In other words, since it is required that $h(x)=H(f(x),g(x))$, we also have that $$ h'(x) = f'(x) \partial_f H + g'(x) \partial_g H \, . $$ The above property is not respected by the "trivial" $\tilde{h}(x)$, as $\tilde{h}'(x)$ is not linear in the derivatives $f'$ and $g'$.

Quillo
  • 2,260
  • 1
    Not sure if this is what you are looking for, but you can always multiply the functions by a smooth cutoff and add them together. – cct Jul 08 '24 at 22:04
  • Hello! Yes, this would be a "trivial" solution but this would not stem from a function $H(f,g)$ but rather from something that depends explicitly on $x$, i.e. $H(f,g,x)$... if I am not mistaken, it's quite late now here:) If it is not clear I will add soon a clarification. @cct – Quillo Jul 08 '24 at 22:18
  • I might be missing something but defining $H(f,g)=f\phi_0+g\phi_1$ such that $\phi_i$ are locally $1$ at $x_i$, and locally $0$ at the other point seems to work in the way you describe it. This construction even matches all the derivative at both points. – cct Jul 08 '24 at 22:45
  • @cct since the two $\phi(x)$ functions do depend on $x$ explicitly (not via composition with $f$ and $g$), this is not a solution arising from a function $H(f,g)$ as indicated in the body of the question. – Quillo Jul 08 '24 at 22:58
  • If I understand your description correctly now, essentially $H$ is a function defined on (subset of) $\mathbb{R}^2$? Then one immediate problem is, this cannot work for all $f,g$. From your description, you want $H(f(x_0),g(x_0))=f(x_0)$ and $H(f(x_1),g(x_1))=g(x_1)$. However, if you have $f\equiv c_1,g\equiv c_2$, this is a contradiction. And similar condition is probably required for higher derivatives. – cct Jul 09 '24 at 01:02
  • Yes exactly, I didn't write it explicitly but all the coefficients of the expansion are in general non-zero @cct – Quillo Jul 09 '24 at 01:13

2 Answers2

2

Here is a partial solution when $|f(x_0)-g(x_0)| > \epsilon$, $|f(x_1)-g(x_1)| > \epsilon$, $|f(x_0)-f(x_1)| > \epsilon$, and $|g(x_0)-g(x_1)| > \epsilon$ all hold. Or to put it differently, pick

$\epsilon = \tfrac{1}{2} \min(|f(x_0)-g(x_0)|, |f(x_1)-g(x_1)|, |f(x_0)-f(x_1)|, |g(x_0)-g(x_1)|)$

and the assumption is that this number isn't 0. Under the stated assumption you can pick

$H(f,g) = \Theta(\epsilon - |f-f(x_0)|) f + \Theta(\epsilon - |g-g(x_1)|)g$

where $\Theta$ is the Heaviside step function or smooth approximation that is flat enough around $0$ to preserve the Taylor series to a given finite order.

Here is a counterexample showing that you need to assume something more: When $x_0 = x_1$, and $f_i \neq g_i$ for some $i \leq \min(M,N)$, the Taylor series of $h(x)$ at $x_0=x_1$ can obviously agree with at most one of the series for $f(x)$ and $g(x)$.

Quillo
  • 2,260
  • +1 very interesting, thank you! Do you think it is possible to upgrade this idea to something more "smooth"? i.e. maybe with some smoother versions of "min" and the "Heaviside theta", e.g. https://math.stackexchange.com/q/30843/532409 – Quillo Jul 10 '24 at 13:46
  • 1
    I think it should be no problem to pick a smooth approximation to the Heavyside function as long as it is flat enough (e.g. the first max(M,N) Taylor coefficients are zero). Another idea is to try to simplify the problem as follows: make the Ansatz $f = g + f'$ and change variables to $f' = f - g$ and $g' = g - g = 0$. Now we've reduced to the problem to that when one of the functions vanishes identically. It might be easier to identify sharp necessary conditions and/or counterexamples in this setting. – Erik Tellgren Jul 10 '24 at 14:15
  • The softplus (see e.g. https://math.stackexchange.com/q/2733605/532409) can help! https://en.wikipedia.org/wiki/LogSumExp and https://en.wikipedia.org/wiki/Softplus – Quillo Dec 01 '24 at 15:52
  • Actually (and this comment is mostly as a reminder for me and the interested reader), there is a whole class of smooth maximum functions! https://en.wikipedia.org/wiki/Smooth_maximum https://math.stackexchange.com/q/534/532409 – Quillo Dec 01 '24 at 16:00
1

I assume there are some restrictions on where the agreement needs to arise since $h$ can't agree with $f$ and $g$ everywhere. So I propose a similar question: how can we construct a function $h$ such that

  • on $x<a$, it agrees with $f$
  • on $x>b$, it agrees with $g$
  • $h$ is smooth (provided $f$ and $g$ are)

This idea came up in a video by EpsilonDelta on YouTube a while back, which I later implemented into a Desmos demo out of curiosity. Naturally, then, it should work just fine in the polynomial case, but can also work for smooth functions in general.

The process of interpolation works as follows:

  • We define a sufficiently smooth "base interpolating" function of sorts. The video works with $$ \psi(x) = e^{-1/x} \cdot \mathbf{1}_{[0,1]}(x) $$ The video notes that we require $\psi$ monotone increasing and smooth, if we want the interpolation to be smooth. We also force $\psi(0) = 0$. These conditions allow us to get the useful properties for the next function, $\phi$.

  • We then modify $\psi$ into a new function $\phi$, as below. The goal is to construct a function that is "like" $\psi$ (and in particular, inherits its starting value, monotonicity, and smoothness), but also has $\phi(1)=1$. Hence, $\phi$ will function as a sort of sigmoid-like "weight" between $f$ and $g$, telling how much of a value is contributed based on its distance from the boundaries of the domain (and based on the definition of the smooth "base" function). $$ \phi(x) = \frac{\psi(x)}{\psi(x)+\psi(1-x)} $$ This has some useful properties:

    • $\phi$ is smooth
    • $\phi(0) = 0$
    • $\phi(1) = 1$
    • $\phi$ is monotone increasing
    • $\phi$ is symmetric about $(\frac 1 2, \frac 1 2)$
    • If $\psi^{(n)}(0) = 0$ for $k = 1,2,\cdots,n$ the same is true of $\phi$
  • To handle interpolation on $[a,b]$, we modify $\phi$ with the rule $$ \phi_{a,b}(x) = \phi \left( \frac{x-a}{b-a} \right) $$

  • The interpolating function $h$ is then given by $$ h(x) = \left( 1 - \phi_{a,b}(x) \right) f(x) + \phi_{a,b}(x) g(x) $$

For instance, interpolating $\sin(5x)-x$ on $(-\infty,-1)$ with $\cos(7x)+2$ on $(3,\infty)$, the interpolation graphically looks as so:

enter image description here

PrincessEev
  • 50,606
  • +1 This is very cool and useful (I liked the video) but note that your $h(x)$ is not as described in the question because it can not be written as $h(x)=H(f(x),g(x))$. I am requiring that the $x$ dependence comes entirely from composition of $H$ with $f$ and $g$ (or, if needed, translations of these functions). – Quillo Jul 08 '24 at 23:25
  • related: https://math.stackexchange.com/q/5000269/532409 – Quillo Nov 18 '24 at 17:25