2

Let $f \in C^k([a,b] \times [c,d])$, where $[a,b]$ and $[c,d]$ are bounded subsets of $\mathbb{R}$.

Can we approximate $f$ by a sequence of polynomials of the form

$$P^n(x,y) = P_1^n(x) P_2^n(y),$$ where $P_1^n$ and $P_2^n$ are polynomials in $x$ and $y$, respectively?

More precisely, does there exist a sequence of polynomials $P^n(x,y) = P_1^n(x) P_2^n(y)$ that converges to $f(x,y)$ in the $C^k$-norm as $n \to \infty$?

P.S.: Does the same hold for $L^1(\mathbb{R}^2)$? That is, can every $F \in L^1(\mathbb{R}^2)$ be approximated by a sequence of functions of the form $f_1^n(x) f_2^n(y)$, where $f_1^n, f_2^n \in L^1(\mathbb{R})$, such that $$ f_1^n(x) f_2^n(y) \rightarrow F(x, y) \quad \text{in } L^1(\mathbb{R}^2)? $$

Any references or insights would be helpful.

Celestina
  • 1,351
  • 8
  • 22

1 Answers1

5

I prove that what you're trying to show is not true first in a simple but non general way and then in a more abstract and general way.

Simple proof: Take $x_1,x_2 \in [a,b]$ and $y_1,y_2 \in [c,d]$, then consider the operator $T \; : \; C^k([a,b] \times [c,d]) \to \mathbb{R}^{2 \times 2}$ that maps $f$ in the matrix $$T(f) = \begin{pmatrix} f(x_1, y_1) & f(x_1, y_2) \\ f(x_2, y_1) & f(x_2, y_2) \end{pmatrix}$$

consider the polynomial $$g(x,y) = \frac{(x - x_2)(y-y_2)}{(x_1-x_2)(y_1-y_2)} + \frac{(x - x_1)(y-y_1)}{(x_2-x_1)(y_2-y_1)}$$ then $$T(g) = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$$

and so $\det(T(g)) = 1 \neq 0$. On the other hand the function $f \mapsto \det(T(f))$ is continuous function with respect to the topology of $C^k$ but for any function of the form $f(x,y) = f_1(x)\cdot f_2(y)$ we have

$$\det(T(f)) = \det\bigg(\begin{pmatrix} f_1(x_1) \\ f_1(x_2) \end{pmatrix} \cdot \begin{pmatrix} f_2(y_1) & f_2(y_2) \end{pmatrix} \bigg) = 0 $$

therefore if $S := \{ f_1(x)\cdot f_2(y) \; : \; f_1 \in C^k([a,b]), f_2 \in C^k([c,d])$ we have that $\det(T(f)) = 0$ for all $f \in S$, and so by continuity $\det(T(f)) = 0 \forall f \in \overline{S}$ but this means $g \not\in \overline{S}$ since $\det(T(g)) \neq 0$

For $L^1(\mathbb{R}^2)$ a similar argument works defining $$T(f) = \begin{pmatrix} \iint_{X_1 \times Y_1}f & \iint_{X_1 \times Y_2}f \\ \iint_{X_2 \times Y_1}f & \iint_{X_2 \times Y_2}f \end{pmatrix}$$ for some disjoint sets of finite positive measure $X_1,X_2,Y_1,Y_2 \subset \mathbb{R}$.

Abstract proof:

Let $X$,$Y$ be banach spaces and consider the Banach Space tensor product $X \otimes Y$, consider the space $S := \{ x \otimes y \; : \; (x,y) \in X \times Y\}$ then $\overline{S} \neq X \otimes Y$ whenever $\min(\dim(X),\dim(Y)) \geq 2$. To prove this take some distinct $\ell_1,\ell_2 \in X'$ and $x_1,x_2 \in X$ such that $$ \begin{pmatrix} \ell_1(x_1) & \ell_2(x_1) \\ \ell_1(x_2) & \ell_2(x_2) \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} $$

Define in a similar way $h_1,h_2 \in Y'$ and $y_1,y_2 \in Y$. Now I can define on $X \times Y$ the map $\ell_i \otimes h_j (x \otimes y) := \ell_i(x) \cdot h_j(y)$, I can then extend this by linearity and continuity over all $X \otimes Y$, this defines the function $\ell_i \otimes h_j$, so I can define the operator

$$T(z) = \begin{pmatrix} (\ell_1 \otimes h_1)(z) & (\ell_1 \otimes h_2)(z) \\ (\ell_2 \otimes h_1)(z) & (\ell_2 \otimes h_2)(z) \end{pmatrix}$$ then $\det(T(x_1 \otimes y_1 + x_2 \otimes y_2)) = 1$ but for $x \otimes y$ we have $$\det(T(x \otimes y)) = \det\bigg(\begin{pmatrix} \ell_1(x) \\ \ell_2(x) \end{pmatrix} \cdot \begin{pmatrix} h_1(y) & h_2(y) \end{pmatrix} \bigg) = 0 $$ and so by continuity $\det(T(z)) = 0 \forall z \in \overline{S}$ but then $x_1 \otimes y_1 + x_2 \otimes y_2 \not\in \overline{S}$

Paul
  • 1,489
  • This (https://math.stackexchange.com/questions/4737770/tensor-product-of-lp-spaces-is-dense-in-the-product-lp-space) answer says such a decomposition is possible, could you throw some light on it as well? – Celestina Feb 17 '25 at 19:13
  • 2
    The key difference between the answer you're referring to and your question is the difference between the sets $\mathcal{E}0$ and $\mathcal{E}$. What is true is that, defining $S := { f(x)g(y) , : , f,g \in L^1(\mathbb{R})}$ then $span(S)$ is dense in $L^1(\mathbb{R}^2)$. Similarly every function $f \in C^k([a,b] \times [c,d])$ can be uniformly approximated by a sequence of polynomials of the form $\sum{k=1}^{m_n}P^n_k(x)Q^n_k(y)$, if you impose $m_n = 1$ as you did the statement is false. – Paul Feb 17 '25 at 19:52
  • Thanks so much! I'd also love to hear your thoughts on (https://math.stackexchange.com/questions/5036354/tensor-product-structure-of-sobolev-spaces-and-generalization-to-lp-spaces). And if you happen to know of any good references for density results like this, I'd appreciate your suggestions. – Celestina Feb 17 '25 at 20:12