1

I need to prove the following:

Let $P(x_1,\cdots,x_r)$ be a polynomial. Then $P(\alpha_1,\cdots,\alpha_r) = 0$ for all $(\alpha_1,\cdots,\alpha_r) \in \mathbb{R}^r_{+}$ such that $\sum\limits^r_{j=1} \alpha_j = 1$ if and only if $P(.)$ is the zero polynomial.

I think I have proven it also, using induction, with $A.\Gamma$'s help on this question.

My question is, is there any existing result (perhaps involving fundamental theorem for multivariate polynomials, e.g.) from which this follows? I tried looking for it, found Bezout's Theorem, but not sure if and how this can be derived using it. Any help is appreciated. Thank you.

Edit: The wording of the above statement is not correct, as pointed out by others. I have posted a new question with the correct wording as suggested in question comments, here.

Canine360
  • 1,574
  • I rolled back the question to the first version that was already answered. If you want to ask a modified question then post a new question. – miracle173 Jun 04 '18 at 05:51

2 Answers2

1

This is not correct. Consider the polynomial $$ P(x_1, x_2, \dots, x_r) = (1 - \sum_j x_j)^2 \, . $$

Hans Engler
  • 16,092
  • 1
    Thank you for pointing this out. Do you have any suggestion as to how we should modify this so that it holds? It holds for $r=2$ as we have seen here. https://math.stackexchange.com/questions/2805121/how-to-know-if-a-given-infinite-set-of-vectors-lie-in-the-same-hyperplane/2805152?noredirect=1#comment5784202_2805152 – Canine360 Jun 04 '18 at 02:45
  • Sorry I should've mentioned, the polynomial has no constant term. Would it hold then? Have modified the question accordingly. – Canine360 Jun 04 '18 at 02:46
  • @Canine360 : Multiply the given polynomial with $(x_1^2+x_2^2+...+x_r^2)$, then it does not have a constant term. – Lutz Lehmann Jun 04 '18 at 09:20
  • It would not be homogeneous then. I had also mentioned homogeneous but the edit has been reverted back (see comment on question). So the correct version is "A homogeneous polynomial"... (it'll have no constant term by definition). Let me start a new question as suggested. :( – Canine360 Jun 04 '18 at 23:48
  • Here is the new question: https://math.stackexchange.com/questions/2808331/prove-that-homogeneous-multivariate-polynomial-reduces-to-the-zero-polynomial-gi – Canine360 Jun 05 '18 at 00:12
1

The if part is obvious. For the only if, first notice that given any $(x_1, \dots, x_r) \in \mathbb{R}^r_{+}$, we can use the homogeneity of $P$ to establish the value of $P$ at this point: $$P(x_1, \dots, x_r)=\lambda^d P(\alpha_1,\cdots,\alpha_r)=0,$$ where $\lambda = \displaystyle \sum_{i=1}^r x_i>0$, $\alpha_i = \dfrac {x_i} {\lambda} \; \forall i$ and $d$ is the degree of $P$ (assuming $P$ is nonzero).

Now, knowing that $P=0$ at the open set $\mathbb{R}^r_{+}$, we can choose some point $(u_1, \dots, u_r) \in \mathbb{R}^r_{+}$ and take any partial derivative of $P$ at this point (including mixed of any order) to find coefficients of $P$. For example, if $P$ would have the term $k x_{i_1}^{a_1} \dots x_{i_m}^{a_m}$, then $$k = \frac 1 {a_1! \dots a_m!} \frac{\partial^{a_1 + \dots + a_m}P(x_1, \dots, x_r)}{\partial x_{i_1}^{a_1} \dots \partial x_{i_m}^{a_m}} (u_1, \dots, u_r) = 0$$ because any other term, if exist, would become zero after such differentiation (here we use the homogeneity again) and value of the derivative is completely defined by values of $P$ at a small open neighbourhood of $(u_1, \dots, u_r)$.

John McClane
  • 1,835
  • 9
  • 18
  • As for the second part of the answer one could also consult https://math.stackexchange.com/questions/2789699/zeros-of-polynomial-over-an-infinite-field/2789760#2789760 – Jens Schwaiger Jun 04 '18 at 05:37