I need to prove the following:
Let $P(x_1,\cdots,x_r)$ be a polynomial. Then $P(\alpha_1,\cdots,\alpha_r) = 0$ for all $(\alpha_1,\cdots,\alpha_r) \in \mathbb{R}^r_{+}$ such that $\sum\limits^r_{j=1} \alpha_j = 1$ if and only if $P(.)$ is the zero polynomial.
I think I have proven it also, using induction, with $A.\Gamma$'s help on this question.
My question is, is there any existing result (perhaps involving fundamental theorem for multivariate polynomials, e.g.) from which this follows? I tried looking for it, found Bezout's Theorem, but not sure if and how this can be derived using it. Any help is appreciated. Thank you.
Edit: The wording of the above statement is not correct, as pointed out by others. I have posted a new question with the correct wording as suggested in question comments, here.