0

Suppose $z_1,...,z_n\in \Bbb C$ are complex numbers, not necessarily distinct, satisfying $$ z_1^k+\cdots +z_n^k=0$$

$ \color{red}{\forall\; k>0}$. Then is it true that $z_1=\cdots=z_n=0$?

This will be obvious if the $z_i$'s were all real, but I'm wondering that this is true for the complex case.

nonuser
  • 91,557
user302934
  • 1,738
  • 1
  • 9
  • 27
  • 1
    Yes it is true. Do you know how to covert those equations into showing that $\sum \prod_{i \text{terms} } x_ax_b \ldots = 0$ ? Hence, by vieta, $z_i$ are the roots of $z^n = 0$, so $z_i = 0$. – Calvin Lin Apr 10 '20 at 19:46
  • 3
    Also: https://math.stackexchange.com/q/811913/42969, https://math.stackexchange.com/q/1203786/42969, https://math.stackexchange.com/q/2963882/42969. – Martin R Apr 10 '20 at 19:47
  • @MartinR I doubt that the first link is related to this question. Well, maybe they are vaguely related via Newton's formulae. – user26857 Apr 10 '20 at 21:54

2 Answers2

2

Hint: Sure, if $k=2$ we have $z_1^2+...+z_n^2=0$ and $$(z_1+...+z_n)^2=0$$

so $$z_1z_2+...+z_{n-1}z_n=0$$

and so on... So by Newton formulas and Vieta formulas you get that coeficients of polynomial with zeroes $z_1,z_2...$ are zero which means that all $z_i=0$.

nonuser
  • 91,557
2

Another point of view, with recursion and linear algebra. Consider the following assumption hypothesis $H(n)$.

For all $z_1, \dots, z_n \in \mathbb{C}$, if for some $a_1, \dots, a_n > 0$, and all $k \in \{1, \dots, n\}$, we have $$ a_1 z_1^k + \cdots + a_n z_n^k, $$ then $z_1 = \cdots = z_n = 0$.

This a generalization of the result, necessary for the induction below.

So let us show this. Clearly, $H(1)$ is true, so assume $H(n-1)$, and take $z_1, \dots, z_n$ and $a_1, \dots, a_n$ as above. If one of the $z_i$ is $0$, then we use the $H(n-1)$ and we are done. Otherwise, consider the $n \times n$ matrix $$ A = \begin{pmatrix} 1 & z_1 & z_1^2 & \cdots & z_1^{n-1} \\ 1 & z_2 & z_2^2 & \cdots & z_2^{n-1} \\ \vdots & \vdots & \ddots & \vdots \\ 1 & z_n & z_n^2 & \cdots & z_n^{n-1} \end{pmatrix}. $$ Then if $u$ is the vector $(a_1 z_1,a_2 z_2,\dots,a_n z_n)$ (which is non-zero by assumption), you are saying that $uA = 0$. In particular, $A$ is not invertible. But this is a Vandermonde matrix, so it means that two of the $z_i$ are the same, say for instance $z_1 = z_2$. Then we can use the assumption hypothesis with $z_1, z_3, \dots, z_n$ and $a_1+a_2, a_3, \dots, a_n$ (note that $a_1+a_2 > 0$), to find that all the $z_i$ are zero.

Note. The assumption on the $a_i$ being strictly positive could be improved to make this more general and work in any field of characteristic $0$, for instance by saying that any sum of the $a_i$ is non-zero.

Raoul
  • 2,370
  • 2
    I just realized that the comments above already mention a similar solution... – Raoul Apr 10 '20 at 20:20
  • 1
    Is there an argument like this for proving $x_1^k+\dots+x_n^k = y_1^k+\dots+y_n^k$ for $0 \le k \le n-1$ implies ${x_1,\dots,x_n} = {y_1,\dots,y_n}$? – mathworker21 Apr 22 '20 at 22:30
  • This seems more delicate. Just take $n=2$. Define $A = \begin{pmatrix} x_1 & x_1^2 \ x_2 & x_2^2 \end{pmatrix}$, $B = \begin{pmatrix} y_1 & y_1^2 \ y_2 & y_2^2 \end{pmatrix}$, and let $u = (1,1)$. Then the assumption can be summarized as $u A = u B$. It tells that $A-B$ is not invertible, so its determinant $(x_1-y_1)(x_2-y_2)(x_2+y_2-x_1-y_1)$ is $0$. However, you cannot deduce from this that ${x_1,y_1} = {x_2,y_2}$. So you would need to get more from the assumption $uA = uB$, but I am not sure what. – Raoul Apr 22 '20 at 23:29
  • lol, I asked that question because I know the exact same argument you used in your answer (which you tried to do in your comment above for $n=2$) will not work. I'm asking if there's any argument using linear algbera. – mathworker21 Apr 23 '20 at 02:30
  • Not that I can think of. – Raoul Apr 23 '20 at 12:14