3

I'm struggling with this question, and I'll be happy for a hint or a direction.

Let G=($V_1\cup V_2$,E) with no isolated Vertices s.t. the average degree on $V_1$ is large than the average degree on $V_2$. Prove that there exists an edge $\{v_1,v_2\}\in E$ s.t. $v_i\in V_i$ and $deg(v_1)>deg(v_2)$.

Thank you.

Things I tried: Since the sum of degrees is the same on every side, the assumptions lead to the fact that $|V_1|<|V_2|$.

I defining a function $f:V_1\rightarrow N$ by $f(v_1)$ being the minimal degree of the vertex's neighbors. we see that $$\sum_{v\in V_1} f(v) \geq \sum_{v\in V_1} deg(v)=|E|$$ This could solve the question since I'm counting degree's in $V_2$ but I could be double counting.

Another thing I noticed is that if I assume the contrary, I get $$\sum_{v\in V_1} deg^2(v)\leq \sum_{v\in V_2} deg^2(v)$$ but I'm not sure if this helps in any way.

Yotam D
  • 369
  • Where it says "isolated edges", do you mean "isolated vertices"? It seems that this premise is required, since otherwise you could just add isolated vertices to $V_2$ in any graph until the average degree on $V_2$ is lower. (This also implies that you'll need to use this premise in the proof.) – joriki Mar 31 '16 at 15:44
  • Of course, I edited, thank you. :) – Yotam D Mar 31 '16 at 16:05

1 Answers1

2

Define $f : E \to \mathbb{R}$ as follows $$f\big((v_1,v_2)\big) = \frac{1}{\deg(v_1)} - \frac{1}{\deg(v_2)}$$ and observe that we need to prove that there exists an edge $e$ such that $f(e) < 0$. However, assuming that there are no isolated vertices, $$ \sum_{e \in E}f(e) = \sum_{v_1 \in V_1}\deg(v_1)\cdot\frac{1}{\deg(v_1)} - \sum_{v_2 \in V_2}\deg(v_2)\cdot\frac{1}{\deg(v_2)} = |V_1|-|V_2| < 0 $$

because $|V_2| > |V_1|$, so there has to be some edge $e$ with $f(e) < 0$.

I hope this helps $\ddot\smile$

dtldarek
  • 37,969
  • Beautiful, Thanks! – Yotam D Mar 31 '16 at 17:55
  • 2
    And another question - how did you come up with this? – Yotam D Mar 31 '16 at 18:06
  • @YotamD First I've tried $f = \deg - \deg$ and Cauchy-Schwarz, and from the way it failed, I've observed that perhaps I should try $f = \deg^\alpha - \deg^\alpha$ for $\alpha < 1$. I've checked $\alpha = 1/2$, but that didn't work, and $\alpha = 0$ was meaningless, yet it gave me an idea that $\deg\cdot\frac{1}{\deg}$ could cancel out, so I've tested $\alpha = -1$, and the proof succeeded. There's nothing more to it, just the trial and error approach $\ddot\smile$ – dtldarek Mar 31 '16 at 18:39
  • I didn't think in that direction at all actually. Any way Thanks again :) – Yotam D Mar 31 '16 at 19:56