The friendship paradox is the phenomenon that most people have fewer friends than their friends have, on average. It can be explained as a form of sampling bias in which people with more friends are more likely to be in one's own friend group. In other words, one is less likely to be friends with someone who has very few friends.
Mathematically speaking, the degrees of the neighbours of a node in any network will, on average, be greater than the degree of the node itself.
I am considering an undirected graph with n vertices.
At the beginning of this experiment, each vertex is randomly assigned an "extraversion value" EV such that EV ∈ [0, 0.5]
Whether two particular vertices are connected by an edge or not is determined by their extraversion values. The probability that the two vertices i and j are connected is the sum of their extraversion values EVi and EVj.
EVi + EVj = P(edge between i and j) ∈ [0, 1]
I simulated this network on my computer with n = 5000 and ran the experiment 100 times.
After analyzing the data, I found that 57.7% of the nodes have a lesser degree than the average degree of their neighbours.
When I repeated the experiment for larger values of n, this fraction of nodes tended towards the Euler-Mascheroni constant (γ = 0.57721566...)
How do I prove that the fraction of nodes tends to γ as n tends to infinity?