4

I am currently doing research on the symmetric qudit Dicke states, which are states symmetric under the permutation group. In the article "Entanglement entropy in the Lipkin-Meshkov-Glick model," it is claimed in Eq. 5 that $$p_l=\frac{{L \choose l} {N-L \choose n-l}}{ {N \choose n}},$$ can be approximated for $N\, L\gg 1$ where $p_l$ represent the Clebsch-Gordon coefficients used in a Schmidt decomposition of the Dicke state. Specifically, they write

...the hypergeometric distribution of the $p_l$ can be recast into a Gaussian distribution $p_l\approx \frac{1}{\sqrt{2\pi}\sigma}\exp\left[- \frac{(l-\bar l)^2}{2\sigma^2}\right]$, of mean value $\bar l=nL/N$ and variance $\sigma^2=n(N-n)(N-L)L/N^3$, where we have retained the subleading term in $(N-L)$ to explicitly preserve the symmetry $L\to N-L$.

I am looking for a reference or an explanation of how this is achieved; I have used asymptotic formulas before, but never for something of combinatorial nature. Also, I am relatively unfamiliar with hypergeometric functions. Understanding this is important for me, because I am looking to extend this argument from $SU(2)$ to the $SU(2)_q$, where the binomials in $p_l$ are replaced with $q$-binomials.

Edit: My research professor just sent me a wikipedia link that appears to make the same claim. I still don't understand how this approximation is found. Perhaps this link is relevant?

Edit: Reached $1k$ reputation :)

David Raveh
  • 1,876

2 Answers2

3

You know that (under appropiate conditions) $\binom{n}{x} \approx 2^n \phi(x; \frac{n}{2}, \frac{n}{4})$ where $\phi(x; \mu,\sigma^2)$ is the Gaussian distribution. You can apply that. Or, combinatorily:

The hypergeometric distribution gives the probability of $\ell$ successes in $n$ draws without replacement, from a finite population of size $N$ that contains $L$ "good" objects.

If $N\gg n$, then the experiment is approximately equivalent to draws with replacement. In that case, the success has probability $L/N$, and the number of successes follows a Binomial distribution, with mean $\mu=n p = n L/N$ and variance $\sigma^2 = n p (1-p) = n \frac{L}{N}\frac{N-L}{N}$

Further, the Binomial can be aproximated by a Gaussian with those parameters.

This already gives the approximation linked in the Wikipedia article.

In your case, they have an additional correction factor. Instead of

$$\sigma^2 = n \frac{L(N-L)}{N^2} $$

they use

$$\sigma^2 = n \frac{L(N-L)}{N^2} \frac{N-n}{N}$$

which is closer to the true variance of the hypergeometric :

$$\sigma_H^2 = n \frac{L(N-L)}{N^2} \frac{N-n}{N-1}$$

leonbloy
  • 66,202
2

I will assume that $n\to\infty$, and that $n\ll N,\ell\ll L$, and $(n-\ell)\ll N-L$. I believe these assumptions are necessary for the result. You only get a normal approximation when the samples are independent, which means we need the sample pool to be large compared to the sample.

Start by rearranging $$ p_\ell = \binom{n}\ell {(L/N)^\ell(1-L/N)^{n-\ell}}\cdot \color{blue}{\frac{\frac{L!}{L^\ell(L-\ell)!}\cdot \frac{(N-L)!}{(N-L)^{n-\ell}(N-L-(n-\ell))!}}{\frac{N!}{N^n(N-n)!}}} $$ From the de Moivre-Laplace theorem, we know when $n$ is large that $$ \begin{align} \binom{n}\ell {(L/N)^\ell(1-L/N)^{n-\ell}} &\approx \frac1{\sqrt{2\pi nL(N-L)/N^2}} \exp\left(-\,\frac{(\ell-nL/N)^2}{2nL(N-L)/N^2}\right)\\ &= \frac1{\sigma \sqrt{2\pi}\cdot \sqrt{\frac{N}{N-n}}} \exp\left(-\,\frac{(\ell-\hat \ell)^2}{2\sigma^2 \cdot\frac{N}{N-n}}\right) \end{align} $$ From this answer of mine, we have the approximation $$ \frac{L!}{L^\ell (L-\ell)!}\approx \exp\left(-\,\frac{\ell^2}{2L}\right), $$ valid when $\ell\ll L$. This pattern appears three times in the blue fraction, leading the approximation

$$ \color{blue}{\frac{\exp(-\frac{\ell^2}{2L})\exp(-\frac{(n-\ell)^2}{2(N-L)})}{\exp(-\frac{n^2}{2N})}} = \color{blue}{ \exp\left(-\,\frac{(\ell-nL/N)^2}{2L(N-L)/N}\right) } = \color{blue}{ \exp\left(-\,\frac{(\ell-\hat\ell)^2}{2\sigma^2\cdot \frac{N^2}{n(N-n)}}\right) } $$ Putting this altogether, $$ \begin{align} p_\ell &\approx \frac1{\sigma \sqrt{2\pi}\cdot \sqrt{\frac{N}{N-n}}} \exp\left(-\,\frac{(\ell-\hat \ell)^2(N-n)}{2\sigma^2 N}\right) \cdot \color{blue}{ \exp\left(-\,\frac{(\ell-\hat\ell)^2n(N-n)}{2\sigma^2\cdot N^2}\right) } \\&=\frac1{\sigma \sqrt{2\pi}} \exp\left(-\,\frac{(\ell-\hat \ell)^2}{2\sigma^2}\cdot \left(1-\frac{n^2}{N^2}\right)\right)\cdot \sqrt{1-\frac{n}N} \end{align} $$ Since $n/N\approx 0$, the result follows.

Mike Earnest
  • 84,902