This question may seem quite strange, but I am wondering whether the above is true?
In several russian books I saw the notion of "somehow" normally distributed discrete r.v. Can this even be possible?
Thank you in advance!
This question may seem quite strange, but I am wondering whether the above is true?
In several russian books I saw the notion of "somehow" normally distributed discrete r.v. Can this even be possible?
Thank you in advance!
A discrete probability distribution is a probability distribution that can take on a countable number of values.
$$$$
A Normal Distribution is a type of continuous probability distribution for a real-valued random variable.
$$$$
A continuous probability distribution is a probability distribution whose support is an uncountable set, such as an interval in the real line.
$$$$
In mathematics, the support of a real-valued function $f$ is the subset of the domain containing the elements which are not mapped to zero.
$$$$
Looking at the definitions of "continuous probability distribution" and "support", we see that the domain of a Normal distribution function must be uncountable, and therefore cannot be countable, which is what is required for it to be a discrete probability distribution, as in the first definition.
The central limit theorem is applicable to discrete and continuous variables alike. In particular, the binomial distribution tends to a normal one as $n\to\infty$, and so may be called "almost normal".
The short and strict answer to your question is no. By definition, all normally distributed random variables are continuous random variables. A random variable cannot be continuous and discrete at the same time, so the definition excludes the existence of normally distributed discrete random variables.
The longer answer is "It depends on what 'somehow' means". There is a very famous theorem called the central limit theorem which tells you that random variables that can be written as the average of a large number of independent random variables will be "close" to a normal distribution.
In particular, if $X_i$ are independent variables with mean $\mu$ and variance $\sigma$, then we can define $\overline{X_n}=\frac{1}{n}(X_1+\cdots X_n)$, and the theorem tells us that $\sqrt{n}(\overline{X_n} - \mu)$ converges in distribtution to a normally distributed random variable with mean $0$ and variance $\sigma^2$.