The standard deviation and the entropy are not the same, but a transformation of the standard deviation, the coefficient of variation ($CV_Y := \frac{\sigma_Y}{\mu_Y}$), is part of the single-parameter generalized entropy family of measures of inequality (or, technically, a transformation of the CV is part of the entropy family). There was a vibrant literature on this in British econometrics in the 80s; the standard papers are from Shorrocks (1980, 1982, 1983), but Cowell's textbook (various editions), Measuring Inequality, is a good source.
Generally, if $\theta$ is our parameter, we can write a formula for entropy ... (Cowell has a really good discussion of the meaning of $\theta$ that I will avoid here)
$$
\begin{align*}
E_\theta = \frac{1}{\theta(\theta-1)} \frac{1}{N} \sum_{i=1}^N \left\{ \frac{y_i}{\mu_Y}\right\}^\theta - 1, \theta \notin \{0, 1\}
\end{align*}
$$
For $\theta=2$, this is equal to half the square of the $CV$ less one (some people call this square the relvariance, although I don't see it often; Kish's famous book on sampling methods does).
$$
\begin{align*}
E_2 &= \frac{1}{2} \frac{1}{N} \sum_{i=1}^N
\left\{ \frac{y_i}{\mu_Y} \right\}^2 -1 \cr
&= \frac{1}{2} \frac{1}{N} \sum_{i=1}^N
\left\{ \frac{y_i-\mu_Y}{\mu_Y} \right\}^2 - \frac{1}{2} \cr
&= \frac{1}{2} [CV]^2 - \frac{1}{2}
\end{align*}
$$
To get from line two to line three, note that to complete the square, we have to add back $\frac{1}{2}\frac{1}{N} \sum_i 2\frac{Y_i \mu_Y}{\mu_Y^2} = \frac{1}{2}\frac{1}{N\mu_Y} 2N \mu_Y = 1$ and subtract off $\frac{1}{2}\frac{1}{N} \sum_i (\frac{\mu_Y}{\mu_Y})^2 = \frac{1}{2} \frac{1}{N} N = \frac{1}{2}$.[^1]
To get the more familiar entropy formula, we need to use L'Hôpital's rule for $\theta = 1$.
$$
\begin{align*}
\lim_{\theta \rightarrow 1} E_1 &=
\frac{\text{d} E_1}{\text{d}\theta}|_{\theta = 1} \cr
&= \left\{ \frac{1}{N(2\theta-1)} \sum_{i=1}^N
\left\{ \frac{y_i}{\mu_Y} \right\}^\theta \ln\frac{y_i}{\mu_Y} \right\}|_{\theta=1} \cr
&= \frac{1}{N} \sum_{i=1}^N \frac{y_i}{\mu_Y} \ln\frac{y_i}{\mu_Y} \cr
\end{align*}
$$
Finally, if we just move the $N$ inside the summation and consider to be a coefficient on the mean, then that term becomes the share of person $i$ in the total income.
$$
\begin{align*}
E_1 &= \sum_{i=1}^N \frac{y_i}{N\mu_Y} \ln\frac{y_i}{\mu_Y} \cr
&= \sum_{i=1}^N \frac{y_i}{N\mu_Y} \ln\frac{y_i}{\mu_Y} \cr
\end{align*}
$$
If you call that the "probability of a dollar of national income belonging to $i$", then we have, finally, the standard entropy formula:
$$
\begin{align*}
E_1 &= \sum_{i=1}^N \pi_i \ln \pi_i = -H(Y) \cr
\end{align*}
$$
So, TL;DR, the generalized entropy formula allows you to recover a simple transformation of the standard deviation as well as the "regular" entropy formula (with a slightly modified interpretation).
[1]: I have seen this fact reported as "the entropy of degree two is half the relvariance", but the algebra and simulations (see below for Stata code) make me think that it is missing the subtraction of $\frac{1}{2}$ (in the Stata code, I use the uncorrected denominator for the variance so that it agrees with the entropy formula).
sysuse auto, clear
qui sum price, d
local cv = r(sd)/r(mean)
local halfrelvar = 0.5*(`cv'^2 * (r(N)-1)/r(N))
gen meansqprice = (price/r(mean))^2
qui sum meansqprice, d
local e2 = (1/2)*(1/(r(N)-1))*(r(N)-1)*r(mean)
local ent = `e2' - 0.5
di "The entropy of degree two minus 0.5 is `ent'"
di "Half the relvar is `halfrelvar'"