2

I understand log of determinant of covariance matrix bounds entropy for gaussian distributed data. Is this the case for non gaussian data as well and if so, why?

What does Determinant of Covariance Matrix give? and http://web.ntpu.edu.tw/~phwang/teaching/2012s/IT/slides/chap08.pdf show connection between 'differential entropy' and log of determinant of covariance matrix for Gaussian case. Eqn. 26 of https://arxiv.org/pdf/1604.03924.pdf?fbclid=IwAR1tDOzgZ2iXSo3lDbXnr8TUkxawCA8NikHFlfY4E5OWmbmJ3_WHeVPotFE has some relations (I guess for non-Gaussian case, but yet to check that)

hearse
  • 141

1 Answers1

2

I understand log of determinant of covariance matrix bounds entropy for gaussian distributed data

Wrong. That's not a bound, it's the value of entropy for a gaussian. And it's not exactly log of determinant of covariance matrix but

$$H(Z)=\frac{k}{2} \ln(2 \pi e)+\frac12 \ln (| \Sigma|)$$

Is this the case for non gaussian data as well and if so, why?

It's a well known result that the gaussian distribution maximizes the entropy, for a given covariance. See eg here.

Hence, yes, that value is a bound.

leonbloy
  • 66,202