I understand log of determinant of covariance matrix bounds entropy for gaussian distributed data. Is this the case for non gaussian data as well and if so, why?
What does Determinant of Covariance Matrix give? and http://web.ntpu.edu.tw/~phwang/teaching/2012s/IT/slides/chap08.pdf show connection between 'differential entropy' and log of determinant of covariance matrix for Gaussian case. Eqn. 26 of https://arxiv.org/pdf/1604.03924.pdf?fbclid=IwAR1tDOzgZ2iXSo3lDbXnr8TUkxawCA8NikHFlfY4E5OWmbmJ3_WHeVPotFE has some relations (I guess for non-Gaussian case, but yet to check that)