When I first got into information theory, information was measured or based on shannon entropy or in other words, most books I read before were talked about shannon entropy. Today someone told me there is another information called fisher information. I got confused a lot. I tried to google them. Here are links, fisher information: https://en.wikipedia.org/wiki/Fisher_information and shannon entropy goes here https://en.wikipedia.org/wiki/Entropy_(information_theory).
What are differences and relationship between shannon entropy and fisher information? Why do two kinds of information exist there?
Currently, my idea is that it seems fisher information is a statistical view while shannon entropy goes probability view.
Any comments or anwsers are welcome. Thanks.