0

These holidays (two weeks) I need to learn probability applied to Machine Learning.

Reading a paper, I have found this: "The training objective is thus to minimize the pixel-wise multi-class crossentropy loss ...", and I have no idea about what it is, so I realize that I need to learn probability.

NOTE: I know what is a loss function and how to minimize it. My problem is with crossentropy. I've been searching about it and it is related to information theory and with probability.

Do you know any good probability crash course?

  • 1
    This is quite an endeavor to complete in two weeks, this is friendly enough if you want to take a look. That being said, I don't think you need need to take a whole course in probability to translate that sentence, in a nutshell it just means that you need to minimize a function, turns out that function has a long name, but other than that it is just an optimization problem – caverac Dec 24 '20 at 15:27
  • 1
    You can get an intuitive understanding of the cross-entropy loss function directly, quite quickly, without taking an entire course in probability and without knowing anything about information theory. Just read this question and the answer I posted: https://math.stackexchange.com/questions/3389976/what-is-the-motivation-for-using-cross-entropy-to-compare-two-probability-vector – littleO Dec 25 '20 at 06:51
  • There are a lot of suggestions about books for probability and machine learning: https://www.google.com/search?client=firefox-b-d&q=%22machine+learning%22+%22probability%22+%22book%22 – VansFannel Dec 25 '20 at 06:55
  • @caverac Thanks for your comment but I know what is a loss function and how to minimize it. My problem is with crossentropy. I've been searching about it and it is related to information theory and with probability. Thanks a lot. – VansFannel Dec 25 '20 at 07:07

0 Answers0