Definition¶
Entropy is the basis of Mutual Information, which quantifies the relationship between two things. Also, it is the basis of Relative Entropy and Cross Entropy.
Surprise¶
Surprise is the opposite component (Inverse relationship) to probabilities: when a thing with higher probability happens, we will not feel surprise. On the contrast, when a thing with very low probability happens, we will feel super surprise.
Entropy¶
Entropy is the expected value of surprise
Example¶
Interpretation¶
Higher difference between 2 classes, lower the Entropy it is
Definiton¶
Why we want Cross Entropy¶
Cross Entropy takes a big penalty when model gives the wrong prediction. When we do backpropagation, it will change weights in a big step (check the derivatives / slopes)