## Rough Notes
**- cross entropy is a measure of difference in entropy between 2 probability distributions
- related to [[KL-Distribution divergence|KL-Divergence]] but not the same. KL divergence is the relative entropy between 2 probabilities and cross-entropy is the total entropy between the distributions.
- not to be confused with [[log-loss]] a.k.a logistic loss. Even though cross entropy and log loss can be used interchangeably when used as a loss function, on their own, they come from different concepts.
## Resources
- [A Gentle Introduction to Cross-Entropy for Machine Learning - MachineLearningMastery.com](https://machinelearningmastery.com/cross-entropy-for-machine-learning/)
---
- Links:
- Created at: 2023-05-10