What is the difference between cross-entropy and log loss error?

南笙酒味 提交于 2019-12-04 19:31:17

问题


What is the difference between cross-entropy and log loss error? The formulae for both seem to be very similar.


回答1:


They are essentially the same; usually, we use the term log loss for binary classification problems, and the more general cross-entropy (loss) for the general case of multi-class classification, but even this distinction is not consistent, and you'll often find the terms used interchangeably as synonyms.

From the Wikipedia entry for cross-entropy:

The logistic loss is sometimes called cross-entropy loss. It is also known as log loss

From the fast.ai wiki entry on log loss:

Log loss and cross-entropy are slightly different depending on the context, but in machine learning when calculating error rates between 0 and 1 they resolve to the same thing.

From the ML Cheatsheet:

Cross-entropy loss, or log loss, measures the performance of a classification model whose output is a probability value between 0 and 1.



来源:https://stackoverflow.com/questions/50913508/what-is-the-difference-between-cross-entropy-and-log-loss-error

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!