Skip to main content
    Skip to main contentSkip to navigationSkip to footer
    Data & Analytics

    Log Loss

    Also known as:
    Logarithmic Loss
    Logistic Loss
    Binary Cross-Entropy Loss
    Updated: 2/12/2026

    A loss function evaluating the quality of predicted probabilities – exponentially penalizes wrong but confident predictions.

    Quick Summary

    Log Loss evaluates probabilistic predictions – exponentially penalizes wrong-confident predictions.

    Explanation

    Log Loss = -1/n × Σ[y·log(p) + (1-y)·log(1-p)]. Perfect predictions give loss=0.

    Marketing Relevance

    Log Loss is the standard metric for probabilistic classification – measures calibration quality.

    Common Pitfalls

    Sensitive to outlier predictions. Clipping required. Hard to interpret without baseline.

    Origin & History

    Log Loss is based on maximum likelihood estimation and is mathematically identical to binary cross-entropy.

    Comparisons & Differences

    Log Loss vs. AUC

    AUC measures ranking; Log Loss measures calibration. Both together give the complete picture.

    Related Services

    Related Terms

    👋Questions? Chat with us!