Skip to content

Visualization

Scoring rules alone are not enough for a thorough evaluation of probabilistic forecasts. Visualizations can be used as a complement.

scoringrules.visualization.reliability_diagram

reliability_diagram(
    observations: np.ndarray,
    forecasts: np.ndarray,
    /,
    uncertainty_band: (
        tp.Literal["confidence", "consistency"] | None
    ) = "consistency",
    n_bootstrap: int = 100,
    alpha: float = 0.05,
    ax: plt.Axes = None,
) -> plt.Axes

Plot the reliability diagram of a set of predictions.

CORP: Consistent, Optimally binned, Reproducible, PAV-algorithm based reliability diagram from Dimitriadis et al. (2021).

Parameters:

Name Type Description Default
observations ndarray

The observed outcomes, either 0 or 1.

required
forecasts ndarray

Forecasted probabilities between 0 and 1.

required
uncertainty_band Literal['confidence', 'consistency'] | None

The type of uncertainty band to plot, which can be either 'confidence' or 'consistency'band. If None, no uncertainty band is plotted.

'consistency'
n_bootstrap int

The number of bootstrap samples to use for the uncertainty band.

100
alpha float

The confidence level for the uncertainty band.

0.05

Returns:

Name Type Description
ax Axes

The CORP reliability diagram plot.

Examples:

>>> import numpy as np
>>> from scoringrules.visualization import reliability_diagram
>>> x = np.random.uniform(0, 1, 1024)
>>> y = np.random.binomial(1, np.sqrt(x), 1024)
>>> ax = reliability_diagram(y, x)