Visualization
Scoring rules alone are not enough for a thorough evaluation of probabilistic forecasts. Visualizations can be used as a complement.
scoringrules.visualization.reliability_diagram
reliability_diagram(
observations: np.ndarray,
forecasts: np.ndarray,
/,
uncertainty_band: (
tp.Literal["confidence", "consistency"] | None
) = "consistency",
n_bootstrap: int = 100,
alpha: float = 0.05,
ax: plt.Axes = None,
) -> plt.Axes
Plot the reliability diagram of a set of predictions.
CORP: Consistent, Optimally binned, Reproducible, PAV-algorithm based reliability diagram from Dimitriadis et al. (2021).
Parameters:
Name | Type | Description | Default |
---|---|---|---|
observations
|
ndarray
|
The observed outcomes, either 0 or 1. |
required |
forecasts
|
ndarray
|
Forecasted probabilities between 0 and 1. |
required |
uncertainty_band
|
Literal['confidence', 'consistency'] | None
|
The type of uncertainty band to plot, which can be either |
'consistency'
|
n_bootstrap
|
int
|
The number of bootstrap samples to use for the uncertainty band. |
100
|
alpha
|
float
|
The confidence level for the uncertainty band. |
0.05
|
Returns:
Name | Type | Description |
---|---|---|
ax |
Axes
|
The CORP reliability diagram plot. |
Examples: