Interpreting Results#
This guide explains how to interpret the output files generated by PyEEPAS workflows.
Output File Overview#
PyEEPAS generates two types of output files:
Parameter Files (.csv) - Fitted model parameters
Forecast Files (.mat) - Spatiotemporal earthquake rate forecasts
Directory Structure#
After running a complete workflow, your results directory will contain:
results_italy_reproduce/
├── Fitted_par_PPE_1990_2012.csv # PPE parameters (a, d, s)
├── Fitted_par_aftershock_1990_2012.csv # Aftershock parameters (v, k)
├── Fitted_par_EEPAS_1990_2012.csv # EEPAS parameters (8 params)
├── PREVISIONI_3m_PPE_2012_2022.mat # PPE forecast matrix
└── PREVISIONI_3m_EEPAS_2012_2022.mat # EEPAS forecast matrix
Parameter Files#
PPE Parameters#
File: Fitted_par_PPE_YYYY1_YYYY2.csv
Format:
a,d,s,ln_likelihood
0.616,29.64,1e-15,-514.10
Parameters:
Parameter |
Description |
|---|---|
|
Normalization factor for PPE spatial kernel |
|
Characteristic distance (km) |
|
Uniform background rate |
|
Log-likelihood (higher is better) |
Aftershock Parameters#
File: Fitted_par_aftershock_YYYY1_YYYY2.csv
Format:
v,k,ln_likelihood
0.577,0.205,-429.26
Mathematical Definition (from model equation λ’ = ν·λ₀ + κ·Σλᵢ’):
ν (nu/v): Coefficient for PPE baseline rate λ₀
κ (kappa/k): Coefficient for precursor rate Σλᵢ’ (after short-term exclusion)
EEPAS Parameters#
File: Fitted_par_EEPAS_YYYY1_YYYY2.csv
Format:
am,bm,Sm,at,bt,St,ba,Sa,u,ln_likelihood
1.234,1.000,0.242,2.588,0.349,0.150,0.504,1.000,0.167,-495.39
Parameters:
Parameter |
Description |
|---|---|
|
Magnitude scaling intercept (M_mainshock = am + bm·M_precursor) |
|
Magnitude scaling slope (fixed = 1.0) |
|
Magnitude distribution standard deviation |
|
Time scaling intercept (log₁₀(T_precursor) = at + bt·M_precursor) |
|
Time scaling slope |
|
Time distribution standard deviation |
|
Spatial scaling exponent (σ² = σ_A²·10^(ba·m)) |
|
Spatial distribution base standard deviation |
|
Failure-to-predict rate ( |
|
Log-likelihood (higher is better) |
Forecast Files#
File Structure#
Files: PREVISIONI_3m_PPE_YYYY1_YYYY2.mat and PREVISIONI_3m_EEPAS_YYYY1_YYYY2.mat
Format: MATLAB .mat file containing forecast matrices
Loading Forecast Files:
import scipy.io as sio
import numpy as np
# Load PPE forecast (adjust filename to your results)
mat_ppe = sio.loadmat('results_italy_reproduce/PREVISIONI_3m_PPE_2012_2022.mat')
ppe_forecast = mat_ppe['PREVISIONI_3m']
# Load EEPAS forecast
mat_eepas = sio.loadmat('results_italy_reproduce/PREVISIONI_3m_EEPAS_2012_2022.mat')
eepas_forecast = mat_eepas['PREVISIONI_3m_less']
print(f'PPE forecast shape: {ppe_forecast.shape}')
# Actual output: (1000, 178) → 1000 rows (40 time windows × 25 magnitude bins), 178 columns (1 index + 177 spatial cells)
Matrix Structure#
Dimensions: [N_rows × N_cols]
- Where:
N_rows = N_time_windows × N_mag_binsN_mag_bins = 25(magnitude range from mT to mT+5.0 in 0.1 steps)N_cols = 1 + N_spatial_cells(first column is time window index)
Important: The first column (column 0) is a time window index, not forecast data!
# WRONG - includes index column
total_rate_wrong = np.sum(ppe_forecast)
# CORRECT - exclude index column
total_rate_correct = np.sum(ppe_forecast[:, 1:])
Column Layout:
Column 0: Time window index (1, 2, 3, ..., T)
Column 1: Rate for spatial cell 1
Column 2: Rate for spatial cell 2
...
Column N: Rate for spatial cell N
Row Organization (rows grouped by time window, with 25 magnitude bins per window):
Rows 0-24: Time window 1, magnitude bins [mT, mT+5.0) in 0.1 steps
Rows 25-49: Time window 2, magnitude bins [mT, mT+5.0) in 0.1 steps
Rows 50-74: Time window 3, magnitude bins [mT, mT+5.0) in 0.1 steps
...
Rows (T-1)×25 to T×25-1: Time window T, 25 magnitude bins
Example: For 177 spatial cells and 40 time windows:
Matrix shape: (1000, 178)
Rows 0-24: Time window 1 (25 magnitude bins × 177 spatial cells)
Rows 25-49: Time window 2 (25 magnitude bins × 177 spatial cells)
...
Rows 975-999: Time window 40 (25 magnitude bins × 177 spatial cells)
Validation and Quality Checks#
Lambda Sum Validation#
The sum of forecast rates should approximately equal the observed number of target events.
Theory: For a Poisson process, E[N] = Λ where Λ = ∫∫∫∫ λ(t,x,y,m) dt dx dy dm
Use the provided analysis tool to verify Lambda sum:
python3 analysis/analyze_forecast_lambda.py
- This tool will:
Load forecast matrices from your results directory
Calculate Lambda sums (correctly excluding index columns)
Compare with observed event counts
Report whether forecasts are well-calibrated
Log-Likelihood Progression#
When running EEPAS learning with auto-boundary adjustment, check that NLL improves across rounds.
Good: NLL increases (less negative) across optimization rounds Warning: NLL decreases → Possible optimization issue
The actual NLL values depend on your dataset (number of events, spatial extent, etc.).
See Also#
Complete Workflows - How to generate these results
Configuration Reference - Configuration options
Numerical Integration - Understanding the calculations