Utility Modules

Contents

Utility Modules#

This page documents utility modules that provide data loading, processing, and numerical integration functions.

Data Loader#

DataLoader - EEPAS System Data Loading Utility Class Provides unified configuration file loading, model parameter management, and seismic catalog reading functionality

class utils.data_loader.DataLoader[source]#

Bases: object

Unified data loading utility class

static load_config(config_file='config.json')[source]#

Load system configuration from JSON config file. Automatically fills in missing default values to ensure backward compatibility.

Parameters:

config_file – Configuration file path

Returns:

Configuration dictionary

Return type:

dict

static load_catalogs(input_arg)[source]#

Load seismic catalog files.

Parameters:

input_arg – Configuration file path (.json) or data directory path

Returns:

(HORUS, CPTI11, CELLE) three data matrices

Return type:

tuple

static load_model_params(config_file='config.json')[source]#

Load model parameters from configuration.

Parameters:

config_file – Configuration file path

Returns:

Model parameters dictionary

Return type:

dict

static load_spatial_regions(config_file='config.json')[source]#

Load spatial region definitions (supports single or dual region configurations).

Single region configuration uses the same grid for both testing and neighborhood regions. Dual region configuration uses a grid for testing and a polygon for neighborhood region.

Parameters:

config_file – Configuration file path

Returns:

Dictionary containing the following keys:

  • ’testing_region’: Testing region data (numpy array)

  • ’testing_type’: ‘grid’ or ‘polygon’

  • ’neighborhood_region’: Neighborhood region data (numpy array)

  • ’neighborhood_type’: ‘grid’ or ‘polygon’

  • ’config’: Region configuration (if exists)

Return type:

dict

static load_custom_stages(config_file='config.json')[source]#

Load custom optimization stages configuration.

Parameters:

config_file – Configuration file path

Returns:

Custom stages configuration, or None if not enabled
{

‘enable’: bool, ‘stages’: [

{

‘name’: str, ‘parameters’: list[str], ‘initialValues’: list[float] or None, ‘lowerBounds’: list[float], ‘upperBounds’: list[float], ‘fixedValues’: dict

]

}

Return type:

dict or None

static validate_custom_stages(stages)[source]#

Validate custom stages configuration.

Checks: 1. All parameter names are valid EEPAS parameters 2. Bounds and initial values have correct lengths 3. Fixed parameters + optimized parameters = 9 parameters 4. No duplicate optimization of same parameter in same stage

Parameters:

stages – List of stage configurations

Raises:

ValueError – If validation fails

static detect_stage1_config_type(config_file='config.json')[source]#

Detect whether stage1 configuration is for single-stage or three-stage optimization.

Detection logic: - If stage1 optimizes all 8 parameters (am, Sm, at, bt, St, ba, Sa, u) → Single-stage - Otherwise → Three-stage (stage1 is part of three-stage optimization)

Parameters:

config_file – Configuration file path

Returns:

‘single’ if stage1 is single-stage config, ‘three’ if it’s three-stage config

Return type:

str


Catalog Processor#

CatalogProcessor - Seismic Catalog Preprocessing Utility Class

Provides unified earthquake catalog quality control, time conversion, and filtering functions.

Format Support:
This module uses HORUS format (10 columns) as the internal standard:

year, month, day, hour, minute, second, lat, lon, depth, mag

Extended format conversion support (via catalog_processor_extensions):
  • Input formats: ZMAP, CSEP, QuakeML, DataFrame, SeismoStats, pyCSEP, INGV HORUS

  • Output formats: MATLAB .mat, SeismoStats Catalog, pyCSEP CSEPCatalog

  • Methods: from_zmap(), from_csep(), from_quakeml(), from_dataframe(),

    from_seismostats(), from_pycsep(), from_ingv_horus(), to_mat(), to_seismostats(), to_pycsep()

All conversion methods are available as static methods on CatalogProcessor class.

utils.catalog_processor.decyear(date_array)[source]#

Convert date to decimal year.

Parameters:

date_array – Nx6 matrix [year, month, day, hour, minute, second]

Returns:

Decimal year

Return type:

np.array

class utils.catalog_processor.CatalogProcessor[source]#

Bases: object

Seismic catalog preprocessing utility class

static preprocess_catalog(HORUS, catalog_start_year, learn_start_year, learn_end_year, completeness_threshold=2.0, max_depth=40, target_magnitude=None, filter_target_mag=False, verbose=False)[source]#

General preprocessing pipeline for earthquake catalogs. Standardizes magnitudes, filters quality, converts time coordinates.

Parameters:
  • HORUS – Raw seismic catalog matrix

  • catalog_start_year – Catalog start year

  • learn_start_year – Learning start year

  • learn_end_year – Learning end year

  • completeness_threshold – Completeness magnitude threshold

  • max_depth – Maximum depth (km)

  • target_magnitude – PPE target magnitude

  • filter_target_mag – Whether to enable target magnitude filtering

  • verbose – Verbose output mode

Returns:

(HORUS_processed, T1, T2)

Return type:

tuple

static create_catalogs(HORUS, params, learn_start_year, learn_end_year, catalog_start_year)[source]#

Create sub-catalogs from preprocessed earthquake catalog (without spatial filtering).

This function creates three sub-catalogs for EEPAS learning. When no RegionManager is provided, all events are treated as being within both the Neighborhood Region and Testing Region (no spatial filtering applied).

Naming conventions (from main_gji.tex paper):
  • Index i: Historical source events (CatI), from Neighborhood Region N

  • Index j: Target events (CatJ), from Testing Region R

  • NLL = -Σ_j log λ(t_j, m_j, x_j, y_j | CatI) + ∫∫∫∫_R λ(…) dt dm dx dy

Parameters:
  • HORUS – Preprocessed earthquake catalog (numpy array, N×11)

  • params – Model parameters dictionary, must contain ‘mT’ (target magnitude threshold)

  • learn_start_year – Learning start year

  • learn_end_year – Learning end year

  • catalog_start_year – Catalog start year (for time conversion)

Returns:

(CatE, CatI, CatJ)

  • CatE: EEPAS precursor catalog (M≥m₀, all time)

  • CatI: Historical source events (M≥mT, all time, Neighborhood Region) - paper’s index i

  • CatJ: Target events (M≥mT, learning period, Testing Region) - paper’s index j

Return type:

tuple

Note

Return order is (CatE, CatI, CatJ), corresponding to (precursor, i, j) in paper. This differs from versions before v1.1.0 (old order was CatE, CatJ, CatI).

static filter_by_region(catalog, region_manager, region_type='testing', x_col=7, y_col=6)[source]#

Filter earthquake catalog by spatial region.

Purpose:

  • Integrates with RegionManager for spatial filtering

  • Automatically handles both grid and polygon region formats

Parameters:
  • catalog – numpy array (N×M) earthquake catalog HORUS format: […, y(col6), x(col7), depth(col8), mag(col9), …]

  • region_manager – RegionManager instance

  • region_type

    ‘testing’ or ‘neighborhood’

    • ’testing’: filter events within testing region (for target events)

    • ’neighborhood’: filter events within neighborhood region (for source events)

  • x_col – X coordinate column (default 7)

  • y_col – Y coordinate column (default 6)

Returns:

Filtered earthquake catalog

Return type:

numpy.ndarray

static create_catalogs_with_regions(HORUS, params, learn_start_year, learn_end_year, catalog_start_year, region_manager=None)[source]#

Create sub-catalogs from preprocessed earthquake catalog (with optional spatial filtering).

This function supports both spatial filtering modes depending on the region_manager parameter. When region_manager is None, all events are used without spatial filtering. When region_manager is provided, events are filtered by Neighborhood and Testing Regions.

Naming conventions (from main_gji.tex paper):

  • Index i: Historical source events (CatI), from Neighborhood Region N

  • Index j: Target events (CatJ), from Testing Region R

  • NLL = -Σ_j log λ(t_j, m_j, x_j, y_j | CatI) + ∫∫∫∫_R λ(…) dt dm dx dy

Difference from create_catalogs():

  • Supports RegionManager spatial filtering

  • Without region_manager (None): Behaves identically to create_catalogs()

  • With region_manager provided:

    • CatE: All events within Neighborhood Region N (for edge compensation)

    • CatI: M≥mT events within Neighborhood Region N (historical sources, paper’s index i)

    • CatJ: M≥mT events within Testing Region R during learning period (targets, paper’s index j)

Parameters:
  • HORUS – Preprocessed earthquake catalog (numpy array, N×11)

  • params – Model parameters dictionary, must contain ‘mT’ (target magnitude threshold)

  • learn_start_year – Learning start year

  • learn_end_year – Learning end year

  • catalog_start_year – Catalog start year (for time conversion)

  • region_manager

    RegionManager instance (optional)

    • None: No spatial filtering applied

    • RegionManager: Apply spatial filtering using Neighborhood and Testing Regions

Returns:

(CatE, CatI, CatJ) containing:

  • CatE: EEPAS precursor catalog (M≥m₀)

  • CatI: Historical source events (M≥mT, Neighborhood Region) - paper’s index i

  • CatJ: Target events (M≥mT, learning period, Testing Region) - paper’s index j

Return type:

tuple

Note

Return order is (CatE, CatI, CatJ), corresponding to (precursor, i, j) in paper. This differs from versions before v1.1.0 (old order was CatE, CatJ, CatI).

static filter_by_magnitude(catalog, min_magnitude, mag_col=9)[source]#

Filter earthquake catalog by magnitude.

Parameters:
  • catalog – numpy array earthquake catalog

  • min_magnitude – Minimum magnitude threshold

  • mag_col – Magnitude column (default 9)

Returns:

Filtered catalog

Return type:

numpy array

static filter_by_time_range(catalog, start_year, end_year, year_col=0)[source]#

Filter earthquake catalog by time range.

Parameters:
  • catalog – numpy array earthquake catalog

  • start_year – Start year

  • end_year – End year

  • year_col – Year column (default 0)

Returns:

Filtered catalog

Return type:

numpy array

static from_zmap(file_path, delimiter=None, skiprows=0)[source]#

Read ZMAP format and convert to HORUS format. See catalog_processor_extensions.from_zmap()

static from_csep(file_path)[source]#

Read CSEP ASCII format and convert to HORUS format. See catalog_processor_extensions.from_csep()

static from_quakeml(file_path)[source]#

Read QuakeML format and convert to HORUS format. See catalog_processor_extensions.from_quakeml()

static from_dataframe(df, column_mapping=None)[source]#

Convert Pandas DataFrame to HORUS format. See catalog_processor_extensions.from_dataframe()

static from_horus_text(file_path, delimiter=None, skiprows=0)[source]#

Read HORUS text format. See catalog_processor_extensions.from_horus_text()

static from_ingv_horus(file_path, delimiter='\t', skiprows=1, filter_events=True)[source]#

Read INGV HORUS catalog format. See catalog_processor_extensions.from_ingv_horus()

static from_seismostats(catalog)[source]#

Convert SeismoStats Catalog object to HORUS format. See catalog_processor_extensions.from_seismostats()

static from_pycsep(catalog)[source]#

Convert pyCSEP CSEPCatalog object to HORUS format. See catalog_processor_extensions.from_pycsep()

static from_mat(file_path, variable_name='HORUS')[source]#

Read HORUS catalog from MATLAB .mat file. See catalog_processor_extensions.from_mat()

static to_seismostats(horus_catalog, mc=None, delta_m=0.1, b_value=None, a_value=None)[source]#

Convert HORUS format to SeismoStats Catalog object. See catalog_processor_extensions.to_seismostats()

static to_pycsep(horus_catalog, name='EEPAS_catalog', region=None)[source]#

Convert HORUS format to pyCSEP CSEPCatalog object. See catalog_processor_extensions.to_pycsep()

static to_mat(horus_catalog, output_file, variable_name='HORUS', matlab_compatible=True)[source]#

Write HORUS format catalog to MATLAB .mat file. See catalog_processor_extensions.to_mat()

static detect_format(file_path)[source]#

Auto-detect earthquake catalog file format based on extension and content.

Parameters:

file_path – Path to catalog file

Returns:

Detected format (‘zmap’, ‘csep’, ‘quakeml’, ‘horus’, ‘mat’, ‘unknown’)

Return type:

str


Region Manager#

Region Manager - Spatial region handling for EEPAS.

This module provides unified management of testing region and neighborhood region, supporting both grid and polygon region types.

Region Types:
  • ‘grid’: Rectangular grid (e.g., CELLE_ter format)

  • ‘polygon’: Polygon boundary (e.g., CPTI15 format)

Regional Functions:
  • Testing Region (R): Used for likelihood calculation and forecast output

  • Neighborhood Region (N): Used for edge effect compensation

Examples

Single grid region (both regions use same grid):

mgr = RegionManager(celle_data, celle_data, 'grid', 'grid')

Testing grid with neighborhood polygon:

mgr = RegionManager(celle_data, cpti15_polygon, 'grid', 'polygon')
class utils.region_manager.RegionManager(testing_region_data, neighborhood_region_data=None, testing_type='grid', neighborhood_type='grid')[source]#

Bases: object

Region Manager - Handles spatial determination for testing and neighborhood regions

testing_region#

Testing region data (grid or polygon)

neighborhood_region#

Neighborhood region data (grid or polygon)

testing_type#

‘grid’ or ‘polygon’

neighborhood_type#

‘grid’ or ‘polygon’

__init__(testing_region_data, neighborhood_region_data=None, testing_type='grid', neighborhood_type='grid')[source]#

Initialize region manager.

Parameters:
  • testing_region_data

    Testing region data

    • If type=’grid’: numpy array (N×10), column format is [x_min, x_max, y_min, y_max, …, cell_id, …]

    • If type=’polygon’: numpy array (M×2 or M×4), vertex coordinates

  • neighborhood_region_data

    Neighborhood region data (optional)

    • If None, then neighborhood = testing (same region for both)

    • Same format as testing_region_data

  • testing_type – ‘grid’ or ‘polygon’

  • neighborhood_type – ‘grid’ or ‘polygon’

is_in_testing_region(x, y)[source]#

Determine if point (x, y) is within testing region.

Parameters:
  • x – X coordinate (can be scalar or array)

  • y – Y coordinate (can be scalar or array)

Returns:

bool or numpy array of bool

is_in_neighborhood_region(x, y)[source]#

Determine if point (x, y) is within neighborhood region.

Parameters:
  • x – X coordinate (can be scalar or array)

  • y – Y coordinate (can be scalar or array)

Returns:

bool or numpy array of bool

filter_events_for_learning(catalog, x_col=7, y_col=6)[source]#

Filter events for learning: events within neighborhood region.

During PPE/EEPAS learning, use all events within neighborhood as source events to avoid edge effects.

Parameters:
  • catalog – numpy array (N×M) earthquake catalog

  • x_col – X coordinate column (default 7, HORUS format)

  • y_col – Y coordinate column (default 6, HORUS format)

Returns:

Filtered catalog

Return type:

numpy array

filter_events_for_targets(catalog, x_col=7, y_col=6)[source]#

Filter target events: events within testing region.

During likelihood calculation and forecasting, only use/output events within the testing region.

Parameters:
  • catalog – numpy array (N×M) earthquake catalog

  • x_col – X coordinate column (default 7, HORUS format)

  • y_col – Y coordinate column (default 6, HORUS format)

Returns:

Filtered catalog

Return type:

numpy array

get_active_cells()[source]#

Get active cell list (only when testing region is grid type).

Returns:

[x_min, x_max, y_min, y_max] for each cell

Return type:

numpy array (N×4)

Raises:

ValueError – If testing region is not grid type

get_n_active_cells()[source]#

Get number of active cells.

Returns:

Number of cells (returns number of vertices if polygon)

Return type:

int

get_region_summary()[source]#

Get region manager summary information (for debugging).

Returns:

Dictionary containing region types, cell counts, and other metadata

Return type:

dict


Numerical Integration#

Numerical Integration Functions for EEPAS.

This module provides low-level numerical integration functions with Numba acceleration.

Functions:
PPE Spatial Integration:
  • fast_kernel_sum_2d: Cauchy kernel summation (Numba parallel)

  • trapezoidal_2d: 2D trapezoidal integration (Numba parallel)

EEPAS Magnitude Integration:
  • fast_magnitude_integral: Midpoint rectangular rule (Numba JIT)

  • magnitude_integral_accurate: quad_vec wrapper

utils.numerical_integration.fast_kernel_sum_2d(x_grid, y_grid, d, xj, yj, weights)[source]#

Calculate PPE spatial kernel summation on 2D grid (vectorized version).

Seismological Significance:

For each grid point (x,y), calculate the Cauchy kernel summation from all historical earthquakes: h₀(x,y) = Σⱼ [wⱼ/(π(d²+rⱼ²))]

Mathematical Principle:

Cauchy kernel κ(r) = 1/(π(d²+r²)) is the spatial distribution function in PPE model. By summing contributions from all historical earthquakes, obtain spatial background rate.

Optimization Strategy:

  1. Numba JIT compilation: boost pure Python loop performance

  2. Parallelization (prange): multi-core computation for grid points

  3. fastmath: allow floating-point optimizations (tiny precision loss for speed gain)

Parameters:
  • x_grid – 2D grid coordinate matrices (meshgrid format) [nx × ny]

  • y_grid – 2D grid coordinate matrices (meshgrid format) [nx × ny]

  • d – Characteristic distance (km)

  • xj – Historical earthquake location arrays [n_events]

  • yj – Historical earthquake location arrays [n_events]

  • weights – Magnitude weight array wⱼ = a·(mⱼ-mT)·t_integral [n_events]

Returns:

Kernel function summation on 2D grid [nx × ny]

Return type:

numpy.ndarray

Usage Scenarios:

  • ppe_optimization.py: PPE Learning spatial integration

  • neg_log_like_aftershock.py: Fit Aftershock spatial integration

  • ppe_make_forecast.py: PPE Forecast grid integration

utils.numerical_integration.trapezoidal_2d(f_values, dx, dy)[source]#

2D trapezoidal integration method.

Mathematical Principle:

Divide integration region into regular grid, approximate each small rectangle using trapezoidal rule: ∫∫ f(x,y) dx dy ≈ Σᵢⱼ [(f₀₀+f₁₀+f₀₁+f₁₁)/4] · Δx·Δy where f₀₀, f₁₀, f₀₁, f₁₁ are function values at the four vertices of rectangle

Error Analysis:

For smooth functions, error is O(Δx² + Δy²). PPE Cauchy kernel becomes relatively smooth after summing many earthquakes, trapezoidal method has good accuracy.

Optimization Strategy:

  1. Numba parallelization: multi-core computation for grid cell summation

  2. Vectorization-friendly: can be used together with fast_kernel_sum_2d

Parameters:
  • f_values – 2D grid of function values (shape: nx × ny)

  • dx – Grid spacing (unit: km)

  • dy – Grid spacing (unit: km)

Returns:

Integral value (scalar)

Return type:

float

Accuracy Test Results:

Trapezoidal 50×50 vs Gauss-Legendre 10×10: relative error 0.0000%, speed improvement 2.9x

utils.numerical_integration.ppe_spatial_integral_accurate(x_bounds, y_bounds, event_data, d, s, s_coeff, a_coeff=1.0)[source]#

PPE spatial integration: accurate mode (dblquad adaptive integration).

Seismological Significance:

Calculate spatial integral of PPE background rate: ∫∫ h(x,y) dx dy where h(x,y) = a_coeff · [Σⱼ wⱼ/(π(d²+rⱼ²))] + s·s_coeff

Mathematical Principle:

Uses scipy.dblquad for adaptive double integration. Higher accuracy (relative error <0.001%) but slower (about 10-20x).

Applicable Scenarios:

  • PPE Learning accurate integration mode

  • Fit Aftershock accurate integration mode

  • PPE Forecast accurate integration mode

  • Paper validation or scenarios requiring extremely high precision

Parameters:
  • x_bounds – (x_min, x_max) integration range

  • y_bounds – (y_min, y_max) integration range

  • event_data – Event list, each event is (weight, x, y) tuple where weight already includes all coefficients (e.g., (m-mT)·t_integral)

  • d – PPE characteristic distance (km)

  • s – PPE background rate

  • s_coeff – Coefficient for s (e.g., mag_integral * t_sum)

  • a_coeff

    Coefficient for kernel summation term (default 1.0)

    • PPE Learning/Fit Aftershock: a_coeff = a (PPE parameter)

    • PPE Forecast: a_coeff = 1.0 (weight already includes a)

Returns:

Integral value (scalar)

Return type:

float

Note

This function is suitable for calling separately for each grid cell in a loop. Not suitable for overall vectorization (dblquad doesn’t support it).

utils.numerical_integration.fast_magnitude_integral(m1, m2, mee, am, bm, Sm, m0, B, magnitude_samples=20)[source]#

EEPAS magnitude integration: fast mode (midpoint rectangular rule).

Calculate magnitude integral: ∫[m1,m2] fGme(m, me) dm, for all me values

Where:

  • gₑ(m|mₑ): Normal distribution N(am + bm·mₑ, Sm²)

  • Δ(m): Normalization factor (calculated using erf function)

Fast Mode Characteristics:

  1. Midpoint rectangular rule: divide integral interval equally, approximate using midpoint values

  2. Numba acceleration: JIT compilation

  3. Fixed sampling: predictable computational cost

Error Analysis:

20 sample points: relative error ~3%, suitable for Forecast (speed priority)

Parameters:
  • m1 – Integration range (scalar)

  • m2 – Integration range (scalar)

  • mee – Precursor magnitude array (must be array)

  • am – Magnitude distribution parameters

  • bm – Magnitude distribution parameters

  • Sm – Magnitude distribution parameters

  • m0 – Completeness magnitude

  • B – GR b-value

  • magnitude_samples – Number of sample points (default 20)

Returns:

Integration result array of length len(mee)

Return type:

numpy.ndarray

utils.numerical_integration.magnitude_integral_accurate(m1, m2, mee, am, bm, Sm, m0, B)[source]#

EEPAS magnitude integration: accurate mode (quad_vec vectorized adaptive integration).

Uses scipy.quad_vec for adaptive integration, supports vectorization.

Accurate Mode Characteristics:

  1. Adaptive sampling: adjust integration points based on function characteristics

  2. Vectorization: can calculate multiple integrals simultaneously

  3. Extremely high precision: <0.001%

Applicable Scenarios:

  • EEPAS Learning (parameter learning requires high precision)

  • Paper validation

Parameters:
  • m1 – Integration range (scalar)

  • m2 – Integration range (scalar)

  • mee – Precursor magnitude array

  • am – Magnitude distribution parameters

  • bm – Magnitude distribution parameters

  • Sm – Magnitude distribution parameters

  • m0 – Completeness magnitude

  • B – GR b-value

Returns:

Integral value (can be scalar or array, depending on input)

Return type:

float or numpy.ndarray


Path Management#

Path Management Tool - Unified File Path Generation Generates standardized file paths based on configuration file and time parameters

utils.get_paths.get_paths(cfg, learn_start=None, learn_end=None, fore_start=None, fore_end=None, config_file=None)[source]#

Generate standardized file paths.

Parameters:
  • cfg – Configuration dictionary

  • learn_start – Learning start year

  • learn_end – Learning end year

  • fore_start – Forecast start year

  • fore_end – Forecast end year

  • config_file – Configuration file path (for resolving relative paths)

Returns:

Dictionary containing various file paths (dataPath, resultsPath, eepasParam, ppeParam, etc.)

Return type:

dict


Optimization Helpers#

fminsearchcon - Constrained Nelder-Mead optimization Ported from MATLAB’s fminsearchcon.m (John D’Errico)

Uses variable transformation to handle boundary constraints, uses penalty functions to handle linear and nonlinear inequality constraints.

utils.fminsearchcon.fminsearchcon(fun, x0, lb=None, ub=None, A=None, b=None, nonlcon=None, options=None, args=())[source]#

Constrained Nelder-Mead optimization.

Parameters:
  • fun – Objective function f(x, *args)

  • x0 – Initial guess (array-like)

  • lb – Lower bounds, can include -inf (optional)

  • ub – Upper bounds, can include +inf (optional)

  • A – Linear inequality constraint matrix A @ x <= b (optional)

  • b – Linear inequality constraint RHS vector (optional)

  • nonlcon – Nonlinear inequality constraint function, return <= 0 (optional)

  • options

    Optimization options dict (optional), containing:

    • maxiter: Maximum iterations (default: 200*n)

    • maxfun: Maximum function evaluations (default: 200*n)

    • xtol: Tolerance for x (default: 1e-4)

    • ftol: Tolerance for function value (default: 1e-4)

    • disp: Display progress (default: False)

  • args – Additional arguments for objective and constraint functions (optional)

Returns:

(x, fval, exitflag, output) where:

  • x: Optimal solution (ndarray)

  • fval: Optimal function value (float)

  • exitflag: Exit flag (0=success, -1=failure)

  • output: Optimization information (dict)

Return type:

tuple

utils.fminsearchcon.transform_forward(x, params)[source]#

Transform constrained variables to unconstrained variables

utils.fminsearchcon.transform_backward(xu, params)[source]#

Transform unconstrained variables back to constrained variables

utils.fminsearchcon.intrafun(xu, params)[source]#

Internal objective function: transform variables and check constraints


Coordinate Transformation#

Convert WGS84 coordinates to various projected coordinate systems (RDN2008, TWD97, etc.).

Coordinate Transformation Utility for Earthquake Catalogs

Converts HORUS earthquake catalog and CELLE grid coordinates between different coordinate reference systems (CRS). Supports multiple regional projections.

Supported CRS:
  • WGS84 (EPSG:4326): Global lat/lon coordinate system

  • RDN2008 (EPSG:7794): Italy zone projected system (default)

  • TWD97 (EPSG:3826): Taiwan TM2 121°E projected system

  • Custom: Any EPSG code supported by pyproj

Transformed coordinates are in kilometers and replace original lat/lon columns in the output MATLAB .mat files.

Usage:

# Default: WGS84 to RDN2008 (Italy) python coordinate_transform.py

–horus-in HORUS_Italy.mat –celle-in CELLE_Italy.mat –horus-out HORUS_Italy_RDN2008.mat –celle-out CELLE_Italy_RDN2008.mat

# Taiwan: WGS84 to TWD97 python coordinate_transform.py

–horus-in HORUS_Taiwan.mat –celle-in CELLE_Taiwan.mat –horus-out HORUS_Taiwan_TWD97.mat –celle-out CELLE_Taiwan_TWD97.mat –target-crs epsg:3826 –region Taiwan

# Custom EPSG code python coordinate_transform.py

–horus-in input.mat –celle-in grids.mat –horus-out output.mat –celle-out grids_out.mat –target-crs epsg:32633 –region “Custom Region”

utils.coordinate_transform.get_crs_info(crs_key)[source]#

Get coordinate system information from preset or custom EPSG code.

Parameters:

crs_key (str) – CRS preset key (‘rdn2008’, ‘twd97’) or EPSG code (‘epsg:3826’)

Returns:

CRS information including EPSG code, name, and bounds

Return type:

dict

utils.coordinate_transform.validate_coordinates(lon, lat, bounds, region_name='region')[source]#

Validate coordinates are within expected regional bounds.

Parameters:
  • lon (ndarray) – Longitude array (degrees East)

  • lat (ndarray) – Latitude array (degrees North)

  • bounds (dict) – Dictionary with ‘lat’ and ‘lon’ tuples (min, max)

  • region_name (str) – Name of region for warning messages

Returns:

(all_valid, invalid_count)

Return type:

Tuple[bool, int]

utils.coordinate_transform.convert_coordinates(array, lon_idx, lat_idx, transformer, bounds, region_name='region', meters_per_km=1000.0)[source]#

Convert lat/lon coordinates in array to projected coordinates.

Parameters:
  • array (ndarray) – Array containing lat/lon data

  • lon_idx (int) – Longitude column index

  • lat_idx (int) – Latitude column index

  • transformer (Transformer) – pyproj Transformer object

  • bounds (dict) – Coordinate validation bounds

  • region_name (str) – Region name for validation messages

  • meters_per_km (float) – Meters to kilometers conversion factor (default: 1000.0)

Returns:

Transformed array with coordinates in km

Return type:

np.ndarray

Note

Validates coordinates are within regional bounds and prints warnings for out-of-range coordinates.

utils.coordinate_transform.convert_celle_coordinates(celle, transformer, bounds, region_name='region', meters_per_km=1000.0)[source]#

Convert CELLE grid boundary coordinates.

Parameters:
  • celle (ndarray) – CELLE array (each row: lon_min, lon_max, lat_min, lat_max)

  • transformer (Transformer) – pyproj Transformer object

  • bounds (dict) – Coordinate validation bounds

  • region_name (str) – Region name for validation messages

  • meters_per_km (float) – Meters to kilometers conversion factor (default: 1000.0)

Returns:

Transformed CELLE array (x_min, x_max, y_min, y_max in km)

Return type:

np.ndarray

Note

CELLE format columns:
  • Column 0: Minimum longitude (°E)

  • Column 1: Maximum longitude (°E)

  • Column 2: Minimum latitude (°N)

  • Column 3: Maximum latitude (°N)

utils.coordinate_transform.main(horus_infile, celle_infile, horus_outfile, celle_outfile, target_crs='rdn2008', region_name=None)[source]#

Main function to convert HORUS and CELLE coordinates between CRS.

Parameters:
  • horus_infile (Path | None) – Input HORUS catalog .mat file path (optional)

  • celle_infile (Path | None) – Input CELLE grid .mat file path (optional)

  • horus_outfile (Path | None) – Output HORUS .mat file path (optional)

  • celle_outfile (Path | None) – Output CELLE .mat file path (optional)

  • target_crs (str) – Target coordinate system (‘rdn2008’, ‘twd97’, or ‘epsg:XXXX’)

  • region_name (str | None) – Custom region name for messages (optional)

Returns:

Writes transformed data to output files and prints status

Return type:

None

Note

HORUS format columns:
  • Column 7 (index 6): Latitude (°N)

  • Column 8 (index 7): Longitude (°E)

After transformation, these columns contain:
  • Column 7 (index 6): Northing Y (km)

  • Column 8 (index 7): Easting X (km)


See Also#