Part 11 — EBP (Elastic Batch Platform)¶
11.1 Overview¶
EBP (Elastic Batch Platform) is a Job Entry Subsystem (JES) that enables execution of batch workloads compiled from PL/I programs. It provides JCL (Job Control Language) parsing, job scheduling, resource management, and execution orchestration for modernized mainframe batch applications.
Purpose¶
EBP bridges the gap between mainframe batch processing and modern Java environments:
- JCL Compatibility — Parse and execute JCL (Job Control Language) jobs
- Job Scheduling — Queue and schedule multi-step batch jobs
- Resource Management — Manage datasets, GDGs, and file allocations
- Monitoring — Web UI and REST API for job tracking
- Integration — Works with PLI-compiled batch programs
Architecture¶
┌──────────────────────────────────────────────────────────────┐
│ EBP Architecture │
└──────────────────────────────────────────────────────────────┘
JCL Job Definition (.jcl)
│
▼
┌─────────────────┐
│ JCL Parser │ ANTLR4-based parser
│ (ANTLR4) │ Validates syntax, builds job tree
└────────┬────────┘
▼
Job Definition (internal model)
│
▼
┌─────────────────┐
│ Job Scheduler │ Queue management
│ │ Priority scheduling
│ │ Dependency resolution
└────────┬────────┘
▼
┌─────────────────┐
│ Resource │ Dataset allocation (DD statements)
│ Manager │ GDG management
│ │ File mapping
└────────┬────────┘
▼
┌─────────────────┐
│ Job Executor │ Executes Java programs (compiled from PL/I)
│ │ Captures output (SYSOUT, SYSPRINT)
│ │ Return code handling
└────────┬────────┘
▼
Job Results + Output Datasets
│
▼
┌─────────────────┐
│ Web UI / API │ Job submission interface
│ │ Monitoring dashboard
│ │ Output viewing (SYSOUT)
│ │ REST API
└─────────────────┘
Deployment Options¶
EBP can be deployed in two modes:
- WAR Deployment (Recommended)
- Deploy to Jakarta EE server (Payara, WildFly)
- Web UI at
http://localhost:8080/ebp - REST API enabled
-
Suitable for multi-user environments
-
Standalone JAR (CLI mode)
- Command-line job submission
- No web UI
- Suitable for CI/CD pipelines
11.2 JCL Syntax Support¶
EBP parses JCL (Job Control Language) using an ANTLR4-based grammar that supports mainframe JCL constructs.
Supported JCL Statements¶
JOB Statement¶
Defines the job and execution parameters.
Parameters: - CLASS — Job class (priority) - MSGCLASS — Output message class - NOTIFY — User to notify on completion - REGION — Memory allocation - TIME — Maximum execution time
EXEC Statement¶
Executes a program or procedure.
Parameters: - PGM=programName — Java class to execute (fully qualified name) - PARM='parameters' — Command-line parameters passed to program - COND — Conditional execution based on previous step return codes - REGION — Step-level memory allocation - TIME — Step-level time limit
Example (Compiled PL/I program):
DD Statement¶
Defines datasets (Data Definition).
//INPUTDD DD DSN=CUST.INPUT.FILE,DISP=SHR
//OUTPUTDD DD DSN=CUST.OUTPUT.FILE,DISP=(NEW,CATLG,DELETE),
// SPACE=(CYL,(5,1)),UNIT=SYSDA
//SYSOUT DD SYSOUT=*
Parameters: - DSN=datasetName — Dataset name (maps to file path) - DISP=(status,normal,abnormal) — Disposition: - Status: NEW, OLD, SHR, MOD - Normal termination: KEEP, CATLG, DELETE, PASS - Abnormal termination: KEEP, CATLG, DELETE - SPACE=(unit,(primary,secondary)) — Space allocation - UNIT=device — Device type - SYSOUT=class — Output dataset (print, display) - DUMMY — Dummy dataset (no I/O)
Special DD Names: - STEPLIB — Program library - SYSOUT — Standard output - SYSPRINT — Print output - SYSIN — Standard input - SYSUDUMP — Dump dataset
PROC/PEND Statements¶
Define reusable procedures (not yet fully supported in EBP v1.0).
11.3 Job Submission¶
Web UI Submission¶
- Access EBP Web UI:
http://localhost:8080/ebp - Navigate to "Submit Job"
- Upload JCL file or paste JCL text
- Click "Submit"
- View job ID and status
REST API Submission¶
Endpoint: POST /ebp/api/jobs
Request (JCL as text):
Response:
{
"jobId": "JOB00123",
"jobName": "CUSTUPD",
"status": "SUBMITTED",
"submitTime": "2026-03-02T10:30:00Z"
}
Endpoint: GET /ebp/api/jobs/{jobId}
Response:
{
"jobId": "JOB00123",
"jobName": "CUSTUPD",
"status": "COMPLETED",
"returnCode": 0,
"steps": [
{
"stepName": "STEP01",
"program": "com.customer.batch.CustomerUpdate",
"status": "COMPLETED",
"returnCode": 0,
"startTime": "2026-03-02T10:30:05Z",
"endTime": "2026-03-02T10:32:10Z"
}
]
}
CLI Submission (Standalone JAR)¶
Output:
11.4 Job Scheduling and Execution¶
Job Lifecycle¶
States: - SUBMITTED — Job accepted, awaiting scheduling - QUEUED — Job queued, waiting for resources - RUNNING — Job executing - COMPLETED — Job finished successfully (RC=0) - FAILED — Job finished with error (RC≠0) - ABENDED — Job abnormally terminated
Job Queue Management¶
Jobs are queued based on: - Job class (priority) - Resource availability (memory, datasets) - Dependency resolution (future: job dependencies)
Step Execution¶
Each EXEC statement in JCL becomes a job step:
- Resource Allocation — Allocate DD datasets
- Program Invocation — Execute Java main() method
- Parameter Passing — Pass PARM string as args[]
- Output Capture — Redirect stdout/stderr to SYSOUT
- Return Code Handling — Check program exit code
- Conditional Execution — Evaluate COND for next step
Example:
//STEP01 EXEC PGM=com.myapp.Step1
//STEP02 EXEC PGM=com.myapp.Step2,COND=(0,NE,STEP01)
//STEP03 EXEC PGM=com.myapp.Step3
STEP02runs only ifSTEP01return code ≠ 0STEP03runs unconditionally
Parallel Execution¶
EBP supports concurrent job execution: - Multiple jobs can run simultaneously - Resource locking prevents dataset conflicts - Configurable job concurrency limit
11.5 Dataset Management¶
Dataset Name Mapping¶
JCL dataset names (DSN) map to file system paths:
| JCL DSN | File Path | Description |
|---|---|---|
CUST.INPUT.FILE | $EBP_DATA/CUST/INPUT.FILE | Regular dataset |
CUST.OUTPUT(+1) | $EBP_DATA/CUST/OUTPUT.G0001V00 | GDG dataset (next generation) |
CUST.OUTPUT(0) | $EBP_DATA/CUST/OUTPUT.G0000V00 | GDG dataset (current generation) |
&&TEMP | $EBP_TEMP/JOB00123/TEMP | Temporary dataset |
Configuration:
# ebp.properties
ebp.data.root=/var/ebp/data
ebp.temp.root=/var/ebp/temp
ebp.catalog.file=/var/ebp/catalog.db
GDG (Generation Data Group) Support¶
GDGs are versioned datasets commonly used in mainframe batch processing.
Creating a GDG:
Using GDG Generations:
//OUTPUT DD DSN=CUST.BACKUP(+1),DISP=(NEW,CATLG) // New generation
//INPUT DD DSN=CUST.BACKUP(0),DISP=SHR // Current generation
//PREV DD DSN=CUST.BACKUP(-1),DISP=SHR // Previous generation
GDG Version Naming: - CUST.BACKUP.G0001V00 — Generation 1, version 0 - CUST.BACKUP.G0002V00 — Generation 2, version 0 - Oldest generation rolled off when limit exceeded
11.6 Monitoring and Output¶
Web UI Monitoring¶
Job List View: - Job ID, name, status, submit time - Filter by status (ALL, RUNNING, COMPLETED, FAILED) - Search by job name
Job Detail View: - Step-by-step execution progress - Return codes - Elapsed time - Resource usage
Output Viewing: - SYSOUT (standard output) - SYSPRINT (print output) - SYSUDUMP (abend dumps) - Download output as text file
REST API Monitoring¶
List Jobs:
View Job Output:
Example:
11.7 Configuration¶
EBP Configuration File (ebp.properties)¶
# Data directories
ebp.data.root=/var/ebp/data
ebp.temp.root=/var/ebp/temp
ebp.catalog.file=/var/ebp/catalog.db
# Job execution
ebp.job.max.concurrent=5
ebp.job.default.class=A
ebp.job.output.retention.days=7
# JCL parsing
ebp.jcl.column.start=1
ebp.jcl.column.end=72
# Database (for job catalog)
ebp.db.url=jdbc:h2:file:/var/ebp/ebpdb
ebp.db.username=sa
ebp.db.password=
# Web UI
ebp.web.port=8080
ebp.web.context=/ebp
Logging Configuration¶
EBP uses SLF4J for logging. Configure in logback.xml:
<configuration>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>/var/log/ebp/ebp.log</file>
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<logger name="com.heirloom.ebp" level="INFO"/>
<root level="WARN">
<appender-ref ref="FILE"/>
</root>
</configuration>
11.8 Integration with Batch Applications¶
Project Structure¶
Enterprise batch applications use multi-wave structure with EBP:
batch-processing-system/
├── wave1/
│ ├── src/main/pli/ ← PL/I source programs
│ ├── src/main/jcl/ ← JCL job definitions
│ └── build.gradle ← Compile PL/I → Java
├── wave2/
├── ...
└── wave10/
Build Process¶
Gradle Configuration:
plugins {
id 'com.heirloom.pli' version '26.2.27.RC1'
}
pli {
source = file('src/main/pli')
target = file('build/generated/java')
strategy = 'static' // Batch strategy
sql = true
}
dependencies {
implementation 'com.heirloom:pli_runtime2:26.2.27.RC1'
}
Execution Flow¶
- Development:
- Write PL/I batch programs (P4, P5 types)
- Write JCL job definitions
-
Build:
./gradlew build -
Deployment:
- Deploy compiled JAR to EBP classpath
-
Copy JCL files to job library
-
Execution:
- Submit JCL via EBP web UI or API
- EBP parses JCL
- EBP executes Java programs (from compiled PL/I)
- EBP captures output and return codes
11.9 Example JCL Job¶
Complete JCL Example¶
//CUSTUPD JOB (ACCT001),'Customer Update Job',
// CLASS=A,MSGCLASS=X,NOTIFY=&SYSUID
//*
//* Step 1: Extract customer data
//*
//EXTRACT EXEC PGM=com.customer.batch.ExtractCustomers,
// PARM='RUNDATE=20260301'
//STEPLIB DD DSN=CUSTOMER.LOAD.LIB,DISP=SHR
//INPUT DD DSN=CUSTOMER.MASTER.FILE,DISP=SHR
//OUTPUT DD DSN=CUSTOMER.EXTRACT.FILE,
// DISP=(NEW,CATLG,DELETE),
// SPACE=(CYL,(10,5)),UNIT=SYSDA
//SYSOUT DD SYSOUT=*
//SYSPRINT DD SYSOUT=*
//*
//* Step 2: Transform extracted data (runs only if EXTRACT RC=0)
//*
//TRANSFRM EXEC PGM=com.customer.batch.TransformData,
// COND=(0,NE,EXTRACT)
//INPUT DD DSN=CUSTOMER.EXTRACT.FILE,DISP=SHR
//OUTPUT DD DSN=CUSTOMER.TRANSFORM.FILE,
// DISP=(NEW,CATLG,DELETE),
// SPACE=(CYL,(10,5))
//REFDATA DD DSN=CUSTOMER.REFERENCE.DATA,DISP=SHR
//SYSOUT DD SYSOUT=*
//*
//* Step 3: Load transformed data into database
//*
//LOAD EXEC PGM=com.customer.batch.LoadCustomers,
// COND=(0,NE,TRANSFRM)
//INPUT DD DSN=CUSTOMER.TRANSFORM.FILE,DISP=SHR
//SYSOUT DD SYSOUT=*
//SYSPRINT DD SYSOUT=*
//*
//* Step 4: Backup to GDG
//*
//BACKUP EXEC PGM=com.customer.batch.BackupData
//INPUT DD DSN=CUSTOMER.TRANSFORM.FILE,DISP=SHR
//OUTPUT DD DSN=CUSTOMER.BACKUP(+1),
// DISP=(NEW,CATLG,DELETE)
//SYSOUT DD SYSOUT=*
Execution Result¶
Job: CUSTUPD (JOB00123)
Status: COMPLETED
Return Code: 0
Elapsed Time: 00:05:23
Steps:
EXTRACT - COMPLETED (RC=0) - 00:02:10
TRANSFRM - COMPLETED (RC=0) - 00:01:45
LOAD - COMPLETED (RC=0) - 00:01:20
BACKUP - COMPLETED (RC=0) - 00:00:08
Datasets Created:
CUSTOMER.EXTRACT.FILE
CUSTOMER.TRANSFORM.FILE
CUSTOMER.BACKUP.G0005V00 (GDG)
11.10 Troubleshooting¶
Common Issues¶
"JCL syntax error at line X"¶
Cause: Invalid JCL syntax.
Solution: - Verify JCL statement format (columns 1-72) - Check for missing commas or continuation lines - Use JCL syntax validator
"Program not found: com.myapp.MyProgram"¶
Cause: Java class not in classpath.
Solution: - Verify JAR deployed to EBP library directory - Check STEPLIB DD statement - Verify fully qualified class name
"Dataset not found: MY.INPUT.FILE"¶
Cause: Dataset doesn't exist or wrong disposition.
Solution: - Create dataset first or use DISP=(NEW,...) - Check dataset name mapping in ebp.properties - Verify file permissions
"Job stuck in QUEUED state"¶
Cause: Resource contention or configuration issue.
Solution: - Check ebp.job.max.concurrent setting - Verify no dataset locks - Review job queue in web UI
"GDG generation limit exceeded"¶
Cause: Too many GDG generations.
Solution: - Increase GDG limit in DEFINE GDG - Delete old generations manually - Enable SCRATCH option for automatic cleanup