legstar.avro
A COBOL to Apache Avro translator based on LegStar
The idea is that mainframe data might be useful in environments where Apache Avro records are commonplace. One such environment is Hadoop where MapReduce jobs can read/write Avro records.
Legstar.avro helps deliver mainframe data as Avro records.
Objectives
-
Provide a Generator to produce artifacts from a COBOL copybook:
-
Avro Schema matching the COBOL copybook
-
Conversion logic to turn mainframe records to Avro Generic records
-
Avro specific records compiled from the Avro schema
-
-
Readers for pure Java, or Hadoop, that read mainframe files and deliver an Avro record for each mainframe record
Requirements
-
Java JDK 6 and above (It is important that this is a JDK not a simple JRE)
-
Maven 3 (If you build from sources)
-
Ant 1.9.x (To run samples)
-
Hadoop 2.4.1 (To run the Hadoop sample)
Build from sources
-
Clone the GIT repository
-
From a command window, while located in the folder where you cloned the repo, type:
mvn clean install
Run the samples
If you built the project from sources, you will find the distribution zip file under legstar.avro/legstar.avro.distrib/target.
Otherwise you can get the latest released zip here.
Unzip the zip file in a location of your choice.
1. Run the CustdatReader sample
Go to the samples folder and type:
ant
You can get more information on the CustdatReader sample provided on the wiki
2. Run the CustdatHadoopReader sample
Go to the samples folder and type:
ant -f build-hadoop.xml
You can get more information on the CustdatHadoopReader sample provided on the wiki