Genetic Programming in Java

Genetic Programming in Java, including packages on Linear Genetic Programming

License

License

MIT
Categories

Categories

Java Languages Net
GroupId

GroupId

com.github.chen0040
ArtifactId

ArtifactId

java-genetic-programming
Last Version

Last Version

1.0.14
Release Date

Release Date

Type

Type

jar
Description

Description

Genetic Programming in Java
Genetic Programming in Java, including packages on Linear Genetic Programming
Project URL

Project URL

https://github.com/chen0040/java-genetic-programming
Source Code Management

Source Code Management

https://github.com/chen0040/java-genetic-programming

Download java-genetic-programming

How to add to project

<!-- https://jarcasting.com/artifacts/com.github.chen0040/java-genetic-programming/ -->
<dependency>
    <groupId>com.github.chen0040</groupId>
    <artifactId>java-genetic-programming</artifactId>
    <version>1.0.14</version>
</dependency>
// https://jarcasting.com/artifacts/com.github.chen0040/java-genetic-programming/
implementation 'com.github.chen0040:java-genetic-programming:1.0.14'
// https://jarcasting.com/artifacts/com.github.chen0040/java-genetic-programming/
implementation ("com.github.chen0040:java-genetic-programming:1.0.14")
'com.github.chen0040:java-genetic-programming:jar:1.0.14'
<dependency org="com.github.chen0040" name="java-genetic-programming" rev="1.0.14">
  <artifact name="java-genetic-programming" type="jar" />
</dependency>
@Grapes(
@Grab(group='com.github.chen0040', module='java-genetic-programming', version='1.0.14')
)
libraryDependencies += "com.github.chen0040" % "java-genetic-programming" % "1.0.14"
[com.github.chen0040/java-genetic-programming "1.0.14"]

Dependencies

compile (3)

Group / Artifact Type Version
org.slf4j : slf4j-api jar 1.7.20
org.apache.commons : commons-math3 jar 3.2
com.github.chen0040 : java-data-frame jar 1.0.2

provided (1)

Group / Artifact Type Version
org.projectlombok : lombok jar 1.16.6

test (10)

Group / Artifact Type Version
org.testng : testng jar 6.9.10
org.hamcrest : hamcrest-core jar 1.3
org.hamcrest : hamcrest-library jar 1.3
org.assertj : assertj-core jar 3.5.2
org.powermock : powermock-core jar 1.6.5
org.powermock : powermock-api-mockito jar 1.6.5
org.powermock : powermock-module-junit4 jar 1.6.5
org.powermock : powermock-module-testng jar 1.6.5
org.mockito : mockito-core jar 2.0.2-beta
org.mockito : mockito-all jar 2.0.2-beta

Project Modules

There are no modules declared in this project.

java-genetic-programming

This package provides java implementation of various genetic programming paradigms such as linear genetic programming, tree genetic programming, gene expression programming, etc

Build Status Coverage Status Documentation Status

More details are provided in the docs for implementation, complexities and further info.

Install

Add the following dependency to your POM file:

<dependency>
  <groupId>com.github.chen0040</groupId>
  <artifactId>java-genetic-programming</artifactId>
  <version>1.0.14</version>
</dependency>

Features

  • Linear Genetic Programming

    • Initialization

      • Full Register Array
      • Fixed-length Register Array
    • Crossover

      • Linear
      • One-Point
      • One-Segment
    • Mutation

      • Micro-Mutation
      • Effective-Macro-Mutation
      • Macro-Mutation
    • Replacement

      • Tournament
      • Direct-Compete
    • Default-Operators

      • Most of the math operators
      • if-less, if-greater
      • Support operator extension
  • Tree Genetic Programming

    • Initialization

      • Full
      • Grow
      • PTC 1
      • Random Branch
      • Ramped Full
      • Ramped Grow
      • Ramped Half-Half
    • Crossover

      • Subtree Bias
      • Subtree No Bias
    • Mutation

      • Subtree
      • Subtree Kinnear
      • Hoist
      • Shrink
    • Evolution Strategy

      • (mu + lambda)
      • TinyGP

Future Works

  • Grammar-based Genetic Programming
  • Gene Expression Programming

Usage of Linear Genetic Programming

Create training data

The sample code below shows how to generate data from the "Mexican Hat" regression problem (Please refers to the Tutorials.mexican_hat() in the source code on how to create the data):

List<Observation> data = Tutorials.mexican_hat();

We can split the data generated into training and testing data:

List<Observation> data = Tutorials.mexican_hat();
CollectionUtils.shuffle(data);
TupleTwo<List<Observation>, List<Observation>> split_data = CollectionUtils.split(data, 0.9);
List<Observation> trainingData = split_data._1();
List<Observation> testingData = split_data._2();

Create and train the LGP

The sample code below shows how the LGP can be created and trained:

import com.github.chen0040.gp.lgp.LGP;
import com.github.chen0040.gp.commons.BasicObservation;
import com.github.chen0040.gp.commons.Observation;
import com.github.chen0040.gp.lgp.gp.Population;
import com.github.chen0040.gp.lgp.program.operators.*;

LGP lgp = LGP.defaultConfig();
lgp.setRegisterCount(6); // the number of register here is the number of input dimension of the training data times 3
lgp.setCostEvaluator((program, observations)->{
 double error = 0;
 for(Observation observation : observations){
    program.execute(observation);
    error += Math.pow(observation.getOutput(0) - observation.getPredictedOutput(0), 2.0);
 }

 return error;
});

lgp.setDisplayEvery(10); // to display iteration results every 10 generation

Program program = lgp.fit(trainingData);
System.out.println(program);

The number of registers of a linear program is set by calling LGP.setRegisterCount(...), the number of registers is usually the a multiple of the input dimension of a training data instance. For example if the training data has input (x, y) which is 2 dimension, then the number of registers may be set to 6 = 2 * 3.

The cost evaluator computes the training cost of a 'program' on the 'observations'.

The last line prints the linear program found by the LGP evolution, a sample of which is shown below (by calling program.toString()):

instruction[1]: <If<	r[4]	c[0]	r[4]>
instruction[2]: <If<	r[3]	c[3]	r[0]>
instruction[3]: <-	r[2]	r[3]	r[2]>
instruction[4]: <*	c[7]	r[2]	r[2]>
instruction[5]: <If<	c[2]	r[3]	r[1]>
instruction[6]: </	r[1]	c[4]	r[2]>
instruction[7]: <If<	r[3]	c[7]	r[1]>
instruction[8]: <-	c[0]	r[0]	r[0]>
...

Test the program obtained from the LGP evolution

The best program in the LGP population obtained from the training in the above step can then be used for prediction, as shown by the sample code below:

for(Observation observation : testingData) {
 program.execute(observation);
 double predicted = observation.getPredictedOutput(0);
 double actual = observation.getOutput(0);

 logger.info("predicted: {}\tactual: {}", predicted, actual);
}

Usage of Tree Genetic Programming

Here we will use the "Mexican Hat" symbolic regression introduced earlier to

Create and train the TreeGP

The sample code below shows how the TreeGP can be created and trained:

import com.github.chen0040.gp.treegp.TreeGP;
import com.github.chen0040.gp.commons.BasicObservation;
import com.github.chen0040.gp.commons.Observation;
import com.github.chen0040.gp.treegp.gp.Population;
import com.github.chen0040.gp.treegp.program.operators.*;

TreeGP tgp = TreeGP.defaultConfig();
tgp.setVariableCount(2); // equal to the number of input dimension of the training data
tgp.setCostEvaluator((program, observations)->{
 double error = 0;
 for(Observation observation : observations){
    program.execute(observation);
    error += Math.pow(observation.getOutput(0) - observation.getPredictedOutput(0), 2.0);
 }

 return error;
});
tgp.setDisplayEvery(10); // to display iteration results every 10 generation

Solution program = tgp.fit(trainingData);
System.out.println(program.mathExpression());

The last line prints the TreeGP program found by the TreeGP evolution, a sample of which is shown below (by calling program.mathExpression()):

Trees[0]: 1.0 - (if(1.0 < if(1.0 < 1.0, if(1.0 < v0, 1.0, 1.0), if(1.0 < (v1 * v0) + (1.0 / 1.0), 1.0 + 1.0, 1.0)), 1.0, v0 ^ 1.0))

Test the program obtained from the TreeGP evolution

The best program in the TreeGP population obtained from the training in the above step can then be used for prediction, as shown by the sample code below:

for(Observation observation : testingData) {
 program.execute(observation);
 double predicted = observation.getPredictedOutput(0);
 double actual = observation.getOutput(0);

 logger.info("predicted: {}\tactual: {}", predicted, actual);
}

Versions

Version
1.0.14
1.0.13
1.0.12
1.0.11
1.0.10
1.0.9
1.0.8
1.0.7
1.0.6
1.0.5
1.0.4
1.0.3
1.0.2
1.0.1