Last Version

multiLayerPerceptrons 1.0.10

This package currently contains classes for training multilayer perceptrons with one hidden layer, where the number of hidden units is user specified. MLPClassifier can be used for classification problems and MLPRegressor is the corresponding class for numeric prediction tasks. The former has as many output units as there are classes, the latter only one output unit. Both minimise a penalised squared error with a quadratic penalty on the (non-bias) weights, i.e., they implement "weight decay", where this penalised error is averaged over all training instances. The size of the penalty can be determined by the user by modifying the "ridge" parameter to control overfitting. The sum of squared weights is multiplied by this parameter before added to the squared error. Both classes use BFGS optimisation by default to find parameters that correspond to a local minimum of the error function. but optionally conjugated gradient descent is available, which can be faster for problems with many parameters. Logistic functions are used as the activation functions for all units apart from the output unit in MLPRegressor, which employs the identity function. Input attributes are standardised to zero mean and unit variance. MLPRegressor also rescales the target attribute (i.e., "class") using standardisation. All network parameters are initialised with small normally distributed random values.

License

License

Categories

Categories

Weka Business Logic Libraries Machine Learning
GroupId

GroupId

nz.ac.waikato.cms.weka
ArtifactId

ArtifactId

multiLayerPerceptrons
Version

Version

1.0.10
Type

Type

jar
Description

Description

multiLayerPerceptrons
This package currently contains classes for training multilayer perceptrons with one hidden layer, where the number of hidden units is user specified. MLPClassifier can be used for classification problems and MLPRegressor is the corresponding class for numeric prediction tasks. The former has as many output units as there are classes, the latter only one output unit. Both minimise a penalised squared error with a quadratic penalty on the (non-bias) weights, i.e., they implement "weight decay", where this penalised error is averaged over all training instances. The size of the penalty can be determined by the user by modifying the "ridge" parameter to control overfitting. The sum of squared weights is multiplied by this parameter before added to the squared error. Both classes use BFGS optimisation by default to find parameters that correspond to a local minimum of the error function. but optionally conjugated gradient descent is available, which can be faster for problems with many parameters. Logistic functions are used as the activation functions for all units apart from the output unit in MLPRegressor, which employs the identity function. Input attributes are standardised to zero mean and unit variance. MLPRegressor also rescales the target attribute (i.e., "class") using standardisation. All network parameters are initialised with small normally distributed random values.
Project URL

Project URL

http://weka.sourceforge.net/doc.packages/multiLayerPerceptrons
Project Organization

Project Organization

University of Waikato, Hamilton, NZ
Source Code Management

Source Code Management

https://svn.cms.waikato.ac.nz/svn/weka/tags/multiLayerPerceptrons-1.0.10

Download multiLayerPerceptrons 1.0.10


<!-- https://jarcasting.com/artifacts/nz.ac.waikato.cms.weka/multiLayerPerceptrons/ -->
<dependency>
    <groupId>nz.ac.waikato.cms.weka</groupId>
    <artifactId>multiLayerPerceptrons</artifactId>
    <version>1.0.10</version>
</dependency>
// https://jarcasting.com/artifacts/nz.ac.waikato.cms.weka/multiLayerPerceptrons/
implementation 'nz.ac.waikato.cms.weka:multiLayerPerceptrons:1.0.10'
// https://jarcasting.com/artifacts/nz.ac.waikato.cms.weka/multiLayerPerceptrons/
implementation ("nz.ac.waikato.cms.weka:multiLayerPerceptrons:1.0.10")
'nz.ac.waikato.cms.weka:multiLayerPerceptrons:jar:1.0.10'
<dependency org="nz.ac.waikato.cms.weka" name="multiLayerPerceptrons" rev="1.0.10">
  <artifact name="multiLayerPerceptrons" type="jar" />
</dependency>
@Grapes(
@Grab(group='nz.ac.waikato.cms.weka', module='multiLayerPerceptrons', version='1.0.10')
)
libraryDependencies += "nz.ac.waikato.cms.weka" % "multiLayerPerceptrons" % "1.0.10"
[nz.ac.waikato.cms.weka/multiLayerPerceptrons "1.0.10"]

Dependencies

compile (1)

Group / Artifact Type Version
nz.ac.waikato.cms.weka : weka-dev jar [3.7.6,)

test (2)

Group / Artifact Type Version
nz.ac.waikato.cms.weka : weka-dev test-jar [3.7.6,)
junit : junit jar 3.8.2

Project Modules

There are no modules declared in this project.