DJL Engine Adapter for PyTorch

Deep Java Library (DJL) Engine Adapter for PyTorch

License

License

Categories

Categories

DJL Business Logic Libraries Machine Learning
GroupId

GroupId

ai.djl.pytorch
ArtifactId

ArtifactId

pytorch-engine-precxx11
Last Version

Last Version

0.7.0
Release Date

Release Date

Type

Type

jar
Description

Description

DJL Engine Adapter for PyTorch
Deep Java Library (DJL) Engine Adapter for PyTorch
Source Code Management

Source Code Management

https://github.com/awslabs/djl

Download pytorch-engine-precxx11

How to add to project

<!-- https://jarcasting.com/artifacts/ai.djl.pytorch/pytorch-engine-precxx11/ -->
<dependency>
    <groupId>ai.djl.pytorch</groupId>
    <artifactId>pytorch-engine-precxx11</artifactId>
    <version>0.7.0</version>
</dependency>
// https://jarcasting.com/artifacts/ai.djl.pytorch/pytorch-engine-precxx11/
implementation 'ai.djl.pytorch:pytorch-engine-precxx11:0.7.0'
// https://jarcasting.com/artifacts/ai.djl.pytorch/pytorch-engine-precxx11/
implementation ("ai.djl.pytorch:pytorch-engine-precxx11:0.7.0")
'ai.djl.pytorch:pytorch-engine-precxx11:jar:0.7.0'
<dependency org="ai.djl.pytorch" name="pytorch-engine-precxx11" rev="0.7.0">
  <artifact name="pytorch-engine-precxx11" type="jar" />
</dependency>
@Grapes(
@Grab(group='ai.djl.pytorch', module='pytorch-engine-precxx11', version='0.7.0')
)
libraryDependencies += "ai.djl.pytorch" % "pytorch-engine-precxx11" % "0.7.0"
[ai.djl.pytorch/pytorch-engine-precxx11 "0.7.0"]

Dependencies

compile (1)

Group / Artifact Type Version
ai.djl : api jar 0.7.0

Project Modules

There are no modules declared in this project.

DeepJavaLibrary

Continuous PyTorch Continuous Tensorflow Docs

Deep Java Library (DJL)

Overview

Deep Java Library (DJL) is an open-source, high-level, engine-agnostic Java framework for deep learning. DJL is designed to be easy to get started with and simple to use for Java developers. DJL provides a native Java development experience and functions like any other regular Java library.

You don't have to be machine learning/deep learning expert to get started. You can use your existing Java expertise as an on-ramp to learn and use machine learning and deep learning. You can use your favorite IDE to build, train, and deploy your models. DJL makes it easy to integrate these models with your Java applications.

Because DJL is deep learning engine agnostic, you don't have to make a choice between engines when creating your projects. You can switch engines at any point. To ensure the best performance, DJL also provides automatic CPU/GPU choice based on hardware configuration.

DJL's ergonomic API interface is designed to guide you with best practices to accomplish deep learning tasks. The following pseudocode demonstrates running inference:

    // Assume user uses a pre-trained model from model zoo, they just need to load it
    Criteria<Image, Classifications> criteria =
            Criteria.builder()
                    .optApplication(Application.CV.OBJECT_DETECTION) // find object dection model
                    .setTypes(Image.class, Classifications.class) // define input and output
                    .optFilter("backbone", "resnet50") // choose network architecture
                    .build();

    try (ZooModel<Image, Classifications> model = ModelZoo.loadModel(criteria)) {
        try (Predictor<Image, Classifications> predictor = model.newPredictor()) {
            Image img = ImageFactory.getInstance().fromUrl("http://..."); // read image
            Classifications result = predictor.predict(img);

            // get the classification and probability
            ...
        }
    }

The following pseudocode demonstrates running training:

    // Construct your neural network with built-in blocks
    Block block = new Mlp(28, 28);

    try (Model model = Model.newInstance("mlp")) { // Create an empty model
        model.setBlock(block); // set neural network to model

        // Get training and validation dataset (MNIST dataset)
        Dataset trainingSet = new Mnist.Builder().setUsage(Usage.TRAIN) ... .build();
        Dataset validateSet = new Mnist.Builder().setUsage(Usage.TEST) ... .build();

        // Setup training configurations, such as Initializer, Optimizer, Loss ...
        TrainingConfig config = setupTrainingConfig();
        try (Trainer trainer = model.newTrainer(config)) {
            /*
             * Configure input shape based on dataset to initialize the trainer.
             * 1st axis is batch axis, we can use 1 for initialization.
             * MNIST is 28x28 grayscale image and pre processed into 28 * 28 NDArray.
             */
            Shape inputShape = new Shape(1, 28 * 28);
            trainer.initialize(new Shape[] {inputShape});

            EasyTrain.fit(trainer, epoch, trainingSet, validateSet);
        }

        // Save the model
        model.save(modelDir, "mlp");
    }

Getting Started

Resources

Release Notes

Building From Source

To build from source, begin by checking out the code. Once you have checked out the code locally, you can build it as follows using Gradle:

# for Linux/macOS:
./gradlew build

# for Windows:
gradlew build

To increase build speed, you can use the following command to skip unit tests:

# for Linux/macOS:
./gradlew build -x test

# for Windows:
gradlew build -x test

Note: SpotBugs is not compatible with JDK 11+. SpotBugs will not be executed if you are using JDK 11+.

Community

Join our slack channel to get in touch with the development team, for questions and discussions.

Follow our twitter to see updates about new content, features, and releases.

关注我们 知乎专栏 获取DJL最新的内容!

License

This project is licensed under the Apache-2.0 License.

ai.djl.pytorch

Amazon Web Services - Labs

AWS Labs

Versions

Version
0.7.0
0.6.0