Project Group: com.microsoft.onnxruntime

onnx-runtime

com.microsoft.onnxruntime : onnxruntime

ONNX Runtime is a performance-focused inference engine for ONNX (Open Neural Network Exchange) models.

Last Version: 1.11.0

Release Date:

onnx-runtime

com.microsoft.onnxruntime : onnxruntime_gpu

ONNX Runtime is a performance-focused inference engine for ONNX (Open Neural Network Exchange) models.

Last Version: 1.11.0

Release Date:

onnx-runtime

com.microsoft.onnxruntime : onnxruntime-mobile

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-model-format.html for more details.

Last Version: 1.11.0

Release Date:

onnx-runtime

com.microsoft.onnxruntime : onnxruntime-android

ONNX Runtime is a performance-focused inference engine for ONNX (Open Neural Network Exchange) models. This package contains the Android (aar) build of ONNX Runtime. It includes support for all types and operators, for ONNX format models. All standard ONNX models can be executed with this package. As such the binary size and memory usage will be larger than the onnxruntime-mobile package.

Last Version: 1.11.0

Release Date:

  • 1