All Downloads are FREE. Search and download functionalities are using the official Maven repository.

Download all versions of onnxruntime-mobile JAR files with all dependencies

Search JAR files by class name

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.17.3)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.17.1)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.17.0)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.16.3)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.16.2)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.16.1)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.16.0)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.16.0-rc1)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.15.1)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.15.0)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads



Page 1 from 2 (items total 20)


© 2015 - 2024 Weber Informatics LLC | Privacy Policy