All Downloads are FREE. Search and download functionalities are using the official Maven repository.

Download all versions of onnxruntime-mobile JAR files with all dependencies

Search JAR files by class name

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.15.0)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.14.0)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.13.1)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.12.1)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.12.0)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.11.0)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.10.0)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.10.0
Last update 07. December 2021
Tags: targeting inference engine disk package runtime neural footprint platforms from network size models open optimized source reduced executing library onnx this built exchange mobile
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.9.0)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.9.0
Last update 22. September 2021
Tags: targeting inference engine disk package runtime neural footprint platforms from network size models open optimized source reduced executing library onnx this built exchange mobile
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.8.2)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.8.2
Last update 06. August 2021
Tags: targeting inference engine disk package runtime neural footprint platforms from network size models open optimized source reduced executing library onnx this built exchange mobile
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!

onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.8.1)

The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.

Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation 
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.8.1
Last update 06. July 2021
Tags: targeting inference engine disk package runtime neural footprint platforms from network size models open optimized source reduced executing library onnx this built exchange mobile
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!



Page 2 from 3 (items total 21)


© 2015 - 2024 Weber Informatics LLC | Privacy Policy