Download all versions of onnxruntime-mobile JAR files with all dependencies
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.18.0)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.18.0
Last update 20. May 2024
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.18.0
Last update 20. May 2024
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.17.3)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.17.3
Last update 10. April 2024
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.17.3
Last update 10. April 2024
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.17.1)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.17.1
Last update 26. February 2024
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.17.1
Last update 26. February 2024
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.17.0)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.17.0
Last update 01. February 2024
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.17.0
Last update 01. February 2024
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.16.3)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.16.3
Last update 21. November 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.16.3
Last update 21. November 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.16.2)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.16.2
Last update 09. November 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.16.2
Last update 09. November 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.16.1)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.16.1
Last update 11. October 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.16.1
Last update 11. October 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.16.0)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.16.0
Last update 19. September 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.16.0
Last update 19. September 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.16.0-rc1)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.16.0-rc1
Last update 09. September 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.16.0-rc1
Last update 09. September 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.15.1)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.15.1
Last update 16. June 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.15.1
Last update 16. June 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Page 1 from 3 (items total 21)