Download all versions of onnxruntime-mobile JAR files with all dependencies
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.15.0)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.15.0
Last update 24. May 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.15.0
Last update 24. May 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.14.0)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.14.0
Last update 10. February 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.14.0
Last update 10. February 2023
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.13.1)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.13.1
Last update 24. October 2022
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.13.1
Last update 24. October 2022
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.12.1)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.12.1
Last update 04. August 2022
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.12.1
Last update 04. August 2022
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.12.0)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.12.0
Last update 22. July 2022
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.12.0
Last update 22. July 2022
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.11.0)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.11.0
Last update 25. March 2022
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.11.0
Last update 25. March 2022
Tags: inference neural optimized android executing source network platforms reference onnxruntime targeting library docs engine operators supports from model html details aligned https onnx order minimize models types package built more mobile this format runtime reduced with disk footprint size converted binary typical exchange must open applications
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.10.0)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.10.0
Last update 07. December 2021
Tags: targeting inference engine disk package runtime neural footprint platforms from network size models open optimized source reduced executing library onnx this built exchange mobile
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.10.0
Last update 07. December 2021
Tags: targeting inference engine disk package runtime neural footprint platforms from network size models open optimized source reduced executing library onnx this built exchange mobile
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.9.0)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.9.0
Last update 22. September 2021
Tags: targeting inference engine disk package runtime neural footprint platforms from network size models open optimized source reduced executing library onnx this built exchange mobile
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.9.0
Last update 22. September 2021
Tags: targeting inference engine disk package runtime neural footprint platforms from network size models open optimized source reduced executing library onnx this built exchange mobile
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.8.2)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.8.2
Last update 06. August 2021
Tags: targeting inference engine disk package runtime neural footprint platforms from network size models open optimized source reduced executing library onnx this built exchange mobile
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.8.2
Last update 06. August 2021
Tags: targeting inference engine disk package runtime neural footprint platforms from network size models open optimized source reduced executing library onnx this built exchange mobile
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
onnxruntime-mobile from group com.microsoft.onnxruntime (version 1.8.1)
The ONNX Runtime Mobile package is a size optimized inference library for executing ONNX (Open Neural Network Exchange) models on Android. This package is built from the open source inference engine but with reduced disk footprint targeting mobile platforms. To minimize binary size this library supports a reduced set of operators and types aligned to typical mobile applications. The ONNX model must be converted to ORT format in order to use it with this package. See https://onnxruntime.ai/docs/reference/ort-format-models.html for more details.
Group: com.microsoft.onnxruntime Artifact: onnxruntime-mobile
Show documentation
Show documentation
There is no JAR file uploaded. A download is not possible! Please choose another version.
0 downloads
Artifact onnxruntime-mobile
Group com.microsoft.onnxruntime
Version 1.8.1
Last update 06. July 2021
Tags: targeting inference engine disk package runtime neural footprint platforms from network size models open optimized source reduced executing library onnx this built exchange mobile
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Group com.microsoft.onnxruntime
Version 1.8.1
Last update 06. July 2021
Tags: targeting inference engine disk package runtime neural footprint platforms from network size models open optimized source reduced executing library onnx this built exchange mobile
Organization Microsoft
URL https://microsoft.github.io/onnxruntime/
License MIT License
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!
Page 2 from 3 (items total 21)