All Downloads are FREE. Search and download functionalities are using the official Maven repository.

ai.onnxruntime.package-info Maven / Gradle / Ivy

Go to download

ONNX Runtime is a performance-focused inference engine for ONNX (Open Neural Network Exchange) models.

There is a newer version: 1.19.2
Show newest version
/*
 * Copyright (c) 2019, Oracle and/or its affiliates. All rights reserved.
 * Licensed under the MIT License.
 */

/**
 * A Java interface to the ONNX Runtime.
 *
 * 

Provides access to the same execution backends as the C library. Non-representable types in * Java (such as fp16) are converted into the nearest Java primitive type when accessed through this * API. * *

There are two shared libraries required: onnxruntime and onnxruntime4j_jni * . The loader is in {@link ai.onnxruntime.OnnxRuntime} and the logic is in this order: * *

    *
  1. The user may signal to skip loading of a shared library using a property in the form * onnxruntime.native.LIB_NAME.skip with a value of true. This means the * user has decided to load the library by some other means. *
  2. The user may specify an explicit location of the shared library file using a property in * the form onnxruntime.native.LIB_NAME.path. This uses {@link * java.lang.System#load}. *
  3. The shared library is autodiscovered: *
      *
    1. If the shared library is present in the classpath resources, load using {@link * java.lang.System#load} via a temporary file. Ideally, this should be the default use * case when adding JAR's/dependencies containing the shared libraries to your * classpath. *
    2. If the shared library is not present in the classpath resources, then load using * {@link java.lang.System#loadLibrary}, which usually looks elsewhere on the filesystem * for the library. The semantics and behavior of that method are system/JVM dependent. * Typically, the java.library.path property is used to specify the * location of native libraries. *
    *
* * For troubleshooting, all shared library loading events are reported to Java logging at the level * FINE. */ package ai.onnxruntime;




© 2015 - 2024 Weber Informatics LLC | Privacy Policy