File OnnxRuntimeBase.hpp

namespace Acts

Note

This file is foreseen for the Geometry module to replace Extent

Typedefs

using NetworkBatchInput = Eigen::Array<float, Eigen::Dynamic, Eigen::Dynamic, Eigen::RowMajor>
class OnnxRuntimeBase
#include </home/docs/checkouts/readthedocs.org/user_builds/acts/checkouts/v28.0.0/Plugins/Onnx/include/Acts/Plugins/Onnx/OnnxRuntimeBase.hpp>

Subclassed by Acts::MLTrackClassifier

Public Functions

OnnxRuntimeBase() = default

Default constructor.

OnnxRuntimeBase(Ort::Env &env, const char *modelPath)

Parametrized constructor.

Parameters
  • env – the ONNX runtime environment

  • modelPath – the path to the ML model in *.onnx format

~OnnxRuntimeBase() = default

Default destructor.

std::vector<float> runONNXInference(std::vector<float> &inputTensorValues) const

Run the ONNX inference function.

Parameters

inputTensorValues – The input feature values used for prediction

Returns

The output (predicted) values

std::vector<std::vector<float>> runONNXInference(NetworkBatchInput &inputTensorValues) const

Run the ONNX inference function for a batch of input.

Parameters

inputTensorValues – Vector of the input feature values of all the inputs used for prediction

Returns

The vector of output (predicted) values

std::vector<std::vector<std::vector<float>>> runONNXInferenceMultiOutput(NetworkBatchInput &inputTensorValues) const

Run the multi-output ONNX inference function for a batch of input.

Parameters

inputTensorValues – Vector of the input feature values of all the inputs used for prediction

Returns

The vector of output (predicted) values, one for each output

Private Members

std::vector<int64_t> m_inputNodeDims
std::vector<const char*> m_inputNodeNames
std::vector<Ort::AllocatedStringPtr> m_inputNodeNamesAllocated
std::vector<std::vector<int64_t>> m_outputNodeDims
std::vector<const char*> m_outputNodeNames
std::vector<Ort::AllocatedStringPtr> m_outputNodeNamesAllocated
std::unique_ptr<Ort::Session> m_session

ONNX runtime session / model properties.