onnx_diagnostic.helpers.rt_helper¶
- onnx_diagnostic.helpers.rt_helper.make_feeds(proto: ModelProto | List[str], inputs: Any, use_numpy: bool = False, copy: bool = False, check_flatten: bool = True) Dict[str, Tensor | ndarray] [source]¶
Serializes the inputs to produce feeds expected by
onnxruntime.InferenceSession
.- Parameters:
proto – onnx model or list of names
inputs – any kind of inputs
use_numpy – if True, converts torch tensors into numpy arrays
copy – a copy is made, this should be the case if the inputs is ingested by
OrtValue
check_flatten – if True, checks the
torch.utils._pytree.tree_flatten
returns the same number of outputs
- Returns:
feeds dictionary