onnx_diagnostic.export.api¶
- onnx_diagnostic.export.api.to_onnx(mod: Module | GraphModule, args: Sequence[Tensor] | None = None, kwargs: Dict[str, Tensor] | None = None, input_names: Sequence[str] | None = None, target_opset: int | Dict[str, int] | None = None, verbose: int = 0, dynamic_shapes: Dict[str, Any] | Tuple[Any] | None = None, filename: str | None = None, output_names: List[str] | None = None, output_dynamic_shapes: Dict[str, Any] | Tuple[Any] | None = None, exporter: str = 'onnx-dynamo', exporter_kwargs: Dict[str, Any] | None = None, save_ep: str | None = None, optimize: bool = True, use_control_flow_dispatcher: bool = False) Any[source][source]¶
Common API for exporters. By default, the models are optimized to use the most efficient kernels implemented in onnxruntime.
- Parameters:
mod – torch model
args – unnamed arguments
kwargs – named arguments
input_names – input names for the onnx model (optional)
target_opset – opset to target, if not specified, each converter keeps its default value
verbose – verbosity level
dynamic_shapes – dynamic shapes, usually a nested structure included a dictionary for each tensor
filename – output filename
output_names – to change the output of the onnx model
output_dynamic_shapes – to overwrite the dynamic shapes names
exporter – exporter to use (
onnx-dynamo,modelbuilder,custom)exporter_kwargs – additional parameters sent to the exporter
save_ep – saves the exported program
optimize – optimizes the model
use_control_flow_dispatcher – use the dispatcher created to supported custom loops (see
onnx_diagnostic.export.control_flow.loop_for())
- Returns:
the output of the selected exporter, usually a structure including an onnx model
A simple example:
to_onnx( model, kwargs=inputs, dynamic_shapes=ds, exporter=exporter, filename=filename, )