onnx_diagnostic.torch_models.test_helper¶
- onnx_diagnostic.torch_models.test_helper.call_exporter(data: Dict[str, Any], exporter: str, quiet: bool = False, verbose: int = 0, optimization: str | None = None, do_run: bool = False) Tuple[Dict[str, int | float | str], Dict[str, Any]] [source]¶
Calls an exporter on a model; If a patch must be applied, it should be before this functions.
- Parameters:
data – dictionary with all the necessary inputs
exporter – exporter to call
quiet – catch exception or not
verbose – verbosity
optimization – optimization to do
do_run – runs and compute discrepancies
- Returns:
two dictionaries, one with some metrics, another one with whatever the function produces
- onnx_diagnostic.torch_models.test_helper.call_torch_export_custom(data: Dict[str, Any], exporter: str, quiet: bool = False, verbose: int = 0, optimization: str | None = None) Tuple[Dict[str, Any], Dict[str, Any]] [source]¶
Exports a model into onnx. If a patch must be applied, it should be before this functions.
- Parameters:
data – dictionary with all the necessary inputs, the dictionary must contains keys
model
andinputs_export
exporter – exporter to call
quiet – catch exception or not
verbose – verbosity
optimization – optimization to do
- Returns:
two dictionaries, one with some metrics, another one with whatever the function produces
- onnx_diagnostic.torch_models.test_helper.call_torch_export_export(data: Dict[str, Any], exporter: str, quiet: bool = False, verbose: int = 0, optimization: str | None = None, do_run: bool = False)[source]¶
Exports a model with
torch.export.export()
. If a patch must be applied, it should be before this functions.- Parameters:
data – dictionary with all the necessary inputs, the dictionary must contains keys
model
andinputs_export
exporter – exporter to call
quiet – catch exception or not
verbose – verbosity
optimization – optimization to do
do_run – runs and compute discrepancies
- Returns:
two dictionaries, one with some metrics, another one with whatever the function produces
- onnx_diagnostic.torch_models.test_helper.call_torch_export_onnx(data: Dict[str, Any], exporter: str, quiet: bool = False, verbose: int = 0, optimization: str | None = None) Tuple[Dict[str, Any], Dict[str, Any]] [source]¶
Exports a model into onnx. If a patch must be applied, it should be before this functions.
- Parameters:
data – dictionary with all the necessary inputs, the dictionary must contains keys
model
andinputs_export
exporter – exporter to call
quiet – catch exception or not
verbose – verbosity
optimization – optimization to do
- Returns:
two dictionaries, one with some metrics, another one with whatever the function produces
- onnx_diagnostic.torch_models.test_helper.empty(value: Any) bool [source]¶
Tells if the value is empty.
- onnx_diagnostic.torch_models.test_helper.filter_inputs(inputs: Any, drop_names: List[str], model: Module | List[str] | None = None, dynamic_shapes: Any | None = None)[source]¶
Drops some inputs from the given inputs. It updates the dynamic shapes as well.
- onnx_diagnostic.torch_models.test_helper.get_inputs_for_task(task: str, config: Any | None = None) Dict[str, Any] [source]¶
Returns dummy inputs for a specific task.
- Parameters:
task – requested task
config – returns dummy inputs for a specific config if available
- Returns:
dummy inputs and dynamic shapes
- onnx_diagnostic.torch_models.test_helper.make_inputs(args: Tuple[Any, ...] | None, kwargs: Dict[str, Any] | None = None) Any [source]¶
Returns either args, kwargs or both depending on which ones are empty.
- onnx_diagnostic.torch_models.test_helper.run_ort_fusion(model_or_path: str | ModelProto, output_path: str, num_attention_heads: int, hidden_size: int, model_type: str = 'bert', verbose: int = 0) Tuple[Dict[str, Any], Dict[str, Any]] [source]¶
Runs onnxruntime fusion optimizer.
- Parameters:
model_or_path – path to the ModelProto or the ModelProto itself
output_path – the model to save
num_attention_heads – number of heads, usually
config.num_attention_heads
hidden_size – hidden size, usually
config.hidden_size
model_type – type of optimization, see below
verbose – verbosity
- Returns:
two dictionaries, summary and data
Supported values for
model_type
:<<<
import pprint from onnxruntime.transformers.optimizer import MODEL_TYPES pprint.pprint(sorted(MODEL_TYPES))
>>>
['bart', 'bert', 'bert_keras', 'bert_tf', 'clip', 'conformer', 'gpt2', 'gpt2_tf', 'gpt_neox', 'mmdit', 'phi', 'sam2', 'swin', 't5', 'tnlr', 'unet', 'vae', 'vit']
- onnx_diagnostic.torch_models.test_helper.split_args_kwargs(inputs: Any) Tuple[Tuple[Any, ...], Dict[str, Any]] [source]¶
Splits into args, kwargs.
- onnx_diagnostic.torch_models.test_helper.validate_model(model_id: str, task: str | None = None, do_run: bool = False, exporter: str | None = None, do_same: bool = False, verbose: int = 0, dtype: str | dtype | None = None, device: str | device | None = None, trained: bool = False, optimization: str | None = None, quiet: bool = False, patch: bool = False, stop_if_static: int = 1, dump_folder: str | None = None, drop_inputs: List[str] | None = None, ortfusiontype: str | None = None, input_options: Dict[str, Any] | None = None) Tuple[Dict[str, int | float | str], Dict[str, Any]] [source]¶
Validates a model.
- Parameters:
model_id – model id to validate
task – task used to generate the necessary inputs, can be left empty to use the default task for this model if it can be determined
do_run – checks the model works with the defined inputs
exporter – exporter the model using this exporter, available list:
export-strict
,export-nostrict
,onnx
do_same – checks the discrepancies of the exported model
verbose – verbosity level
dtype – uses this dtype to check the model
device – do the verification on this device
trained – use the trained model, not the untrained one
optimization – optimization to apply to the exported model, depend on the the exporter
quiet – if quiet, catches exception if any issue
patch – applies patches (
patch_transformers=True
) before exporting, seeonnx_diagnostic.torch_export_patches.bypass_export_some_errors()
stop_if_static – stops if a dynamic dimension becomes static, see
onnx_diagnostic.torch_export_patches.bypass_export_some_errors()
dump_folder – dumps everything in a subfolder of this one
drop_inputs – drops this list of inputs (given their names)
ortfusiontype – runs ort fusion, the parameters defines the fusion type, it accepts multiple values separated by
|
, seeonnx_diagnostic.torch_models.test_helper.run_ort_fusion()
input_options – additional options to define the dummy inputs used to export
- Returns:
two dictionaries, one with some metrics, another one with whatever the function produces
- onnx_diagnostic.torch_models.test_helper.validate_onnx_model(data: Dict[str, Any], quiet: bool = False, verbose: int = 0, flavour: str | None = None) Tuple[Dict[str, Any], Dict[str, Any]] [source]¶
Verifies that an onnx model produces the same expected outputs. It uses
data["onnx_filename]
as the input onnx filename ordata["onnx_filename_{flavour}]
if flavour is specified.- Parameters:
data – dictionary with all the necessary inputs, the dictionary must contains keys
model
andinputs_export
quiet – catch exception or not
verbose – verbosity
flavour – use a different version of the inputs
- Returns:
two dictionaries, one with some metrics, another one with whatever the function produces
- onnx_diagnostic.torch_models.test_helper.version_summary() Dict[str, int | float | str] [source]¶
Example:
<<<
import pprint from onnx_diagnostic.torch_models.test_helper import version_summary pprint.pprint(version_summary())
>>>
{'version_date': '2025-04-22T09:33:44', 'version_numpy': '2.2.4', 'version_onnx': '1.19.0', 'version_onnx_diagnostic': '0.4.0', 'version_onnxruntime': '1.22.0+cu126', 'version_onnxscript': '0.3.0.dev20250301', 'version_torch': '2.8.0.dev20250416+cu126', 'version_transformers': '4.52.0.dev0'}