onnx_diagnostic.helpers.optim_helper¶
- onnx_diagnostic.helpers.optim_helper.optimize_model(algorithm: str, model: ModelProto | str, output: str | None = None, processor: str | None = None, infer_shapes: bool = True, remove_shape_info: bool = False, verbose: int = 1)[source][source]¶
Optimizes an onnx model by fusing nodes. It looks for patterns in the graphs and replaces them by the corresponding nodes. It also does basic optimization such as removing identity nodes or unused nodes.
- Parameters:
algorithm – algorithm to choose
model – model to optimize as a proto or a filename
output – if not empty, the optimized model is saved
processor – optimization are done for the processor
infer_shapes – infer shapes before optimizing, this might not be available for all algorithm
remove_shape_info – remove shape information before saving the model
verbose – verbosity level
- Returns:
optimized model
The goal is to make the model faster. Argument patterns defines the patterns to apply or the set of patterns. It is possible to show statistics or to remove a particular pattern. Here are some environment variables which can be used to trigger these displays.
Available options algorithms, default and default+runtime:
DROPPATTERN=<pattern1,patterns2,...>: do not apply those patterns when optimizing a modelDUMPPATTERNS=<folder>: dumps all matched and applied nodes when a pattern is appliedPATTERN=<pattern1,pattern2,...>: increase verbosity for specific patterns to understand why one pattern was not applied, this shows which line is rejecting a pattern if it seems one pattern was missed