.xoptim.patterns_ort¶
modules
- experimental_experiment.xoptim.patterns_ort.get_onnxruntime_patterns(verbose: int = 0) List[PatternOptimization] [source]¶
Returns a default list of optimization patterns for onnxruntime. It is equal to the following list.
<<<
from experimental_experiment.xoptim.patterns_api import pattern_table_doc from experimental_experiment.xoptim.patterns_ort import get_onnxruntime_patterns print(pattern_table_doc(get_onnxruntime_patterns(), as_rst=True))
>>>
name
short_name
priority
doc
0
BiasGeluPattern
BiasGelu
1
Replaces by
y = BiasGelu(x, B)
1
BiasSoftmaxPattern
BiasSoftmax
1
Replaces Softmax(Add(x,y), axis=-1) by BiasSoftmax(x,y,axis=-1)
2
GeluOrtPattern
GeluOrt
0
Detects the decomposed version of Gelu with Tanh .. math
3
GeluErfPattern
GeluErf
0
Detects the decomposed version of Gelu with Erf.
4
FusedConvPattern
FusedConv
2
Replaces the Conv + Relu into FusedConv.
5
FastGeluPattern
FastGelu
1
Replaces Gelu by FastGelu.
6
FusedMatMulPattern
FusedMatMul
2
Replaces the sequence Transpose, Matmul into FusedMatMul.
7
FusedMatMulx2Pattern
FusedMatMulx2
3
Replaces the sequence Div by a scalar consumed by two FusedMatMul.
8
FusedMatMulDivPattern
FusedMatMulDiv
2
Replaces the Matmul, Div into FusedMatMul.
9
FusedMatMulTransposePattern
FusedMatMulTranspose
3
Replaces the sequence (Fused)Matmul(A,B) + Transpose into FusedMatMul(B.T, A.T).
10
OrtBatchNormalizationTrainingPattern
OrtBatchNormalizationTraining
1
onnxruntime does not support batch normalization with training=1.
11
QuickGeluPattern
QuickGelu
1
Replaces Mul(x, Sigmoid(x)) by QuickGelu(x, alpha=1)
12
SimplifiedLayerNormalizationPattern
SimplifiedLayerNormalization
1
Replaces the sequence Transpose, Matmul into FusedMatMul.
13
SoftmaxGradPattern
SoftmaxGrad
1
Replaces the sequence Mul, ReduceSum, Mul, Sub by SoftmaxGrad