yobx.tensorflow.ops.activations#

Converters for TF activation ops: Relu, Relu6, Sigmoid, Tanh, Softmax, Elu, Selu, LeakyRelu.

yobx.tensorflow.ops.activations.convert_relu(g: GraphBuilderExtendedProtocol, sts: Dict[str, Any], outputs: List[str], op: Operation) str[source]#

TF Relu → ONNX Relu.

yobx.tensorflow.ops.activations.convert_relu6(g: GraphBuilderExtendedProtocol, sts: Dict[str, Any], outputs: List[str], op: Operation, verbose: int = 0) str[source]#

TF Relu6 → ONNX Clip(min=0, max=6).

yobx.tensorflow.ops.activations.convert_sigmoid(g: GraphBuilderExtendedProtocol, sts: Dict[str, Any], outputs: List[str], op: Operation) str[source]#

TF Sigmoid → ONNX Sigmoid.

yobx.tensorflow.ops.activations.convert_softmax(g: GraphBuilderExtendedProtocol, sts: Dict[str, Any], outputs: List[str], op: Operation, verbose: int = 0) str[source]#

TF Softmax → ONNX Softmax(axis=-1).

yobx.tensorflow.ops.activations.convert_tanh(g: GraphBuilderExtendedProtocol, sts: Dict[str, Any], outputs: List[str], op: Operation) str[source]#

TF Tanh → ONNX Tanh.