yobx.tensorflow.ops.nn_ops#

Converters for tf.nn activation ops.

Activation functions#

Elu, Selu, LeakyRelu, LogSoftmax

yobx.tensorflow.ops.nn_ops.convert_elu(g: GraphBuilderExtendedProtocol, sts: Dict[str, Any], outputs: List[str], op: Operation) str[source]#

TF Elu (tf.nn.elu) → ONNX Elu.

yobx.tensorflow.ops.nn_ops.convert_leaky_relu(g: GraphBuilderExtendedProtocol, sts: Dict[str, Any], outputs: List[str], op: Operation) str[source]#

TF LeakyRelu (tf.nn.leaky_relu) → ONNX LeakyRelu.

The alpha negative-slope value is read from the TF op attribute and forwarded to the ONNX node unchanged.

yobx.tensorflow.ops.nn_ops.convert_log_softmax(g: GraphBuilderExtendedProtocol, sts: Dict[str, Any], outputs: List[str], op: Operation) str[source]#

TF LogSoftmax (tf.nn.log_softmax) → ONNX LogSoftmax(axis=-1).

TensorFlow applies log-softmax along the last axis; the ONNX LogSoftmax operator defaults to axis=1 in older opsets, so the axis is set explicitly to -1 to match TF semantics.

yobx.tensorflow.ops.nn_ops.convert_selu(g: GraphBuilderExtendedProtocol, sts: Dict[str, Any], outputs: List[str], op: Operation) str[source]#

TF Selu (tf.nn.selu) → ONNX Selu.

Both TensorFlow and ONNX use the same fixed coefficients: alpha = 1.6732632423543772 and gamma = 1.0507009873554805.