yobx.tensorflow.ops.nn_ops#
Converters for tf.nn activation ops.
Activation functions#
Elu, Selu, LeakyRelu, LogSoftmax
- yobx.tensorflow.ops.nn_ops.convert_elu(g: GraphBuilderExtendedProtocol, sts: Dict[str, Any], outputs: List[str], op: Operation) str[source]#
TF
Elu(tf.nn.elu) → ONNXElu.
- yobx.tensorflow.ops.nn_ops.convert_leaky_relu(g: GraphBuilderExtendedProtocol, sts: Dict[str, Any], outputs: List[str], op: Operation) str[source]#
TF
LeakyRelu(tf.nn.leaky_relu) → ONNXLeakyRelu.The
alphanegative-slope value is read from the TF op attribute and forwarded to the ONNX node unchanged.
- yobx.tensorflow.ops.nn_ops.convert_log_softmax(g: GraphBuilderExtendedProtocol, sts: Dict[str, Any], outputs: List[str], op: Operation) str[source]#
TF
LogSoftmax(tf.nn.log_softmax) → ONNXLogSoftmax(axis=-1).TensorFlow applies log-softmax along the last axis; the ONNX
LogSoftmaxoperator defaults toaxis=1in older opsets, so the axis is set explicitly to-1to match TF semantics.