yobx.sklearn.neural_network.mlp#
- yobx.sklearn.neural_network.mlp.sklearn_mlp_classifier(g: GraphBuilderExtendedProtocol, sts: Dict, outputs: List[str], estimator: MLPClassifier, X: str, name: str = 'mlp_classifier') Tuple[str, str][source]#
Converts a
sklearn.neural_network.MLPClassifierinto ONNX.All hidden layers use the activation stored in
estimator.activation; the output layer usesestimator.out_activation_which is'logistic'for binary classification and'softmax'for multi-class.Hidden layers (repeated for each weight matrix except the last):
h_prev ──MatMul(coef_i)──Add(bias_i)──Activation──► h_i
Binary classification (
out_activation_ == 'logistic'):h ──MatMul(coef_out)──Add(bias_out)──Sigmoid──► proba_pos │ Sub(1, ·) ──► proba_neg │ Concat ──► probabilities │ ArgMax──Cast──Gather(classes)──► labelMulti-class classification (
out_activation_ == 'softmax'):h ──MatMul(coef_out)──Add(bias_out)──Softmax──► probabilities │ ArgMax──Cast──Gather(classes)──► label- Parameters:
g – the graph builder to add nodes to
sts – shapes defined by scikit-learn
estimator – a fitted
MLPClassifieroutputs – desired names (label, probabilities)
X – input tensor name
name – prefix names for the added nodes
- Returns:
tuple
(label_result_name, proba_result_name)
- yobx.sklearn.neural_network.mlp.sklearn_mlp_regressor(g: GraphBuilderExtendedProtocol, sts: Dict, outputs: List[str], estimator: MLPRegressor, X: str, name: str = 'mlp_regressor') str[source]#
Converts a
sklearn.neural_network.MLPRegressorinto ONNX.All hidden layers use the activation stored in
estimator.activation('identity','logistic','tanh', or'relu'); the output layer always uses the'identity'activation (i.e., the linear output is returned as-is).X ──hidden layers──► h ──MatMul(coef_out)──Add(bias_out)──► predictions
- Parameters:
g – the graph builder to add nodes to
sts – shapes defined by scikit-learn
estimator – a fitted
MLPRegressoroutputs – desired output names (predictions)
X – input tensor name
name – prefix names for the added nodes
- Returns:
output tensor name