yobx.sklearn.preprocessing.power_transformer#
- yobx.sklearn.preprocessing.power_transformer.sklearn_power_transformer(g: GraphBuilderExtendedProtocol, sts: Dict, outputs: List[str], estimator: PowerTransformer, X: str, name: str = 'power_transformer') str[source]#
Converts a
sklearn.preprocessing.PowerTransformerinto ONNX.Both
method='yeo-johnson'(default) andmethod='box-cox'are supported. Whenstandardize=True(the default) the transformer also applies aStandardScalerto the output; this is inlined asSub/Divnodes.Yeo-Johnson — applied per column:
y >= 0, lam != 0 : ((y + 1)^lam - 1) / lam y >= 0, lam == 0 : log(y + 1) y < 0, lam != 2 : -((-y + 1)^(2-lam) - 1) / (2-lam) y < 0, lam == 2 : -log(-y + 1)
Box-Cox — applied per column (input must be positive):
lam != 0 : (x^lam - 1) / lam lam == 0 : log(x)
- Parameters:
g – the graph builder to add nodes to
sts – shapes defined by scikit-learn
outputs – desired output tensor names
estimator – a fitted
PowerTransformerX – name of the input tensor
name – prefix used for names of nodes added by this converter
- Returns:
name of the output tensor