yobx.sklearn.preprocessing.power_transformer#

yobx.sklearn.preprocessing.power_transformer.sklearn_power_transformer(g: GraphBuilderExtendedProtocol, sts: Dict, outputs: List[str], estimator: PowerTransformer, X: str, name: str = 'power_transformer') str[source]#

Converts a sklearn.preprocessing.PowerTransformer into ONNX.

Both method='yeo-johnson' (default) and method='box-cox' are supported. When standardize=True (the default) the transformer also applies a StandardScaler to the output; this is inlined as Sub / Div nodes.

Yeo-Johnson — applied per column:

y >= 0, lam != 0 :  ((y + 1)^lam  - 1) / lam
y >= 0, lam == 0 :  log(y + 1)
y < 0,  lam != 2 :  -((-y + 1)^(2-lam) - 1) / (2-lam)
y < 0,  lam == 2 :  -log(-y + 1)

Box-Cox — applied per column (input must be positive):

lam != 0 :  (x^lam - 1) / lam
lam == 0 :  log(x)
Parameters:
  • g – the graph builder to add nodes to

  • sts – shapes defined by scikit-learn

  • outputs – desired output tensor names

  • estimator – a fitted PowerTransformer

  • X – name of the input tensor

  • name – prefix used for names of nodes added by this converter

Returns:

name of the output tensor