yobx.sklearn.isotonic.isotonic_regression#
- yobx.sklearn.isotonic.isotonic_regression.sklearn_isotonic_regression(g: GraphBuilderExtendedProtocol, sts: Dict, outputs: List[str], estimator: IsotonicRegression, X: str, name: str = 'isotonic_regression') str[source]#
Converts a
sklearn.isotonic.IsotonicRegressioninto ONNX.The prediction follows
sklearn.isotonic.IsotonicRegression.predict(), which uses piecewise-linear interpolation vianumpy.interp():predict(X) = np.interp(X.ravel(), X_thresholds_, y_thresholds_)
Values below
X_min_are clamped toy_thresholds_[0]and values aboveX_max_are clamped toy_thresholds_[-1].ONNX graph structure (K breakpoints, K ≥ 2):
X (N, 1) or (N,) │ Reshape(-1) ──► x_flat (N,) │ Clip(X_min_, X_max_) ──► x_clipped (N,) │ Unsqueeze(-1) ──► x_exp (N, 1) │ xp (K,) ──── GreaterOrEqual ──► cmp (N, K) [bool] │ │ │ Cast(INT64) ──► cmp_int (N, K) │ │ │ ReduceSum(axis=1) ──► seg_count (N,) │ │ │ Sub(1) ──► seg_lo_raw (N,) │ │ │ Clip(0, K-2) ──► seg_lo (N,) int64 │ │ │ Add(1) ──► seg_hi (N,) int64 │ Gather(xp, seg_lo) ──► xp_lo (N,) Gather(xp, seg_hi) ──► xp_hi (N,) Gather(fp, seg_lo) ──► fp_lo (N,) Gather(fp, seg_hi) ──► fp_hi (N,) │ dx = xp_hi - xp_lo t = (x_clipped - xp_lo) / dx out = fp_lo + t * (fp_hi - fp_lo) ──► predictions (N,)
When all training samples collapse to a single breakpoint (K = 1), the graph simply broadcasts the constant
y_thresholds_[0]to all rows.- Parameters:
g – the graph builder to add nodes to
sts – shapes defined by scikit-learn
outputs – desired output names
estimator – a fitted
IsotonicRegressionX – input tensor name (shape
(N,)or(N, 1))name – prefix for added node names
- Returns:
output tensor name (shape
(N,))