yobx.sklearn.decomposition.kernel_pca#
- yobx.sklearn.decomposition.kernel_pca.sklearn_kernel_pca(g: GraphBuilderExtendedProtocol, sts: Dict, outputs: List[str], estimator: KernelPCA, X: str, name: str = 'kernel_pca') str[source]#
Converts a
sklearn.decomposition.KernelPCAinto ONNX.The out-of-sample transform replicates
sklearn.decomposition.KernelPCA.transform():Compute the pairwise kernel matrix between X (shape
(N, F)) and the training dataX_fit_(shape(M, F)):linear —
K = X @ X_fit_.Trbf —
K[i,j] = exp(−γ · ||X[i] − X_fit_[j]||²)laplacian —
K[i,j] = exp(−γ · ||X[i] − X_fit_[j]||₁)(not supported; raisesNotImplementedError)poly —
K = (γ · X @ X_fit_.T + coef0) ^ degreesigmoid —
K = tanh(γ · X @ X_fit_.T + coef0)cosine —
K[i,j] = (X[i]/‖X[i]‖) · (X_fit_[j]/‖X_fit_[j]‖)precomputed / callable — not supported; raises
NotImplementedError
Centre the kernel using the statistics stored in
_centerer:K_pred_cols = K.sum(axis=1, keepdims=True) / M K_centered = K − K_fit_rows_ − K_pred_cols + K_fit_all_
Compute scaled eigenvectors (zero-eigenvalue columns set to 0):
scaled_alphas[:, non_zeros] = eigenvectors_[:, non_zeros] / sqrt(eigenvalues_[non_zeros])Project:
result = K_centered @ scaled_alphas # (N, n_components)
The full ONNX graph (rbf example) is:
X (N, F) │ └── ||X[i] − X_fit_[j]||² ────────────────────────────► sq_dists (N, M) │ sq_dists × (−γ) ──► neg_scaled ──Exp──► K (N, M) │ K − K_fit_rows_ − K.sum(axis=1)/M + K_fit_all_ ──► K_c (N, M) │ MatMul(scaled_alphas) ──► output (N, n_components)- Parameters:
g – the graph builder to add nodes to
sts – shapes defined by scikit-learn
estimator – a fitted
KernelPCAoutputs – desired output names
X – input tensor name
name – prefix name for the added nodes
- Returns:
output tensor name
- Raises:
NotImplementedError – for
kernel='precomputed', callable kernels, orkernel='laplacian'