yobx.sklearn.manifold.locally_linear_embedding#
- yobx.sklearn.manifold.locally_linear_embedding.sklearn_locally_linear_embedding(g: GraphBuilderExtendedProtocol, sts: Dict, outputs: List[str], estimator: LocallyLinearEmbedding, X: str, name: str = 'lle') str[source]#
Converts a
sklearn.manifold.LocallyLinearEmbeddinginto ONNX.The out-of-sample embedding follows the algorithm in
sklearn.manifold.LocallyLinearEmbedding.transform():Find the k nearest training neighbours for each query point using the same distance metric as during fitting (default: Euclidean).
Compute barycentric reconstruction weights — the coefficients
wthat minimise the local reconstruction error:minimize ||x - w @ X_neighbours||² s.t. sum(w) = 1
The (regularised) closed-form solution is:
C = v @ v.T + R * I_k (Gram matrix, regularised) v = x - X_neighbours (local displacement vectors) R = reg * trace(C) if trace(C) > 0, else reg w_raw = C⁻¹ @ ones_k (solved by Conjugate Gradient) w = w_raw / sum(w_raw) (normalised to sum to 1)
Apply weights to the training embedding:
result = w @ embedding_ (N, n_components)
The full ONNX graph is:
X (N, F) │ ├── pairwise Euclidean distances ──────────────────────► dists (N, M) │ │ │ TopK(k, largest=0) ──┘ │ │ │ nb_idx (N, k) │ │ │ │ │ X_train (M, F) ──Gather──────────┘ │ │ │ │ nb_feats (N, k, F) │ │ v = X[:, None, :] - nb_feats → (N, k, F) │ C = v @ v.T → (N, k, k) │ C_reg = C + R * I_k → (N, k, k) │ w_raw = CG(C_reg, ones) → (N, k) │ w = w_raw / sum(w_raw) → (N, k) │ embedding_ (M, n_comp) ──Gather(nb_idx)──► emb_nb (N, k, n_comp) │ result = w[:, None, :] @ emb_nb → (N, n_components)
- Parameters:
g – the graph builder to add nodes to
sts – shapes defined by scikit-learn
outputs – desired output names (embedded inputs)
estimator – a fitted
LocallyLinearEmbeddingX – input tensor name
name – prefix name for the added nodes
- Returns:
output tensor name