Fluent ONNX Builder (Light API)#

yobx.builder.light provides a chainable, expression-oriented API for building ONNX graphs without writing protobuf boilerplate. It is designed for small models, test fixtures, and quick experimentation.

Core classes#

The module exposes three core abstractions:

  • OnnxGraph — accumulates nodes, inputs, outputs, and initializers. Created via start (ModelProto output) or g (GraphProto output for subgraphs).

  • Var — represents a single tensor value (graph input, initializer, or node output). Supports Python operator overloads (+, -, *, /, @, etc.) and every standard ONNX operator as a method (Relu(), MatMul(), …).

  • Vars — a tuple of Var objects returned when a node produces multiple outputs (e.g. Split, TopK).

Building a simple model#

The minimal workflow is a single chained expression starting from start():

<<<

from yobx.builder.light import start

# Build Y = Neg(X)
onx = start().vin("X").Neg().rename("Y").vout().to_onnx()
print(f"inputs : {[i.name for i in onx.graph.input]}")
print(f"outputs: {[o.name for o in onx.graph.output]}")
print(f"nodes  : {[n.op_type for n in onx.graph.node]}")

>>>

    inputs : ['X']
    outputs: ['Y']
    nodes  : ['Neg']

Each call in the chain returns either an OnnxGraph (for methods like vin) or a Var (for operator methods and rename). Calling to_onnx (or its shortcut on Var) finalizes the model.

Two-input models#

When multiple inputs are needed, call vin for each input and then combine them with bring before applying the operator:

<<<

from yobx.builder.light import start

onx = start().vin("X").vin("Y").bring("X", "Y").Add().rename("Z").vout().to_onnx()
print(f"inputs : {[i.name for i in onx.graph.input]}")
print(f"outputs: {[o.name for o in onx.graph.output]}")

>>>

    inputs : ['X', 'Y']
    outputs: ['Z']

Python operator overloads#

Var supports the standard Python arithmetic operators so that expressions closely mirror the underlying math:

<<<

import numpy as np
from yobx.builder.light import start

gr = start()
x = gr.vin("X")
y = gr.vin("Y")
bias = gr.cst(np.ones(4, dtype=np.float32), "bias")

# (X * Y) + bias → renamed to Z, declared as graph output
(x * y + bias).rename("Z").vout()

onx = gr.to_onnx()
print(f"nodes  : {[n.op_type for n in onx.graph.node]}")
print(f"outputs: {[o.name for o in onx.graph.output]}")

>>>

    nodes  : ['Mul', 'Add']
    outputs: ['Z']

Constants and initializers#

cst adds a numpy array as a graph initializer and returns a Var pointing to it:

<<<

import numpy as np
from yobx.builder.light import start

gr = start()
x = gr.vin("X")
w = gr.cst(np.random.randn(4, 2).astype(np.float32), "W")
(x @ w).rename("Y").vout()

onx = gr.to_onnx()
print(f"initializers: {[i.name for i in onx.graph.initializer]}")
print(f"nodes       : {[n.op_type for n in onx.graph.node]}")

>>>

    initializers: ['W']
    nodes       : ['MatMul']

Multiple outputs#

Operators that produce more than one tensor return a Vars object. Individual outputs are accessed by indexing. Unique is an example of such an operator — it returns unique values, indices, inverse indices, and counts as four separate tensors:

<<<

from yobx.builder.light import start

gr = start()
x = gr.vin("X")
parts = x.Unique(axis=0, sorted=1)  # returns Vars with 4 outputs
parts[0].rename("vals").vout()  # unique values
parts[1].rename("inds").vout()  # indices

onx = gr.to_onnx()
print(f"outputs: {[o.name for o in onx.graph.output]}")

>>>

    outputs: ['vals', 'inds']

Subgraphs#

Use g() to build a GraphProto for use inside control-flow operators such as If. The graph is finalized with to_onnx, which returns a onnx.GraphProto in this mode:

from yobx.builder.light import g

then_branch = g().vin("X").Relu().rename("Y").vout().to_onnx()
else_branch = g().vin("X").Abs().rename("Y").vout().to_onnx()

See also

GraphBuilder — the GraphBuilder API for building and optimizing ONNX graphs programmatically, which is more feature-rich but lower-level.

Translate — translating an existing onnx.ModelProto back to Python source code using the light API or the onnx.helper functions.