experimental_experiment.model_run

experimental_experiment.model_run.create_feeds(sess: onnxruntime.InferenceSession, batch_size: int = 1) Dict[str, Any][source]

Creates random feeds for a model.

Parameters:
  • sess – onnxruntime session

  • batch_size – batch_size

Returns:

feeds

experimental_experiment.model_run.create_tensor(shape: Tuple[int, ...], dtype: int, batch_size: int = 1) ndarray[source]

Creates a random tensor.

Parameters:
  • shape – shape

  • dtype – onnx type

  • batch_size – batch_size

Returns:

numpy array

experimental_experiment.model_run.model_run(model: str | ModelProto, repeat: int = 10, warmup: int = 5, batch_size: int = 1, processor: str = 'CPU', verbose: int = 0, validate: str | ModelProto | None = None) Dict[str, Any][source]

Loads a model with onnxruntime and measures the inference time.

Parameters:
  • model – model to run

  • warmup – number of iterations to run before measuring

  • repeat – number of iterations to run to measure

  • batch_size – batch size of the inputs

  • processor – processor to run

  • verbose – verbosity

Returns:

metrics