Skip to content
Apache-2.0 vision evaluation toolkit

YOLOZU — Framework-agnostic vision model evaluation toolkit

Bring your own inference. Export once. Evaluate fairly.

YOLOZU evaluates vision model outputs through a stable predictions.json interface contract instead of tying reports to one training or inference framework. It is built for teams comparing PyTorch, ONNX Runtime, TensorRT, and custom inference backends without silent metric drift.

Detection Instance segmentation Keypoints Monocular depth 6DoF pose

Use cases

What YOLOZU is for

YOLOZU is the evaluation lane for teams that already have, or can export, prediction artifacts and need comparable reports across model families or runtime backends.

Fair cross-framework evaluation

Compare outputs from Ultralytics, RT-DETR, Detectron2, MMDetection, or custom inference through the same prediction artifact path.

Backend parity checks

Qualify PyTorch, ONNX Runtime, TensorRT, TorchScript, and adapter targets while labeling real, placeholder, skipped, and dry-run support clearly.

Reproducible reports

Keep dataset split, metric protocol, export settings, and artifact paths explicit enough for review and follow-up debugging.

Beginner-friendly diagnostics

Use yolozu guide and yolozu doctor --explain when a JSON-only diagnostic is too dense for a first run.

Quickstart

Install and run a visible demo

The shortest path checks the environment, runs an instance-segmentation demo, and writes both a JSON report and PNG overlays.

python3 -m pip install -U yolozu
yolozu doctor --explain
yolozu demo instance-seg --run-dir reports/quickstart_instance_seg --progress

Evaluation flow

How the lane works

1. Export Bring predictions from any framework or runtime.
2. Validate Check the predictions interface contract before scoring.
3. Evaluate Run the pinned metric protocol for the task.
4. Compare Review reports, PNG evidence, and parity artifacts.

Resources

Official links

These are the canonical public surfaces for installation, source review, archived releases, and the manual.

Source and issues

Review the Apache-2.0 repository, file issues, and inspect the current README and documentation map.

Install package

Install the current Python package from PyPI and use the CLI entrypoint yolozu.

Manual and DOI records

Use the archived software and manual records when you need a stable citation trail.

Interface contract

Start with the predictions schema when wiring a new model, runtime, or export adapter into the evaluation lane.

License

Apache-2.0 boundary

YOLOZU is published under Apache-2.0. The primary lane is framework-agnostic prediction validation and evaluation; optional adapters and research paths are labeled separately so production users can choose the stable surface first.