Ultralytics is the home for cutting-edge, state-of-the-art computer vision models for tasks like image classification, object detection, image segmentation, and pose estimation. Not only it hosts YOLOv8, the latest iteration in the YOLO series of real-time object detection models, but other powerful computer vision models such as SAM (Segment Anything Model), RT-DETR, YOLO-NAS, etc. Besides providing implementations of these models, Ultralytics also provides us with out-of-the-box workflows for training, fine-tuning, and applying these models using an easy-to-use API.Documentation Index
Fetch the complete documentation index at: https://wb-21fd5541-john-wbdocs-2044-rename-serverless-products.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Get started
-
Install
ultralyticsandwandb.The development team has tested the integration with- Command Line
- Notebook
ultralyticsv8.0.238and below. To report any issues with the integration, create a GitHub issue with the tagyolov8.
Track experiments and visualize validation results
This section demonstrates a typical workflow of using an Ultralytics model for training, fine-tuning, and validation and performing experiment tracking, model-checkpointing, and visualization of the model’s performance using W&B. You can also check out about the integration in this report: Supercharging Ultralytics with W&B To use the W&B integration with Ultralytics, import thewandb.integration.ultralytics.add_wandb_callback function.
YOLO model of your choice, and invoke the add_wandb_callback function on it before performing inference with the model. This ensures that when you perform training, fine-tuning, validation, or inference, it automatically saves the experiment logs and the images, overlaid with both ground-truth and the respective prediction results using the interactive overlays for computer vision tasks on W&B along with additional insights in a wandb.Table.
Visualize prediction results
This section demonstrates a typical workflow of using an Ultralytics model for inference and visualizing the results using W&B. You can try out the code in Google Colab: Open in Colab. You can also check out about the integration in this report: Supercharging Ultralytics with W&B In order to use the W&B integration with Ultralytics, we need to import thewandb.integration.ultralytics.add_wandb_callback function.
wandb.init(). Next, Initialize your desired YOLO model and invoke the add_wandb_callback function on it before you perform inference with the model. This ensures that when you perform inference, it automatically logs the images overlaid with your interactive overlays for computer vision tasks along with additional insights in a wandb.Table.
wandb.init() in case of a training or fine-tuning workflow. However, if the code involves only prediction, you must explicitly create a run.
Here’s how the interactive bbox overlay looks:
For more details, see the W&B image overlays guide.