Hugging Face Accelerate is a library that enables the same PyTorch code to run across any distributed configuration, to simplify model training and inference at scale. Accelerate includes a W&B Tracker which we show how to use below. You can also read more about Accelerate Trackers in Hugging Face.Documentation Index
Fetch the complete documentation index at: https://wb-21fd5541-john-wbdocs-2044-rename-serverless-products.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Start logging with Accelerate
To get started with Accelerate and W&B you can follow the pseudocode below:- Pass
log_with="wandb"when initialising the Accelerator class - Call the
init_trackersmethod and pass it:
- a project name via
project_name - any parameters you want to pass to
wandb.init()via a nested dict toinit_kwargs - any other experiment config information you want to log to your wandb run, via
config
- Use the
wandb.Run.log()method to log to Weigths & Biases; thestepargument is optional - Call
.end_training()when finished training
Access the W&B tracker
To access the W&B tracker, use theAccelerator.get_tracker() method. Pass in the string corresponding to a tracker’s .name attribute, which returns the tracker on the main process.