Skip to main content

Documentation Index

Fetch the complete documentation index at: https://wb-21fd5541-john-wbdocs-2044-rename-serverless-products.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

Learn how to use the Serverless Inference API to access foundation models programmatically.

Base URL

Access the Inference service at:
https://api.inference.wandb.ai/v1
ImportantTo use this endpoint, you need:
  • A W&B account with Inference credits
  • A valid W&B API key
If you belong to more than one team or want to attribute your usage to a project you will also need team and project IDs. In code samples, these appear as <your-team>/<your-project>. Your default entity and the project name inference will be used if unspecified.

Available methods

The Serverless Inference API provides OpenAI-compatible endpoints for interacting with foundation models:

Authentication

All API requests require authentication using your W&B API key. Create an API key at wandb.ai/settings. Include your API key in the request headers:
  • For OpenAI SDK: Set as api_key parameter
  • For direct API calls: Use Authorization: Bearer <your-api-key>

Error handling

See API Errors for a complete list of error codes and how to resolve them.

Next steps