Track usage and costs

Manage the usage of your LLM apis in Stack

You can track your LLM usage in Stack AI deployed via two mechanisms:

  1. Usage interface: simply go to 'Profile > Usage' and you will see a breakdown of your costs for each flow.

  2. API: a RestAPI that tracks your LLM usage across all your projects for each flow.

Below is an example on how to call the usage API:

import requests

# API endpoint URL
url = f"https://www.stack-ai.com/get_usage_api?flow_id={'FLOW_ID'}&org={'YOUR_ORGANIZATION'}"

# Request data
headers = {
    "Authorization": "Bearer <PRIVATE_API_KEY>"
}

# Make the API request
response = requests.post(url, headers=headers)

# Check the response
if response.status_code == 200:
    print("API request successful")
else:
    print("API request failed:", response.text)

This API tracks the number of tokens, each model, and the cost for each Stack AI run or API call. The results are organized daily, weekly, and monthly.

Last updated