Settings
Ollama Proxy
To auto-capture Ollama token usage, point your apps at the proxy instead of Ollama directly:
http://localhost:5000/proxy/ollama
Example — using the CLI:
OLLAMA_HOST=http://localhost:5000/proxy/ollama ollama run llama3
Example — in code:
openai.base_url = "http://localhost:5000/proxy/ollama/v1"