Skip to main content

Documentation Index

Fetch the complete documentation index at: https://www.osohq.com/docs/llms.txt

Use this file to discover all available pages before exploring further.

The Oso edge proxy is a reverse proxy at agents.osohq.cloud that sits between AI agents and their LLM providers. It captures the full content of agent sessions, including prompts, completions, tool calls, and metadata, and feeds them into Oso for monitoring and alerting.

What it captures

When agent traffic flows through the edge proxy, Oso records:
  • Prompts sent to the LLM
  • Completions returned by the LLM
  • Tool calls and their parameters
  • Model metadata (which model was used, token counts)
  • User and session identifiers
This data powers session monitoring and alerts.

Supported agents

The edge proxy works with any agent that allows configuring a custom base URL for its LLM provider. Currently supported:
AgentEnvironment
Claude CodeCLI
CodexCLI
Gemini CLICLI
Claude DesktopDesktop
CursorDesktop
AntigravityDesktop
Agents with hardcoded LLM endpoints that don’t support custom base URLs cannot use the edge proxy. For those agents, consider using the browser extension or EDR integration for discovery.

Setup

For step-by-step configuration instructions, see the Quickstart. The quickstart walks through:
  1. Getting your Environment ID from the Oso UI
  2. Configuring your agent to route through agents.osohq.cloud
  3. Verifying sessions appear in your dashboard