Integrations Overview
Connect Future AGI with your existing AI frameworks, LLM providers, and tools.
TraceAI
TraceAI provides pre-built auto-instrumentation for the following frameworks and LLM providers.
LLM Models
OpenAI
OpenAI Agents SDK
Vertex AI (Gemini)
AWS Bedrock
Mistral AI
Anthropic
Groq
Together AI
Google ADK
Google GenAI
Portkey
Ollama
Orchestration Frameworks
LlamaIndex
LlamaIndex Workflows
LangChain
LangGraph
LiteLLM
CrewAI
Haystack
AutoGen
PromptFlow
Vercel
Mastra
DSPy
Instructor
Guardrails AI
Hugging Face smolagents
MCP
Langfuse (SDK tracing)
Note
The Langfuse card above is for SDK-level tracing integration (sending new traces via the Langfuse SDK). To import existing traces from a Langfuse account into Future AGI, see the Langfuse Import integration below.
Voice
Other
Import Traces
Already using another observability platform? Pull your existing traces into Future AGI without re-instrumenting your code.
| Platform | Use when |
|---|---|
| Langfuse | You’re migrating from Langfuse or running both platforms side by side |
Export & Alerts
Route Future AGI data to the tools your team already monitors. All exports are configured through Settings > Integrations with no code changes.
| If you want to… | Use |
|---|---|
| Build dashboards and monitor infra | Datadog |
| Track LLM usage in product analytics | PostHog or Mixpanel |
| Archive trace data for compliance or cost | Cloud Storage (S3, Azure Blob, GCS) |
| Stream events to your own consumers | Message Queues (SQS, Pub/Sub) |
| Get paged when something breaks | PagerDuty |