Integrations
Auto-instrumentation for LLM applications.
What it is
Auto-instrumentation allows you to add tracing to your LLM applications with minimal code changes. Install the relevant traceAI package for your framework, register a trace provider, and FutureAGI handles the rest — automatically capturing spans, inputs, outputs, latency, and metadata without manually adding instrumentation code.
LLM Models
OpenAI
traceAI-openai
OpenAI Agents
traceAI-openai-agents
Anthropic
traceAI-anthropic
AWS Bedrock
traceAI-bedrock
Vertex AI (Gemini)
traceAI-vertexai
Mistral AI
traceAI-mistralai
Groq
traceAI-groq
Together AI
traceAI-openai
Google ADK
traceai-google-adk
Google GenAI
traceAI-google-genai
Ollama
traceAI-openai
Portkey
traceAI-portkey
Orchestration Frameworks
LangChain
traceAI-langchain
LangGraph
traceAI-langchain
LlamaIndex
traceAI-llamaindex
LlamaIndex Workflows
traceAI-llamaindex
LiteLLM
traceAI-litellm
CrewAI
traceAI-crewai
AutoGen
traceAI-autogen
Haystack
traceAI-haystack
PromptFlow
traceAI-openai
Vercel AI SDK
@traceai/vercel
Mastra
@traceai/mastra
Pipecat
traceAI-pipecat