Integrations
Auto-instrumentation for LLM applications across Python, JavaScript, and Java.
About
Auto-instrumentation adds tracing to your LLM applications with minimal code changes. Install the relevant traceAI package for your framework, register a trace provider, and FutureAGI captures spans, inputs, outputs, latency, and metadata automatically.
Python and JS/TS integrations use instrumentors that patch client libraries. Java integrations use explicit Traced* wrappers around your existing clients. Both produce the same OpenTelemetry spans.
LLM Providers
OpenAI
traceAI-openai
Anthropic
traceAI-anthropic
AWS Bedrock
traceAI-bedrock
Vertex AI
traceAI-vertexai
Google GenAI
traceAI-google-genai
Google ADK
traceai-google-adk
Groq
traceAI-groq
MistralAI
traceAI-mistralai
Together AI
traceAI-openai
Ollama
traceAI-openai
Portkey
traceAI-portkey
Frameworks & Agents
LangChain
traceAI-langchain
LangGraph
traceAI-langchain
LlamaIndex
traceAI-llamaindex
LlamaIndex Workflows
traceAI-llamaindex
LiteLLM
traceAI-litellm
CrewAI
traceAI-crewai
AutoGen
traceAI-autogen
Haystack
traceAI-haystack
DSPy
traceAI-DSPy
OpenAI Agents
traceAI-openai-agents
Smol Agents
traceAI-smolagents
Instructor
traceAI-instructor
PromptFlow
traceAI-openai
Guardrails
traceAI-guardrails
MCP
traceAI-mcp
Mastra
@traceai/mastra
Vercel AI SDK
@traceai/vercel
Voice & Realtime
Java
The Java SDK uses explicit Traced* wrappers instead of instrumentors. Add a Maven/Gradle dependency, wrap your client, and traces flow to FutureAGI. See the Java overview for core setup.
Spring Boot
traceai-spring-boot-starter
OpenAI
traceai-java-openai
Anthropic
traceai-java-anthropic
AWS Bedrock
traceai-java-bedrock
Cohere
traceai-java-cohere
Pinecone
traceai-java-pinecone
More LLM Providers
Google GenAI, Vertex AI, Azure OpenAI, Ollama, Watsonx
Vector Databases
Qdrant, Milvus, ChromaDB, Weaviate, and 5 more
Frameworks
LangChain4j, Semantic Kernel