Logging Prompt Templates & Variables
Instrument prompt templates so they appear in Future AGI's prompt playground — test changes to prompt text or variables directly in the UI without deploying a new version.
What it is
Logging Prompt Templates lets you attach prompt template data to your OpenTelemetry spans so Future AGI can surface it in the prompt playground. You use the using_attributes context manager to add a prompt template into the current OpenTelemetry context — FI auto-instrumentors read this context and pass the template fields as span attributes, following traceAI semantic conventions.
Use cases
- Prompt playground — Experiment with prompt text and variable changes directly in the Future AGI UI without deploying a new template version.
- Template versioning — Track which template version and variables were used in each LLM call for reproducibility.
- Debugging — See exactly what prompt was sent and with which variables on every span.
How to
Install dependencies
Install the core instrumentation package and any framework instrumentors you need.
pip install fi-instrumentation-otel traceai_openai openai Implement prompt template tracing
Register your tracer provider, instrument your LLM clients, and wrap your LLM calls with using_attributes to attach the prompt template to spans.
import os
from fi_instrumentation import register, Transport, using_attributes
from traceai_openai import OpenAIInstrumentor
from fi_instrumentation.fi_types import ProjectType
from traceai_langchain import LangChainInstrumentor
from fi_instrumentation import register
from fi_instrumentation.fi_types import (
ProjectType,
)
from traceai_langchain import LangChainInstrumentor
os.environ["GOOGLE_API_KEY"] = "google_api_key"
print(os.environ.get("GOOGLE_API_KEY"))
os.environ["OPENAI_API_KEY"] = "futureagi_openai_api_key"
os.environ["FI_API_KEY"] = "futureagi_api_key"
os.environ["FI_SECRET_KEY"] = "futureagi_secret_key"
# Setup OTel via our register function
trace_provider = register(
project_type=ProjectType.OBSERVE,
project_name="<project_name>", # Your project name
transport=Transport.HTTP, # Transport mechanism for your traces
)
OpenAIInstrumentor().instrument(tracer_provider=trace_provider)
LangChainInstrumentor().instrument(tracer_provider=trace_provider)
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
with using_attributes(
prompt_template="<prompt_template_name>",
prompt_template_label="<prompt_template_label>",
):
prompt = ChatPromptTemplate.from_template("{x} {y} {z}?").partial(x="why is", z="blue")
chain = prompt | ChatOpenAI(model_name="gpt-3.5-turbo")
result = chain.invoke({"y": "sky"})
print(f"Response: {result}")
Key concepts
using_attributes— Context manager that enriches the current OpenTelemetry context with prompt template fields. All spans created by FI auto-instrumentors within the block will carry the template data as span attributes.prompt_template— The name of the prompt template registered in Future AGI.prompt_template_label— A label identifying the specific version or variant of the template.
using_prompt_template parameters:
| Parameter | Type | Description | Example |
|---|---|---|---|
| template | str | The string for the prompt template | ”Please describe the weather forecast for {city} on {date}” |
| version | str | Identifier for the template version | ”v1.0” |
| variables | Dict[str] | Dictionary containing variables to fill the template | {"city": "San Francisco", "date": "March 27"} |
What you can do next
Set Up Tracing
Register a tracer provider and add instrumentation.
Add Attributes & Metadata
Attach custom data to spans for filtering and evals.
Instrument with traceAI Helpers
Use FITracer decorators and context managers for typed spans.
Set Session & User ID
Group traces into sessions and link them to end users.