Logging Prompt Templates & Variables
Attach prompt template data to spans so Future AGI can surface it in the prompt playground for testing changes without deploying.
About
LLM outputs depend entirely on the prompt, but the prompt itself is not captured in traces by default. Logging prompt templates attaches the template name, version, label, and variables to spans as attributes. Once logged, Future AGI surfaces them in the prompt playground where template text and variables can be edited and re-run directly in the UI without redeploying.
When to use
- Test prompt changes without deploying: Logged templates appear in the prompt playground where text and variables can be edited and re-run directly in the UI.
- Reproduce a past LLM call: Template version and variables are recorded on every span, so any call can be reconstructed exactly as it ran.
- Debug unexpected outputs: Open a span and see the full prompt that was sent, including which variables were filled in.
How to
Install dependencies
Install the core instrumentation package and any framework instrumentors needed.
pip install fi-instrumentation-otel traceai_openai openai Log prompt templates with using_attributes
Wrap LLM calls with using_attributes to attach the prompt template to all spans created inside the block.
import os
from fi_instrumentation import register, Transport, using_attributes
from fi_instrumentation.fi_types import ProjectType
from traceai_openai import OpenAIInstrumentor
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
from traceai_langchain import LangChainInstrumentor
os.environ["OPENAI_API_KEY"] = "your-openai-api-key"
os.environ["FI_API_KEY"] = "your-futureagi-api-key"
os.environ["FI_SECRET_KEY"] = "your-futureagi-secret-key"
trace_provider = register(
project_type=ProjectType.OBSERVE,
project_name="your-project-name",
transport=Transport.HTTP,
)
OpenAIInstrumentor().instrument(tracer_provider=trace_provider)
LangChainInstrumentor().instrument(tracer_provider=trace_provider)
with using_attributes(
prompt_template="your-template-name",
prompt_template_label="your-template-label",
):
prompt = ChatPromptTemplate.from_template("{x} {y} {z}?").partial(x="why is", z="blue")
chain = prompt | ChatOpenAI(model_name="gpt-3.5-turbo")
result = chain.invoke({"y": "sky"})
print(f"Response: {result}") Log with using_prompt_template (alternative)
For more granular control, use using_prompt_template to attach the template string, version, and variables separately.
from fi_instrumentation import using_prompt_template
with using_prompt_template(
template="Please describe the weather forecast for {city} on {date}",
version="v1.0",
variables={"city": "San Francisco", "date": "March 27"},
):
# All spans in this block get prompt template attributes
pass Key concepts
using_attributes: Context manager that enriches the current OpenTelemetry context with prompt template fields. All spans created by auto-instrumentors within the block carry the template data as span attributes.prompt_template: The name of the prompt template registered in Future AGI.prompt_template_label: A label identifying the specific version or variant of the template.using_prompt_template: Alternative context manager for attaching the raw template string, version, and variables.
using_prompt_template parameters:
| Parameter | Type | Description | Example |
|---|---|---|---|
| template | str | The string for the prompt template | ”Please describe the weather forecast for {city} on {date}” |
| version | str | Identifier for the template version | ”v1.0” |
| variables | Dict[str] | Dictionary containing variables to fill the template | {"city": "San Francisco", "date": "March 27"} |
using_attributes prompt parameters:
| Parameter | Type | Description |
|---|---|---|
| prompt_template | str | Name of the prompt template |
| prompt_template_label | str | Label for the template version or variant |
| prompt_template_version | str | Version identifier |
| prompt_template_variables | Dict[str, Any] | Variables to fill the template |
Next Steps
Set Up Tracing
Register a tracer provider and add instrumentation.
Add Attributes & Metadata
Attach custom data to spans for filtering and evals.
Instrument with traceAI Helpers
Use FITracer decorators and context managers for typed spans.
Set Session & User ID
Group traces into sessions and link them to end users.