Integrating Prompt Templates into Spans

By instrumenting the prompt template, users can fully utilize Future AGI’s prompt playground. There’s no need to deploy a new template version to test if changes in prompt text or variables achieve the desired effect. Instead, you can experiment with these modifications directly in the playground UI.

Implementation Details

We provide a using_prompt_template context manager to add a prompt template into the current OpenTelemetry Context. FI auto-instrumentors will read this Context and pass the prompt template fields as span attributes, adhering to the traceAI semantic conventions.

Required Parameters

ParameterTypeDescriptionExample
templatestrThe string for the prompt template”Please describe the weather forecast for on
versionstrIdentifier for the template version”v1.0”
variablesDict[str]Dictionary containing variables to fill the template{"city": "San Francisco", "date": "March 27"}

Sample Implementation

Begin by installing the necessary dependencies:

pip install fi-instrumentation-otel traceai_openai openai

Below is a comprehensive example demonstrating how to implement prompt template tracing:

import os
import openai
import opentelemetry
from fi_instrumentation import register, using_prompt_template
from openai import OpenAI
from traceai_openai import OpenAIInstrumentor

# Set up Environment Variables
os.environ["OPENAI_API_KEY"] = "your-openai-api-key"
os.environ["FI_API_KEY"] = "your-futureagi-api-key"
os.environ["FI_SECRET_KEY"] = "your-futureagi-secret-key"

my_first_model = "my first model"

# Setup OTel via our register function
trace_provider = register(
    project_type=ProjectType.EXPERIMENT,
    project_name="Project_name",
    project_version_name="project_version_name",
)
OpenAIInstrumentor().instrument(tracer_provider=trace_provider)

# Setup OpenAI
client = OpenAI()

# Define the prompt template and its variables
prompt_template = "Please describe the weather forecast for {city} on {date}"
prompt_template_variables = {"city": "San Francisco", "date":"March 27"}

# Use the context manager to add template information
with using_prompt_template(
    template=prompt_template,
    variables=prompt_template_variables,
    version="v1.0",
):
    response = client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[
            {
                "role": "user",
                "content": prompt_template.format(**prompt_template_variables)
            },
        ]
    )