Template structure
Basic components
- Name: unique identifier (required)
- Messages: ordered list of messages
- Model configuration: model + generation params
- Variables: dynamic placeholders used in messages
Message types
- System: sets behavior/context
- User: contains the prompt; supports variables like
{{var}}
- Assistant: few-shot examples or expected outputs
{ "role": "system", "content": "You are a helpful assistant." }
{ "role": "user", "content": "Introduce {{name}} from {{city}}." }
{ "role": "assistant", "content": "Meet Ada from Berlin!" }
Model configuration fields
model_name
, temperature
, frequency_penalty
, presence_penalty
, max_tokens
, top_p
, response_format
, tool_choice
, tools
Placeholders and compile
Add a placeholder message (type="placeholder"
, name="..."
) in your template. At compile time, supply an array of messages for that key; {{var}}
variables are substituted in all message contents.
import { PromptTemplate, ModelConfig, MessageBase, Prompt } from "@futureagi/sdk";
const tpl = new PromptTemplate({
name: "chat-template",
messages: [
{ role: "system", content: "You are a helpful assistant." } as MessageBase,
{ role: "user", content: "Hello {{name}}!" } as MessageBase,
{ type: "placeholder", name: "history" } as any, // placeholder
],
model_configuration: new ModelConfig({ model_name: "gpt-4o-mini" }),
});
const client = new Prompt(tpl);
// Compile with substitution and inlined chat history
const compiled = client.compile({
name: "Alice",
history: [{ role: "user", content: "Ping {{name}}" }],
} as any);
Create templates
import { Prompt, PromptTemplate, ModelConfig, MessageBase } from "@futureagi/sdk";
const tpl = new PromptTemplate({
name: "intro-template",
messages: [
{ role: "system", content: "You are a helpful assistant." } as MessageBase,
{ role: "user", content: "Introduce {{name}} from {{city}}." } as MessageBase,
],
variable_names: { name: ["Ada"], city: ["Berlin"] },
model_configuration: new ModelConfig({ model_name: "gpt-4o-mini" }),
});
const client = new Prompt(tpl);
await client.open(); // draft v1
await client.commitCurrentVersion("Finish v1", true); // set default
Versioning (step-by-step)
- Build the template (see above)
- Create draft v1 (JS/TS:
await client.open()
; Python: client.create()
)
- Update draft & save (JS/TS:
saveCurrentDraft()
; Python: save_current_draft()
)
- Commit v1 and set default (JS/TS:
commitCurrentVersion("msg", true)
; Python: commit_current_version
)
- Open a new draft (JS/TS:
createNewVersion()
; Python: create_new_version()
)
- Delete if needed (JS/TS:
delete()
; Python: delete()
)
Labels (deployment control)
- System labels: Production, Staging, Development (predefined by backend)
- Custom labels: create explicitly and assign to versions
- Name-based APIs: manage by names (no IDs needed)
- Draft safety: cannot assign labels to drafts; assignments are queued and applied on commit
Assign labels
// Assign by instance (current project)
await client.labels().assign("Production", "v1");
await client.labels().assign("Staging", "v2");
// Create and assign a custom label
await client.labels().create("Canary");
await client.labels().assign("Canary", "v2");
// Class helpers by names (org-wide context)
await Prompt.assignLabelToTemplateVersion("intro-template", "v2", "Development");
Remove labels
await client.labels().remove("Canary", "v2");
await Prompt.removeLabelFromTemplateVersion("intro-template", "v2", "Development");
List labels and mappings
const labels = await client.labels().list(); // system + custom
const mapping = await Prompt.getTemplateLabels({ template_name: "intro-template" });
Fetch by name + label (or version)
- Precedence: version > label
- Python default: if no label is provided, defaults to
“production”
import { Prompt } from "@futureagi/sdk";
const tplByLabel = await Prompt.getTemplateByName("intro-template", { label: "Production" });
const tplByVersion = await Prompt.getTemplateByName("intro-template", { version: "v2" });
A/B testing with labels (compile -> OpenAI gpt‑4o)
Fetch two labeled versions of the same template (e.g., prod-a
and prod-b
), randomly select one, compile variables, and send the compiled messages to OpenAI.
The compile()
API replaces {{var}}
in string contents and preserves structured contents. Ensure your template contains the variables you pass (e.g., {{name}}
, {{city}}
).
import OpenAI from "openai";
import { Prompt, PromptTemplate } from "@futureagi/sdk";
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY! });
// Fetch both label variants
const [tplA, tplB] = await Promise.all([
Prompt.getTemplateByName("my-template-name", { label: "prod-a" }),
Prompt.getTemplateByName("my-template-name", { label: "prod-b" }),
]);
// Randomly select a variant
const selected = Math.random() < 0.5 ? tplA : tplB;
const client = new Prompt(selected as PromptTemplate);
// Compile variables into the template messages
const compiled = client.compile({ name: "Ada", city: "Berlin" });
// Send to OpenAI gpt-4o
const completion = await openai.chat.completions.create({
model: "gpt-4o",
messages: compiled as any,
});
const resultText = completion.choices[0]?.message?.content;
For analytics, attach the selected label/version to your logs or tracing so A/B results can be compared.