Bedrock
Set up auto-instrumentation for AWS Bedrock with Future AGI tracing. Install traceAI-bedrock to capture model invocation spans and metadata.
1. Installation
Install the traceAI and Bedrock packages.
pip install traceAI-bedrock
pip install boto3npm install @traceai/bedrock @traceai/fi-core @opentelemetry/instrumentation 2. Set Environment Variables
Set up your environment variables to authenticate with both FutureAGI and AWS services.
import os
os.environ["AWS_ACCESS_KEY_ID"] = "your-aws-access-key-id"
os.environ["AWS_SECRET_ACCESS_KEY"] = "your-aws-secret-access-key"
os.environ["FI_API_KEY"] = "your-futureagi-api-key"
os.environ["FI_SECRET_KEY"] = "your-futureagi-secret-key"process.env.AWS_ACCESS_KEY_ID = "your-aws-access-key-id";
process.env.AWS_SECRET_ACCESS_KEY = "your-aws-secret-access-key";
process.env.FI_API_KEY = "your-futureagi-api-key";
process.env.FI_SECRET_KEY = "your-futureagi-secret-key"; 3. Initialize Trace Provider
Set up the trace provider to create a new project in FutureAGI, establish telemetry data pipelines .
from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
trace_provider = register(
project_type=ProjectType.OBSERVE,
project_name="bedrock_project",
)import { register, ProjectType } from "@traceai/fi-core";
const tracerProvider = register({
project_type: ProjectType.OBSERVE,
project_name: "bedrock_project",
}); 4. Configure Bedrock Instrumentation
Instrument your Project with Bedrock Instrumentor. This step ensures that all interactions with the Bedrock are tracked and monitored.
from traceai_bedrock import BedrockInstrumentor
BedrockInstrumentor().instrument(tracer_provider=trace_provider)import { BedrockInstrumentation } from "@traceai/bedrock";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
const bedrockInstrumentation = new BedrockInstrumentation({});
registerInstrumentations({
instrumentations: [bedrockInstrumentation],
tracerProvider: tracerProvider,
}); 5. Create Bedrock Components
Set up your Bedrock client and use your application as you normally would. Our Instrumentor will automatically trace and send the telemetry data to our platform.
import boto3
client = boto3.client(
service_name="bedrock",
region_name="your-region",
aws_access_key_id=os.environ["AWS_ACCESS_KEY_ID"],
aws_secret_access_key=os.environ["AWS_SECRET_ACCESS_KEY"],
)import { BedrockRuntimeClient } from "@aws-sdk/client-bedrock-runtime";
const client = new BedrockRuntimeClient({
region: "your-region",
}); 6. Execute
Run your Bedrock application.
def converse_with_claude():
system_prompt = [{"text": "You are an expert at creating music playlists"}]
messages = [
{
"role": "user",
"content": [{"text": "Hello, how are you?"}, {"text": "What's your name?"}],
}
]
inference_config = {"maxTokens": 1024, "temperature": 0.0}
try:
response = client.converse(
modelId="model_id",
system=system_prompt,
messages=messages,
inferenceConfig=inference_config,
)
out = response["output"]["message"]
messages.append(out)
print(out)
except Exception as e:
print(f"Error: {str(e)}")
if __name__ == "__main__":
converse_with_claude()import { ConverseCommand } from "@aws-sdk/client-bedrock-runtime";
async function converseWithClaude() {
const system = [{ text: "You are an expert at creating music playlists" }];
const messages = [
{
role: "user",
content: [{ text: "Hello, how are you?" }, { text: "What's your name?" }],
},
];
const inferenceConfig = { maxTokens: 1024, temperature: 0.0 };
try {
const response = await client.send(
new ConverseCommand({
modelId: "model_id",
system,
messages,
inferenceConfig,
})
);
const out = response.output?.message;
if (out) {
console.log(out);
}
} catch (e) {
console.error("Error:", e);
}
}
converseWithClaude();