1. Installation
Install the traceAI and Bedrock packages.
pip install traceAI-bedrock
pip install boto3
2. Environment Configuration
Set up your environment variables to authenticate with both FutureAGI and AWS services.
import os
os.environ["AWS_ACCESS_KEY_ID"] = "your-aws-access-key-id"
os.environ["AWS_SECRET_ACCESS_KEY"] = "your-aws-secret-access-key"
os.environ["FI_API_KEY"] = "your-futureagi-api-key"
os.environ["FI_SECRET_KEY"] = "your-futureagi-secret-key"
3. Initialize Trace Provider
Set up the trace provider to create a new project in FutureAGI, establish telemetry data pipelines .
from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
trace_provider = register(
project_type=ProjectType.OBSERVE,
project_name="bedrock_project",
)
Instrument your Project with Bedrock Instrumentor. This step ensures that all interactions with the Bedrock are tracked and monitored.
from traceai_bedrock import BedrockInstrumentor
BedrockInstrumentor().instrument(tracer_provider=trace_provider)
5. Create Bedrock Components
Set up your Bedrock client and use your application as you normally would. Our Instrumentor will automatically trace and send the telemetry data to our platform.
import boto3
client = boto3.client(
service_name="bedrock",
region_name="your-region",
aws_access_key_id=os.environ["AWS_ACCESS_KEY_ID"],
aws_secret_access_key=os.environ["AWS_SECRET_ACCESS_KEY"],
)
6. Execute
Run your Bedrock application.
def converse_with_claude():
system_prompt = [{"text": "You are an expert at creating music playlists"}]
messages = [
{
"role": "user",
"content": [{"text": "Hello, how are you?"}, {"text": "What's your name?"}],
}
]
inference_config = {"maxTokens": 1024, "temperature": 0.0}
try:
response = client.converse(
modelId="model_id",
system=system_prompt,
messages=messages,
inferenceConfig=inference_config,
)
out = response["output"]["message"]
messages.append(out)
print(out)
except Exception as e:
print(f"Error: {str(e)}")
if __name__ == "__main__":
converse_with_claude()