AWS Bedrock (Java)
Trace AWS Bedrock model invocations in Java with TracedBedrockRuntimeClient. Supports both InvokeModel (raw JSON) and Converse (typed API).
TracedBedrockRuntimeClientwrapsBedrockRuntimeClientfrom the AWS SDK- Two APIs:
invokeModel()(raw JSON body) andconverse()(typed messages) - Provider auto-detected from model ID prefix (anthropic., amazon., meta., etc.)
- Parses provider-specific JSON formats for Claude, Titan, Llama, and others
Prerequisites
Complete the Java SDK setup first.
Installation
<dependency>
<groupId>com.github.future-agi.traceAI</groupId>
<artifactId>traceai-java-bedrock</artifactId>
<version>main-SNAPSHOT</version>
</dependency>implementation 'com.github.future-agi.traceAI:traceai-java-bedrock:main-SNAPSHOT' You also need the AWS Bedrock Runtime SDK:
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>bedrockruntime</artifactId>
<version>2.25.0</version>
</dependency>implementation 'software.amazon.awssdk:bedrockruntime:2.25.0' Wrap the client
import ai.traceai.TraceAI;
import ai.traceai.bedrock.TracedBedrockRuntimeClient;
import software.amazon.awssdk.services.bedrockruntime.BedrockRuntimeClient;
TraceAI.initFromEnvironment();
BedrockRuntimeClient client = BedrockRuntimeClient.create();
TracedBedrockRuntimeClient traced = new TracedBedrockRuntimeClient(client);
InvokeModel (raw JSON)
The invokeModel API takes a raw JSON body. The wrapper parses the JSON to extract inputs and outputs based on the provider format.
import software.amazon.awssdk.core.SdkBytes;
import software.amazon.awssdk.services.bedrockruntime.model.*;
// Claude Messages format
String requestBody = """
{
"anthropic_version": "bedrock-2023-05-31",
"messages": [{"role": "user", "content": "What is the capital of France?"}],
"max_tokens": 1024
}
""";
InvokeModelResponse response = traced.invokeModel(InvokeModelRequest.builder()
.modelId("anthropic.claude-3-haiku-20240307-v1:0")
.body(SdkBytes.fromUtf8String(requestBody))
.build());
String responseJson = response.body().asUtf8String();
System.out.println(responseJson);
Span created: “Bedrock Invoke Model” with kind LLM
The wrapper detects the provider from the model ID prefix and parses the JSON format accordingly:
| Model ID prefix | Provider | Input format | Output format |
|---|---|---|---|
anthropic. | Anthropic | Messages API (messages array) | content[].text |
amazon. | Amazon Titan | inputText field | results[].outputText |
meta. | Meta Llama | prompt field | generation field |
ai21. | AI21 | prompt field | completions[].data.text |
cohere. | Cohere | prompt or message | generations[].text or text |
mistral. | Mistral | prompt field | outputs[].text |
Converse (typed API)
The converse API uses typed request/response objects instead of raw JSON. This is the recommended API for new integrations.
import software.amazon.awssdk.services.bedrockruntime.model.*;
import java.util.List;
ConverseResponse response = traced.converse(ConverseRequest.builder()
.modelId("anthropic.claude-3-haiku-20240307-v1:0")
.messages(List.of(
Message.builder()
.role(ConversationRole.USER)
.content(List.of(ContentBlock.fromText("What is the capital of France?")))
.build()
))
.inferenceConfig(InferenceConfiguration.builder()
.maxTokens(1024)
.temperature(0.7f)
.topP(0.9f)
.build())
.build());
String text = response.output().message().content().get(0).text();
System.out.println(text);
Span created: “Bedrock Converse” with kind LLM
What gets captured
Both APIs capture the same core attributes:
| Attribute | Example |
|---|---|
llm.system | bedrock |
llm.provider | anthropic (extracted from model ID) |
llm.request.model | anthropic.claude-3-haiku-20240307-v1:0 |
llm.request.temperature | 0.7 |
llm.request.top_p | 0.9 |
llm.request.max_tokens | 1024 |
llm.token_count.prompt | 15 |
llm.token_count.completion | 42 |
llm.token_count.total | 57 |
llm.response.finish_reason | end_turn |
| Input/output messages | Structured role + content |
fi.raw_input / fi.raw_output | Full JSON body |
For invokeModel, the raw JSON body is stored in fi.raw_input and fi.raw_output. The wrapper does its best to extract structured messages from provider-specific JSON, but the raw JSON is always available as a fallback.
Accessing the original client
BedrockRuntimeClient original = traced.unwrap();