Anthropic (Java)
Trace Anthropic Messages API calls in Java with TracedAnthropicClient. Uses reflection for cross-version compatibility.
TracedAnthropicClientwraps any version of the Anthropic Java SDK- Uses reflection internally - the client is typed as
Object, not a specific SDK class - Traces
createMessage()calls with full message, token, and model capture - Works across different Anthropic SDK versions without recompilation
Prerequisites
Complete the Java SDK setup first.
Installation
<dependency>
<groupId>com.github.future-agi.traceAI</groupId>
<artifactId>traceai-java-anthropic</artifactId>
<version>main-SNAPSHOT</version>
</dependency>implementation 'com.github.future-agi.traceAI:traceai-java-anthropic:main-SNAPSHOT' You also need the Anthropic Java SDK (any version):
<dependency>
<groupId>com.anthropic</groupId>
<artifactId>anthropic-java</artifactId>
<version>1.0.0</version>
</dependency>implementation 'com.anthropic:anthropic-java:1.0.0' Why reflection?
Unlike the OpenAI wrapper (which imports com.openai types directly), the Anthropic wrapper accepts Object for both the client and message params. This is intentional - the Anthropic Java SDK has changed its API surface across versions, and the reflection approach means traceai-java-anthropic works with any version without needing to match exact class signatures.
The tradeoff: your IDE won’t autocomplete the createMessage() parameter type. You pass the Anthropic SDK’s own MessageCreateParams object, but the compiler sees it as Object.
Wrap the client
import ai.traceai.TraceAI;
import ai.traceai.anthropic.TracedAnthropicClient;
import com.anthropic.AnthropicClient;
import com.anthropic.AnthropicOkHttpClient;
TraceAI.initFromEnvironment();
// Create the Anthropic client normally
AnthropicClient client = AnthropicOkHttpClient.builder()
.apiKey(System.getenv("ANTHROPIC_API_KEY"))
.build();
// Wrap it - note the client is accepted as Object
TracedAnthropicClient traced = new TracedAnthropicClient(client);
Create a message
import com.anthropic.models.*;
Object response = traced.createMessage(
MessageCreateParams.builder()
.model("claude-sonnet-4-20250514")
.maxTokens(1024)
.system("You are a helpful assistant.")
.addMessage(MessageParam.builder()
.role(MessageParam.Role.USER)
.content("What is the capital of France?")
.build())
.build()
);
// Cast to the SDK's Message type
Message message = (Message) response;
System.out.println(message.content().get(0).text());
The createMessage() return type is generic (<T>), so you need to cast the result to the Anthropic SDK’s Message type. This is the cost of the reflection approach.
Span created: “Anthropic Message” with kind LLM
What gets captured
| Attribute | Example |
|---|---|
llm.system | anthropic |
llm.provider | anthropic |
llm.request.model | claude-sonnet-4-20250514 |
llm.response.model | claude-sonnet-4-20250514 |
llm.response.id | msg_abc123 |
llm.request.max_tokens | 1024 |
llm.request.temperature | 0.7 |
llm.token_count.prompt | 20 |
llm.token_count.completion | 35 |
llm.token_count.total | 55 |
llm.response.finish_reason | end_turn |
| Input messages | System prompt + user messages as structured JSON |
| Output messages | Assistant response content blocks concatenated |
fi.raw_input / fi.raw_output | Full request/response serialized |
The wrapper handles multi-block content (text blocks in the response are concatenated). System prompts are captured as a separate “system” role message in the input messages.
Accessing the original client
Object original = traced.unwrap();
// Cast back if you need typed access
AnthropicClient anthropic = (AnthropicClient) original;