Use the OpenAI Instrumentor to instrument your project, as the OpenAI Client is utilized for interactions with Ollama. This step guarantees that all interactions are tracked and monitored. If you are using a different client to interact with Ollama, use that client’s Instrumentor instead.
Copy
from traceai_openai import OpenAIInstrumentorOpenAIInstrumentor().instrument(tracer_provider=trace_provider)
Interact with the Ollama as you normally would. Our Instrumentor will automatically trace and send the telemetry data to our platform.
Make sure that Ollama is running and accessible from your project.
Copy
from openai import OpenAIclient = OpenAI( base_url = 'http://localhost:11434/v1', api_key='ollama',)response = client.chat.completions.create( model="llama3.2:1b", messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "What is OpenAI?"}, ] )print(response.choices[0].message.content)