1. Installation

Install the FutureAGI package to access the observability framework.

pip install futureagi

2. Environment Configuration

Set up your environment variables to authenticate with FutureAGI services. These credentials enable:

  • Authentication with FutureAGI’s observability platform
  • Encrypted telemetry data transmission
import os
os.environ["FI_API_KEY"] = "your-futureagi-api-key"

---

## 3. Configure Evaluation Tags
Define evaluation criteria for monitoring LLM responses. Evaluation tags allow you to:
- Define custom evaluation criteria
- Set up automated response quality checks
- Track model performance metrics

```python
from fi.integrations.otel.types import EvalName, EvalSpanKind, EvalTag, EvalTagType

eval_tags = [
    EvalTag(
        eval_name=EvalName.DETERMINISTIC_EVALS,
        value=EvalSpanKind.TOOL,
        type=EvalTagType.OBSERVATION_SPAN,
        config={
            "multi_choice": False,
            "choices": ["Yes", "No"],
            "rule_prompt": "Evaluate if the response is correct",
        },
        custom_eval_name="det_eval_haystack_1"
    )
]

4. Initialize Trace Provider

Set up the trace provider to establish the observability pipeline. The trace provider:

  • Creates a new project in FutureAGI
  • Establishes telemetry data pipelines
  • Configures version tracking
  • Sets up evaluation frameworks
from fi.integrations.otel import register
from fi.integrations.otel.types import ProjectType

trace_provider = register(
    project_type=ProjectType.EXPERIMENT,
    project_name="haystack_app",
    project_version_name="v1",
    eval_tags=eval_tags
)

5. Configure Haystack Instrumentation

Initialize the Haystack instrumentor to enable automatic tracing.

from fi.integrations.otel import HaystackInstrumentor

HaystackInstrumentor().instrument(tracer_provider=trace_provider)

6. Install Required Dependencies

Install the necessary Haystack components required for your project.

pip install haystack

7. Create Haystack Components

Set up your Haystack components with built-in observability.

from haystack import Document, Pipeline
from haystack.components.builders import PromptBuilder
from haystack.components.embedders import (
    SentenceTransformersDocumentEmbedder,
    SentenceTransformersTextEmbedder,
)
from haystack.components.generators import OpenAIGenerator
from haystack.components.retrievers.in_memory import InMemoryEmbeddingRetriever
from haystack.document_stores.in_memory import InMemoryDocumentStore
from datasets import load_dataset

document_store = InMemoryDocumentStore()

dataset = load_dataset("bilgeyucel/seven-wonders", split="train")
docs = [Document(content=doc["content"], meta=doc["meta"]) for doc in dataset]

doc_embedder = SentenceTransformersDocumentEmbedder(
    model="sentence-transformers/all-MiniLM-L6-v2"
)
doc_embedder.warm_up()

docs_with_embeddings = doc_embedder.run(docs)
document_store.write_documents(docs_with_embeddings["documents"])

text_embedder = SentenceTransformersTextEmbedder(
    model="sentence-transformers/all-MiniLM-L6-v2"
)

retriever = InMemoryEmbeddingRetriever(document_store)

template = """
Given the following information, answer the question.

Context:
{% for document in documents %}
    {{ document.content }}
{% endfor %}

Question: {{question}}
Answer:
"""

prompt_builder = PromptBuilder(template=template)

generator = OpenAIGenerator(model="gpt-3.5-turbo")

basic_rag_pipeline = Pipeline()
basic_rag_pipeline.add_component("text_embedder", text_embedder)
basic_rag_pipeline.add_component("retriever", retriever)
basic_rag_pipeline.add_component("prompt_builder", prompt_builder)
basic_rag_pipeline.add_component("llm", generator)

basic_rag_pipeline.connect("text_embedder.embedding", "retriever.query_embedding")
basic_rag_pipeline.connect("retriever", "prompt_builder.documents")
basic_rag_pipeline.connect("prompt_builder", "llm")

8. Execute

Run your Haystack application.

question = "What does Rhodes Statue look like?"

response = basic_rag_pipeline.run(
    {"text_embedder": {"text": question}, "prompt_builder": {"question": question}}
)

print(response["llm"]["replies"][0])