Experiment
Llama Index
1. Installation
Install the FutureAGI package to access the observability framework.
2. Environment Configuration
Set up your environment variables to authenticate with FutureAGI services. These credentials enable: Authentication with FutureAGI’s observability platform Encrypted telemetry data transmission
3. Configure Evaluation Tags
Define evaluation criteria for monitoring LLM responses. Evaluation tags allow you to:
- Define custom evaluation criteria
- Set up automated response quality checks
- Track model performance metrics
4. Initialize Trace Provider
Set up the trace provider to establish the observability pipeline. The trace provider:
- Creates a new project in FutureAGI
- Establishes telemetry data pipelines
- Configures version tracking
- Sets up evaluation frameworks
5. Configure Llama Index Instrumentation
Initialize the Llama Index instrumentor to enable automatic tracing.
6. Install Required Dependencies
Install the necessary Llama Index components required for your project.
7. Create Llama Index Components
Set up your Llama Index components with built-in observability.
8. Execute
Run your Llama Index application.