Experiment
Vertex AI
1. Installation
Install the FutureAGI package to access the observability framework.
2. Environment Configuration
Set up your environment variables to authenticate with FutureAGI services. These credentials enable:
- Authentication with FutureAGI’s observability platform
- Encrypted telemetry data transmission
3. Configure Evaluation Tags
Define evaluation criteria for monitoring LLM responses. Evaluation tags allow you to:
- Define custom evaluation criteria
- Set up automated response quality checks
- Track model performance metrics
4. Initialize Trace Provider
Set up the trace provider to establish the observability pipeline. The trace provider:
- Creates a new project in FutureAGI
- Establishes telemetry data pipelines
- Configures version tracking
- Sets up evaluation frameworks
- Configure Vertex AI Instrumentation Initialize the Vertex AI instrumentor to enable automatic tracing.
6. Install Required Dependencies
Install the necessary Vertex AI components required for your project.
- Create Vertex AI Components Set up your Vertex AI components with built-in observability.
- Execute Run your Vertex AI application.