Experiment
LangChain
1. Installation
First install the FutureAGI package to access the observability framework
2. Environment Configuration
Set up your environment variables to authenticate with both OpenAI and FutureAGI services. These credentials enable:
- Secure access to OpenAI’s language models
- Authentication with FutureAGI’s observability platform
- Encrypted telemetry data transmission
3. Configure Evaluation Tags
Define evaluation criteria for monitoring LLM responses. Evaluation tags allow you to:
- Define custom evaluation criteria
- Set up automated response quality checks
- Track model performance metrics
4. Initialize Trace Provider
Set up the trace provider to establish the observability pipeline. The trace provider:
- Creates a new project in FutureAGI
- Establishes telemetry data pipelines
- Configures version tracking
- Sets up evaluation frameworks
5. Configure LangChain Instrumentation
Initialize the LangChain instrumentor to enable automatic tracing.
6. Install Required Dependencies
Install the necessary LangChain components required for your project.
7. Create LangChain Components
Set up your LangChain pipeline with built-in observability.
8. Execute
Run your LangChain application.