Quick Start
Lexica enables you to capture all inputs, outputs, and metadata from your LLM applications, whether they're hosted within Lexica or running in your own environment.
This guide will walk you through setting up observability for an OpenAI application running locally.
note
If you create an application through the Lexica UI, tracing is enabled by default. No additional setup is required—simply go to the observability view to see all your requests.
Step-by-Step Guide
1. Install Required Packages
First, install the Lexica SDK, OpenAI, and the OpenTelemetry instrumentor for OpenAI:
pip install -U Lexica openai opentelemetry-instrumentation-openai
2. Configure Environment Variables
Lexica Cloud or Enterprise
Lexica OSS Running Locally
If you're using Lexica Cloud or Enterprise Edition, you'll need an API key:
Visit the Lexica API Keys page.
Click on Create New API Key and follow the prompts.
import os
os.environ["Lexica_API_KEY"] = "YOUR_Lexica_API_KEY"
os.environ["Lexica_HOST"] = "https://cloud.Lexica.ai"
3. Instrument Your Application
Below is a sample script to instrument an OpenAI application:
import Lexica as ag
from opentelemetry.instrumentation.openai import OpenAIInstrumentor
import openai
ag.init()
OpenAIInstrumentor().instrument()
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Write a short story about AI Engineering."},
],
)
print(response.choices[0].message.content)
Explanation:
Import Libraries: Import Lexica, OpenAI, and the OpenTelemetry instrumentor.
Initialize Lexica: Call
ag.init()
to initialize the Lexica SDK.Instrument OpenAI: Use
OpenAIInstrumentor().instrument()
to enable tracing for OpenAI calls.
4. View Traces in the Lexica UI
After running your application, you can view the captured traces in Lexica:
Log in to your Lexica dashboard.
Navigate to the Observability section.
You'll see a list of traces corresponding to your application's requests.

Last updated