Quick Start

Lexica enables you to capture all inputs, outputs, and metadata from your LLM applications, whether they're hosted within Lexica or running in your own environment.

This guide will walk you through setting up observability for an OpenAI application running locally.

note

If you create an application through the Lexica UI, tracing is enabled by default. No additional setup is required—simply go to the observability view to see all your requests.

Step-by-Step Guidearrow-up-right

1. Install Required Packagesarrow-up-right

First, install the Lexica SDK, OpenAI, and the OpenTelemetry instrumentor for OpenAI:

pip install -U Lexica openai opentelemetry-instrumentation-openai

2. Configure Environment Variablesarrow-up-right

  • Lexica Cloud or Enterprise

  • Lexica OSS Running Locally

If you're using Lexica Cloud or Enterprise Edition, you'll need an API key:

  1. Visit the Lexica API Keys page.

  2. Click on Create New API Key and follow the prompts.

import os

os.environ["Lexica_API_KEY"] = "YOUR_Lexica_API_KEY"
os.environ["Lexica_HOST"] = "https://cloud.Lexica.ai"

3. Instrument Your Applicationarrow-up-right

Below is a sample script to instrument an OpenAI application:

Explanation:

  • Import Libraries: Import Lexica, OpenAI, and the OpenTelemetry instrumentor.

  • Initialize Lexica: Call ag.init() to initialize the Lexica SDK.

  • Instrument OpenAI: Use OpenAIInstrumentor().instrument() to enable tracing for OpenAI calls.

4. View Traces in the Lexica UIarrow-up-right

After running your application, you can view the captured traces in Lexica:

  1. Log in to your Lexica dashboard.

  2. Navigate to the Observability section.

  3. You'll see a list of traces corresponding to your application's requests.

Illustration of observability

Last updated