OpenAI Integration

OpenAI calls can be automatically instrumented with Lexica using the opentelemetry-instrumentation-openai package. This guide shows how to set it up.

Installationarrow-up-right

Install the required packages:

pip install -U Lexica openai opentelemetry-instrumentation-openai

Instrument OpenAI API Callsarrow-up-right

1. Configure Environment Variablesarrow-up-right

  • Lexica Cloud or Enterprise

  • Lexica OSS Running Locally

import os

os.environ["Lexica_API_KEY"] = "YOUR_Lexica_API_KEY"
os.environ["Lexica_HOST"] = "https://cloud.Lexica.ai"

2. Initialize Lexica and Instrument OpenAIarrow-up-right

import Lexica as ag
from opentelemetry.instrumentation.openai import OpenAIInstrumentor
import openai

ag.init()
OpenAIInstrumentor().instrument()

response = openai.ChatCompletion.create(
    model="gpt-3.5-turbo",
    messages=[
        {"role": "user", "content": "Write a short story about AI."},
    ],
)

print(response.choices[0].message.content)

After running this code, Lexica will automatically capture the details of this API call.

Instrumenting a Workflow with a Parent Spanarrow-up-right

If you have a function or workflow with multiple calls you want to monitor as a single trace, you can use the @ag.instrument decorator.

Associating Traces with Applicationsarrow-up-right

In the previous examples, the traces are instrumented in the global project scope. They are not associated with a specific application or variant or environment. To link traces to specific parts of your Lexica projects, you can store references inside your instrumented functions.

Note: You need to have the @ag.instrument decorator to use ag.tracing.store_refs.

Complete Examplearrow-up-right

Here's the full code putting it all together:

Last updated