Skip to main content
Integrate Lumina with LlamaIndex applications.

Installation

npm install @uselumina/sdk llamaindex

Basic Usage

import { initLumina } from '@uselumina/sdk';
import { OpenAI } from 'llamaindex';

const lumina = initLumina({
  endpoint: 'http://localhost:9411/v1/traces',
  service_name: 'llamaindex-app',
});

const llm = new OpenAI({ model: 'gpt-4' });

const response = await lumina.traceLLM(
  () => llm.complete({ prompt: 'Hello!' }),
  { name: 'llamaindex-complete', system: 'openai' }
);

Query Engine

import { VectorStoreIndex, SimpleDirectoryReader } from 'llamaindex';

const documents = await new SimpleDirectoryReader().loadData('./docs');
const index = await VectorStoreIndex.fromDocuments(documents);
const queryEngine = index.asQueryEngine();

await lumina.trace('rag-query', async () => {
  const response = await lumina.traceLLM(
    () => queryEngine.query({ query: 'What is AI?' }),
    { name: 'query-engine', system: 'openai' }
  );
  return response;
});