Skip to main content
Common patterns for using the Lumina SDK.

Single LLM Call

const response = await lumina.traceLLM(
  () =>
    openai.chat.completions.create({
      model: 'gpt-4',
      messages: [{ role: 'user', content: 'Hello!' }],
    }),
  {
    name: 'chat-completion',
    system: 'openai',
    prompt: 'Hello!',
  }
);

Adding Metadata

await lumina.traceLLM(
  () => llm.generate(prompt),
  {
    name: 'chat',
    metadata: {
      userId: 'user-123',
      sessionId: 'session-456',
    },
  }
);

Error Handling

try {
  await lumina.traceLLM(
    () => llm.generate(prompt),
    { name: 'chat' }
  );
} catch (error) {
  // Error automatically recorded
  console.error('LLM call failed:', error);
  throw error;
}

Next Steps