Documentation Index
Fetch the complete documentation index at: https://docs.uselumina.io/docs/llms.txt
Use this file to discover all available pages before exploring further.
Step-by-step guide to sending your first trace.
Prerequisites
- Lumina running (Docker or Kubernetes)
- Node.js 18+ or Bun
Step 1: Install SDK
npm install @uselumina/sdk @anthropic-ai/sdk
Step 2: Create Test File
Create first-trace.ts:
import { initLumina } from '@uselumina/sdk';
import Anthropic from '@anthropic-ai/sdk';
const anthropic = new Anthropic({
apiKey: process.env.ANTHROPIC_API_KEY,
});
const lumina = initLumina({
endpoint: 'http://localhost:9411/v1/traces',
service_name: 'quickstart',
});
async function main() {
console.log('Sending first trace...');
await lumina.traceLLM(
() =>
anthropic.messages.create({
model: 'claude-sonnet-4-5',
max_tokens: 100,
messages: [{ role: 'user', content: 'Hello!' }],
}),
{
name: 'hello-world',
system: 'anthropic',
prompt: 'Hello!',
}
);
console.log('✓ Trace sent! View at http://localhost:3000/traces');
}
main();
Step 3: Run
Step 4: View in Dashboard
- Open http://localhost:3000/traces
- Find your trace with service
quickstart
- Click to view details
You should see:
- Automatic cost calculation
- Token counts
- Latency measurement
- Full prompt and response
Next Steps
SDK Documentation
Learn SDK usage patterns
Multi-Span Tracing
Track complex workflows