LadgerTracer
The LadgerTracer class is the main entry point for the Ladger SDK. It manages span creation, batching, and communication with the Ladger API.
Constructor
new LadgerTracer(config: LadgerConfig)Creates a new tracer instance.
Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
config | LadgerConfig | Yes | Configuration object |
LadgerConfig
interface LadgerConfig { apiKey: string; // Required - starts with 'ladger_' flowName: string; // Required - groups traces projectUrl?: string; // Default: 'https://ladger.pages.dev/api' batchSize?: number; // Default: 10 flushInterval?: number; // Default: 5000 (ms) debug?: boolean; // Default: false}Example
import { LadgerTracer } from '@ladger/sdk';
const tracer = new LadgerTracer({ apiKey: process.env.LADGER_API_KEY!, flowName: 'customer-support-agent', batchSize: 20, flushInterval: 10000, debug: process.env.NODE_ENV === 'development',});Validation
The constructor validates:
apiKeyis provided and starts with'ladger_'flowNameis provided
// Throws: "apiKey is required"new LadgerTracer({ flowName: 'app' });
// Throws: "flowName is required"new LadgerTracer({ apiKey: 'ladger_...' });
// Throws: "apiKey must start with 'ladger_'"new LadgerTracer({ apiKey: 'invalid', flowName: 'app' });Methods
startSpan()
startSpan(name: string, options?: SpanOptions): LadgerSpanCreates and returns a new span.
Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
name | string | Yes | Human-readable span name |
options | SpanOptions | No | Span configuration |
SpanOptions
interface SpanOptions { parent?: LadgerSpan; // Parent span for nesting}Example
// Simple spanconst span = tracer.startSpan('generate-response');span.end();
// Nested spansconst parent = tracer.startSpan('handle-request');const child = tracer.startSpan('call-openai', { parent });child.end();parent.end();trace()
trace<T>(name: string, fn: (span: LadgerSpan) => Promise<T>, options?: SpanOptions): Promise<T>Wraps an async function with automatic span management.
Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
name | string | Yes | Span name |
fn | (span: LadgerSpan) => Promise<T> | Yes | Async function to wrap |
options | SpanOptions | No | Span configuration |
Features
- Automatically calls
span.end()after function completes - Captures errors and adds them as span attributes
- Re-throws errors after recording them
Example
const response = await tracer.trace('chat-completion', async (span) => { span.setAttributes({ 'user.id': 'user-123' });
const result = await openai.chat.completions.create({ ... });
span.recordCost({ provider: 'openai', model: 'gpt-4o', inputTokens: result.usage?.prompt_tokens, outputTokens: result.usage?.completion_tokens, });
return result.choices[0].message.content;});newSession()
newSession(): stringStarts a new session and returns the session ID.
Returns
| Type | Description |
|---|---|
string | New session ID (format: sess-{timestamp}-{random}) |
Example
// Start a new conversationconst sessionId = tracer.newSession();console.log('New session:', sessionId);
// All subsequent spans belong to this sessionawait chat('Hello!');await chat('How are you?');getSessionId()
getSessionId(): stringReturns the current session ID.
Example
const currentSession = tracer.getSessionId();console.log('Current session:', currentSession);getProjectUrl()
getProjectUrl(): stringReturns the configured project URL.
Example
console.log('Sending traces to:', tracer.getProjectUrl());// Output: https://ladger.pages.dev/apigetPendingSpanCount()
getPendingSpanCount(): numberReturns the number of spans waiting to be sent.
Example
console.log('Pending spans:', tracer.getPendingSpanCount());flush()
flush(): Promise<void>Manually sends all pending spans to the API.
Example
// Force send after important operationsawait tracer.flush();shutdown()
shutdown(): Promise<void>Gracefully shuts down the tracer. Clears the flush timer and sends remaining spans.
Example
process.on('SIGTERM', async () => { console.log('Shutting down...'); await tracer.shutdown(); process.exit(0);});instrument() (Coming Soon)
instrument(client: unknown): voidAuto-instruments an AI provider client. Currently logs a placeholder message.
Planned Support
- OpenAI SDK
- Anthropic SDK
- LangChain
- LlamaIndex
Complete Example
import { LadgerTracer } from '@ladger/sdk';import OpenAI from 'openai';
const tracer = new LadgerTracer({ apiKey: process.env.LADGER_API_KEY!, flowName: 'customer-support', debug: true,});
const openai = new OpenAI();
async function handleCustomerQuery(query: string) { // Start new session per customer interaction tracer.newSession();
return tracer.trace('handle-query', async (parentSpan) => { // Classify intent const intent = await tracer.trace('classify', async (span) => { const result = await openai.chat.completions.create({ model: 'gpt-4o-mini', messages: [{ role: 'user', content: `Classify: ${query}` }], });
span.recordCost({ provider: 'openai', model: 'gpt-4o-mini', inputTokens: result.usage?.prompt_tokens, outputTokens: result.usage?.completion_tokens, });
return result.choices[0].message.content; }, { parent: parentSpan });
// Generate response return tracer.trace('respond', async (span) => { span.setAttributes({ intent });
const result = await openai.chat.completions.create({ model: 'gpt-4o', messages: [{ role: 'user', content: query }], });
span.recordCost({ provider: 'openai', model: 'gpt-4o', inputTokens: result.usage?.prompt_tokens, outputTokens: result.usage?.completion_tokens, });
return result.choices[0].message.content; }, { parent: parentSpan }); });}
// Graceful shutdownprocess.on('SIGTERM', () => tracer.shutdown());