Skip to content

LadgerTracer

The LadgerTracer class is the main entry point for the Ladger SDK. It manages span creation, batching, and communication with the Ladger API.

Constructor

new LadgerTracer(config: LadgerConfig)

Creates a new tracer instance.

Parameters

ParameterTypeRequiredDescription
configLadgerConfigYesConfiguration object

LadgerConfig

interface LadgerConfig {
apiKey: string; // Required - starts with 'ladger_'
flowName: string; // Required - groups traces
projectUrl?: string; // Default: 'https://ladger.pages.dev/api'
batchSize?: number; // Default: 10
flushInterval?: number; // Default: 5000 (ms)
debug?: boolean; // Default: false
}

Example

import { LadgerTracer } from '@ladger/sdk';
const tracer = new LadgerTracer({
apiKey: process.env.LADGER_API_KEY!,
flowName: 'customer-support-agent',
batchSize: 20,
flushInterval: 10000,
debug: process.env.NODE_ENV === 'development',
});

Validation

The constructor validates:

  • apiKey is provided and starts with 'ladger_'
  • flowName is provided
// Throws: "apiKey is required"
new LadgerTracer({ flowName: 'app' });
// Throws: "flowName is required"
new LadgerTracer({ apiKey: 'ladger_...' });
// Throws: "apiKey must start with 'ladger_'"
new LadgerTracer({ apiKey: 'invalid', flowName: 'app' });

Methods

startSpan()

startSpan(name: string, options?: SpanOptions): LadgerSpan

Creates and returns a new span.

Parameters

ParameterTypeRequiredDescription
namestringYesHuman-readable span name
optionsSpanOptionsNoSpan configuration

SpanOptions

interface SpanOptions {
parent?: LadgerSpan; // Parent span for nesting
}

Example

// Simple span
const span = tracer.startSpan('generate-response');
span.end();
// Nested spans
const parent = tracer.startSpan('handle-request');
const child = tracer.startSpan('call-openai', { parent });
child.end();
parent.end();

trace()

trace<T>(name: string, fn: (span: LadgerSpan) => Promise<T>, options?: SpanOptions): Promise<T>

Wraps an async function with automatic span management.

Parameters

ParameterTypeRequiredDescription
namestringYesSpan name
fn(span: LadgerSpan) => Promise<T>YesAsync function to wrap
optionsSpanOptionsNoSpan configuration

Features

  • Automatically calls span.end() after function completes
  • Captures errors and adds them as span attributes
  • Re-throws errors after recording them

Example

const response = await tracer.trace('chat-completion', async (span) => {
span.setAttributes({ 'user.id': 'user-123' });
const result = await openai.chat.completions.create({ ... });
span.recordCost({
provider: 'openai',
model: 'gpt-4o',
inputTokens: result.usage?.prompt_tokens,
outputTokens: result.usage?.completion_tokens,
});
return result.choices[0].message.content;
});

newSession()

newSession(): string

Starts a new session and returns the session ID.

Returns

TypeDescription
stringNew session ID (format: sess-{timestamp}-{random})

Example

// Start a new conversation
const sessionId = tracer.newSession();
console.log('New session:', sessionId);
// All subsequent spans belong to this session
await chat('Hello!');
await chat('How are you?');

getSessionId()

getSessionId(): string

Returns the current session ID.

Example

const currentSession = tracer.getSessionId();
console.log('Current session:', currentSession);

getProjectUrl()

getProjectUrl(): string

Returns the configured project URL.

Example

console.log('Sending traces to:', tracer.getProjectUrl());
// Output: https://ladger.pages.dev/api

getPendingSpanCount()

getPendingSpanCount(): number

Returns the number of spans waiting to be sent.

Example

console.log('Pending spans:', tracer.getPendingSpanCount());

flush()

flush(): Promise<void>

Manually sends all pending spans to the API.

Example

// Force send after important operations
await tracer.flush();

shutdown()

shutdown(): Promise<void>

Gracefully shuts down the tracer. Clears the flush timer and sends remaining spans.

Example

process.on('SIGTERM', async () => {
console.log('Shutting down...');
await tracer.shutdown();
process.exit(0);
});

instrument() (Coming Soon)

instrument(client: unknown): void

Auto-instruments an AI provider client. Currently logs a placeholder message.

Planned Support

  • OpenAI SDK
  • Anthropic SDK
  • LangChain
  • LlamaIndex

Complete Example

import { LadgerTracer } from '@ladger/sdk';
import OpenAI from 'openai';
const tracer = new LadgerTracer({
apiKey: process.env.LADGER_API_KEY!,
flowName: 'customer-support',
debug: true,
});
const openai = new OpenAI();
async function handleCustomerQuery(query: string) {
// Start new session per customer interaction
tracer.newSession();
return tracer.trace('handle-query', async (parentSpan) => {
// Classify intent
const intent = await tracer.trace('classify', async (span) => {
const result = await openai.chat.completions.create({
model: 'gpt-4o-mini',
messages: [{ role: 'user', content: `Classify: ${query}` }],
});
span.recordCost({
provider: 'openai',
model: 'gpt-4o-mini',
inputTokens: result.usage?.prompt_tokens,
outputTokens: result.usage?.completion_tokens,
});
return result.choices[0].message.content;
}, { parent: parentSpan });
// Generate response
return tracer.trace('respond', async (span) => {
span.setAttributes({ intent });
const result = await openai.chat.completions.create({
model: 'gpt-4o',
messages: [{ role: 'user', content: query }],
});
span.recordCost({
provider: 'openai',
model: 'gpt-4o',
inputTokens: result.usage?.prompt_tokens,
outputTokens: result.usage?.completion_tokens,
});
return result.choices[0].message.content;
}, { parent: parentSpan });
});
}
// Graceful shutdown
process.on('SIGTERM', () => tracer.shutdown());