Loom
The Loom
module provides global configuration for OpenAI API settings. It’s implemented as a singleton that maintains consistent settings across your application.
Basic Usage
import { Loom } from 'loom-agents';
// Configure OpenAI APILoom.api = 'completions'; // or 'responses'Loom.openai_config = { apiKey: process.env.OPENAI_API_KEY, organization: process.env.OPENAI_ORG_ID};
// Access OpenAI clientconst completion = await Loom.openai.chat.completions.create({ model: 'gpt-4o', messages: [{ role: 'user', content: 'Hello!' }]});
Configuration Options
The ConfigOptions
interface defines available configuration:
interface ConfigOptions { api?: 'completions' | 'responses'; // API mode to use openai_config?: ClientOptions; // OpenAI client configuration}
Setting API Mode
The api
property controls which OpenAI endpoint to use:
// Use OpenAI's Chat Completions APILoom.api = 'completions';
// Use OpenAI's Responses APILoom.api = 'responses';
The selected API mode affects how Agents process requests.
Configuring OpenAI Client
The openai_config
property accepts any options supported by the OpenAI Node.js client: