Introduction
A unified infrastructure for converting between different LLM provider APIs
What is Amux?
Amux is a bidirectional API adapter that enables seamless conversion between different LLM provider formats.
Key benefits:
- Provider flexibility - Switch between OpenAI, Anthropic, DeepSeek, and others without changing your code
- Bidirectional conversion - Accept requests in any format, call any provider, return responses in any format
- Type-safe - Full TypeScript support with comprehensive type definitions
- Zero dependencies - Core package has no runtime dependencies
- Production-ready - Used in production applications
Unlike other solutions that only provide a unified interface, Amux enables true bidirectional conversion - you control both the input and output formats.
Quick Example
import { createBridge } from '@amux.ai/llm-bridge'
import { openaiAdapter } from '@amux.ai/adapter-openai'
import { anthropicAdapter } from '@amux.ai/adapter-anthropic'
// Create a bridge: OpenAI format in → Anthropic API out
const bridge = createBridge({
inbound: openaiAdapter,
outbound: anthropicAdapter,
config: {
apiKey: process.env.ANTHROPIC_API_KEY
}
})
// Send OpenAI-format request, get OpenAI-format response
// But actually calls Claude API under the hood
const response = await bridge.chat({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Hello!' }]
})
console.log(response.choices[0].message.content)How It Works
Amux uses an Intermediate Representation (IR) pattern to convert between any provider formats:
Your App (OpenAI format)
↓
Inbound Adapter → Parse to IR
↓
Intermediate Representation (unified format)
↓
Outbound Adapter → Build Anthropic format
↓
Claude API
↓
Response flows back through IR
↓
Your App (OpenAI format)The IR is a unified format that captures all LLM capabilities - messages, tools, streaming, multimodal content, and provider-specific extensions.
Supported Providers
| Provider | Package | Features |
|---|---|---|
| OpenAI | @amux.ai/adapter-openai | Chat, streaming, tools, vision, JSON mode |
| Anthropic | @amux.ai/adapter-anthropic | Messages, streaming, tools, vision, reasoning |
| DeepSeek | @amux.ai/adapter-deepseek | Chat, streaming, tools, reasoning |
| Moonshot | @amux.ai/adapter-moonshot | Chat, streaming, tools |
| Zhipu (GLM) | @amux.ai/adapter-zhipu | Chat, streaming, tools |
| Qwen | @amux.ai/adapter-qwen | Chat, streaming, tools, vision |
| Gemini | @amux.ai/adapter-google | Chat, streaming, tools, vision |
Common Use Cases
Multi-provider support - Let users choose their preferred LLM provider:
const providers = {
openai: createBridge({ inbound: openaiAdapter, outbound: openaiAdapter }),
anthropic: createBridge({ inbound: openaiAdapter, outbound: anthropicAdapter })
}
const response = await providers[userChoice].chat(request)Provider migration - Switch providers without code changes:
// Old: Using OpenAI
const bridge = createBridge({
inbound: openaiAdapter,
outbound: openaiAdapter
})
// New: Switch to Anthropic (no other code changes needed)
const bridge = createBridge({
inbound: openaiAdapter,
outbound: anthropicAdapter
})Cost optimization - Route to cheaper providers for simple requests:
const isSimple = request.messages.length < 3
const outbound = isSimple ? deepseekAdapter : anthropicAdapter
const bridge = createBridge({ inbound: openaiAdapter, outbound })