Adapters Overview
All official adapters supported by Amux
Amux provides 7 official adapters supporting mainstream LLM providers. Each adapter handles bidirectional conversion between provider-specific formats and a unified intermediate representation (IR).
Official Adapters
OpenAI
GPT-4, GPT-3.5, function calling, vision
Anthropic
Claude 3.5 Sonnet, Claude 3 Opus, 200K context
DeepSeek
DeepSeek Chat, DeepSeek Coder, high value
Moonshot
Kimi models, 200K ultra-long context
Zhipu
GLM-4 series, web search support
Qwen
Qwen series, vision and multimodal
Google Gemini
Gemini Pro, 1M+ tokens context
Feature Comparison
| Adapter | Streaming | Tools | Vision | Multimodal | Reasoning | JSON Mode |
|---|---|---|---|---|---|---|
| OpenAI | ✅ | ✅ | ✅ | ✅ | ❌ | ✅ |
| Anthropic | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ |
| DeepSeek | ✅ | ✅ | ❌ | ❌ | ✅ | ✅ |
| Moonshot | ✅ | ✅ | ❌ | ❌ | ✅ | ✅ |
| Zhipu | ✅ | ✅ | ✅ | ✅ | ❌ | ✅ |
| Qwen | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Gemini | ✅ | ✅ | ✅ | ✅ | ❌ | ✅ |
Choosing an Adapter
By Feature
Vision & Multimodal
- OpenAI (GPT-4o), Anthropic (Claude 3), Qwen (Qwen-VL), Gemini (Gemini Pro Vision), Zhipu (GLM-4V)
Long Context
- Gemini (1M+ tokens), Anthropic (200K), Moonshot (200K)
Coding
- DeepSeek (DeepSeek Coder), OpenAI (GPT-4), Anthropic (Claude 3.5 Sonnet)
Reasoning
- OpenAI (o3, o4-mini), DeepSeek (DeepSeek Reasoner), Moonshot (Kimi K2 Thinking)
By Price
Most Economical: DeepSeek, Qwen
Mid-range: OpenAI GPT-3.5, Moonshot
Premium: OpenAI GPT-4, Anthropic Claude, Gemini Pro
By OpenAI Compatibility
These adapters are OpenAI-compatible and can be seamlessly swapped:
- DeepSeek (fully compatible)
- Moonshot (fully compatible)
- Zhipu (fully compatible)
- Qwen (mostly compatible, minor differences)
- Gemini (requires format conversion)
Quick Start
pnpm add @amux.ai/llm-bridge @amux.ai/adapter-openai @amux.ai/adapter-anthropicBasic Usage
import { createBridge } from '@amux.ai/llm-bridge'
import { openaiAdapter } from '@amux.ai/adapter-openai'
import { anthropicAdapter } from '@amux.ai/adapter-anthropic'
// Accept OpenAI format, call Claude API
const bridge = createBridge({
inbound: openaiAdapter,
outbound: anthropicAdapter,
config: {
apiKey: process.env.ANTHROPIC_API_KEY
}
})
const response = await bridge.chat({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Hello!' }]
})Creating Custom Adapters
Need to support other LLM providers? Create a custom adapter:
import type { LLMAdapter } from '@amux.ai/llm-bridge'
export const myAdapter: LLMAdapter = {
name: 'my-provider',
version: '1.0.0',
capabilities: {
streaming: true,
tools: true,
vision: false,
multimodal: false,
systemPrompt: true,
toolChoice: true
},
inbound: {
parseRequest: (request) => { /* ... */ },
parseResponse: (response) => { /* ... */ },
parseStream: (stream) => { /* ... */ },
parseError: (error) => { /* ... */ }
},
outbound: {
buildRequest: (ir) => { /* ... */ },
buildResponse: (ir) => { /* ... */ }
},
getInfo() {
return {
name: this.name,
version: this.version,
capabilities: this.capabilities,
endpoint: {
baseURL: 'https://api.my-provider.com',
chatPath: '/v1/chat'
}
}
}
}See the Custom Adapter Guide for a complete tutorial.