Amux
使用指南

在工程中使用

在 Python 和 Node.js 项目中集成 Amux API Platform

Amux API Platform 提供 OpenAI 兼容 API,你可以在任何编程语言中使用标准的 OpenAI SDK 接入。

前提条件

  • 拥有 Amux API Platform 账号和 API 令牌(快速开始

Python

安装 SDK

pip install openai

基础对话补全

from openai import OpenAI

client = OpenAI(
    base_url="https://api.amux.ai/v1",
    api_key="your-amux-api-token",
)

response = client.chat.completions.create(
    model="gpt-5.4",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"},
    ],
)

print(response.choices[0].message.content)

流式响应

from openai import OpenAI

client = OpenAI(
    base_url="https://api.amux.ai/v1",
    api_key="your-amux-api-token",
)

stream = client.chat.completions.create(
    model="gpt-5.4",
    messages=[{"role": "user", "content": "写一首短诗"}],
    stream=True,
)

for chunk in stream:
    if chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="")

异步用法

import asyncio
from openai import AsyncOpenAI

client = AsyncOpenAI(
    base_url="https://api.amux.ai/v1",
    api_key="your-amux-api-token",
)

async def main():
    response = await client.chat.completions.create(
        model="gpt-5.4",
        messages=[{"role": "user", "content": "Hello!"}],
    )
    print(response.choices[0].message.content)

asyncio.run(main())

使用环境变量

export OPENAI_BASE_URL="https://api.amux.ai/v1"
export OPENAI_API_KEY="your-amux-api-token"
from openai import OpenAI

# 自动读取 OPENAI_BASE_URL 和 OPENAI_API_KEY
client = OpenAI()

response = client.chat.completions.create(
    model="gpt-5.4",
    messages=[{"role": "user", "content": "Hello!"}],
)

Node.js

安装 SDK

npm install openai

基础对话补全

import OpenAI from 'openai';

const client = new OpenAI({
  baseURL: 'https://api.amux.ai/v1',
  apiKey: 'your-amux-api-token',
});

async function main() {
  const response = await client.chat.completions.create({
    model: 'gpt-5.4',
    messages: [
      { role: 'system', content: 'You are a helpful assistant.' },
      { role: 'user', content: 'Hello!' },
    ],
  });

  console.log(response.choices[0].message.content);
}

main();

流式响应

import OpenAI from 'openai';

const client = new OpenAI({
  baseURL: 'https://api.amux.ai/v1',
  apiKey: 'your-amux-api-token',
});

async function main() {
  const stream = await client.chat.completions.create({
    model: 'gpt-5.4',
    messages: [{ role: 'user', content: '写一首短诗' }],
    stream: true,
  });

  for await (const chunk of stream) {
    const content = chunk.choices[0]?.delta?.content;
    if (content) process.stdout.write(content);
  }
}

main();

使用环境变量

export OPENAI_BASE_URL="https://api.amux.ai/v1"
export OPENAI_API_KEY="your-amux-api-token"
import OpenAI from 'openai';

// 自动读取 OPENAI_BASE_URL 和 OPENAI_API_KEY
const client = new OpenAI();

const response = await client.chat.completions.create({
  model: 'gpt-5.4',
  messages: [{ role: 'user', content: 'Hello!' }],
});

CommonJS (require)

const OpenAI = require('openai');

const client = new OpenAI({
  baseURL: 'https://api.amux.ai/v1',
  apiKey: 'your-amux-api-token',
});

async function main() {
  const response = await client.chat.completions.create({
    model: 'gpt-5.4',
    messages: [{ role: 'user', content: 'Hello!' }],
  });

  console.log(response.choices[0].message.content);
}

main();

直接使用 fetch(无 SDK)

如果不想引入 SDK,可以直接调用 API:

const response = await fetch('https://api.amux.ai/v1/chat/completions', {
  method: 'POST',
  headers: {
    'Content-Type': 'application/json',
    'Authorization': 'Bearer your-amux-api-token',
  },
  body: JSON.stringify({
    model: 'gpt-5.4',
    messages: [{ role: 'user', content: 'Hello!' }],
  }),
});

const data = await response.json();
console.log(data.choices[0].message.content);

可用模型

供应商模型
OpenAIGPT-5.4、GPT-4o、GPT-4、o3-mini 等
xAIGrok 系列(文本、图像、视频)
MiniMaxM2.5 及 M 系列

更多供应商将陆续接入,请在平台查看最新模型列表。

注意事项

  • Amux API Platform 完全兼容 OpenAI,任何 OpenAI SDK 或库均可使用
  • 请将 API 令牌存储在环境变量中,不要写在源码里
  • 所有模型均支持流式响应
  • 速率限制和配额取决于你的账户额度

On this page