API 文档
API 参考手册
完全兼容 OpenAI API 格式,替换 baseURL 即可迁移
概览
Base URL
https://freetokenrouter.cn/api/v1协议
HTTPS · REST · JSON · SSE 流式
兼容性
完全兼容 OpenAI API v1
计费
按 Token 消耗,从余额扣除
鉴权
所有请求需在 HTTP Header 中携带 API Key。在 控制台 创建 Key,格式为 ftr_xxxxxxxxxx。
http
Authorization: Bearer ftr_your_api_key
⚠️ API Key 只在创建时完整显示一次,请妥善保存。
对话补全
POST/api/v1/chat/completions
创建一个对话补全请求,支持多轮对话和系统提示词。
请求体
modelstring必填模型 ID,如 deepseek-chatmessagesarray必填消息列表,每条包含 role 和 contentstreamboolean可选是否开启流式响应,默认 falsetemperaturenumber可选随机性,0-2,默认 1max_tokensinteger可选最大输出 Token 数curl
curl https://freetokenrouter.cn/api/v1/chat/completions \
-H "Authorization: Bearer ftr_your_api_key" \
-H "Content-Type: application/json" \
-d '{
"model": "deepseek-chat",
"messages": [
{"role": "system", "content": "你是一个有帮助的 AI 助手"},
{"role": "user", "content": "你好!"}
]
}'响应示例
json
{
"id": "chatcmpl-abc123",
"object": "chat.completion",
"created": 1711900000,
"model": "deepseek-chat",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "你好!有什么我可以帮助你的吗?"
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 18,
"completion_tokens": 12,
"total_tokens": 30
}
}流式响应
设置 stream: true 开启 SSE 流式输出,响应格式与 OpenAI 完全一致。
curl
curl https://freetokenrouter.cn/api/v1/chat/completions \
-H "Authorization: Bearer ftr_your_api_key" \
-H "Content-Type: application/json" \
-d '{
"model": "qwen-turbo",
"messages": [{"role": "user", "content": "写一首诗"}],
"stream": true
}'流式响应格式
text
data: {"id":"chatcmpl-abc","object":"chat.completion.chunk","choices":[{"delta":{"role":"assistant"},"index":0}]}
data: {"id":"chatcmpl-abc","object":"chat.completion.chunk","choices":[{"delta":{"content":"床前"},"index":0}]}
data: {"id":"chatcmpl-abc","object":"chat.completion.chunk","choices":[{"delta":{"content":"明月光"},"index":0}]}
data: [DONE]模型列表
GET/api/v1/models
获取所有可用模型列表。
curl
curl https://freetokenrouter.cn/api/v1/models \ -H "Authorization: Bearer ftr_your_api_key"
模型 ID供应商上下文免费
deepseek-chatDeepSeek128K—deepseek-reasonerDeepSeek128K—qwen-turbo通义千问131K✓ 免费qwen-plus通义千问131K—qwen-max通义千问33K—glm-4-flash智谱AI128K✓ 免费glm-4-plus智谱AI128K—moonshot-v1-128k月之暗面128K—hunyuan-lite腾讯混元256K✓ 免费hunyuan-pro腾讯混元32K—查看完整模型详情和定价 → 模型列表页
错误码
401UnauthorizedAPI Key 无效或缺失402Payment RequiredToken 余额不足404Not Found模型不存在或未开放429Too Many Requests请求频率超限500Internal Server Error服务器内部错误,请稍后重试503Service Unavailable上游模型服务暂时不可用错误响应格式
json
{
"error": {
"message": "Token 余额不足,请充值或签到获取更多 Token",
"type": "insufficient_balance",
"code": 402
}
}SDK 示例
Python(openai SDK)
python
from openai import OpenAI
client = OpenAI(
base_url="https://freetokenrouter.cn/api/v1",
api_key="ftr_your_api_key",
)
# 普通请求
response = client.chat.completions.create(
model="deepseek-chat",
messages=[{"role": "user", "content": "你好"}],
)
print(response.choices[0].message.content)
# 流式请求
stream = client.chat.completions.create(
model="qwen-turbo",
messages=[{"role": "user", "content": "写一首诗"}],
stream=True,
)
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="", flush=True)JavaScript / TypeScript(openai SDK)
typescript
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://freetokenrouter.cn/api/v1",
apiKey: "ftr_your_api_key",
});
// 普通请求
const response = await client.chat.completions.create({
model: "glm-4-flash",
messages: [{ role: "user", content: "你好" }],
});
console.log(response.choices[0].message.content);
// 流式请求
const stream = await client.chat.completions.create({
model: "deepseek-chat",
messages: [{ role: "user", content: "写一首诗" }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content ?? "");
}迁移已有 OpenAI 项目
只需修改两行,无需改动其他代码:
diff
- base_url="https://api.openai.com/v1" + base_url="https://freetokenrouter.cn/api/v1" - api_key="sk-..." + api_key="ftr_your_api_key"