Introduction to Chat
Welcome to the Chat service documentation!
This service provides a multi-protocol AI API gateway, supporting OpenAI Compatible API, OpenAI Responses API, Anthropic (Claude) Compatible API, and Google Gemini Native API. You can use official SDKs or any compatible client (such as Cherry Studio, Claude Code, Cursor, etc.) to access our powerful language models.
🌟 Key Features
Multi-Protocol Support
Our Chat API supports multiple API formats to meet different needs:
| Protocol | Base URL | Description |
|---|
| OpenAI Compatible | https://api.mountsea.ai/chat | 兼容 OpenAI Chat Completions API |
| OpenAI Responses | https://api.mountsea.ai/chat | 兼容 OpenAI Responses API 格式 |
| Anthropic (Claude) | https://api.mountsea.ai/chat/claude | 兼容 Anthropic Messages API,支持 Claude Code |
| Gemini Native | https://api.mountsea.ai/chat/gemini | 兼容 Google Gemini Native API |
注意 Base URL 区别:Claude Code 使用的 Base URL 为 https://api.mountsea.ai/chat/claude,其他协议统一使用 https://api.mountsea.ai/chat。
OpenAI Compatible
- ✅ Use the official OpenAI Python/Node.js SDK directly
- ✅ No code changes needed - just update
base_url and api_key
- ✅ Full support for streaming responses
- ✅ Support for Function Calling / Tools
- ✅ Support for JSON Mode and structured outputs
OpenAI Responses API
- ✅ Support the new OpenAI Responses API format (
/v1/responses)
- ✅ Compatible with Cherry Studio and other clients using Responses API
- ✅ Simplified input format with
input and instructions
Anthropic (Claude Code) Compatible
- ✅ Use Claude Code CLI directly with our service
- ✅ Compatible with Anthropic Python/TypeScript SDK
- ✅ Full support for streaming, tools, and system prompts
- ✅ Access Claude models (claude-4.5, claude-opus-4-6, claude-sonnet-4-6, etc.)
Supported Models
We provide access to various high-quality language models:
| Model | Type | Description |
|---|
gpt-5.1 | OpenAI | Latest GPT-5.1 model |
gpt-5.1-all | OpenAI | GPT-5.1 with all capabilities |
gpt-5.1-thinking | OpenAI | GPT-5.1 with enhanced reasoning |
gpt-5.1-thinking-all | OpenAI | GPT-5.1 thinking with all capabilities |
gpt-5.1-2025-11-13 | OpenAI | GPT-5.1 snapshot (2025-11-13) |
gpt-5.2 | OpenAI | Latest GPT-5.2 model |
gemini-3-pro | Google | Google Gemini 3 Pro |
gemini-2.5-pro | Google | Google Gemini 2.5 Pro |
gemini-2.5-flash | Google | Google Gemini 2.5 Flash (faster) |
gemini-3-flash | Google | Google Gemini 3 Flash (faster) |
gemini-3.1-pro | Google | Google Gemini 3.1 Pro |
claude-4.5 | Anthropic | Claude 4.5 model |
claude-opus-4-6 | Anthropic | Claude Opus 4.6 model |
claude-sonnet-4-6 | Anthropic | Claude Sonnet 4.6 model |
claude-haiku-4-5-20251001 | Anthropic | Claude Haiku 4.5 (轻量快速) |
🚀 Quick Start
Using OpenAI Python SDK
from openai import OpenAI
client = OpenAI(
api_key="your-api-key",
base_url="https://api.mountsea.ai/chat"
)
response = client.chat.completions.create(
model="gpt-5.1",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
]
)
print(response.choices[0].message.content)
Using OpenAI Node.js SDK
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: 'your-api-key',
baseURL: 'https://api.mountsea.ai/chat'
});
const response = await client.chat.completions.create({
model: 'gpt-5.1',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Hello!' }
]
});
console.log(response.choices[0].message.content);
Streaming Responses
from openai import OpenAI
client = OpenAI(
api_key="your-api-key",
base_url="https://api.mountsea.ai/chat"
)
stream = client.chat.completions.create(
model="gpt-5.1",
messages=[{"role": "user", "content": "Tell me a story"}],
stream=True
)
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
Using cURL
curl https://api.mountsea.ai/chat/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your-api-key" \
-d '{
"model": "gpt-5.1",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
]
}'
🤖 Claude Code 接入指南
Claude Code 是 Anthropic 官方的 AI 编程 CLI 工具。通过配置我们的 API,你可以直接使用 Claude Code 进行 AI 辅助编程。
Claude Code 使用 Anthropic Messages API 格式,Base URL 为 https://api.mountsea.ai/chat/claude,与其他协议不同,请注意区分。
配置方法
设置环境变量即可使用:
macOS / Linux
Windows (PowerShell)
Windows (CMD)
# 设置 API Key 和 Base URL
export ANTHROPIC_API_KEY="your-api-key"
export ANTHROPIC_BASE_URL="https://api.mountsea.ai/chat/claude"
# 启动 Claude Code
claude
持久化配置(添加到 ~/.bashrc 或 ~/.zshrc):echo 'export ANTHROPIC_API_KEY="your-api-key"' >> ~/.zshrc
echo 'export ANTHROPIC_BASE_URL="https://api.mountsea.ai/chat/claude"' >> ~/.zshrc
source ~/.zshrc
# 设置环境变量
$env:ANTHROPIC_API_KEY = "your-api-key"
$env:ANTHROPIC_BASE_URL = "https://api.mountsea.ai/chat/claude"
# 启动 Claude Code
claude
持久化配置:[System.Environment]::SetEnvironmentVariable("ANTHROPIC_API_KEY", "your-api-key", "User")
[System.Environment]::SetEnvironmentVariable("ANTHROPIC_BASE_URL", "https://api.mountsea.ai/chat/claude", "User")
:: 设置环境变量
set ANTHROPIC_API_KEY=your-api-key
set ANTHROPIC_BASE_URL=https://api.mountsea.ai/chat/claude
:: 启动 Claude Code
claude
使用 Anthropic SDK 调用
import anthropic
client = anthropic.Anthropic(
api_key="your-api-key",
base_url="https://api.mountsea.ai/chat/claude"
)
message = client.messages.create(
model="claude-sonnet-4-6",
max_tokens=4096,
system="You are a helpful coding assistant.",
messages=[
{"role": "user", "content": "Write a Python function to sort a list"}
]
)
print(message.content[0].text)
Streaming:with client.messages.stream(
model="claude-sonnet-4-6",
max_tokens=4096,
messages=[{"role": "user", "content": "Explain async/await in Python"}]
) as stream:
for text in stream.text_stream:
print(text, end="")
Install:import Anthropic from '@anthropic-ai/sdk';
const client = new Anthropic({
apiKey: 'your-api-key',
baseURL: 'https://api.mountsea.ai/chat/claude'
});
const message = await client.messages.create({
model: 'claude-sonnet-4-6',
max_tokens: 4096,
system: 'You are a helpful coding assistant.',
messages: [
{ role: 'user', content: 'Write a TypeScript function to sort an array' }
]
});
console.log(message.content[0].text);
Streaming:const stream = await client.messages.stream({
model: 'claude-sonnet-4-6',
max_tokens: 4096,
messages: [{ role: 'user', content: 'Explain async/await in JavaScript' }]
});
for await (const event of stream) {
if (event.type === 'content_block_delta' && event.delta.type === 'text_delta') {
process.stdout.write(event.delta.text);
}
}
Install:npm install @anthropic-ai/sdk
curl https://api.mountsea.ai/chat/claude/v1/messages \
-H "Content-Type: application/json" \
-H "x-api-key: your-api-key" \
-H "anthropic-version: 2023-06-01" \
-d '{
"model": "claude-sonnet-4-6",
"max_tokens": 4096,
"system": "You are a helpful assistant.",
"messages": [
{"role": "user", "content": "Hello, Claude!"}
]
}'
📱 Cherry Studio 接入指南
Cherry Studio 是一款功能强大的 AI 桌面客户端,支持多种 API 格式。你可以通过配置我们的 API 在 Cherry Studio 中使用所有模型。
配置 OpenAI 兼容模式(推荐)
打开设置
在 Cherry Studio 中,进入 设置 → 模型服务商 → 点击 添加
配置 API
填写以下信息:| 配置项 | 值 |
|---|
| 服务商名称 | MountSea AI |
| API 地址 | https://api.mountsea.ai/chat |
| API Key | 你的 API Key |
添加模型
点击 添加模型,手动输入模型名称,例如:gpt-5.1、claude-sonnet-4-6、gemini-3-flash 等
配置 OpenAI Responses 模式
如果 Cherry Studio 使用 OpenAI Responses API 格式,配置方式相同:
| 配置项 | 值 |
|---|
| API 地址 | https://api.mountsea.ai/chat |
| API Key | 你的 API Key |
| API 格式 | OpenAI Responses |
📋 Available Endpoints
OpenAI Compatible API
| Endpoint | Method | Description |
|---|
/chat/chat/completions | POST | Create a chat completion |
/chat/v1/chat/completions | POST | Create a chat completion (v1) |
/chat/models | GET | List available models |
/chat/models/{model} | GET | Get model details |
OpenAI Responses API
| Endpoint | Method | Description |
|---|
/chat/v1/responses | POST | Create a response (Responses API 格式) |
Anthropic Compatible API (Claude Code)
Base URL: https://api.mountsea.ai/chat/claude (注意与其他 API 不同)
| Endpoint | Method | Description |
|---|
/chat/claude/v1/messages | POST | Create a message (Anthropic 格式) |
Gemini Native API
| Endpoint | Method | Description |
|---|
/chat/gemini/{apiVersion}/models/{model}:generateContent | POST | Generate content (non-streaming) |
/chat/gemini/{apiVersion}/models/{model}:streamGenerateContent | POST | Generate content (streaming) |
The Gemini Native API allows you to use the official @google/genai SDK directly. See Gemini API for details.
Explore the API Documentation
OpenAI Compatible:
OpenAI Responses API:
Anthropic Compatible (Claude Code):
Gemini Native:
For more details, visit the full MountSea API Documentation.