Skip to main content

Introduction to Chat

Welcome to the Chat service documentation! This service provides a multi-protocol AI API gateway, supporting OpenAI Compatible API, OpenAI Responses API, Anthropic (Claude) Compatible API, and Google Gemini Native API. You can use official SDKs or any compatible client (such as Cherry Studio, Claude Code, Cursor, etc.) to access our powerful language models.

🌟 Key Features

Multi-Protocol Support

Our Chat API supports multiple API formats to meet different needs:
ProtocolBase URLDescription
OpenAI Compatiblehttps://api.mountsea.ai/chat兼容 OpenAI Chat Completions API
OpenAI Responseshttps://api.mountsea.ai/chat兼容 OpenAI Responses API 格式
Anthropic (Claude)https://api.mountsea.ai/chat/claude兼容 Anthropic Messages API,支持 Claude Code
Gemini Nativehttps://api.mountsea.ai/chat/gemini兼容 Google Gemini Native API
注意 Base URL 区别:Claude Code 使用的 Base URL 为 https://api.mountsea.ai/chat/claude,其他协议统一使用 https://api.mountsea.ai/chat

OpenAI Compatible

  • ✅ Use the official OpenAI Python/Node.js SDK directly
  • ✅ No code changes needed - just update base_url and api_key
  • ✅ Full support for streaming responses
  • ✅ Support for Function Calling / Tools
  • ✅ Support for JSON Mode and structured outputs

OpenAI Responses API

  • ✅ Support the new OpenAI Responses API format (/v1/responses)
  • ✅ Compatible with Cherry Studio and other clients using Responses API
  • ✅ Simplified input format with input and instructions

Anthropic (Claude Code) Compatible

  • ✅ Use Claude Code CLI directly with our service
  • ✅ Compatible with Anthropic Python/TypeScript SDK
  • ✅ Full support for streaming, tools, and system prompts
  • ✅ Access Claude models (claude-4.5, claude-opus-4-6, claude-sonnet-4-6, etc.)

Supported Models

We provide access to various high-quality language models:
ModelTypeDescription
gpt-5.1OpenAILatest GPT-5.1 model
gpt-5.1-allOpenAIGPT-5.1 with all capabilities
gpt-5.1-thinkingOpenAIGPT-5.1 with enhanced reasoning
gpt-5.1-thinking-allOpenAIGPT-5.1 thinking with all capabilities
gpt-5.1-2025-11-13OpenAIGPT-5.1 snapshot (2025-11-13)
gpt-5.2OpenAILatest GPT-5.2 model
gemini-3-proGoogleGoogle Gemini 3 Pro
gemini-2.5-proGoogleGoogle Gemini 2.5 Pro
gemini-2.5-flashGoogleGoogle Gemini 2.5 Flash (faster)
gemini-3-flashGoogleGoogle Gemini 3 Flash (faster)
gemini-3.1-proGoogleGoogle Gemini 3.1 Pro
claude-4.5AnthropicClaude 4.5 model
claude-opus-4-6AnthropicClaude Opus 4.6 model
claude-sonnet-4-6AnthropicClaude Sonnet 4.6 model
claude-haiku-4-5-20251001AnthropicClaude Haiku 4.5 (轻量快速)

🚀 Quick Start

Using OpenAI Python SDK

from openai import OpenAI

client = OpenAI(
    api_key="your-api-key",
    base_url="https://api.mountsea.ai/chat"
)

response = client.chat.completions.create(
    model="gpt-5.1",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"}
    ]
)

print(response.choices[0].message.content)

Using OpenAI Node.js SDK

import OpenAI from 'openai';

const client = new OpenAI({
    apiKey: 'your-api-key',
    baseURL: 'https://api.mountsea.ai/chat'
});

const response = await client.chat.completions.create({
    model: 'gpt-5.1',
    messages: [
        { role: 'system', content: 'You are a helpful assistant.' },
        { role: 'user', content: 'Hello!' }
    ]
});

console.log(response.choices[0].message.content);

Streaming Responses

from openai import OpenAI

client = OpenAI(
    api_key="your-api-key",
    base_url="https://api.mountsea.ai/chat"
)

stream = client.chat.completions.create(
    model="gpt-5.1",
    messages=[{"role": "user", "content": "Tell me a story"}],
    stream=True
)

for chunk in stream:
    if chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="")

Using cURL

curl https://api.mountsea.ai/chat/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer your-api-key" \
  -d '{
    "model": "gpt-5.1",
    "messages": [
      {"role": "system", "content": "You are a helpful assistant."},
      {"role": "user", "content": "Hello!"}
    ]
  }'

🤖 Claude Code 接入指南

Claude Code 是 Anthropic 官方的 AI 编程 CLI 工具。通过配置我们的 API,你可以直接使用 Claude Code 进行 AI 辅助编程。
Claude Code 使用 Anthropic Messages API 格式,Base URL 为 https://api.mountsea.ai/chat/claude,与其他协议不同,请注意区分。

配置方法

设置环境变量即可使用:
# 设置 API Key 和 Base URL
export ANTHROPIC_API_KEY="your-api-key"
export ANTHROPIC_BASE_URL="https://api.mountsea.ai/chat/claude"

# 启动 Claude Code
claude
持久化配置(添加到 ~/.bashrc 或 ~/.zshrc):
echo 'export ANTHROPIC_API_KEY="your-api-key"' >> ~/.zshrc
echo 'export ANTHROPIC_BASE_URL="https://api.mountsea.ai/chat/claude"' >> ~/.zshrc
source ~/.zshrc

使用 Anthropic SDK 调用

import anthropic

client = anthropic.Anthropic(
    api_key="your-api-key",
    base_url="https://api.mountsea.ai/chat/claude"
)

message = client.messages.create(
    model="claude-sonnet-4-6",
    max_tokens=4096,
    system="You are a helpful coding assistant.",
    messages=[
        {"role": "user", "content": "Write a Python function to sort a list"}
    ]
)

print(message.content[0].text)
Streaming:
with client.messages.stream(
    model="claude-sonnet-4-6",
    max_tokens=4096,
    messages=[{"role": "user", "content": "Explain async/await in Python"}]
) as stream:
    for text in stream.text_stream:
        print(text, end="")
Install:
pip install anthropic

📱 Cherry Studio 接入指南

Cherry Studio 是一款功能强大的 AI 桌面客户端,支持多种 API 格式。你可以通过配置我们的 API 在 Cherry Studio 中使用所有模型。

配置 OpenAI 兼容模式(推荐)

1

打开设置

在 Cherry Studio 中,进入 设置模型服务商 → 点击 添加
2

配置 API

填写以下信息:
配置项
服务商名称MountSea AI
API 地址https://api.mountsea.ai/chat
API Key你的 API Key
3

添加模型

点击 添加模型,手动输入模型名称,例如:gpt-5.1claude-sonnet-4-6gemini-3-flash
4

开始对话

在对话界面选择对应模型,即可开始使用

配置 OpenAI Responses 模式

如果 Cherry Studio 使用 OpenAI Responses API 格式,配置方式相同:
配置项
API 地址https://api.mountsea.ai/chat
API Key你的 API Key
API 格式OpenAI Responses

📋 Available Endpoints

OpenAI Compatible API

EndpointMethodDescription
/chat/chat/completionsPOSTCreate a chat completion
/chat/v1/chat/completionsPOSTCreate a chat completion (v1)
/chat/modelsGETList available models
/chat/models/{model}GETGet model details

OpenAI Responses API

EndpointMethodDescription
/chat/v1/responsesPOSTCreate a response (Responses API 格式)

Anthropic Compatible API (Claude Code)

Base URL: https://api.mountsea.ai/chat/claude (注意与其他 API 不同)
EndpointMethodDescription
/chat/claude/v1/messagesPOSTCreate a message (Anthropic 格式)

Gemini Native API

EndpointMethodDescription
/chat/gemini/{apiVersion}/models/{model}:generateContentPOSTGenerate content (non-streaming)
/chat/gemini/{apiVersion}/models/{model}:streamGenerateContentPOSTGenerate content (streaming)
The Gemini Native API allows you to use the official @google/genai SDK directly. See Gemini API for details.

Explore the API Documentation

OpenAI Compatible: OpenAI Responses API: Anthropic Compatible (Claude Code): Gemini Native:
For more details, visit the full MountSea API Documentation.