Skip to main content

Introduction to Chat

Welcome to the Chat service documentation! This service provides a fully OpenAI-compatible API, allowing you to use the official OpenAI SDK or any OpenAI-compatible client to access our powerful language models.

🌟 Key Features

OpenAI Compatible

Our Chat API is 100% compatible with OpenAI’s Chat Completions API. This means:
  • βœ… Use the official OpenAI Python/Node.js SDK directly
  • βœ… No code changes needed - just update base_url and api_key
  • βœ… Full support for streaming responses
  • βœ… Support for Function Calling / Tools
  • βœ… Support for JSON Mode and structured outputs

Supported Models

We provide access to various high-quality language models:
ModelDescription
gpt-5.1Latest GPT-5.1 model
gpt-5.1-allGPT-5.1 with all capabilities
gpt-5.1-thinkingGPT-5.1 with enhanced reasoning
gpt-5.1-thinking-allGPT-5.1 thinking with all capabilities
gpt-5.2Latest GPT-5.2 model
gemini-3-proGoogle Gemini 3 Pro
gemini-2.5-proGoogle Gemini 2.5 Pro
gemini-2.5-flashGoogle Gemini 2.5 Flash (faster)
gemini-3-flashGoogle Gemini 3 Flash (faster)
claude-4.5Claude 4.5 model

πŸš€ Quick Start

Using OpenAI Python SDK

from openai import OpenAI

client = OpenAI(
    api_key="your-api-key",
    base_url="https://api.mountsea.ai/chat"
)

response = client.chat.completions.create(
    model="gpt-5.1",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"}
    ]
)

print(response.choices[0].message.content)

Using OpenAI Node.js SDK

import OpenAI from 'openai';

const client = new OpenAI({
    apiKey: 'your-api-key',
    baseURL: 'https://api.mountsea.ai/chat'
});

const response = await client.chat.completions.create({
    model: 'gpt-5.1',
    messages: [
        { role: 'system', content: 'You are a helpful assistant.' },
        { role: 'user', content: 'Hello!' }
    ]
});

console.log(response.choices[0].message.content);

Streaming Responses

from openai import OpenAI

client = OpenAI(
    api_key="your-api-key",
    base_url="https://api.mountsea.ai/chat"
)

stream = client.chat.completions.create(
    model="gpt-5.1",
    messages=[{"role": "user", "content": "Tell me a story"}],
    stream=True
)

for chunk in stream:
    if chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="")

Using cURL

curl https://api.mountsea.ai/chat/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer your-api-key" \
  -d '{
    "model": "gpt-5.1",
    "messages": [
      {"role": "system", "content": "You are a helpful assistant."},
      {"role": "user", "content": "Hello!"}
    ]
  }'

πŸ“‹ Available Endpoints

OpenAI Compatible API

EndpointMethodDescription
/chat/chat/completionsPOSTCreate a chat completion
/chat/modelsGETList available models
/chat/models/{model}GETGet model details

Gemini Native API

EndpointMethodDescription
/chat/gemini/{apiVersion}/models/{model}:generateContentPOSTGenerate content (non-streaming)
/chat/gemini/{apiVersion}/models/{model}:streamGenerateContentPOSTGenerate content (streaming)
The Gemini Native API allows you to use the official @google/genai SDK directly. See Gemini API for details.

Explore the API Documentation

OpenAI Compatible: Gemini Native:
For more details, visit the full MountSea API Documentation.