DeepSeek-TUI Integration

Connect DeepSeek-TUI to QCode.cc: run Claude models in your terminal via an OpenAI-compatible provider

DeepSeek-TUI Integration

DeepSeek-TUI is a recently viral command-line AI coding agent (over 23K GitHub Stars in 10 days) with native support for OpenAI-compatible providers. In ~/.deepseek/config.toml it lets you switch the default provider — just set provider = "openai" and configure base_url and api_key to point the backend at any OpenAI Chat Completions-compatible service.

This guide shows you how to connect DeepSeek-TUI to QCode.cc so you can run Claude models behind a familiar TUI — no login, no regional restrictions, sharing the same API key and quota as Claude Code.

Why use DeepSeek-TUI with QCode

  • Familiar TUI experience: Plan / Agent / YOLO modes plus built-in MCP / Shell / Git / sub-agents
  • One API key: Shares the QCode plan quota with Claude Code and Codex CLI
  • Multi-provider switching: Flip between anthropic / ollama / vllm inside the same tool for easy debugging
  • China-friendly: 103.236.53.153 Shenzhen direct HTTP for the lowest latency
  • Fully open source: MIT license, auditable configuration

1. Installation

Pick any one (official INSTALL.md):

# npm (recommended, auto-downloads the platform binary)
npm install -g deepseek-tui

# Homebrew (macOS)
brew tap Hmbown/deepseek-tui
brew install deepseek-tui

# Scoop (Windows)
scoop install deepseek-tui

# Cargo (build from source)
cargo install deepseek-tui-cli --locked
cargo install deepseek-tui --locked

Verify the install:

deepseek --version

2. Configure ~/.deepseek/config.toml

Switch the default provider to openai and point the [providers.openai] sub-table at QCode.cc:

# ~/.deepseek/config.toml

provider = "openai"

[providers.openai]
api_key  = "cr_your_qcode_api_key"
base_url = "https://api.qcode.cc/openai/v1"
model    = "claude-sonnet-4-6"

Field reference:

Field Description
provider Top-level "openai" makes the OpenAI-compatible provider the default
api_key Obtained from the QCode.cc console, starts with cr_
base_url No trailing slash — DeepSeek-TUI appends /chat/completions itself; a trailing slash produces //chat/completions and returns 404
model A Claude model ID exposed by QCode (see "Available models" below)

For mainland China users, switch to the HTTP direct endpoint (the only endpoint that supports probe.qcode.cc request lookups):

[providers.openai]
api_key  = "cr_your_qcode_api_key"
base_url = "http://103.236.53.153/openai/v1"
model    = "claude-sonnet-4-6"

3. Environment Variable Alternative

If you'd rather skip the config file, use environment variables:

export OPENAI_API_KEY="cr_your_qcode_api_key"
export OPENAI_BASE_URL="https://api.qcode.cc/openai/v1"
export OPENAI_MODEL="claude-sonnet-4-6"

deepseek --provider openai

To make it permanent, append the lines to ~/.zshrc or ~/.bashrc.

4. Available Models

QCode's OpenAI-compatible endpoint exposes the full Claude lineup:

Model ID Recommended Use
claude-opus-4-6 Heavy planning / complex architecture design
claude-sonnet-4-6 Daily coding (recommended)
claude-haiku-4-5-20251001 Quick small tasks / low-cost scenarios

Query the full list live with curl:

curl https://api.qcode.cc/openai/v1/models \
  -H "Authorization: Bearer $OPENAI_API_KEY"

5. Verify Connectivity

First, use curl to check the path and authentication:

# Path connectivity (a 401 means the path is reachable but missing the auth header)
curl -X POST https://api.qcode.cc/openai/v1/chat/completions

# End-to-end with a key
curl -X POST https://api.qcode.cc/openai/v1/chat/completions \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"model":"claude-sonnet-4-6","messages":[{"role":"user","content":"ping"}],"max_tokens":32}'

If the JSON response contains choices[0].message.content, you're in. Then launch deepseek-tui:

deepseek

Type a prompt and you should see a Claude-style reply.

6. Run Modes

DeepSeek-TUI has three modes — switch between them inside the TUI with the /mode command:

Mode Behavior Use Case
plan Read-only exploration; the AI cannot modify files Code reading / design work
agent Multi-step tool calls; key actions require user approval Daily coding (recommended)
yolo Auto-approves every tool call Trusted sandboxes

7. Alternative Endpoint Domains

If the primary domain is unstable, switch to another region — the same API key works everywhere:

Domain Audience base_url
api.qcode.cc Global (Route 53 geo-routing) https://api.qcode.cc/openai/v1
103.236.53.153 Mainland China HTTP direct http://103.236.53.153/openai/v1
us.qcode.cc North America fallback https://us.qcode.cc/openai/v1
eu.qcode.cc Europe fallback https://eu.qcode.cc/openai/v1
asia.qcode.cc Asia fallback https://asia.qcode.cc/openai/v1

See Endpoints and API Paths for the full picture.

8. FAQ

Why does the backend run Claude instead of DeepSeek models?

QCode.cc currently focuses on relaying Claude and Codex models — the OpenAI-compatible endpoint /openai/v1 is backed by the Claude family. To run the official DeepSeek models, switch back to DeepSeek-TUI's default deepseek provider and use https://api.deepseek.com.

How does this relate to Claude Code and Codex CLI?

All three share the same API key and quota:

  • DeepSeek-TUI speaks the OpenAI Chat Completions protocol (/openai/v1)
  • Claude Code speaks the Anthropic Messages protocol (/api/v1/messages)
  • Codex CLI speaks the OpenAI Responses protocol (/openai/v1/responses)

Internally, QCode CRS maps all three to the same Claude model pool. See Endpoints and API Paths for details.

Should base_url end with /?

No. Under the openai provider, DeepSeek-TUI automatically appends /chat/completions; a trailing slash produces //chat/completions and returns 404.

Do function calling and SSE streaming work?

Yes. Under the openai provider, DeepSeek-TUI sends the standard OpenAI Chat Completions schema, and QCode CRS fully supports tools and stream: true — the behavior matches the OpenAI Python SDK.

Empty model list / Model not found?

QCode doesn't expose DeepSeek-TUI's defaults deepseek-v4-pro / deepseek-v4-flash — you must set model to a Claude model ID supported by QCode (e.g. claude-sonnet-4-6). Type it in manually; no list selection required.

How do I configure MCP / sub-agents?

DeepSeek-TUI's MCP and sub-agent configuration is decoupled from the upstream model — see the official CONFIGURATION.md. No QCode-specific configuration is required.

How do I save tokens?

  • Let the AI read in plan mode first, then switch to agent mode to make changes
  • Use claude-haiku-4-5-20251001 for simple tasks
  • Run /compact periodically on long sessions to trim context

Next Steps

Related Documents

Подключение редактора Cursor
Подключите QCode.cc в Cursor IDE через пользовательские эндпоинты Anthropic / OpenAI; поддерживаются Agents Window и Design Mode из Cursor v3
Подключение редактора Zed
Подключите QCode.cc к Claude Code в редакторе Zed через Agent Client Protocol (ACP) и используйте agent panel с Opus 4.7 и контекстом 1M
Подключение Claude Desktop
Настройте QCode.cc как сторонний шлюз вывода в режиме разработчика Claude Desktop и используйте единую квоту QCode
🚀
Get Started with QCode — Claude Code & Codex
One plan for both Claude Code and Codex, Asia-Pacific low latency
View Pricing Plans → Create Account
Team of 3+?
Enterprise: dedicated domain + sub-key management + ban protection, from ¥250/person/mo
Learn Enterprise →