Integrations
Anthropic SDK & Claude Code
Use DEVUP AI models with the Anthropic Messages API, Claude Code, and the Anthropic SDK.
DEVUP AI exposes an Anthropic-compatible Messages API. This means tools that target the Anthropic API — Claude Code, the Anthropic Python and TypeScript SDKs, and any framework with an Anthropic adapter — can point at DEVUP AI and use open-source models.
Endpoint
https://api.devupai.com/anthropicTwo endpoints are available:
| POST /anthropic/v1/messages | Create a message (chat completion) |
| POST /anthropic/v1/messages/count_tokens | Count tokens for a message request |
Authentication
Both standard Anthropic authentication methods are supported:
| Authorization | Bearer $DEVUP_API_KEY |
| x-api-key | $DEVUP_API_KEY |
You can also pass anthropic-version and anthropic-beta headers as needed.
Using the Anthropic SDK
pip install anthropicimport anthropic
client = anthropic.Anthropic(
base_url="https://api.devupai.com/anthropic",
api_key="$DEVUP_API_KEY",
)
message = client.messages.create(
model="deepseek-ai/DeepSeek-V3",
max_tokens=1024,
messages=[
{"role": "user", "content": "Hello!"}
],
)
print(message.content[0].text)Using with Claude Code
Claude Code can use DEVUP AI as its backend. To keep your normal Claude Code setup untouched, add a dedicated shell function to your ~/.bashrc or ~/.zshrc:
devupai() {
export ANTHROPIC_BASE_URL="https://api.devupai.com/anthropic"
export ANTHROPIC_AUTH_TOKEN=$DEVUP_API_KEY
export ANTHROPIC_MODEL="deepseek-ai/DeepSeek-V3"
claude "$@"
}Then run devupai instead of claude to launch Claude Code via DEVUP AI. Your regular claude command stays unchanged.
Model override environment variables
Claude Code uses model aliases (opus, sonnet, haiku) internally. You can remap each alias to a DEVUP AI model using these environment variables:
| ANTHROPIC_MODEL | deepseek-ai/DeepSeek-V3 |
| ANTHROPIC_DEFAULT_OPUS_MODEL | deepseek-ai/DeepSeek-R1 |
| ANTHROPIC_DEFAULT_SONNET_MODEL | deepseek-ai/DeepSeek-V3 |
| ANTHROPIC_DEFAULT_HAIKU_MODEL | Qwen/Qwen3-30B-A3B |
| CLAUDE_CODE_SUBAGENT_MODEL | Qwen/Qwen3-30B-A3B |
A more complete example with all overrides:
devupai() {
export ANTHROPIC_BASE_URL="https://api.devupai.com/anthropic"
export ANTHROPIC_AUTH_TOKEN=$DEVUP_API_KEY
export ANTHROPIC_MODEL="deepseek-ai/DeepSeek-V3"
export ANTHROPIC_DEFAULT_OPUS_MODEL="deepseek-ai/DeepSeek-R1"
export ANTHROPIC_DEFAULT_SONNET_MODEL="deepseek-ai/DeepSeek-V3"
export ANTHROPIC_DEFAULT_HAIKU_MODEL="Qwen/Qwen3-30B-A3B"
export CLAUDE_CODE_SUBAGENT_MODEL="Qwen/Qwen3-30B-A3B"
export CLAUDE_CODE_MAX_OUTPUT_TOKENS=16384
claude "$@"
}ANTHROPIC_DEFAULT_HAIKU_MODEL is used for lightweight background tasks like tab completions and commit messages. Pick a fast, cheap model here to keep costs low. The older ANTHROPIC_SMALL_FAST_MODEL variable is deprecated — use ANTHROPIC_DEFAULT_HAIKU_MODEL instead.Streaming
Streaming works the same as the Anthropic API — use stream=True (Python) or stream: true (JS/cURL):
with client.messages.stream(
model="deepseek-ai/DeepSeek-V3",
max_tokens=1024,
messages=[{"role": "user", "content": "Write a short poem about open source."}]
) as stream:
for text in stream.text_stream:
print(text, end="", flush=True)Token counting
curl "https://api.devupai.com/anthropic/v1/messages/count_tokens" \
-H "Content-Type: application/json" \
-H "x-api-key: $DEVUP_API_KEY" \
-d '{
"model": "deepseek-ai/DeepSeek-V3",
"messages": [{"role": "user", "content": "Hello, how are you?"}]
}'Notes
- You are running open-source models via the Anthropic protocol, not Anthropic's Claude models.
- Model names use DEVUP AI identifiers (e.g.,
deepseek-ai/DeepSeek-V3), not Anthropic model names. - Not all Anthropic-specific features may be supported. Standard message creation, streaming, and token counting work as expected.