DevUp Docs
Back to Dashboard

Chat Completions

Log Probabilities

Get per-token log probabilities from LLM responses.

You can retrieve the log probability of each generated token. This is useful for uncertainty estimation, token-level filtering, confidence scoring, or building custom sampling logic.

Log probabilities are supported across all request modes:

  • OpenAI-compatible API — using logprobs and top_logprobs parameters
  • DEVUP Native API — streaming and non-streaming

OpenAI-compatible API

Set logprobs: true in your request. Optionally set top_logprobs (1-20) to also get the top alternative tokens at each position.

from openai import OpenAI

client = OpenAI(
    api_key="$DEVUP_API_KEY",
    base_url="https://api.devupai.com/v1",
)

response = client.chat.completions.create(
    model="deepseek-ai/DeepSeek-V3",
    messages=[{"role": "user", "content": "Say hello in one word"}],
    logprobs=True,
    top_logprobs=3,
)

for token in response.choices[0].logprobs.content:
    print(f"{token.token!r}: {token.logprob:.4f}")
    for alt in token.top_logprobs:
        print(f"  alt {alt.token!r}: {alt.logprob:.4f}")

Response structure:

json
{
  "choices": [{
    "logprobs": {
      "content": [
        {
          "token": "Hello",
          "logprob": -0.0023,
          "top_logprobs": [
            {"token": "Hello", "logprob": -0.0023},
            {"token": "Hi", "logprob": -1.42},
            {"token": "Hey", "logprob": -3.87}
          ]
        }
      ]
    }
  }]
}

DEVUP Native API (streaming)

The native streaming API returns log probabilities inline with each token as it is generated.

bash
curl -X POST \
  -d '{"input": "I have this dream", "stream": true}' \
  -H "Authorization: Bearer $DEVUP_API_KEY" \
  -H 'Content-Type: application/json' \
  "https://api.devupai.com/v1/inference/deepseek-ai/DeepSeek-V3"

Response (streamed):

json
data: {"token": {"id": 29892, "text": ",", "logprob": -2.65625, "special": false}, "generated_text": null, "details": null}
data: {"token": {"id": 988, "text": " where", "logprob": -0.39575195, "special": false}, "generated_text": null, "details": null}
data: {"token": {"id": 1432, "text": " every", "logprob": -3.15625, "special": false}, "generated_text": null, "details": null}
data: {"token": {"id": 931, "text": " time", "logprob": -0.1385498, "special": false}, "generated_text": null, "details": null}

The logprob field is the log probability of the generated token (base e). Lower (more negative) values indicate less likely tokens.