GENERAL PUB_DATE: 2026.01.10

OPENAI PYTHON SDK V2.15.0 ADDS RESPONSE.COMPLETED_AT

OpenAI's Python client v2.15.0 adds a completed_at property on Response, exposing the server-side finish timestamp for requests. This enables cleaner latency/tr...

OpenAI Python SDK v2.15.0 adds Response.completed_at

OpenAI's Python client v2.15.0 adds a completed_at property on Response, exposing the server-side finish timestamp for requests. This enables cleaner latency/tracing metrics and easier event ordering. The release also includes internal codegen updates and notes no breaking changes.

[ WHY_IT_MATTERS ]
01.

You can measure model latency and enforce SLAs using a server-side completion time.

02.

It simplifies correlating responses across logs and streams for tracing.

[ WHAT_TO_TEST ]
  • terminal

    Verify completed_at is present and correctly populated for both non-streaming and streaming responses, and confirm its type/units before persisting.

  • terminal

    Update metrics/log schemas and parsers to include completed_at while remaining backward compatible with older responses that lack it.

[ BROWNFIELD_PERSPECTIVE ]

Legacy codebase integration strategies...

  • 01.

    If you serialize or log Response objects, add completed_at to schemas and handle nulls for historical data.

  • 02.

    Audit dashboards/SLIs to safely merge new completed_at-based latency with existing measurements without breaking alerts.

[ GREENFIELD_PERSPECTIVE ]

Fresh architecture paradigms...

  • 01.

    Standardize on completed_at for latency/tracing and emit metrics at request start and on response receipt.

  • 02.

    In streaming flows, capture completed_at from the final message and propagate it through your pipelines.