terminal
howtonotcode.com
openai-python logo

openai-python

Ai Tool

A Python library for interacting with OpenAI's API services.

article 2 storys calendar_today First seen: 2026-02-11 update Last seen: 2026-02-24 menu_book Wikipedia

Stories

Showing 1-2 of 2

OpenAI speeds up agent backends with Responses API WebSockets and gpt‑realtime‑1.5

OpenAI shipped a faster path for real-time, tool-calling agents by adding WebSockets to the Responses API and upgrading its voice model to gpt-realtime-1.5. OpenAI reports the new [gpt-realtime-1.5](https://the-decoder.com/openai-ships-api-upgrades-targeting-voice-reliability-and-agent-speed-for-developers/) improves number/letter transcription (~10%), logical audio tasks (~5%), and instruction following (~7%), while the Responses API now supports [WebSockets](https://the-decoder.com/openai-ships-api-upgrades-targeting-voice-reliability-and-agent-speed-for-developers/) so agents stream state and tool calls without resending full context, yielding a claimed 20–40% speedup on complex graphs. For productionization, OpenAI’s docs emphasize hardened patterns—capability encapsulation via [Skills](https://developers.openai.com/api/docs/guides/tools-skills/) and secure prompting/tooling per [Cybersecurity checks](https://developers.openai.com/api/docs/guides/safety-checks/cybersecurity)—while the cookbook on [long‑horizon Codex tasks](https://developers.openai.com/cookbook/examples/codex/long_horizon_tasks/) remains relevant for workflows that still need multi‑hour execution. Ecosystem notes: the Python SDK [v2.24.0](https://github.com/openai/openai-python/releases/tag/v2.24.0) adds a new API “phase” enum; community threads flag rough edges like fine‑tune inconsistencies between Chat vs. Responses with GPT‑4o, transient 401s on vector store creation, and disappearing service‑account keys (linkable via the OpenAI forum).

calendar_today 2026-02-24
openai gpt-realtime-15 responses-api realtime-api openai-python

OpenAI Python SDK adds Batch API image support, context management

OpenAI’s Python SDK shipped three quick releases adding Batch API image support, Responses context management, and new skills/hosted shell features, alongside community-reported deployment and fine-tuning pitfalls. The notes for [v2.20.0](https://github.com/openai/openai-python/releases/tag/v2.20.0)[^1], [v2.18.0](https://github.com/openai/openai-python/releases/tag/v2.18.0)[^2], and [v2.19.0](https://github.com/openai/openai-python/releases/tag/v2.19.0)[^3] plus the [API docs](https://developers.openai.com/api/docs)[^4] confirm images in the Batch API and Responses API context_management, while a thread on [401 ip_not_authorized on Render](https://community.openai.com/t/401-ip-not-authorized-on-render-works-locally-no-ip-allow-list-visible/1373825#post_2)[^5] flags network allowlist gotchas and another on [vision fine-tuning failures](https://community.openai.com/t/why-does-my-vision-fine-tuning-job-keep-failing/1371510#post_6)[^6] highlights pipeline stability issues. [^1]: Adds: Release notes confirming Batch API image support. [^2]: Adds: Release notes detailing Responses API context_management. [^3]: Adds: Release notes introducing skills and hosted shell. [^4]: Adds: Official API docs for capabilities, limits, and best practices. [^5]: Adds: Community report on IP allowlist/auth issues when deploying to Render. [^6]: Adds: Community report on recurring failures in vision fine-tuning jobs.

calendar_today 2026-02-10
openai openai-python openai-batch-api openai-responses-api render