terminal
howtonotcode.com
Azure logo

Azure

Platform

Azure is a cloud computing platform by Microsoft.

article 5 storys calendar_today First seen: 2026-02-09 update Last seen: 2026-02-17 open_in_new Website menu_book Wikipedia

Resources

Links to check for updates: homepage, feed, or git repo.

home Homepage

Stories

Showing 1-5 of 5

Choosing your LLM lane: fast modes, Azure guardrails, and lock‑in risks

Picking between Azure OpenAI, OpenAI, and Anthropic now requires balancing fast‑mode latency tradeoffs, enterprise guardrails, and ecosystem lock‑in that will shape your backend and data pipelines. Kellton’s guide argues that Microsoft’s Azure OpenAI service brings OpenAI models into an enterprise‑ready envelope with compliance certifications, data residency, and cost control via reserved capacity, while integrating natively with Azure services ([overview](https://www.kellton.com/kellton-tech-blog/azure-openai-enterprise-business-intelligence-automation)). On performance, Sean Goedecke contrasts “fast mode” implementations: Anthropic’s approach serves the primary model with roughly ~2.5x higher token throughput, while OpenAI’s delivers >1000 tps via a faster, separate variant that can be less reliable for tool calls; he hypothesizes Anthropic leans on low‑batch inference and OpenAI on specialized Cerebras hardware ([analysis](https://www.seangoedecke.com/fast-llm-inference/)). A contemporaneous perspective frames OpenAI vs Anthropic as a fight to control developer defaults—your provider choice becomes a dependency that dictates pricing, latency profile, and roadmap gravity, not just model quality ([viewpoint](https://medium.com/@kakamber07/openai-vs-anthropic-is-not-about-ai-its-about-who-controls-developers-51ef2232777e)).

calendar_today 2026-02-17
azure-openai-service azure microsoft openai anthropic

Copilot CLI stabilizes for long sessions as IDEs move to agentic, team‑scoped AI

GitHub Copilot CLI’s latest update focuses on memory reductions and long‑session stability while IDE workflows and AI agents mature around team‑level customization and modernization tasks. GitHub Copilot CLI v0.0.410 ships broad stability improvements—fixing high memory usage under rapid logging, reducing streaming overhead, improving long‑session compaction, and adding ergonomic shell features like Ctrl+Z suspend/resume, Page Up/Down scrolling, repo‑level validation toggles, and an IDE status indicator when connected ([release notes](https://github.com/github/copilot-cli/releases)). The momentum aligns with a wider agentic shift: The New Stack frames VS Code as a “multi‑agent command center” for developers ([coverage](https://thenewstack.io/vs-code-becomes-multi-agent-command-center-for-developers/)), and Microsoft’s Copilot App Modernization details AI agents that assess, upgrade, containerize, and deploy .NET/Java apps to Azure in days ([deep dive](https://itnext.io/how-microsoft-is-using-ai-agents-to-turn-8-month-app-modernizations-into-days-a-technical-deep-8340a33513e7)). For IDE standardization, JetBrains/Android Studio Copilot customizations support workspace‑scoped settings committed under .github so teams can share constraints and conventions across projects ([guide](https://www.telefonica.com/en/communication-room/blog/github-copilot-android-studio-customization/)); also watch cost dynamics—one report shows OpenCode using far more credits than Copilot CLI for the same prompt, warranting usage instrumentation and policy checks ([user report](https://www.reddit.com/r/GithubCopilot/comments/1r2fhs2/opencode_vs_github_copilot_cli_huge_credit_usage/)).

calendar_today 2026-02-12
github-copilot-cli github visual-studio-code android-studio jetbrains

LLM safety erosion: single-prompt fine-tuning and URL preview data leaks

Enterprise fine-tuning and common chat UI features can quickly undermine LLM safety and silently exfiltrate data, so treat agentic AI security as a lifecycle with zero‑trust controls and gated releases. Microsoft’s GRP‑Obliteration shows a single harmful prompt used with GRPO can collapse guardrails across several model families, reframing safety as an ongoing process rather than a one‑time alignment step [InfoWorld](https://www.infoworld.com/article/4130017/single-prompt-breaks-ai-safety-in-15-major-language-models-2.html)[^1] and is reinforced by a recap urging teams to add safety evaluations to CI/CD pipelines [TechRadar](https://www.techradar.com/pro/microsoft-researchers-crack-ai-guardrails-with-a-single-prompt)[^2]. Separately, researchers demonstrate that automatic URL previews can exfiltrate sensitive data via prompt‑injected links, and a practical release checklist outlines SDLC gates to verify value, trust, and safety before launching agents [WebProNews](https://www.webpronews.com/the-silent-leak-how-url-previews-in-llm-powered-tools-are-quietly-exfiltrating-sensitive-data/)[^3] [InfoWorld](https://www.infoworld.com/article/4105884/10-essential-release-criteria-for-launching-ai-agents.html)[^4]. [^1]: Adds: original reporting on Microsoft’s GRP‑Obliteration results and cross‑model safety degradation. [^2]: Adds: lifecycle framing and guidance to integrate safety evaluations into CI/CD. [^3]: Adds: concrete demonstration of URL‑preview data exfiltration via prompt injection (OpenClaw case study). [^4]: Adds: actionable release‑readiness checklist for AI agents (metrics, testing, governance).

calendar_today 2026-02-10
microsoft azure gpt-oss deepseek-r1-distill google

VS Code Copilot Chat v0.38 (pre-release): Claude GA, memory tool, and CLI integration updates

VS Code Copilot Chat v0.38 (pre-release) introduces Claude graduating from preview, Anthropic memory tooling (including local memory), a rename of /summarize to /compact with optional instructions, and Copilot CLI integration migration. See the extension’s pre-release notes for Anthropic memory tool support and checks, Claude graduation, /summarize ➜ /compact, subagent improvements, hooks stopReason/warningMessage, telemetry fixes, and the Copilot CLI integration migration [release notes](https://github.com/microsoft/vscode-copilot-chat/releases)[^1]. For enterprise enablement and procurement, this guide outlines how to subscribe to GitHub Copilot via Azure [implementation path](https://medium.com/@addozhang/subscribing-to-github-copilot-via-azure-enterprise-ai-programming-assistant-implementation-path-2504adeff1d8)[^2]. [^1]: Adds: Official v0.38 pre-release changelog with specific features and fixes. [^2]: Adds: Enterprise subscription route via Azure for rolling out Copilot.

calendar_today 2026-02-07
vs-code-copilot-chat github-copilot copilot-cli claude claude-code