Visit NinjaChat: https://ninjachat.ai/
In this video, Iโll walk you through the big Codex upgrades: Rust-powered terminal speed, GPT-5 model options, ChatGPT subscription support, a new VS Code extension, cloud-run tasks, and how to plug in other providers like DeepSeek via Requesty/OpenRouter.
—
Key Takeaways:
๐ Codex CLI is now ported to Rust for much faster terminal performance.
๐ง Works with your ChatGPT subscription and supports GPT-5 High/Medium/Low.
๐งฉ New Codex VS Code extension with Chat, Agent, and Agent Full Access modes.
โ๏ธ Run tasks in the cloud with auto GitHub repo detection and progress tracking.
โ๏ธ Settings include reasoning effort, manual context, open-file context, and MCP support.
๐ Switch models with slash models; use Requesty/OpenRouter and DeepSeek seamlessly.
๐ API keys go via environment variables; launch VS Code from a terminal to load them.
๐ Snappier and lower memory than Roo; clean edited-files list and focused UI.
๐ Dynamic rate limits; $20 ChatGPT plan is great value, $200 plan is mostly unlimited.
๐ Comparable to Gemini Code Assist; Codex + GPT-5 tool calls feel well-optimized.
—
Timestamps:
00:00 – Introduction
00:07 – GPT-5 + ChatGPT integration and Rust speed
02:09 – Codex VS Code extension and modes
05:28 – NinjaChat
06:18 – Usage
08:33 – Setup of Codex with Deepseek
10:32 – Ending
source
