AI Coding Agents

Codex on Windows in 2026: The Honest Setup Guide and How to Ship to Production

Francesc9 min read

OpenAI shipped Codex on Windows on May 13, 2026, and the change is bigger than the headline suggests. For two years, running OpenAI's coding agent on a Windows machine meant Windows Subsystem for Linux (WSL), a virtual machine, or a remote Linux box. Codex on Windows now runs natively in PowerShell, using OS-level sandboxing built on restricted tokens and filesystem access control lists instead of a Linux abstraction layer. The 24 hours that followed brought "Codex from anywhere" (May 14) and the broader "Running Codex safely" guardrails refresh (May 11). This guide is the practical setup, the real limits, and the missing layer: turning Codex output into a deployed, owned production app.

Codex on Windows desktop app running natively in PowerShell with the new Windows sandbox, alongside the Totalum MCP deploy panel for shipping the project to production

Quick Answer

  • Codex on Windows installs from the Microsoft Store or with winget install Codex -s msstore. You need an active ChatGPT plan or an OpenAI API key.
  • It runs natively in PowerShell with a Windows OS sandbox (restricted tokens plus filesystem ACLs). WSL2 still works if you prefer a Linux toolchain.
  • Parity with macOS and Linux is close but not perfect. Parallel agent threads, worktrees, in-app diffs, and CLI handoff work. Computer Use is macOS-only at launch.
  • The agent edits and runs code locally. It does NOT deploy. To ship to production, point it at an MCP server such as Totalum, which provides the database, auth, payments, hosting, and custom domain layer.
  • Trend signal: searches for "codex on windows" grew 333% quarter-over-quarter in early 2026 with CPC around $15. Buyer intent is high.

What shipped on May 13 to 14, 2026

OpenAI rolled out three Codex changes in five days. They are easier to think about as one release.

Codex on Windows (May 13, 2026). The Codex desktop app and the underlying CLI are now first-class on Windows. The app gives Windows users the same command-center surface that macOS and Linux users already had: parallel agent threads, integrated git worktrees, in-app diff review, and a built-in browser. The deeper change is the sandbox. On macOS and Linux, Codex uses Seatbelt and Landlock to isolate agent processes. On Windows, it uses restricted tokens and filesystem access control lists (ACLs), enforced by the Windows kernel. No WSL, no Docker, no Linux VM.

Codex from anywhere (May 14, 2026). OpenAI extended Codex beyond local installs. Tasks can now be delegated from the ChatGPT mobile app, the web client, or any other surface that hits the Codex API, and the agent picks up the work on a cloud machine or your local machine depending on configuration. For Windows users this matters because it removes the "I'm on my desktop, my Codex setup is on my laptop" friction.

Running Codex safely (May 11, 2026). The security follow-up post documents how OpenAI sandboxes Codex internally and what guardrails are exposed to end users. Filesystem boundaries, network egress controls, and approval gates for shell commands are the three surfaces to know.

The combined effect: Codex is now a credible default coding agent on Windows for the first time. Cursor, Cline, and Claude Code already worked on Windows. Codex was the holdout. This brings it level.

How to install Codex on Windows (5-minute setup)

The fast path is the Microsoft Store install. The CLI-first path uses winget.

Option A: Microsoft Store

  1. Open the Microsoft Store on Windows 10 or Windows 11.
  2. Search "Codex" or open the listing directly.
  3. Click Install.
  4. Sign in with your ChatGPT account or paste an OpenAI API key.
  5. Pick a workspace folder. Codex will scope its sandbox to that folder by default.

Option B: Winget

winget install Codex -s msstore

That is one command. The -s msstore flag tells winget to pull from the Microsoft Store source rather than a third-party manifest.

Option C: WSL2 (still supported)

If you already had Codex running in WSL2 with Ubuntu or Debian, nothing breaks. The Codex CLI continues to install via:

npm install -g @openai/codex

You then run it from inside your WSL distribution exactly as before. The native Windows app and the WSL CLI can coexist. They share auth state once you sign in to either surface.

Account requirements

You need one of:

  • An active ChatGPT Plus, Pro, Business, or Enterprise plan.
  • An OpenAI API key with credits on the Codex-eligible billing path.

The free ChatGPT tier does not include Codex agent runs at the time of writing. Pricing scales by the underlying model and the number of agent threads you run in parallel.

Codex on Windows vs Codex on macOS and Linux: parity table

Feature macOS Linux Windows (native) Windows (WSL2)
CLI Yes Yes Yes (PowerShell) Yes (bash/zsh)
Desktop app Yes No Yes N/A
Native OS sandbox Yes (Seatbelt) Yes (Landlock) Yes (restricted tokens + ACLs) Yes (Landlock inside WSL)
Parallel agent threads Yes Yes Yes Yes
Git worktrees Yes Yes Yes Yes
In-app diff review Yes Limited (TUI) Yes Limited (TUI)
In-app browser Yes No Yes No
Computer Use feature Yes No Not yet No
Cloud delegation ("from anywhere") Yes Yes Yes Yes
IDE handoff (VS Code, Cursor) Yes Yes Yes Yes

The honest gap on Windows is Computer Use, the feature that lets Codex drive Chrome and other desktop apps with mouse and keyboard. At launch, it is macOS-only. Everything else is at parity.

Connecting Codex on Windows to Totalum for production deploys

This is the part most setup guides skip. The Codex agent on Windows edits, runs, and tests code inside the sandbox. It does NOT deploy. There is no built-in path that takes the project Codex just generated and ships it to a real URL with auth, a database, payments, and a custom domain.

That is the layer Totalum provides through its API and MCP server, and Codex on Windows can talk to it.

Step 1: Get a Totalum project

Create a free account at totalum.app, then create a new project. Totalum scaffolds a real Next.js application with TotalumSDK, authentication, file storage, a database, and a Cloudflare-backed hosting layer. You own the code. (We cover the MCP wiring end-to-end in our Claude Code MCP tutorial; the Codex flow is the same once the MCP endpoint is configured.)

Step 2: Configure the MCP server inside Codex

Codex on Windows reads MCP servers from a config file. On a native install, that file lives at %APPDATA%\Codex\config.json. Add a Totalum entry:

{
  "mcpServers": {
    "totalum": {
      "url": "https://www.totalum.app/api/mcp",
      "headers": {
        "Authorization": "Bearer YOUR_TOTALUM_API_KEY"
      }
    }
  }
}

Restart Codex. The agent now has tools for creating projects, editing database tables, deploying versions, and reading production logs.

Step 3: Use it

Open a Codex thread and prompt it the way you would prompt Claude Code or Cursor:

> "Use the Totalum MCP to create a new project called client-portal. Add a Clients table with name, email, and status fields. Generate a login page and a dashboard. Deploy version 1."

Codex calls the MCP tools, runs the build, and returns the live URL. The whole flow happens inside the Codex window on Windows, sandboxed by the native Windows ACLs Codex just acquired.

If you want a deeper comparison of how this differs from doing the same flow with Claude Code or Cursor, see Claude Code vs Codex in 2026 and Cursor vs Claude Code in 2026.

Codex CLI vs Claude Code on Windows: which to pick

Codex on Windows did not eliminate the choice. Both agents now run natively on Windows. The decision comes down to four factors.

Factor Codex on Windows Claude Code on Windows
Model defaults OpenAI Codex model family Claude Opus 4.7, Sonnet 4.6, Haiku 4.5
Sandbox Native Windows (restricted tokens + ACLs) Runs in WSL or via shell wrapper
Parallel threads Built-in worktree-aware threading One thread per terminal by default
Subscription ChatGPT plan or OpenAI API key Claude Pro/Max or Anthropic API key
MCP server support Yes (native config) Yes (native config)

If you live in the ChatGPT ecosystem already and you want a polished desktop app with parallel threads, Codex is the obvious pick. If you prefer Anthropic's models or you already pay for Claude Pro, Claude Code is the obvious pick. Both can drive Totalum the same way. We covered the head-to-head trade-offs in Cline vs Claude Code and Claude Opus 4.7 with Totalum earlier this month.

FAQ

Does Codex on Windows require WSL?

No. The May 13, 2026 release added a native Windows sandbox built on restricted tokens and filesystem ACLs. WSL2 is still supported as an alternative, but it is no longer the default.

Is Codex sandboxed at the same level on Windows as on macOS?

The mechanism is different, the goal is the same. macOS uses Seatbelt, Linux uses Landlock, Windows uses restricted tokens and ACLs. All three confine agent file and process access to the working directory. Network egress and shell-command approval gates work identically across platforms.

What is "Codex from anywhere" and does it work on Windows?

"Codex from anywhere" (released May 14, 2026) lets you delegate Codex tasks from the ChatGPT mobile app or the web client. The work runs on a cloud machine or on a configured local machine. Windows local machines are supported as a target.

How much does Codex on Windows cost?

The app itself is free. To run agent tasks you need an active ChatGPT plan (Plus, Pro, Business, or Enterprise) or an OpenAI API key with credits. Costs depend on the underlying model and the number of parallel agent threads you run.

Is Codex on Windows feature-equal to Codex on macOS?

Close, not equal. The Computer Use feature, which lets Codex control desktop apps with mouse and keyboard, is macOS-only at launch. Everything else (parallel threads, worktrees, in-app browser, diff review, IDE handoff) is at parity.

Can Codex on Windows deploy a project to production?

Not on its own. Codex edits and runs code; it does not host, register a domain, set up auth, or run a database. To deploy, point Codex at an MCP server that provides those layers. Totalum exposes those primitives through its MCP endpoint, so Codex can scaffold, build, and ship a real owned Next.js app from a single thread.

Ship Codex output to production

Codex on Windows finally gives Windows developers a serious OpenAI agent. The missing layer, on every platform, is the path from "agent finished editing" to "real app at a real URL with auth, payments, and a database." That is what Totalum provides through its API and MCP server.

Create a free Totalum project at totalum.app, connect it as an MCP server to Codex on Windows, and your next Codex thread can scaffold a Next.js app, set up the database, and deploy version 1 without leaving the window. For more on the broader landscape of agent-driven builders, see our Best vibe coding tools in 2026 roundup.

Francesc

Writes for the Totalum blog about AI app building, no-code development, and product engineering.

Related posts

Start building with Totalum

Create your web app with AI in minutes. No code needed.

Try Totalum for free