Ai Coding 3 min read

SpaceX Fuels Cursor Training With 1M GPUs for Composer 2.5

SpaceX and Cursor announce a strategic partnership using the Colossus supercomputer to train next-gen coding models, with a $60 billion acquisition on the table.

SpaceX and Cursor have established a strategic partnership to train the next generation of AI coding models. Announced in an official release, the deal grants Cursor access to a 1-million-GPU supercomputer to train Composer 2.5. The agreement includes a contractual option for SpaceX to acquire Cursor outright for $60 billion later in 2026. For developers building software with AI assistance, this shifts the landscape of independent tooling toward a vertically integrated ecosystem.

Infrastructure and Model Scaling

Cursor will move its primary training pipeline to Colossus. SpaceX absorbed this supercomputer from xAI in February 2026. The cluster contains 1 million H100-equivalent GPUs, establishing it as the largest known training cluster. This compute capacity directly targets the coding benchmark deficits previously observed in the xAI model family compared to Claude and OpenAI models.

The partnership focuses strictly on training Composer 2.5. This model advances the architecture of previous iterations by scaling up the reinforcement learning pipelines. If you rely on vibe coding for large-scale project edits, the increased context reasoning capabilities will dictate how you structure your workflows. Earlier models laid the groundwork for this hardware scale. Composer 1.5 introduced a 20x expansion in reinforcement learning updates. Composer 2 integrated continued pretraining into the primary loop. Composer 2.5 will apply these techniques across the Colossus cluster.

Model VersionRelease TimelineKey Architecture Update
ComposerLate 2025First agentic model
Composer 1.5Early 202620x expansion in reinforcement learning
Composer 2March 2026Integrated continuous pre-training
Composer 2.5In Development1M GPU Colossus training

Acquisition Terms and Vertical Integration

The financial structure operates as a phased buyout. SpaceX holds the right to acquire Cursor for $60 billion. This sets a 20 percent premium over the startup’s reported $50 billion valuation from March 2026. If SpaceX declines the full acquisition, it must pay $10 billion for the joint intellectual property produced during the collaboration. This deal occurs as SpaceX prepares for a projected $1.75 trillion IPO.

This structure accelerates SpaceX’s vertical integration of the AI stack. The company now controls the underlying Colossus compute, the Grok consumer interface, and the professional developer layer. The operational merger began in March 2026 when senior Cursor engineering leaders Andrew Milich and Jason Ginsberg joined SpaceX. They now report directly to Elon Musk, applying these coding capabilities to orbital and lunar infrastructure projects.

The Developer Ecosystem Shift

Independent coding assistants are rapidly becoming integrated features of larger compute providers. Analysts view the $60 billion valuation as a defensive move against Anthropic’s aggressive rollout of native coding features and OpenAI’s internal Canvas and Operator systems. SpaceX previously rented tens of thousands of GPUs to Cursor, acting as a specialized cloud provider before formalizing this joint venture.

When selecting a coding assistant for your engineering teams, you must evaluate the underlying compute provider. Tooling lock-in is moving from the IDE layer to the hardware layer. Prepare your infrastructure to support multiple AI coding environments to ensure your proprietary codebases are not tethered to a single vendor’s acquisition strategy.

Get Insanely Good at AI

Get Insanely Good at AI

The book for developers who want to understand how AI actually works. LLMs, prompt engineering, RAG, AI agents, and production systems.

Keep Reading