Claude Platform Goes GA on AWS With Native API Parity
Anthropic has launched the Claude Platform on AWS in general availability, granting developers native API parity directly within their AWS environments.
On May 11, 2026, Anthropic moved the Claude Platform on AWS into general availability, bringing its first-party API features natively to AWS accounts. Unlike the existing Amazon Bedrock integration where AWS processes the data, this platform is operated directly by Anthropic on infrastructure accessed via AWS. Developers can now access zero-day model updates and advanced agentic tools while retiring their existing AWS spending commitments.
Native Parity and Cloud Integration
The general availability release standardizes the deployment architecture for enterprise developers. Authentication runs entirely through AWS IAM, removing the need for standalone API keys. Platform activity and inference logs feed directly into AWS CloudTrail for auditing and compliance tracking.
| Integration Model | Operator | Authentication | Feature Updates |
|---|---|---|---|
| Amazon Bedrock | AWS | AWS IAM | Delayed |
| Claude Platform on AWS | Anthropic | AWS IAM | Zero-day |
Billing is consolidated into a single AWS invoice. This structural change allows organizations to draw down their committed AWS spend using Anthropic’s frontier models. The platform is available across most AWS commercial regions, supporting both global and U.S. inference geographies.
Expanded Developer Capabilities
The launch provides immediate feature parity with Anthropic’s first-party API. Developers gain full access to the Messages API, Files API, and Message Batches API. New beta models and capabilities now ship to the AWS-integrated platform on the exact same day they launch natively.
The update also exposes advanced workflow tooling to the AWS ecosystem. Teams can provision secure sandboxes to run Claude Managed Agents without managing the underlying container infrastructure. The release includes access to Claude Code for agentic code generation and built-in agent skills for multi-step task orchestration. Internal demand at Amazon was high enough that the company authorized Claude Code for its tens of thousands of developers in early May, choosing the tool over its own internal Kiro platform.
Infrastructure and Scaling Mechanics
This deployment framework is backed by a massive capital exchange finalized between the two companies in April 2026. Amazon increased its total investment in Anthropic by $25 billion, bringing the total to $33 billion. In parallel, Anthropic committed to spending over $100 billion on AWS infrastructure over the next decade.
Anthropic secured up to 5 gigawatts of compute capacity to support the load. This infrastructure relies heavily on Amazon’s custom silicon, utilizing Trainium2, Trainium3, and Trainium4 chips alongside tens of millions of Graviton CPU cores. This hardware agreement facilitates immediate regional expansion in Europe and Asia to meet international data residency demands. The scaling effort aligns with Anthropic’s surging utilization, as the company reported a $30 billion revenue run-rate in April 2026, up from $9 billion at the close of 2025. With this release, Claude is the only frontier AI model available natively on AWS, Google Cloud, and Azure Foundry.
If your organization builds multi-step autonomous workflows and standardizes on AWS, migrate your endpoints to the Anthropic-operated platform. You can maintain your centralized IAM controls and CloudTrail auditing while gaining immediate access to zero-day model updates and native agent harnesses.
Get Insanely Good at AI
The book for developers who want to understand how AI actually works. LLMs, prompt engineering, RAG, AI agents, and production systems.
Keep Reading
How to Implement the Advisor Strategy with Claude
Optimize AI agents by pairing high-intelligence advisor models with cost-effective executors using Anthropic's native advisor tool API.
JetBrains and Warp Bundle Claude API Skill for Opus Migrations
Anthropic has integrated its open-source claude-api skill into major developer tools to automate model upgrades, context compaction, and caching strategies.
OpenAI Codex Desktop Adds 90 Plugins and Reusable Skills
Learn how to configure OpenAI Codex plugins and reusable skills to automate desktop tasks and connect your coding workflows to external data sources.
Amazon Bedrock Gains GPT-5.5 and Codex in $50B OpenAI Deal
Following the end of Microsoft's exclusive distribution rights, Amazon Web Services has introduced OpenAI's GPT-5.5 and Codex models to the Bedrock platform.
DeepInfra Brings $0.08/1M Inference to Hugging Face Hub
Developers can now route Hugging Face API requests directly to DeepInfra's serverless GPU infrastructure for high-performance model inference.