Nanocode: The best Claude Code that $200 can buy in pure JAX on TPUs
agents anthropic claude training
| Source: HN | Original article
A new open‑source project called **Nanocode** promises to deliver a Claude‑Code‑level coding assistant for the price of a single $200 TPU‑hour budget. The repository, released on GitHub by Salman Mohammadi, is written entirely in JAX and is engineered to run on Google’s Tensor Processing Units. Its creators follow the same “Constitutional AI” recipe Anthropic used to train Claude, writing a SOUL.md specification, generating synthetic instruction data, and applying preference optimisation to align the model with that specification.
Nanocode’s claim to fame is its minimal footprint: a single‑file, ~250‑line Python implementation with no external dependencies, yet capable of producing the same agentic interface Claude Code offers—multiple tool calls, context‑aware code synthesis, and auto‑compact token management. The authors estimate that a full training run on a single TPU v4 pod stays under $200, a stark contrast to Anthropic’s current “Claude Probilled” subscription, which costs $200 up‑front for a year of access (see our April 5 report on Claude Code pricing).
Why it matters is twofold. First, it lowers the barrier for startups and research labs that cannot afford Anthropic’s commercial licences, potentially widening the pool of developers who can experiment with high‑quality, self‑hosting code models. Second, it puts pressure on Anthropic to justify its pricing and subscription model, especially after the company’s recent move to block third‑party tools from accessing Claude subscriptions—a policy we covered on April 5.
What to watch next: benchmark results comparing Nanocode’s output quality, latency and token efficiency against official Claude Code; community uptake and contributions that could scale the model beyond the initial 7‑billion‑parameter baseline; and Anthropic’s response, which may range from legal challenges over training data to new pricing tiers. If Nanocode proves competitive, it could spark a wave of TPU‑based, open‑source LLMs that democratise access to agentic coding assistants across the Nordic AI ecosystem.
Sources
Back to AIPULSEN