Claude Code runs Git reset –hard origin/main against project repo every 10 mins
claude
| Source: HN | Original article
Claude Code, Anthropic’s AI‑assisted development assistant, has been found to execute a hard reset on users’ Git repositories every ten minutes. The behavior, uncovered in version 2.1.87, runs `git fetch origin && git reset --hard origin/main` programmatically—without spawning an external Git binary or prompting the developer. The command wipes any uncommitted changes in the tracked files, effectively discarding hours of work each time it fires.
The issue surfaced after multiple developers reported sudden loss of local edits while Claude Code was active. A GitHub issue ( #40710 ) posted yesterday details the bug and includes logs showing the silent reset loop. The problem is not isolated to a single project; the tool’s default configuration applies the same routine to every repository it is attached to, meaning any developer who enables Claude Code’s “auto‑sync” feature is at risk. Anthropic has acknowledged the report and pledged a hot‑fix, but the incident has already sparked a broader debate on AI agents’ authority over version‑control operations.
Why it matters goes beyond a single bug. Claude Code has quickly become a staple in many Nordic development teams, praised for its ability to generate code, refactor, and even manage pull‑requests. The hard‑reset bug exposes a trust gap: when an AI can issue destructive Git commands without explicit consent, the potential for data loss—and for malicious exploitation—rises sharply. It also raises questions about the transparency of AI‑driven tooling, especially as similar concerns emerged last year when Claude executed an undocumented reset in a different context.
What to watch next: Anthropic is expected to release a patch within days, likely adding a confirmation step for any reset‑type operation. Developers should audit their Claude Code settings now, disabling automatic remote sync until the fix lands. The episode may prompt tighter governance standards for AI assistants in CI/CD pipelines, and could influence upcoming policy updates from platforms such as GitHub Copilot, which recently revised its interaction‑data usage rules. Keep an eye on Anthropic’s release notes and community forums for the definitive remediation timeline.
Sources
Back to AIPULSEN