Linux lays down the law on AI-generated code, says yes to Copilot, no to AI slop, and humans take the fall for mistakes — after months of fierce debate, Torvalds and maintainers come to an agreement
copilot
| Source: Mastodon | Original article
Linus Torvalds and the core Linux‑kernel maintainers have just codified how artificial‑intelligence‑generated patches may enter the tree. After months of heated mailing‑list debates, the community voted to allow contributions that are clearly marked with an “Assisted‑by:” tag, while rejecting any code that arrives without disclosure or that is produced by generic “AI slop” tools. The new rule sits alongside the existing “Signed‑off‑by” requirement, but it makes the human submitter solely liable for any bugs, licensing breaches or security flaws that stem from the AI‑crafted sections.
The decision marks the first formal policy on AI‑assisted development in a major open‑source project. By acknowledging that developers will inevitably use assistants such as GitHub Copilot, the kernel hierarchy avoids a futile ban and instead focuses on transparency and accountability. Critics had warned that unchecked AI output could introduce subtle vulnerabilities or violate GPL terms, while proponents argued that prohibiting the tools would be as ineffective as outlawing a particular keyboard brand. The compromise—permitting Copilot‑generated snippets but demanding explicit attribution—aims to preserve code quality without stifling productivity gains.
The move will reverberate across the broader open‑source ecosystem, where projects ranging from Apache to Rust are still wrestling with similar questions. Legal scholars note that placing responsibility on the human author aligns with existing copyright doctrine, yet it may expose contributors to heightened risk, especially in corporate environments. Vendors of AI coding assistants are likely to adjust their licensing and audit features to accommodate the “Assisted‑by” tag.
Watch for how quickly the new policy is enforced in upcoming kernel releases, whether other foundations adopt comparable disclosure standards, and if any liability disputes arise from AI‑generated bugs. The Linux kernel’s stance could become the de‑facto benchmark for AI governance in open‑source software.
Sources
Back to AIPULSEN