I Built a Visual Spec-Driven Development Extension for VS Code That Works With Any LLM
copilot llama
| Source: Dev.to | Original article
A developer‑led project called **Caramelo** has just been released on the Visual Studio Code Marketplace, promising to turn the emerging “spec‑driven development” (SDD) model into a fully visual workflow inside the editor. The extension imports GitHub’s Spec Kit pipeline—constitution, specification, planning, task breakdown and implementation—into a drag‑and‑drop UI, adds approval gates that pause AI‑generated code until a reviewer signs off, and syncs each step with Jira tickets. Most notably, Caramelo is LLM‑agnostic: it can call a locally hosted Ollama model, GitHub Copilot, or any corporate‑hosted proxy, letting teams stay within existing security boundaries while still leveraging generative AI.
The move matters because SDD aims to curb the “vibe‑coding” problem that has plagued AI‑assisted development: developers hand over vague prompts and receive code that fits the model’s biases rather than the product’s requirements. By forcing a structured specification before any generation occurs, Caramelo forces teams to articulate intent, track changes, and enforce compliance, potentially reducing rework and technical debt. Its Jira integration also bridges the gap between product management and code, a pain point highlighted in recent industry surveys of AI‑coding tool adoption.
What to watch next is how quickly the extension gains traction among enterprises that have already deployed AI copilots but remain wary of uncontrolled code generation. The upcoming release of GitHub’s own Spec Kit UI for the web could create a competitive pressure point, while early adopters will likely test Caramelo’s performance with large on‑prem LLMs such as Llama 3‑70B. If the visual orchestration proves reliable, we may see a shift toward hybrid pipelines where human‑approved specs steer AI output, reshaping the balance between speed and governance in software engineering.
Sources
Back to AIPULSEN