Perry — TypeScript → Native
apple
| Source: Mastodon | Original article
Perry, the open‑source framework that lets developers write bots in TypeScript and ship them as native Apple applications, has just gone public. The project, hosted at perryts.com, compiles TypeScript source directly into Swift‑compatible binaries, bypassing the need for a JavaScript runtime on iOS, iPadOS or macOS. By embedding the code in a native wrapper that can call Core ML models, Perry enables on‑device inference for large language models (LLMs) without relying on cloud APIs.
The move matters because it lowers the barrier for web‑centric developers to enter the on‑device AI market. Until now, creating a native AI‑enabled app required fluency in Swift or Objective‑C and a separate pipeline for model integration. Perry’s TypeScript‑to‑native path lets teams reuse existing codebases, keep data processing local for privacy, and cut latency to milliseconds—critical for conversational agents, real‑time translation and interactive assistants. The announcement follows a wave of on‑device AI news, including Google’s Gemma 4 running offline on iPhone (reported 15 April) and OpenAI’s sandboxed agents SDK for native isolation (reported 17 April). Together they signal a shift toward edge‑first AI deployments on Apple silicon.
What to watch next is how quickly the community adopts Perry’s toolchain and whether Apple will endorse it through official SDKs or App Store guidelines. Early benchmarks comparing Perry‑generated binaries with hand‑written Swift will reveal performance trade‑offs, while support for other platforms—Android, Linux, Windows—could turn Perry into a cross‑ecosystem bridge. Finally, the integration of persistent memory features, similar to Claude‑mem, may extend Perry’s capabilities beyond stateless bots, opening the door to richer, context‑aware assistants that run entirely offline.
Sources
Back to AIPULSEN