Riley Brown built 'Jerry,' an iMessage-style iOS app that generates and hosts web and mobile apps through natural language conversation, using zero hand-written code. The stack: OpenAI Codex desktop app for Swift code generation, Anthropic's Claude Agent SDK for AI-powered app building logic, Vibe Code CLI for hosting, and OpenAI Whisper for voice input. The entire UI was designed in Paper, an AI-native Figma alternative, before a single line hit Xcode.

The argument against Replit is implicit in the architecture: a native Swift app with a conversational interface removes the browser-based friction and gives users a direct pipeline from prompt to deployed app on their phone. The Claude Agent SDK integration, shown at the 17:32 mark, is where the real technical weight sits. That section alone justifies a full watch, because it shows how the agent loop actually handles code generation and execution, not just the polished final demo.

Brown's process exposes something worth examining: how far a non-writing workflow can actually get in 2025. The gap between Paper for design, Codex for Swift, and the Claude SDK for agent logic is where the interesting failures and workarounds live. The video runs through all of it, including a first-build test on a physical iPhone at the 10:54 mark, which is where most of these demos quietly fall apart, or in this case, mostly do not.

[WATCH ON YOUTUBE →]