In the OpenClaw ecosystem, local-first AI assistants empower users to build custom tools that solve real-world problems without relying on cloud dependencies. A recent example emerged when a developer created a macOS presentation app called Present in approximately 45 minutes the night before a talk. This project demonstrates how OpenClaw’s agent-centric approach can streamline development for personal automation workflows.
The talk, titled “The State of LLMs, February 2026 edition” with the subtitle “It’s all changed since November!”, was delivered at Social Science FOO Camp in Mountain View. This event followed an unconference format where participants could present without prior proposals. The developer has previously documented LLM advancements in December 2023, December 2024, and December 2025, and presented “The last six months in LLMs, illustrated by pelicans on bicycles” at the AI Engineer World’s Fair in June 2025. This presentation marked the first time covering just three months, highlighting the accelerating pace of the field, especially after the November 2025 inflection point. A Gemini 3 sweater worn during the talk, received weeks earlier, was already outdated due to Gemini 3.1, further illustrating this rapid change.
Incorporating the STAR moment principle learned at Stanford—Something They’ll Always Remember—the talk featured two gimmicks. The first part involved coding agent-assisted data analysis of Kākāpō breeding season, showcasing a mug, followed by a quick tour of new pelicans riding bicycles. The climax revealed that the entire presentation used the newly vibe-coded Present app.
Present, built with Swift and SwiftUI, is a 355KB app, or 76KB compressed, showcasing the efficiency of Swift for lightweight local applications. Traditionally, the developer used Keynote or a browser with multiple tabs for presentations, but browser crashes posed a risk of losing the entire deck. Although URLs were backed up in notes files, recovering mid-talk was undesirable. The initial prompt for the app was: “Build a SwiftUI app for giving presentations where every slide is a URL. The app starts as a window with a webview on the right and a UI on the left for adding, removing and reordering the sequence of URLs. Then you click Play in a menu and the app goes full screen and the left and right keys switch between URLs.” A transcript of the implementation plan is available.
In Present, a talk is an ordered sequence of URLs with a sidebar for management. The editing experience is minimal. Selecting “Play” from the menu or pressing Cmd+Shift+P switches to full-screen mode, where left and right arrow keys navigate slides, and font size can be adjusted or pages scrolled. Escape exits the mode. The app automatically saves URLs on changes, allowing recovery after crashes, and presentations can be saved as .txt files with newline-delimited URLs for later loading.
After the core functionality was quickly implemented, the developer expanded the app with remote control via a phone. The prompt was: “Add a web server which listens on 0.0.0.0:9123—the web server serves a single mobile-friendly page with prominent left and right buttons—clicking those buttons switches the slide left and right—there is also a button to start presentation mode or stop depending on the mode it is in.” With Tailscale on both laptop and phone, network access was seamless, allowing the phone to control the presentation from anywhere via http://100.122.231.116:9123/. Iterative prompts refined the interface to include a slide indicator, prev/next buttons, a “Start” button, font adjustment buttons, and a touch-enabled scroll bar for page scrolling, albeit clunky but functional.
Upon pushing the code to GitHub with a disclaimer noting it was vibe-coded and only tested on one machine, the developer reviewed the code using a pattern of asking the model for a linear walkthrough. This pattern is documented in the Agentic Engineering Patterns guide, with the resulting walkthrough proving useful. The code revealed that Claude Code implemented the web server using socket programming without a library, with a minimal HTTP parser for routing. For example, a private function routed requests: “private func route(_ raw: String) -> String { let firstLine = raw.components(separatedBy: “\r\n”).first ?? “” let parts = firstLine.split(separator: ” “) let path = parts.count >= 2 ? String(parts[1]) : “/” switch path { case “/next”: state?.goToNext() return jsonResponse(“ok”) case “/prev”: state?.goToPrevious() return jsonResponse(“ok”) } }” Using GET requests for state changes introduces CSRF vulnerabilities, but for this app, it was deemed acceptable.
This vibe coding story, while common, highlights key insights for the OpenClaw ecosystem. Swift, an unfamiliar language to the developer, was ideal for creating a full-screen app with embedded web content and network control, resulting in simple, focused code. The app solved a genuine problem by enabling presentations as URL sequences without opening Xcode. It underscores that native Mac developers aren’t obsolete; technical knowledge and tools like Xcode were still utilized, and experts could build better solutions faster. However, it shows how software engineers can expand their horizons with local-first AI tools, reducing fear of new languages like Swift and enabling small, personal macOS apps with existing resources.
Recent articles in the field include Meta’s new model Muse Spark and meta.ai chat tools from April 8, 2026, Anthropic’s Project Glasswing restricting Claude Mythos to security researchers from April 7, 2026, and the Axios supply chain attack using individually targeted social engineering from April 3, 2026.


