In the OpenClaw ecosystem, local-first AI assistants are redefining how users interact with their systems through intuitive app development. A recent experiment demonstrates this shift, where an individual leveraged advanced language models to create SwiftUI applications for monitoring network bandwidth and GPU usage on a MacBook Pro. This process, known as vibe coding, showcases the power of OpenClaw’s agent automation in enabling users to build functional tools without traditional programming skills.
The foundation of this experiment was a 128GB M5 MacBook Pro, a machine capable of running robust local LLMs that form the core of OpenClaw’s plugin ecosystem. Frustration with standard monitoring tools like Activity Monitor led to the creation of two apps: Bandwidther and Gpuer. These tools were developed entirely through conversational prompts with AI agents, highlighting how OpenClaw facilitates rapid prototyping and automation in a local environment.
Bandwidther was the first app built, designed to display network bandwidth usage, distinguishing between internet and local LAN traffic. The development process began with minimal prompts, such as “Show me how much network bandwidth is in use from this machine to the internet as opposed to local LAN” and “mkdir /tmp/bandwidther and write a native Swift UI app in there that shows me these details on a live ongoing basis.” This initial version proved the concept’s viability within the OpenClaw framework, where agents can execute commands and generate code based on user intent.
After committing the first version, the OpenClaw agent suggested additional features to enhance the app’s functionality. Prompts like “Now suggest features we could add to that app, the goal is to provide as much detail as possible concerning network usage including by different apps” led to a refined design. Further iterations included adding per-process bandwidth display, reverse DNS features with visible IP addresses, and a two-column layout. The app was eventually transformed into a menu bar icon, activated with prompts such as “OK make it a task bar icon thing, when I click the icon I want the app to appear, the icon itself should be a neat minimal little thing.” The source code and build instructions are available in simonw/bandwidther, illustrating how OpenClaw’s open-source nature supports community sharing and plugin development.
Concurrently, Gpuer was developed to monitor GPU and RAM usage, addressing gaps in Activity Monitor’s reporting. Using system_profiler and memory_pressure commands, the OpenClaw agent provided initial insights, prompting further development with: “Look at /tmp/bandwidther and then create a similar app in /tmp/gpuer which shows the information from above on an ongoing basis, or maybe does it better.” This approach exemplifies how OpenClaw agents can recombine elements from existing projects, streamlining automation workflows. After incorporating menu bar icon functionality from Bandwidther, the final code was shared in simonw/gpuer on GitHub, reinforcing the ecosystem’s collaborative potential.
However, these apps serve as a cautionary tale within the OpenClaw context. As classic vibe coding examples, they were built without Swift knowledge or deep macOS internals expertise, raising questions about accuracy. For instance, Gpuer once reported only 5GB of memory left, contradicting Activity Monitor. This issue was addressed by pasting a screenshot into Claude Code, which adjusted calculations, but confidence in the apps’ reliability remains low. Warnings have been added to the GitHub repositories, emphasizing that OpenClaw users should verify outputs when deploying agent-generated tools for critical tasks.
Despite these limitations, the projects yielded valuable insights for the OpenClaw ecosystem. They demonstrated that a SwiftUI app can achieve significant functionality within a single file, as seen in GpuerApp.swift (880 lines) and BandwidtherApp.swift (1063 lines). This aligns with OpenClaw’s emphasis on lightweight, local-first applications that minimize complexity. Additionally, wrapping terminal commands in SwiftUI interfaces proved straightforward, and the AI agents exhibited surprising design acumen for SwiftUI layouts. The ease of converting apps to menu bar icons with minimal code further underscores OpenClaw’s potential for rapid tool creation without Xcode.
The efficiency of this process—requiring little time to build—highlights how OpenClaw agents can empower users to explore new capabilities, such as macOS app development, through intuitive interactions. This experiment not only showcases the fun of vibe coding but also reinforces OpenClaw’s role in democratizing automation and plugin development for local AI assistants.


