PROWORKS
CASE STUDY — MOBILE + ANTHROPIC SDK

Forge Remote: putting Claude Code in your pocket

A custom mobile app built on the Anthropic SDK for remote access to Claude Code. Voice chat, text chat, project management, VPN-secured — so I can ship code from anywhere.

SDK

Anthropic SDK native

Interface

Voice + text

Security

VPN-secured

Build

Solo mobile build

Claude Code is a terminal tool. Terminals don't fit in pockets.

Claude Code is the most powerful AI-assisted development tool I use — but it lives in a desktop terminal. Generic mobile ChatGPT apps don't solve this. They don't know your project structure, can't read your files, can't run commands, can't maintain session context across a working session.

The need: something that speaks Claude Code's language, respects project context, lets me manage development workflows from mobile with proper security — not a consumer chat wrapper with no awareness of what I'm building.

The approach: Build the tool you wish existed.

Mobile app built on Anthropic SDK directly — not a third-party wrapper, not a proxy, native SDK integration. Architecture: mobile client connects through WireGuard VPN to a self-hosted backend, which orchestrates Claude sessions using the Anthropic SDK. The codebase never leaves infrastructure I control.

Voice pipeline: Whisper for transcription, Claude for reasoning and code generation, ElevenLabs for voice response. Speak a request, hear the answer. Works on commute. Works in a meeting break. Works anywhere I don't want to open a laptop.

Security architecture: WireGuard VPN endpoint I control, client codebases never on shared servers, no third-party service except Anthropic and my own VPN endpoint. Built for someone who handles other people's codebases professionally.

What I built

Project management

Create projects, switch between them, persist context per project across sessions.

Text chat

Full Claude conversation with code context and project awareness — not generic chatbot, developer tool.

Voice chat

Speak a request, Claude responds in voice with code output. Whisper for transcription, ElevenLabs for voice response.

File browsing

Read project files from mobile, review changes, approve commits — without needing a laptop.

Session persistence

Mobile and desktop share context through backend — pick up where you left off on either device.

VPN-enforced security

No third-party cloud, no credential exposure. Client codebases stay on their own infrastructure.

Tech stack

Anthropic SDKReact NativeSelf-hosted backend (Node/Python)WireGuard VPNWhisper STTElevenLabs TTSSession persistenceMCP-style tool orchestration

What I'd do differently

Built without a formal roadmap — started as a weekend exploration and became a real tool. That was fine for getting started, but it meant some features were built in the wrong order. Should have spent a day on feature spec with explicit prioritization before writing the first component.

Should have invested earlier in automated testing. Mobile + voice + VPN + Anthropic SDK is a lot of failure surfaces in one stack. Manual testing caught most issues but added uneven QA time at every feature addition. End-to-end tests would have caught regressions faster.

Want to build something on the Anthropic SDK?

Custom apps on Anthropic's SDK are where most of the interesting AI products of 2026 are being built. If you've got an idea and need a partner who actually knows the SDK, let's scope it.

Book a call →