Why I built CommandIt
I keep a folder of commands. You probably do too.
Mine was scattered across Apple Notes, a few Alfred snippets, half a dozen GitHub gists, and the back-of-the-shell I’d Ctrl-R for a while before giving up and retyping. Docker incantations. Git rebase recipes. AWS CLI one-liners. The ffmpeg flag combination that finally worked. After 25 years of shipping software, the “stuff I keep needing” pile is huge.
CommandIt is what I built to fix that. macOS, keyboard-first, local-first. A floating palette you summon, search, paste from. It’s been a nights-and-weekends project since I left Google last year, and it’s now the tool I reach for ten times a day.
The itch got worse, not better
A funny thing happened when I started leaning on AI coding agents. Prompts are commands too. They have arguments. They get reused. “Refactor this React component to use hooks instead of class lifecycle, preserve the existing API, write tests” — that’s a snippet. The model behind it might change, but the prompt rarely does.
So now I had two piles: shell stuff and prompt stuff. Neither system handled both well. Snippet managers treated text as text. Prompt libraries lived inside one tool and didn’t talk to anything else. And nothing — nothing — let an agent paste a command back into a script the way I’d paste it into a terminal.
That gap is what made me actually build the thing instead of just complaining about it.
The bets
A few decisions I made early and haven’t regretted:
- Local-first SQLite. Your snippet library shouldn’t depend on someone else’s uptime. CommandIt stores everything on your Mac, encrypted with AES-256 (SQLCipher). iCloud sync is optional and uses CloudKit — your iCloud, not mine.
- Privacy by default. No account. No login. AI features take your OpenAI or Claude API key and call the provider directly — CommandIt never proxies prompts or sees your text. There’s also on-device AI via MLX on Apple Silicon if you’d rather not send anything anywhere.
- MCP-native. Claude Code, Codex, Gemini CLI, Cursor, Windsurf — they all speak the Model Context Protocol. CommandIt ships an MCP server, so the agent and the human share one library. Ask Claude to “save this command” and it goes into the same SQLite file you’re searching from the palette.
- Keyboard-first, ten-second capture. If saving a snippet takes longer than retyping the command, no one saves anything. Highlight a command in any app, hit ⌃⇧Space, and the New Snippet form opens with the selection pre-filled and arguments (ports, paths, URLs) auto-detected. Templates with
{arg}placeholders and dynamic variables (date, clipboard, shell output) handle the variable parts. - Honest pricing. Free covers what most people need: unlimited snippets, encrypted local storage, search, auto-paste, templates. Plus is $4.99/mo or $49/yr and covers the things that actually cost money to run — iCloud sync, cloud AI, full MCP write access, the composition queue. Free trial, money-back guarantee, no dark patterns.
What’s in it today
A short list of the things I use most:
- Fuzzy + FTS5 search ranked by relevance, frequency, and recency — the snippets you use most rise to the top automatically.
- Auto-paste into any app.
↵and you’re done. - Categories, tags, favorites, smart filters for when the library gets big.
- Snippet Packs — curated, shareable bundles. Mine has the dozen-ish git incantations I never want to look up again.
- Composition queue: stack multiple snippets, decide how they join.
- A terminal CLI for the moments you’re already in iTerm and don’t want to leave.
The full feature list lives at commandit.ai. It’s grown faster than I expected it to.
Try it
If any of this resonates — the scattered notes, the Ctrl-R archaeology, the prompts you keep losing — give it a spin. I’d genuinely love feedback. Email me at chris@commandit.ai. The good ideas have all come from people telling me what they keep reaching for and not finding.