Show HN: AICTL – A native AI agent for terminal and macOS, in Rust

2 pointsposted 12 hours ago
by piotrwittchen

1 Comments

piotrwittchen

12 hours ago

Author here. AICTL is an open-source AI agent I've been building in Rust as an alternative to the various closed CLI assistants. It's a single workspace with three frontends sharing one engine:

- CLI (aictl): REPL with slash commands, agents, skills, plugins, hooks

- Desktop (macOS, work-in-progress): Tauri-based, same engine

- HTTP server (aictl-server): OpenAI/Anthropic-compatible proxy so you can put Aider/Continue/Cline in front of it and get redaction, audit, and prompt-injection guards for free

What I tried to get right:

- 11 providers in one binary: OpenAI, Anthropic, Gemini, Grok, Mistral, DeepSeek, Kimi, Z.ai, Ollama, plus native GGUF (llama.cpp) and MLX (Apple Silicon) for fully local inference.

- Security as a first-class layer, not a checkbox: CWD jail for tool calls, keyring-backed API keys, three-layer outbound redaction (regex + entropy + optional NER via gline-rs), prompt-injection detection, full JSONL audit log. Local providers skip redaction by default; cloud calls don't.

- A working MCP client with stdio + Streamable HTTP + legacy SSE transports (hand-rolled JSON-RPC, no extra deps).

- User-extensible: drop a manifest + executable in ~/.aictl/plugins/ for custom tools; lifecycle hooks can block, rewrite, or augment any turn.

What's honestly rough:

- Native GGUF and MLX inference are experimental. Tool-call formatting on small local models is hit-or-miss; chat templates are mostly ChatML. Cloud providers are the recommended daily driver.

- Desktop is macOS-only for now and still WIP.

- Not aimed at coding specifically — it's general-purpose. For dedicated coding agents I'd still point people at Claude Code, Codex, or opencode.

Install: curl -sSf https://aictl.app/install.sh | sh

Source: https://github.com/pwittchen/aictl

Happy to answer questions about the architecture, the security model, or the server's Anthropic-passthrough mode (the trickiest part — keeping tool_use blocks, prompt caching, and extended thinking intact across the proxy).