auvira_systems
5 hours ago
I love seeing more 'Local-First' tools hitting the front page. We’ve been moving our own project (a financial IDE) toward a purely local-first architecture, and the biggest challenge is always balancing zero-cloud with useful features. Using Claude Code as the base is a clever move because it already has the context of the filesystem. Have you looked into adding a local vector DB (like LanceDB or similar) to give 'pawmode' long-term memory without needing an external API?
daxaur
4 hours ago
Thank you!
Memory right now is flat markdown files (MEMORY.md , SOUL.md) that Claude reads at session start. Honestly it works better than you'd expect. Claude's context window is big enough that a few hundred lines of memory covers most personal assistant use cases. I'd rather keep it as plain text you can read and edit yourself than add a vector DB dependency for marginal recall gains.
The filesystem context thing is exactly right btw
stuaxo
4 hours ago
Going to object to anything using Claude as being local.
Still: this totally could be hooked up to a local LLM.