Run and manage open-source LLMs from your terminal

2 pointsposted 5 hours ago
by clanking7150

1 Comments

clanking7150

5 hours ago

Hey HN! I built llm-launchpad, a terminal UI for deploying and managing open-source LLMs. It’s designed to make model deployment feel simple from the CLI, while running on serverless GPU infrastructure under the hood via Modal. I built it because working with OSS models often means too much setup and infrastructure friction. I’d love feedback on the terminal UX, and what model support or workflow features would make it more useful.