augusteo
37 minutes ago
Curious about the MCP integration. Are people using this for production workloads or mostly experimentation?
mythz
22 minutes ago
MCP support is available via the fast_mcp extension: https://llmspy.org/docs/mcp/fast_mcp
I use llms .py as a personal assistant and MCP is required to access tools available via MCP.
MCP is a great way to make features available to AI assistants, here's a couple I've created after enabling MCP support:
- https://llmspy.org/docs/mcp/gemini_gen_mcp - Give AI Agents ability to generate Nano Banana Images or generate TTS audio
- https://llmspy.org/docs/mcp/omarchy_mcp - Manage Omarchy Desktop Themes with natural language
I will say there's a noticable delay in using MCP vs tools, where I ended up porting Anthropic's node filesystem MCP to Python [1] to speed up common AI Assistant tasks, so their not ideal for frequent access of small tasks, but are great for long running tasks like Image/Audio generation.
[1] https://github.com/ServiceStack/llms/blob/main/llms/extensio...