Show HN: First Claude Code client for Ollama local models

20 pointsposted 5 hours ago
by SerafimKorablev

8 Comments

oceanplexian

41 minutes ago

The Anthropic API was already supported by llama.cpp (The project Ollama ripped off and typically lags in features by 3-6 months), which already works perfectly fine with Claude Code by setting a simple environment variable.

xd1936

28 minutes ago

And they reference that announcement and related information in the second line.

eli

an hour ago

There are already various proxies to translate between OpenAI-style models (local or otherwise) and an Anthropic endpoint that Claude Code can talk to. Is the advantage here just one less piece of infrastructure to worry about?

g4cg54g54

12 minutes ago

siderailing here - but got one that _actually_ works?

in particular i´d like to call claude-models - in openai-schema hosted by a reseller - with some proxy that offers anthropic format to my claude --- but it seems like nothing gets to fully line things up (double-translated tool names for example)

reseller is abacus.ai - tried BerriAI/litellm, musistudio/claude-code-router, ziozzang/claude2openai-proxy, 1rgs/claude-code-proxy, fuergaosi233/claude-code-proxy,

d0100

23 minutes ago

Does this UI work with Open Code?

dosinga

an hour ago

this is cool. not sure it is the first claude code style coding agent that runs against Ollama models though. goose, opencode and others have been able to do that a while no?