Ask HN: A good Model to choose in Ollama to run on Claude Code

3 pointsposted 13 days ago
by sujayk_33

Item id: 46750752

1 Comments

parthsareen

11 days ago

Hey! One of the maintainers of Ollama. 8GB of VRAM is a bit tight for coding agents since their prompts are quite large. You could try playing with qwen3 and at least 16k context length to see how it works.