Hackernews
new
show
ask
jobs
Running LLMs locally? Cut your VRAM consumption by 45% with one line of code
2 points
posted an hour ago
by CarlosCosta_
Item id: 47652954
1 Comments
starkeeper
28 minutes ago
Isn't this just an ad? I'm confused.