LLM Chat via SSH

38 pointsposted 17 days ago
by wey-gu

24 Comments

demosthanos

15 days ago

Skimming the source code I got really confused to see TSX files. I'd never seen Ink (React for CLIs) before, and I like it!

Previously discussions of Ink:

July 2017 (129 points, 42 comments): https://news.ycombinator.com/item?id=14831961

May 2023 (588 points, 178 comments): https://news.ycombinator.com/item?id=35863837

Nov 2024 (164 points, 106 comments): https://news.ycombinator.com/item?id=42016639

ccbikai

13 days ago

Many CLI applications are now using Ink to write their UIs.

I suspect React will eventually standardize all UI writing approaches.

amelius

15 days ago

I'd rather apt-get install something.

But that seems not a possibility in the modern days of software distribution, especially with GPU-dependent stuff like LLMs.

So yeah, I get why this exists.

halJordan

14 days ago

What is the complaint here? There are plenty of binaries you can invoke through your cli that will query a remote llm api

dncornholio

15 days ago

Using React to render a CLI tool is something. I'm not sure how I feel about that. It feels like like 90% of the code is handling issues with rendering.

demosthanos

15 days ago

I mean, it's a thin wrapper around LLM APIs, so it's not surprising that most of the code is rendering. I'm not sure what you're referring to by "handling issues with rendering", though—it looks like a pretty bog standard React app. Am I missing something?

xigoi

14 days ago

It’s not clear from the README what providers it uses and why it needs your GitHub username.

ccbikai

13 days ago

Connects to any OpenAI-compatible API.

Using a GitHub username prevents abuse.

ryancnelson

15 days ago

this is neat.... whose anthropic credits am i using, though? sonnet-4 isn't cheap! would i hit a rate-limit if i used this for daily work?

ccbikai

17 days ago

I am the author, thank you for your support.

Welcome to help maintain it with me

kimjune01

17 days ago

hey i just tried it. it's cool! i wish it was more self aware

ccbikai

17 days ago

Thank you for your feedback; I will optimize the prompt

t0ny1

15 days ago

does this project request to llm providers?

cap11235

15 days ago

Are you serious? Yeah, its using gemini 2.5 pro without a server, sure yeah.

eisbaw

15 days ago

Why not telnet?

accrual

15 days ago

I'd love to see an LLM outputting over a Teletype. Just tschtschtschtsch as it hammers away the paper feed.

cap11235

15 days ago

Last week or so, there was the LLM finetune posted that speaks like a 19th century Irish author. I look forward a bit to having an LLModem model.

RALaBarge

15 days ago

No HTTPS support

benterix

15 days ago

I bet someone can write an API Gateway for this...