Choosing the Right GPU for Training and Inference

3 pointsposted 3 days ago
by redohmy

2 Comments

jamesbsr

3 days ago

The trend is, not every one has the capability to train a Large Language Model. So the majority will focus on fine-tune or prompt engineering :)

user

3 days ago

[deleted]