Alibaba Cloud: AI Models, Reducing Footprint of Nvidia GPUs, and Cloud Streaming

13 pointsposted 4 months ago
by ekianjo

5 Comments

Havoc

4 months ago

>they already have developed their own CUDA-like support, that is not 100% compatible, but works well enough for their internal uses at Alibaba.

I wonder whether that's inference or training.

ekianjo

4 months ago

From what I could gather this was mostly for inference.

daft_pink

4 months ago

not sure about everyone else, but i can’t justify sending my data workflows to china.

rightbyte

4 months ago

Do you imply you can justify sending your data anywhere else?

daft_pink

4 months ago

Unfortunately, it’s not affordable to run your own LLM yet, but since I work in a regulated industry, it can’t go to China.

But you are correct the moment it becomes practical to do it in-house. It will be done that way.