

You must be American. I am talking about Kimi, Mistral, GLM, Liquid, Minimax, Arcee, Qwen, Deepseek, Xiaomi.
And you are of course allowed to use cloud inference if you don’t have the hardware to run locally. Just choose an inference service that is not in bed with fascists. There are plenty. Good luck and have a nice day.


Oooh so they will have these high-end GPUs just sitting there on house walls? What tools would be needed to unmount one? Asking for a friend…