their new datacenter hardware is hyper tuned for LLMs at the expense of general compute, unlike AMD
This is not true. The AMD MI300X/MI325X are, if anything, even more tuned for AI. They’re missing ROPs when Nvidia’s datacenter GPUs (last I checked) still have them.
…And honestly the demand for datacenter GPUs outside of AI is pretty small, anyway.
Also, CUDA has always been and will be the dominant compute API.
I’m not trying to shill Nvidia here. Screw them. The MI cards are better hardware anyway, just with a worse and (ironically) more AI specialized software stack that has utterly sabotaged them.
This is not true. The AMD MI300X/MI325X are, if anything, even more tuned for AI. They’re missing ROPs when Nvidia’s datacenter GPUs (last I checked) still have them.
…And honestly the demand for datacenter GPUs outside of AI is pretty small, anyway.
Also, CUDA has always been and will be the dominant compute API.
I’m not trying to shill Nvidia here. Screw them. The MI cards are better hardware anyway, just with a worse and (ironically) more AI specialized software stack that has utterly sabotaged them.