7900xtx is 24 gb, the 9700 pro has 32 gb as far as high end consumer/prosumer goes.
The AI Pro isn’t even availible! And 32GB is not enough anyway.
I think you underestimate how desperate ML (particularlly LLM) tinkerers are for VRAM; they’re working with ancient MI50s and weird stuff like that. If AMD had sold the 7900 with 48GB for a small markup (instead of $4000), AMD would have grassroots support everywhere because thats what devs would spend their time making work. And these are the same projects that trickle up to the MI325X and newer.
I was in this situation: I desperately wanted a non Nvidia ML card awhile back. I contribute little bugfixes and tweaks to backends all the time; but I ended up with a used 3090 because the 7900 XTX was just too expensive for ‘only’ 24GB + all the fuss.
There’s lingering bits of AMD support everywhere: vulkan backends to popular projects, unfixed rocm bugs in projects, stuff that works but isn’t optimized yet with tweaks; the problem is AMD isnt’ making it worth anyone’s while to maintain them when devs can (and do) just use 3090s or whatever.
They kind of took a baby step in this direction with the AI 395 (effectively a 110GB VRAM APU, albeit very compute light compared to a 7900/9700), but it’s still $2K, effectively mini PC only, and kinda too-little-too-late.
I’m well aware. I’m one such tinkerer. Its a catch 22. No good software support means that no one really wants to use it. And since no one really wants to use it, amd doesn’t make stuff. Also amd is using much denser memory chips so an easy double in vram capacity isn’t as possible.
It took them a few months iirc to get proper support for 9070 in rocm.
The AI Pro isn’t even availible! And 32GB is not enough anyway.
I think you underestimate how desperate ML (particularlly LLM) tinkerers are for VRAM; they’re working with ancient MI50s and weird stuff like that. If AMD had sold the 7900 with 48GB for a small markup (instead of $4000), AMD would have grassroots support everywhere because thats what devs would spend their time making work. And these are the same projects that trickle up to the MI325X and newer.
I was in this situation: I desperately wanted a non Nvidia ML card awhile back. I contribute little bugfixes and tweaks to backends all the time; but I ended up with a used 3090 because the 7900 XTX was just too expensive for ‘only’ 24GB + all the fuss.
There’s lingering bits of AMD support everywhere: vulkan backends to popular projects, unfixed rocm bugs in projects, stuff that works but isn’t optimized yet with tweaks; the problem is AMD isnt’ making it worth anyone’s while to maintain them when devs can (and do) just use 3090s or whatever.
They kind of took a baby step in this direction with the AI 395 (effectively a 110GB VRAM APU, albeit very compute light compared to a 7900/9700), but it’s still $2K, effectively mini PC only, and kinda too-little-too-late.
I’m well aware. I’m one such tinkerer. Its a catch 22. No good software support means that no one really wants to use it. And since no one really wants to use it, amd doesn’t make stuff. Also amd is using much denser memory chips so an easy double in vram capacity isn’t as possible.
It took them a few months iirc to get proper support for 9070 in rocm.