• notthebees@reddthat.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 days ago

    7900xtx is 24 gb, the 9700 pro has 32 gb as far as high end consumer/prosumer goes. There’s no point of shoving that much vram into it if support is painful and makes it hard to develop. I’m probably biased due to my 6800xt, one of the earliest cards that’s still supported by rocm, so there’s a bunch of stuff my gpu can’t do. ZLUDA is painful to get working (and I have it easier due to my 6800xt), ROCM is mostly works but vram utilization is very inefficient for some reason and it’s Linux only, which is fine but I’d like more crossplatform options. Vulkan compute is deprecated within pytorch. AMD HIP is annoying as well but idk how much of it was just my experience with ZLUDA.

    Intel actually has better cross platform support with IPEX, but that’s just pytorch. Again, fine.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      2 days ago

      7900xtx is 24 gb, the 9700 pro has 32 gb as far as high end consumer/prosumer goes.

      The AI Pro isn’t even availible! And 32GB is not enough anyway.

      I think you underestimate how desperate ML (particularlly LLM) tinkerers are for VRAM; they’re working with ancient MI50s and weird stuff like that. If AMD had sold the 7900 with 48GB for a small markup (instead of $4000), AMD would have grassroots support everywhere because thats what devs would spend their time making work. And these are the same projects that trickle up to the MI325X and newer.

      I was in this situation: I desperately wanted a non Nvidia ML card awhile back. I contribute little bugfixes and tweaks to backends all the time; but I ended up with a used 3090 because the 7900 XTX was just too expensive for ‘only’ 24GB + all the fuss.

      There’s lingering bits of AMD support everywhere: vulkan backends to popular projects, unfixed rocm bugs in projects, stuff that works but isn’t optimized yet with tweaks; the problem is AMD isnt’ making it worth anyone’s while to maintain them when devs can (and do) just use 3090s or whatever.


      They kind of took a baby step in this direction with the AI 395 (effectively a 110GB VRAM APU, albeit very compute light compared to a 7900/9700), but it’s still $2K, effectively mini PC only, and kinda too-little-too-late.

      • notthebees@reddthat.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 day ago

        I’m well aware. I’m one such tinkerer. Its a catch 22. No good software support means that no one really wants to use it. And since no one really wants to use it, amd doesn’t make stuff. Also amd is using much denser memory chips so an easy double in vram capacity isn’t as possible.

        It took them a few months iirc to get proper support for 9070 in rocm.