• Merlin@lemm.ee
    link
    fedilink
    English
    arrow-up
    21
    ·
    9 hours ago

    The consumer GPU market is becoming a dystopia at the top end. AMD has publicly retreated from it and Intel is likely a decade away from competing there. I guess I’ll stay in the midrange moving forward. Fuck Nvidia.

    😞

    • AtHeartEngineer@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 hours ago

      If AMD was smart they would release an upper-mid range card with like 40+ gb of vram. Doesn’t event have to be their high end card, people wanting to do local/self serve AI stuff would swarm on those.

      • Eager Eagle@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        12 minutes ago

        yeah, I’ve been wanting a card like that to run local models since 2020 when I got a 3080. Back then I’d have spent a bit more to get one with the same performance but some 20GB of VRAM.

        Nowadays, if they released an RX 9070 with at least 24GB at a price between the 16GB model and an RTX 5080 (also 16GB); that would be neat.

    • Diplomjodler@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      7 hours ago

      I don’t get why people are so keen on handing over such a huge amount of money just for bragging rights. The midrange is perfectly fine for playing any game these days. Those top end GPUs are getting an absolutely inordinate amount of attention compared to the relevance they have to most people.

      • filister@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        4 hours ago

        The problem is that NVIDIA is consistently gimping the mid range making it a very unattractive proposition.

      • poleslav@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 hours ago

        As someone who does VR in flight sims on one of the least optimized games (DCS) I can see the allure. Aside from that one niche though, I can’t think of many uses for a 90 series card though

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 hours ago

        Yup, my 6650 XT is perfectly fine, and my SO has a 6700 XT. Both are way more than we need, and we paid $200-300 for them on sale. Why get the top end? Mine is roughly equivalent to current consoles, so I doubt I’m missing out on much except RTX, but I also don’t care enough about RTX to 10x my GPU cost.

    • latenightnoir@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      7 hours ago

      This was my exact thinking the moment I realised I, yet again, needed a GPU upgrade (thanks, Unreal 5…). Which is why I seared my soul and dished for a 4080 Super, with the hopes that I’ll be covered for a decade at least. The 40s at least seem to still be built mainly for pretty pictures.

      Genuinely not worth paying attention to this nonsense. Maybe - MAYBE - AMD will pull a Comrade and will shift full focus on creating genuinely good and progressively better GPUs, meant for friggin’ graphics processing and not this “AI” tumor. But that’s a big-ass “maybe.”

    • halcyoncmdr@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      ·
      11 hours ago

      Nvidia doesn’t give a shit about gamers anymore. The incremental improvements are a side effect. This is why they’re so focused on software enhancements instead like DLSS now. It gives them the marketing numbers without having to do the hardware improvements for gaming.

      Their bread and butter now is AI, and large scale machine learning. Where businesses are buying thousands of cards at a time. It’s also why they’re so stingy with VRAM on their cards, large amounts on VRAM are not as necessary for most workloads outside gaming now, and it saves them millions of dollars every generation.

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        3 hours ago

        You’re right, however I’d say that Nvidia has always been stingy with VRAM. The 3060 had 6GB while the RX 480 had 8GB, for example, the 970 had 3.5GB VRAM and the R9 390 had 8GB, and there are similar examples going back a long way.

        It has got pretty bad recently. Worse than normal. AI is also very VRAM intensive (even moreso than gaming), so I imagine they’ve been diverting those chips to their AI/enterprise cards.

      • TheDemonBuer@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        8 hours ago

        Nvidia doesn’t give a shit about gamers anymore…Their bread and butter now is AI, and large scale machine learning. Where businesses are buying thousands of cards at a time.

        I’m just quoting this for emphasis.