• tankplanker@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 days ago

    What this chart is missing is the impact of the quality of the screen and the source material being played on it.

    A shit screen is a shit screen, just like a badly filmed TV show from the 80s will look like crap on anything other than an old CRT.

    People buying a 4k screen from Wallmart for $200 then wondering why they cant tell its any better than their old 1080p screen.

    The problem with pushing up resolution is the cost to get a good set right now is so much its a niche within a niche of people who actually want it. Even a good 4k set with proper HDR support and big enough to make a different is expensive. Even when 8k moves away from early adopter markups its still going to be expensive, especially when compared to the tat you can by at the supermarket.

    • sp3ctr4l@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      2 days ago

      It is totally true that things are even more complex than just resolution, but that is why I linked the much more exhaustive write up.

      Its even more complicated in practice than all the things they bring up, they are focusing on mainly a movie watching experience, not a video game playing experience.

      They do not go into LED vs QLED vs OLED vs other actual display techs, don’t go into response latency times, refresh rates, as you say all the different kinds of HDR color gamut support… I am sure I am forgetting things…

      Power consumption may be a significant thing for you, image quality at various viewing angles…

      Oh right, FreeSync vs GSync, VRR… blargh there are so many fucking things that can be different about displays…