Honestly, I have to agree with the article - while you could say graphics have improved in the last decade, it’s nowhere near as much as the difference as the decade before that.
I’d easily argue that the average AAA game from a decade ago looks just as good on a 1080/1440p display as the average AAA game today - and I’d still bet the difference wouldn’t be that noticeable for 4K either.
And what do we gain for that diminishing return on graphics?
Singleplayer games are being made smaller, or vapid “open worlds”, and cost more due to more resources going to design teams rather than the rest of the game.
Meanwhile multiplayer games get less frequent and smaller updates, and that gets padded out with aggressive micro-transactions.
I hate that “realistic” graphics has become such an over-hyped selling point in games that it’s consuming AAA gaming in its entirety.
I would love for AAA games to go back to being reasonably priced with plainer looking graphics, so that resources can actually be put into making them more than just glorified tech demos.
Well it’s a scaling effect and diminishing returns
To the human eye 480p vs 1080p is significant but 4k vs 8k is hard to tell
I think focusing on new technologies such as AI upscaling/world generation or VR is a better use of developers time and pushes the industry back into the innovative space it’s supposed to be
VR will always stay a niche technology just because of the limited circumstances where people can use it (e.g. not on the move, not while watching kids,…).
Depends a bit on screen size and placement, too. I play on 27", 1440p, about 3 feet from my face, and my eyeballs are definitely the lowest resolution link in the chain. 32" screen on my desk, 60" screen in front of the couch, and 1080-1440 will start showing their pixels. I’m not anxious to upgrade my screen, because 1440p gives me great framerates with a cheaper video card. Also a 32" screen at a viewing distance of 3’ is hard to actually see everything.
I’d much rather have a good game that runs fast at 1080p than have to get a $700 card for OK framerate and style-over-substance gameplay just to get 4k.
Agree that using VR to get immersive, wide-field graphics from fewer pixels is a great alternative.
I’d easily argue that the average AAA game from a decade ago looks just as good on a 1080/1440p display as the average AAA game today - and I’d still bet the difference wouldn’t be that noticeable for 4K either.
If you just count pixels, yes. But what really made a big step forward in this decade was the realistic animation. And it does require a lot of effort and time to make it right.
Honestly I’d still argue there’s diminishing returns on this front as well.
I play plenty of older titles, and I wouldn’t say I notice that much of a difference - though that is my very subjective opinion
There’s hundreds of great games on pc to play without all the focus on graphics. You just can’t focus on industry giant game devs. Go play Stardew Valley, or Hades, or Subnautica.
Subnautica is a game I play for the audio, and that’s really saying something because the visuals are great. I bought open back headphones for that game.
Of course there are, and I do - but the focus of the article, and thus the thread was on the AAA gaming space and its obsession with graphics.
Smaller studios and Indies already figured out the whole “you don’t need to be able to see every fibre of a character’s hair in order for a game to be good” thing
Halo 4 at 1440p looks very good, and it’s 12 years old. Fully agree. I’d rather see more entities on screen, more particles, and draw distance. Polygon count and textures don’t really impress me anymore.
I’d rather see highly stylized games with a lot going on in the world, rather than wasting half of my frame render time on a character’s face.
Honestly, I have to agree with the article - while you could say graphics have improved in the last decade, it’s nowhere near as much as the difference as the decade before that.
I’d easily argue that the average AAA game from a decade ago looks just as good on a 1080/1440p display as the average AAA game today - and I’d still bet the difference wouldn’t be that noticeable for 4K either.
And what do we gain for that diminishing return on graphics?
Singleplayer games are being made smaller, or vapid “open worlds”, and cost more due to more resources going to design teams rather than the rest of the game.
Meanwhile multiplayer games get less frequent and smaller updates, and that gets padded out with aggressive micro-transactions.
I hate that “realistic” graphics has become such an over-hyped selling point in games that it’s consuming AAA gaming in its entirety.
I would love for AAA games to go back to being reasonably priced with plainer looking graphics, so that resources can actually be put into making them more than just glorified tech demos.
Well it’s a scaling effect and diminishing returns
To the human eye 480p vs 1080p is significant but 4k vs 8k is hard to tell
I think focusing on new technologies such as AI upscaling/world generation or VR is a better use of developers time and pushes the industry back into the innovative space it’s supposed to be
VR will always stay a niche technology just because of the limited circumstances where people can use it (e.g. not on the move, not while watching kids,…).
I agree
I should’ve clarified VR/ AR. I do think AR will be a large part of daily life and apply much further than video games in the not too distant future
Depends a bit on screen size and placement, too. I play on 27", 1440p, about 3 feet from my face, and my eyeballs are definitely the lowest resolution link in the chain. 32" screen on my desk, 60" screen in front of the couch, and 1080-1440 will start showing their pixels. I’m not anxious to upgrade my screen, because 1440p gives me great framerates with a cheaper video card. Also a 32" screen at a viewing distance of 3’ is hard to actually see everything.
I’d much rather have a good game that runs fast at 1080p than have to get a $700 card for OK framerate and style-over-substance gameplay just to get 4k.
Agree that using VR to get immersive, wide-field graphics from fewer pixels is a great alternative.
If you just count pixels, yes. But what really made a big step forward in this decade was the realistic animation. And it does require a lot of effort and time to make it right.
Honestly I’d still argue there’s diminishing returns on this front as well.
I play plenty of older titles, and I wouldn’t say I notice that much of a difference - though that is my very subjective opinion
There’s hundreds of great games on pc to play without all the focus on graphics. You just can’t focus on industry giant game devs. Go play Stardew Valley, or Hades, or Subnautica.
Subnautica is a game I play for the audio, and that’s really saying something because the visuals are great. I bought open back headphones for that game.
Of course there are, and I do - but the focus of the article, and thus the thread was on the AAA gaming space and its obsession with graphics.
Smaller studios and Indies already figured out the whole “you don’t need to be able to see every fibre of a character’s hair in order for a game to be good” thing
Halo 4 at 1440p looks very good, and it’s 12 years old. Fully agree. I’d rather see more entities on screen, more particles, and draw distance. Polygon count and textures don’t really impress me anymore.
I’d rather see highly stylized games with a lot going on in the world, rather than wasting half of my frame render time on a character’s face.