

If you look at the above images, you can definitely see the differences in scaling modes, and when you're not running at your display's native resolution you'll usually end up with some form of interpolation-either from the GPU or from the display scaler. Most displays these days do some form of bicubic upscaling-some really old LCDs did nearest neighbor interpolation, and it could look horrible (eg, when scaling high contrast text like in the above black line on white background example) and has since gone away.

Integer scaling can sometimes be used while running at lower resolutions, if you have the right hardware, or you can pass the signal along to the display and let it handle the scaling.
#Do 4k movies look better on 1080p or 720p drivers
Why am I talking about integer scaling and various filtering techniques in the drivers when I started with discussing playing games at lower than native resolutions? It's because the two topics are intertwined. Integer scaling is a great feature for pixel art games, but it's often less important (and perhaps even undesirable) when dealing with other games and content.ġ080p scaled to 1440p via bicubic filtering It's perhaps better to show what this looks like while dealing with real resolutions, like 1080p scaled to 1440p and 4K, using Integer Scaling (nearest neighbor) vs. Otherwise, you get the bicubic fuzziness that some people dislike. Intel and Nvidia now support integer scaling, though it requires an Ice Lake 10th Gen CPU for Intel (laptops), or a Turing GPU for Nvidia. If your graphics card drivers support integer scaling, you can double the width and height and get a "sharper" picture. Running 1080p on a 4K display ends up being one fourth the native resolution. The other factor is what resolution you're using relative to native. I intentionally started with an extreme example using black on white-with game images, it's far less problematic. NN results in some pixels getting doubled and others not, while bicubic scaling causes a loss of sharpness. LCDs and video drivers have to take care of the interpolation, and while the results can look decent, there's no denying the deficiencies of both nearest neighbor and bicubic scaling.
#Do 4k movies look better on 1080p or 720p 720p
The result is a closeup of what "native" rendering looks like on an LCD, compared to the two different scaling algorithms for non-native resolutions.ġ60x90 scaled to 256x144 via bicubic interpolation, then blown up to 720p Then I've scaled that to 256x144 using several NN and bicubic, and afterward blown that up to 720p via NN (so you can see what the individual pixels would look like). I've taken a source image at 160x90 and blown that up to 1280x720 (using nearest neighbor and bicubic interpolation) in Photoshop. There are pros and cons to any of those, but all look worse than the native image. One option is to use nearest neighbor interpolation, or you can do bilinear or bicubic interpolation. We run into a problem of not easily being able to scale the image, and there are different techniques. Now try to stretch that image to 256x144. When we shifted to LCDs and digital signals, suddenly all the pixels were perfectly square and running at a different resolution than native presented more visible problems.Ĭonsider a simple example of a 160x90 resolution display with a diagonal black stripe running through it. I hated dealing with pin cushioning, trapezoidal distortion, and the other artifacts caused by CRT technology far more than the potential blurriness of not running at a higher resolution. However, CRTs were inherently less precise and always had a bit of blurriness, so we didn't really notice. There are also games like Rage 2 where even going from maximum to minimum quality will only improve framerates by 50 percent or so.īack when we all used CRTs, running at a lower resolution than your native monitor resolution was commonplace. Dropping from ultra to medium on the other hand might be too much of a compromise for some. Look no further than Control, Gears of War 5, and Borderlands 3 if you want examples.ĭepending on the game, it might be possible to play at 4K with a lower quality setting, and the difference between ultra and high settings is often more of a placebo than something you'd actually notice without comparing screenshots or running benchmarks. Still, there are plenty of games where even the fastest current hardware simply isn't capable of running a new game at 4K, maximum quality, and 60 fps. These days, I have multiple 4K and ultrawide monitors, and the difference in graphics quality is amazing. I started gaming back when we hooked up bulging TVs to our computers (C-64), and we were happy to play at 320x200. In a perfect world, you want to run all of your games at your monitor's native resolution.
