Help Finding a New Monitor for Gaming

The issue I see with scaling lower resolutions up is with the text antialiasing, which is trying to activate subpixels to look smoother. When that's scaled up to 4x resolution, it ends up making things look just a little blurry. It's not a severe problem or anything, but it's not as nice as native res is.

However, it's no worse than any non-native resolution is now on your 1080p screen. That is, run something like 1680x1050 on the screen you have, and if you can live with the game text, you'll be fine expanding 1080p to 4K.

The problem, to whatever degree you think it's a problem, doesn't apply to anything but text. Regular game graphics will look about the same as they do now. The HUDs might suffer a little, but the graphics are fine. This typically means that FPSes and that sort of thing are fine, but older strategy games that don't understand scaling will tend to be the ones that are most bothersome. Example: on my 2560x1600 screen, I notice this problem the most on the Total War Shogun games, which don't scale text to higher resolution. I run them at 1280x800, and the UIs come out fuzzy. They're hardly unplayable, though, just a little blurry compared to native resolution.

Newer HUD-heavy/strategy games, the ones that know how to scale up to 4K, will be fine. And keep in mind that this type of game tends not to push video hardware as hard, so even if your card isn't that great, you may be fine driving a true 4K with games that use text very heavily.

Thanks Malor. I got the AOC monitor, and after some hiccups (had to reroute some cables as the receiver only handles up to 1080, switch out some DVI cables etc) everything works smoothly now. I couldn't be happier, with the image quality in 1080p, 2K or 4K. The whites actually look white, blacks black, well you get the jist of it. Amazing quality for the price!

Oh, one note: if you do the scaling from 1080p to 4k on your graphic card, the latency may be lower. Most monitor scalers aren't that great.

On an NVidia card, right click the desktop and pick NVidia Control Panel off the popup. The setting you want is in Change Desktop Size and Position. Choose your scaling mode (I like Aspect Ratio, which allows black bars but keeps everything in correct proportion), and then choose where the scaling happens in the Perform Scaling On dropdown box. You probably want "GPU".

On AMD, I'm not sure where that setting is, but they must have something similar.

Comparing so-called "2K" or "4K" monitors to older resolutions can be confusing, because manufacturers have switched from quoting vertical resolution to horizontal, presumably because bigger numbers sound more impressive. A 2K monitor is about 2000 pixels horizontally, and a 4K monitor around 4000 horizontally, whereas resolutions like 1080p or 1440p are vertical pixel counts. So a 4K monitor is nowhere near 4x the resolution of a 1080p (unless you're going by area rather than linear size). In practise, a "2K" monitor is typically something like 2560x1440 (i.e. exactly the same as 1440p); a "4K" monitor is typically 3840x2160 or so (twice the resolution of 1080p).

At home I recently upgraded to a pair of 27in 4K monitors (Dell U2718Q), from a pair of 24in 1080p, and I'm very happy with them. I don't play a lot of graphically demanding AAA games though, mostly WOW and Diablo 3.

If you run 4 games at 1920x1080 in windowed borderless mode all four will fit and display on a 4k screen with no scaling.

It is quite literally four times the screen real estate.

It's no different than square footage measurements for a room in a house. If you double the room's size in both width and length you've quadrupled the actual floor space in the room.

So a 4K monitor is nowhere near 4x the resolution of a 1080p (unless you're going by area rather than linear size).

You should think of it that way, because the GPU has to push 4x as many pixels as on a 1080 screen. And I'm not sure that's a linear cost increase, as I believe full-screen effects (like antialiasing) can incur more than a 4x increase in GPU load.

Because that kind of resolution makes a GPU work so much harder, you need to buy a very strong card for 4K. I suspect that even a 1080ti is probably not really fast enough for that res; you'll be able to make it work okay, but it won't be an ideal match, and won't likely be able to provide a smooth framerate with maxed-out settings.

A single 1080Ti will struggle at high or ultra details with newer games at 4K. Certainly Witcher 3 ain’t going to max out at 4K with a single 1080Ti.

We are getting close. We will see what the next gen brings for single GPU 4K gaming.

Yeah, that's basically what I thought, that a 1080Ti would probably be just about ideal for 2560x1440 or 1600, "2.5K" res. Making the jump to 4K, even from there, is a big one, and I'm suspicious it'll take at least one more die shrink to make it happen. And even then, heat is going to be a huge issue.

Just hold out for 3 (or 5 or ??) months and get a nice new 11 series when they release.

Malor wrote:

Yeah, that's basically what I thought, that a 1080Ti would probably be just about ideal for 2560x1440 or 1600, "2.5K" res. Making the jump to 4K, even from there, is a big one, and I'm suspicious it'll take at least one more die shrink to make it happen. And even then, heat is going to be a huge issue.

For the record, I run Witcher 3 at 2560x1440 at maximum settings with no hiccups at all and no noticeable lag or slowdowns. For whatever that's worth. This is using an Asus Strix gtx 1080.

Yeah, that's roughly what I was expecting. If you cranked it up to 4K, that would be almost exactly twice as many pixels, so at least twice as hard to drive, possibly worse. And NVidia cards tend to be bandwidth-starved, so it might not even hit half speed at 4K.

I'll tell you what had become very noticeable with the new monitor and higher resolution. Initual loading times. Noticeable increases in almost all the games I play.

maverickz wrote:

I'll tell you what had become very noticeable with the new monitor and higher resolution. Initual loading times. Noticeable increases in almost all the games I play.

That might actually be the patching done to OSes to mitigate Spectre and Meltdown. It badly impacted I/O speed to most devices. The newer your CPU, the less it hurts. Even as recently as Haswell (4XXX chips), it's a pretty major hit.

Oh, well, I guess that's possible. I have an i7-6700k. I don't know when that was deployed but it definitely changed between the time of my old monitor and new one. So I don't know.