Help me build my PC 2017 Catch All

Thanks Robear! I'll wait one more generation then... and perhaps buy a new monitor in the interim.

That's what I'd do too. I'm planning to upgrade my monitor, and I'd like my 970 freed up to drop it in my son's computer (which will also involve probably an entire rebuild at this point), but if I weren't doing either I'd continue on with the 970 until it actually started actually having issues running stuff. But right now I'm on a 1080p60 monitor running every game I have at high or ultra settings.

If you're going to stay with 1080P, on the other hand, don't bother with the TI version. Unless you are thinking of an eventual upgrade. The regular 1660 will run nearly everything at 1080P amazingly, again according to reviews.

That's the idea, future proofing. I am currently running 2 monitors at 1080p, with my primary at 144hz, but I will be dipping my toe into 1440p right after the gpu upgrade. Then I will be handing my 970GTX down to my sons, when I build them a new gaming rig.

Anyone running 4k setups? Happy with it, or still a little too far past the bleeding edge?

Jonman wrote:

Anyone running 4k setups? Happy with it, or still a little too far past the bleeding edge?

I run a 4K OLED as my gaming PC's monitor.. so far so good.. you can get pretty good 4K performance from a 2080/2080Ti and the Radeon VII

Downside is you won't go past 60hz for the most part.. older games you could push into 120hz range. The biggest issue is that there isnt much in the way of good 120hz+ options at 4K.

TheGameguru wrote:
Jonman wrote:

Anyone running 4k setups? Happy with it, or still a little too far past the bleeding edge?

I run a 4K OLED as my gaming PC's monitor.. so far so good.. you can get pretty good 4K performance from a 2080/2080Ti and the Radeon VII

Downside is you won't go past 60hz for the most part.. older games you could push into 120hz range. The biggest issue is that there isnt much in the way of good 120hz+ options at 4K.

Good to know. I'm probably waiting out this round of upgrades anyway, my 1070 is still doing fine on my 1440p monitor. Next generation of upgrades will likely have me eyeing a 4k monitor though, and it sounds like that might be the sweet spot for cost/performance.

What the world needs are 4K displays with high refresh rate AND built-in integer-based scaling of evenly-divisible lower resolutions.

There is no excuse for 1080p and 720p content to be interpolated on a 4K display. That goes for display built-in scalers and display driver scaling both. The latter is especially egregious.

I always thought the whole point of defining 3840x2160 as the standard PC "4K" resolution was for that even divisibility of both 1080p and 720p, so that the two major previous HD resolutions would display with perfect scaling. Nope, apparently not.

If such a display existed, it would be easy to just recommend buying that one, playing perfectly scaled 1080p for now, and going native 4K when your hardware gets upgraded later.

All parts are ordered minus the HDD, I'll wait to purchase that and install it later. Decided to get the MSI Nvidia 2060 instead of the 1060. Since the 2060 only has HDMI and Displayport connections, I ordered 2 ASUS 28" 4k refurb monitors. My friend has those and highly recommended them, and he got refurb ones as well and never had an issue.

LG leaked their upcoming OLED lineup and there was a 43" model which should feature VRR and HDMI 2.1 thus should make a fantastic gaming monitor. The 55" TV I use now is just too big even for my largish desk

I bought an AOC U2777PQU 4K Monitor almost a year ago, and am very happy with it seeing my use case and budget

I paid €440 for the screen, a 27" IPS 4K with 60Hz max. My use case:

- Almost never any twitch gaming, so I care less about the hertz and the refresh rate. Assasin's Oddysey runs in 2K with very high (not max) detail on a 2700X AMD with RX580 card just fine, except in a few edge cases.

- I really really like a screen that looks nice, and this one reviewed as a monitor with great color reproduction. (still sRGB and not AdobeRGB, but it does look nicer than any screen I ever owned before). When I work from home, it's nice to have the desktop real estate and something that's easy on the eyes. It looks waaaay better than my monitors at work, that's for sure.

I play other more 'static' recent games like CIV VI in 4K, or older shooters like Borderlands 2. The PS4 in 1080P looks great too, in my opinion, on the same screen. So does the digital TV decoder, the Switch and SNES Classic when connected.

I'm definitely not saying this monitor is good for everyone, and everything Legion and Gameguru say about refresh rates and resolution upscaling etc. are probably true, but I just wanted to put my experience out there as well.

mrtomaytohead wrote:
WipEout wrote:

What is this "Op-tickle drive" of which you speak?

It's what I use to take my media from those round discs to my computer and make sure I get the audio and visual quality level I want. Also, the only way to get a cupholder installed on your desktop computer.

Let's be honest, we're all in here talking about building desktop PC's like it's some sort of normal thing people do these days. We're just like drivers who only drive manual transmission cars.

That's not a bad analogy, actually, because we end up with machines that cost less and run faster, but take more attention. It breaks down a little, in that self-built machines are just as easy to use on an ongoing basis as premades are, but you do have the higher time investment up front, both in figuring out your parts and then assembling them.

It's probably lower skill than what a modern mechanic does, though. Building a computer these days probably isn't as hard as assembling a medium-size LEGO set. I don't really work on cars, but it sure as heck looks complex.

*Legion* wrote:

What the world needs are 4K displays with high refresh rate AND built-in integer-based scaling of evenly-divisible lower resolutions.

There is no excuse for 1080p and 720p content to be interpolated on a 4K display. That goes for display built-in scalers and display driver scaling both. The latter is especially egregious.

I always thought the whole point of defining 3840x2160 as the standard PC "4K" resolution was for that even divisibility of both 1080p and 720p, so that the two major previous HD resolutions would display with perfect scaling. Nope, apparently not.

If such a display existed, it would be easy to just recommend buying that one, playing perfectly scaled 1080p for now, and going native 4K when your hardware gets upgraded later.

What I often do on this 1600p monitor is run at quarter-res, 1280x800. This works well for games that don't self-scale. Doing the same thing should be easy on a 4K screen, just using 1080p as the base instead.

Now, if the display is taking that and messing it up, well, screw that. This is an old HP ZR30W, which explicitly doesn't have a scaler, to reduce lag to the minimum amount possible. Well, it has a really primitive one that will scale quarter-res only, but I don't use it. I just scale on the video card, which is fine.

NVIDIA supports integer scaling in Linux, and there's reports of support in Intel's Linux driver, but that's about it as far as driver-based support goes

Dumb question, you guys.

So my monitor spec says: WQHD (2560×1440) resolution @ 60Hz

Steam's on-screen FPS counter often goes above 60 Hz.

What's happening then? Is the monitor just discarding the extra frames?

Jonman wrote:

Dumb question, you guys.

So my monitor spec says: WQHD (2560×1440) resolution @ 60Hz

Steam's on-screen FPS counter often goes above 60 Hz.

What's happening then? Is the monitor just discarding the extra frames?

Sort of? The results tend to be inconsistent, as it doesn't always happen the same way. An example of an issue that can cause is where framerate exceeding the display refresh rate results in screen tearing. GSync and Freesync combined with high refresh rate panels are basically designed to eliminate a lot of those weird issues.

Jonman wrote:

What's happening then? Is the monitor just discarding the extra frames?

To (completely unnecessarily) expand on Thin_J's response:

The monitor itself has no concept of "frames".

The video card has a couple of chunks of memory we call framebuffers, let's call them A and B. It draws a frame in framebuffer A, and sets a pointer to it that says "this is the current image". It then starts drawing the next frame in B. When it finishes the one in B, it changes the "current image" pointer to point at B, and then starts drawing the next frame in A. It goes back and forth, so one buffer always has the current frame, while the other one is used to draw the next frame.

The monitor draws whatever image is in the framebuffer that's currently being pointed at by that pointer. So for a 60hz monitor, 60 times a second, it's going to reach over to the video card, get pointed to the current framebuffer, and draw that image. Whatever image that "current image" pointer is pointing at, that's what the monitor is going to draw. It's unaware of anything else behind the scenes that makes that image happen. It just says to the video card, "point me to whichever block of memory has the image that I'm supposed to draw, and Imma go draw it."

If your game is running at 60 frames per second, then this typically means that every time your 60hz monitor goes to draw whatever the "current image" pointer is pointing at, it will be a new frame.

If your game is running below 60 frames per second, then sometimes when the monitor goes to draw whatever the "current image" pointer is pointing at, it will be the same frame as last time.

OK, so both of those are pretty simple, right, which now leads us to your question: what happens when the game is running *above* 60fps? And that can go a few ways.

So you know how the video card flips between the two framebuffers as it draws frames. Now consider what happens if it makes that flip *while* the monitor is in the middle of drawing the current frame (because the process of drawing the "current image" onto the monitor is not instantaneous). Suddenly, the pointer is now pointing at a completely different frame. This is where screen tearing happens, which that link points out. It's when the monitor starts drawing what was in framebuffer A and then ends up finishing the image with what's in framebuffer B because the video card changed the pointer halfway through the monitor's drawing process.

Now depending on how high the framerate is, the video card could render multiple frames in-between each cycle of the monitor. 60hz = 60 updates a second = every 16.67 milliseconds is when the monitor comes back for a new image to draw. The video card could possible have drawn a frame into A that the monitor got on its last cycle, then drawn the next frame into B, and a third one back into A again before it's been 16.67 ms and the monitor comes knocking again. In that case, that frame that was drawn into B is never shown on the monitor, as it was already replaced before the monitor's next update.

Hopefully that wasn't too garbage of an explanation.

No, those were excellent explainers, thanks Legion and Thin_J.

This is the problem that 'Vsync' solves, Vertical Sync, which is where a game renders a new frame into the inactive buffer, but waits for the monitor to reach the bottom of the screen before marking the new image active. You never get tearing with Vsync.

There are two main problems with that approach, one minor and one potentially major. First, you're limiting yourself to your monitor's frame rate, which can reduce responsiveness in some games. TF2, for instance, always felt awful with VSync on to me, laggy and annoying to control. Turn it off, and it was crisp and fast, albeit with occasional screen tears. It's not that noticeable in most games, but particularly with games that are running at very high internal frame rates, you can sometimes feel the difference.

The major problem crops up when the computer is just a little bit too slow. If it doesn't quite finish a frame in time, and spills into the next frame's timeslice even the tiniest bit, then it pauses for the entire next frame. Taking 1% of the next frame's time means wasting the other 99% waiting for the vertical sync.

This instantly reduces your frame rate to an even divisor of the refresh rate.... if the monitor's at 60Hz, but the game is only fast enough to render 59 frames per second, then the actual refresh rate instantly becomes 30 frames. (the game draws a frame a bit too slow, pauses a frame, draws a frame a bit too slow, pauses a frame, and so on.) It will stay at 30fps for as long as the engine is rendering at 59fps or slower. If the game drops to 29fps, then the actual refresh will drop again to the next even divisor, 20Hz. And if a game is right on the cusp between 59 and 60, rendering some frames fast enough but not quite finishing others, the frame rate bounces between 30 and 60. This is "visual stuttering", and it's incredibly annoying.

So you want to avoid Vsync when the game is running slower than the refresh rate, even a tiny bit. It's most useful for games that run at crazy frame rates internally, like 150Hz or higher, usually because they're old and running on hardware that's much more powerful than they expect. Turning sync on reduces CPU load dramatically, and will usually improve battery life and noise levels. But then that first issue can crop up, where the game doesn't react as quickly to your inputs.

GSync and AMD's FreeSync try to fix the stutter problem by changing the monitor's refresh to run at the speed of the game, on a frame-by-frame basis. With one of the Sync technologies enabled, the monitor starts drawing when the game tells it to, so it can update at exactly, say, 58 frames per second without dropping frames and without tearing. It doesn't fall back to 30fps, it renders an actual 58 frames each second.

But! You gotta read the fine print, because some monitors will only only honor certain frame rates. I haven't spent much time analyzing this, but I've gotten the impression that monitors that will sync correctly to any frame rate below their maximum are rare. Some have a very narrow band of acceptable refresh rates, sharply limiting the usefulness of the feature.

Big changes coming to HardOCP and their Forums. While I certainly didnt visit as much as I used to (partially the amount of garbage posters, but mostly my overall malaise with PC Building/Gaming and the community). I still found HardOCP a useful place for reviews (especially Brett their GPU and Motherboard reviewer). I could sometime get good assistance with weird bleeding edge bugs as well as there were like minded enthusiasts I could ping.

But it looks like its all coming to an "end" as Kyle is going to Intel

https://www.hardocp.com/article/2019...

Kyle Bennett will be taking on new challenges very soon with Intel working as its Director of Enthusiast Engagement.

That job title is taken right out of comedy.

Never really read the site, but I've used the forum quite a bit, when looking around at what hardware is getting recommended whenever it was time for a new PC build. I can't imagine it is a good community (it is on the internet after all), but beggars can't be choosers I guess.

So now Anandtech and Tom's are owned by the same company, and HardOCP is mothballed. Apple got Anand, Intel got Kyle, but who wanted Tom?

Tom's sold off over 10 years ago, I don't think the original guy has been involved in a long time.

LeapingGnome wrote:

Tom's sold off over 10 years ago, I don't think the original guy has been involved in a long time.

That was tongue in cheek, though not done well as usual. Gotta maintain my standards!

TheGameguru wrote:

Big changes coming to HardOCP and their Forums. While I certainly didnt visit as much as I used to (partially the amount of garbage posters, but mostly my overall malaise with PC Building/Gaming and the community). I still found HardOCP a useful place for reviews (especially Brett their GPU and Motherboard reviewer). I could sometime get good assistance with weird bleeding edge bugs as well as there were like minded enthusiasts I could ping.

But it looks like its all coming to an "end" as Kyle is going to Intel

https://www.hardocp.com/article/2019...

Ugh, one of the few PC hardware review places worth visiting. On top of that, pretty much the only place with the guts to call NVIDIA on their crap. Not going to be happy with this.