Help me build my PC 2017 Catch All

HardOCP's first sentence there pretty much wraps it up: "If you were waiting for huge IPC gains out of the new Coffee Lake CPU from Intel, you might be waiting for a very long time." It just isn't moving the needle very much. If you were already inclined to go with Ryzen over Kaby Lake, Coffee Lake doesn't appear to offer much to warrant changing that.

Agreed, for most people it seems like the AMD chip would be the most cost-effective way to go. Not to say the coffee lake chips aren't good but unless you're trying to squeeze out maximum performance it doesn't seem to be as good of a value.

How do they compare on power consumption / heat?

So a Western Digital Black 6tb is coming in today. Download all the things

Question - I'm starting to look at GTX 1080 builds. My current motherboard can only take a CPU up to a 3770 (which I have in it right now), so I'm at the end of upgrading my current rig without just buying a completely new one.

For my current GPU, I'm rocking either a GTX 1060 or 1070 card right now (can't remember off the top of my head). It's the 6 GB version of whichever, though.

So, the question: would it be more feasible/more-bang-for-my-buck to just put in a 1080 in the current mobo I have since the increases I would get by ALSO upping to a i7-7700 CPU+mobo combo not give as much return for my investment? I hear the advances in CPUs have really dwindled over the last few years, but I've not bought a new CPU in 5 so I'm curious what the experts think.

The GPU is absolutely going to be where your dollar makes more difference in performance.

As someone who's done the 1 year to 18 month system updates and gone from a 3770k to a 4790k to a 6700k... things haven't moved all that much. The biggest updates have been to other system features. Things like m.2 slots on the motherboards and updated USB standards and all that.

I'm on a Ryzen 1800X right now which is technically an inferior processor for gaming over the 6700k system I have sitting dormant, but since I game at 1440p where the vast majority of games get far more GPU limited than CPU the difference is usually under 10fps or so. And I gain all the utility of 8 cores/16 threads from the 1800X.

Ok, so is moving up to the GTX 1080 pretty much a clean upgrade? Just wipe the drivers, remove the old card, put in the new one, and then install drivers? I have a PSU that I think is 600W, so I imagine that is more than enough. Thanks for the help so far!

You probably don't even need to do the driver wipe. You can, just for a nice clean "reset", but in all likelihood you could just stick your 1080 in there and go.

When you go to the NVIDIA page and put in your exact video card to do a driver download, it's all smoke and mirrors. You download the same EXE that everyone else with a (modern) NVIDIA card does.

I did just that. I took out my 970 and placed a 1080 ti in its place. Booted fine and everything.
*Shoulder Shrug*

JohnKillo wrote:

I did just that. I took out my 970 and placed a 1080 ti in its place. Booted fine and everything.
*Shoulder Shrug*

Yeah, the reason that the driver download has ballooned up to over 400MB is, well...

IMAGE(https://i.imgur.com/aOMFXGl.png)

You install it and your system has drivers for basically everything NVIDIA going back quite a ways.

Well.. that and all the extra stuff Nvidia installs with their "base" driver package.

TheGameguru wrote:

Well.. that and all the extra stuff Nvidia installs with their "base" driver package.

Is it worse than AMD though? I think one of the reasons I switched from an AMD GPU to Nvidia after many years of loyalty, were their asinine driver installers that constantly tried to install some spyware on my computer.

MoonDragon wrote:
TheGameguru wrote:

Well.. that and all the extra stuff Nvidia installs with their "base" driver package.

Is it worse than AMD though? I think one of the reasons I switched from an AMD GPU to Nvidia after many years of loyalty, were their asinine driver installers that constantly tried to install some spyware on my computer.

AMD has cleaned up that nonsense dramatically. You are right that there used to be a good bit of garbage included in their driver package. Part of it was that they were using 3rd-party software to provide some of the features NVIDIA does with their home-brewed software (eg Shadowplay). AMD now has created their own in-house software for those features (ReLive is their Shadowplay equivalent), and so all that Raptr and Plays.tv stuff has been kicked to the curb.

At this point, I think AMD's package is a little less bloaty than NVIDIA's, but that wasn't the case before when they were installing Plays.tv and "Gaming Evolved" and junk.

What would be the best video card to pair with this monitor?

I currently have a cheapo 1080p 144hz TN with a 1060.

FreeSync, so you're going to want AMD. At that resolution, you need one of the Vega GPUs. An RX 580 wouldn't be enough to push 1440p, and especially not 21:9 1440p.

DigitalFoundry's 1440p Vega benchmarks can be seen here:

You'll have to knock some off those numbers for ultrawide performance, as you're adding 25% to the pixel count.

I would go with the Vega 64. It's still pretty much just reference boards at this point, so if you mind the blower style, you may need to wait a little longer (or spring for the water cooled one).

I suppose if you don't care about FreeSync, you could get a 1080 or a 1080ti. Though if it were me, I wouldn't spend $500+ on a monitor and not get the benefit of some form of adaptive sync out of it.

Yeah, that's the debate here. There's just nothing else at this price point and quality like this.

Yeah if you were to go NVIDIA and G-Sync on a 34" 1440p ultrawide, you're looking at $1000-1200 for the monitor alone.

For that much you can get that monitor and the Vega 64 both.

Would getting that monitor with FreeSync and a 1080Ti be that much of a dumb move? It seems to be the best bang for the buck monitor wise. Not sure the G/Freesync is that big of a deal to begin with.

Edwin wrote:

Not sure the G/Freesync is that big of a deal to begin with.

I mean I can only share my opinion, and my personal opinion is that adaptive sync is essential and I won't ever tolerate not having it again. Eliminating tearing and reducing stutter is what you pay that kind of money for IMO. It's become non-negotiable for me the same way high refresh rate has.

I'd rather have 90fps being drawn on screen in sync with the GPU than have 110fps being drawn out of sync with the rendering pipeline.

I'm sure other people feel differently, but I spent decent money on an AMD card and FreeSync monitor, and then quite a bit of money on an NVIDIA card and G-Sync monitor upgrade, specifically because once I tasted adaptive sync, there was no going back.

I'm very attached to my gsync as well, and for much the same reasons. I suppose whether that's worth it comes down to how much stuttering/tearing bothers you.

The reason freesync monitors are cheaper is because lot more people have Nvidia cards and gsync is way more in demand.

No direct experience, but if you can believe forumgoers (not always the case anymore), GSync is much more tightly controlled by NVidia, and tends to work much better. Whether it's enough better to justify the extra money, I don't know.

Well, in fact, I don't even know that it's better. It's getting hard to discern the truth about things with the number of shills that are constantly marketing these days.

From what I understand, Gsync is a proprietary technology that requires specialized hardware from NVIDIA in the actual monitor (ergo it costs more), while FreeSync is a royalty-free industry standard for DisplayPort and HDMI video communications protocols, supported by AMD video cards.

Yes, G-Sync requires an additional hardware component inside the monitor itself. There was a DIY G-Sync add-on kit you could buy a few years ago, but I think it was only ever compatible with one specific model of Asus monitor.

That said, it's not like Nvidia is selling those G-sync boards to the monitor manufacturers at cost.

In my experience, FreeSync is pretty much just as good as G-Sync in terms of normal operating performance. At least based on my experience with the one of each that I have owned.

The one main area where FreeSync can stumble is in the operating "range" that any given panel supports. Some manufacturers have put out FreeSync panels with narrow FreeSync ranges. Ones where the range doesn't even go to the maximum refresh rate of the panel are particularly egregious.

The panel Edwin's looking at has a FS range of 49-100hz. The 100hz is good as the panel maxes out at 100. The 49 on the low-end is a bit higher than is ideal. It means that 48fps and below wouldn't activate FreeSync. But since the top-end of the range is 2x above the bottom end, it does mean that Low Framerate Compensation should kick in at that point to frame-double the output to push it back into FreeSync territory and keep adaptive sync working. (AMD slides on LFC, these say 2.5x range needed, but AMD has since tweaked LFC to work in only a 2x range).

Note, FreeSync did not use to have LFC at all. FreeSync has required some updates to get up to parity with G-Sync, which is why some older stuff out there may proclaim the superiority of G-Sync.

I'm pretty sure I was reading about ghosting problems on FreeSync monitors, though, which apparently didn't happen on GSync; the explanation I saw at the time was that the custom NVidia hardware did a better job of driving the screen. Is that still a thing?

G-Sync needs to die... especially with HDTV's soon to be shipping with the adaptive refresh rate in the HDMI specification. We need one adaptive refresh rate that both consoles and PC's can use and one that isnt based on a locked in proprietary spec.

TheGameguru wrote:

G-Sync needs to die... especially with HDTV's soon to be shipping with the adaptive refresh rate in the HDMI specification. We need one adaptive refresh rate that both consoles and PC's can use and one that isnt based on a locked in proprietary spec.

I couldn't agree more!

TheGameguru wrote:

G-Sync needs to die... especially with HDTV's soon to be shipping with the adaptive refresh rate in the HDMI specification. We need one adaptive refresh rate that both consoles and PC's can use and one that isnt based on a locked in proprietary spec.

If Nvidia announced tomorrow that they were dropping G-Sync and making all of their cards compatible with FreeSync displays instead... they could basically drive AMD out of the consumer GPU market completely. Because right now apart from the few hardcore AMD fans and those who feel they need to buy AMD on principle just to prevent an Nvidia monopoly, the primary selling point of AMD cards is that that FreeSync displays are cheaper than G-Sync ones.