Help me build my PC 2016 Edition Catch All

And as for the Vega 64 and "trading blows" with the 1080, let's be real: they're comparing it with a stock clocked founder's edition 1080, no doubt. Even those $509 1080s are factory overclocked.

I suspect its a generous optimistic approach that under Vulkan and with the fact that HBM memory we will see AMD shine above and beyond Nvidia simply due to architecture.

*Legion* wrote:

Everyone with a G-Sync should do:
* Enable G-Sync in the NVIDIA panel
* Enable V-Sync in the NVIDIA panel
* Set Preferred Refresh Rate to "Highest Available" in the NVIDIA panel
* Set Max Prerendered Frames to 1 in the NVIDIA panel
* Disable V-Sync within games [1]

Good overview, I'm going to have to have a bit of a play around with it. For VSync I've got options of On, Fast, and Adaptive - which have you found to be better?

TheGameguru wrote:

Yeah I'm not sure the strategy.. you can get a 1080 for $515 or so so its not like AMD is pricing these to move. I would have liked to have seen the Vega 64 start at $449 to at least provide some price appeal and slowly go above the 1080 in price for the liquid cooled versions that would provide more performance and perhaps creep into 1080Ti territory for lower price.

These seem DOA before they even hit retail... unless they are miner efficient above and beyond the 300W. My understanding is miners tend to go for the lower watt cards that still deliver efficient compute performance and just run them in parallel on a massive scale.

I agree with GG. I'm not impressed at all. They are playing catchup to the 1080? We got that a year ago. Was hoping for a more dramatic improvement with all the teasing.

Building an HTPC for a friend and I'm trying to find a HTPC case with a front mounted HDMI port for VR. I know I can get the Gigabye 1080/Ti with the 5.25" bay that does something similar but I already have a 1080Ti that I want to use.

Anyone seen one that's large enough to hold a 1080Ti with a custom fan?

Sonicator wrote:
*Legion* wrote:

Everyone with a G-Sync should do:
* Enable G-Sync in the NVIDIA panel
* Enable V-Sync in the NVIDIA panel
* Set Preferred Refresh Rate to "Highest Available" in the NVIDIA panel
* Set Max Prerendered Frames to 1 in the NVIDIA panel
* Disable V-Sync within games [1]

Good overview, I'm going to have to have a bit of a play around with it. For VSync I've got options of On, Fast, and Adaptive - which have you found to be better?

For G-Sync, you want it set to "On".

Adaptive V-Sync is V-Sync that is only engaged when your framerate is below the panel's refresh rate. This can be nice in a non-variable refresh rate situation for people to lock in at a V-Sync'd 60fps (assuming 60hz monitor) but not have the issue of the bad framerate drop that happens if you have V-Sync on but drop below your refresh rate. Adaptive V-Sync just disables V-Sync temporarily in that scenario.

But for G-Sync, as I referenced in my footnotes, we want V-Sync on at all times, because some G-Sync anti-tearing capability depends on V-Sync being enabled (specifically: frame time variance compensation). This is slightly confusing, because it essentially adds a second, somewhat separate purpose to the V-Sync setting, but it is what it is.

Since the point of G-Sync is to prevent tearing, I don't see the purpose of being able to set things such that G-Sync can be enabled but still have tearing happen. And originally that wasn't even possible. But that was changed and has led to some confused users, trying to understand why they still have tearing while G-Sync'ing.

(If you're Google search "tearing while G-Sync enabled", you'll find lots of forum threads of people saying, "you don't need V-Sync if you stay within the G-Sync range", and other people replying, "then why am I getting tearing at 50fps?". The interaction between G-Sync and V-Sync is not at all made clear or easy to understand.)

How about using the front bay to install something to pass it through? They sell card readers that also pass through hdmi. I've never used one, but here is an example:

https://www.overclockers.co.uk/ocuk-...

IMAGE(https://en.intos.de/media/image/thumbnail/0_5368_0_720x600.jpg)

*Legion* wrote:

For G-Sync, you want it set to "On".

Adaptive V-Sync is V-Sync that is only engaged when your framerate is below the panel's refresh rate. This can be nice in a non-variable refresh rate situation for people to lock in at a V-Sync'd 60fps (assuming 60hz monitor) but not have the issue of the bad framerate drop that happens if you have V-Sync on but drop below your refresh rate. Adaptive V-Sync just disables V-Sync temporarily in that scenario.

But for G-Sync, as I referenced in my footnotes, we want V-Sync on at all times, because some G-Sync anti-tearing capability depends on V-Sync being enabled (specifically: frame time variance compensation). This is slightly confusing, because it essentially adds a second, somewhat separate purpose to the V-Sync setting, but it is what it is.

Since the point of G-Sync is to prevent tearing, I don't see the purpose of being able to set things such that G-Sync can be enabled but still have tearing happen. And originally that wasn't even possible. But that was changed and has led to some confused users, trying to understand why they still have tearing while G-Sync'ing.

(If you're Google search "tearing while G-Sync enabled", you'll find lots of forum threads of people saying, "you don't need V-Sync if you stay within the G-Sync range", and other people replying, "then why am I getting tearing at 50fps?". The interaction between G-Sync and V-Sync is not at all made clear or easy to understand.)

Gotcha, thanks.

Early report of Vega being a mining powerhouse.

Honestly, it would make the whole Vega thing make more sense.

*Legion* wrote:

Early report of Vega being a mining powerhouse.

Honestly, it would make the whole Vega thing make more sense.

Guess we know where AMD's mindset is right now. Sell some cards to gamers or sell boatloads of cards to miners.

TheGameguru wrote:

Guess we know where AMD's mindset is right now. Sell some cards to gamers or sell boatloads of cards to miners.

Does that mean prices will remain high for a while? I was hoping for price drops after the Vega release.

I was even thinking of purchasing a Vega since I have a 1440p FreeSync monitor, to replace my R9 390 and give it to my son. But the news of the power cost and heat have dissuaded me.

That leaves a GTX 1080, but their prices have remained over $500. I'll bite the bullet this month either way.

JeffreyLSmith wrote:
TheGameguru wrote:

Guess we know where AMD's mindset is right now. Sell some cards to gamers or sell boatloads of cards to miners.

Does that mean prices will remain high for a while? I was hoping for price drops after the Vega release.

I was even thinking of purchasing a Vega since I have a 1440p FreeSync monitor, to replace my R9 390 and give it to my son. But the news of the power cost and heat have dissuaded me.

That leaves a GTX 1080, but their prices have remained over $500. I'll bite the bullet this month either way.

Can't say right now. I would guess the best way to ensure the price is to preorder the initial batch. Coming from a 390 you might think about the 56 for $399 and rumored to outperform the 1070.

TheGameguru wrote:

Can't say right now. I would guess the best way to ensure the price is to preorder the initial batch. Coming from a 390 you might think about the 56 for $399 and rumored to outperform the 1070.

That's a good thought. I just setup an HTC Vive and it runs well on the R9 390. I'm really just upgrading to give my son my old card to build a PC around, so maybe it makes sense to save $100 now.

JeffreyLSmith wrote:

That leaves a GTX 1080, but their prices have remained over $500. I'll bite the bullet this month either way.

If you keep your eyes open, you can get them under $500. I just grabbed mine for $469, and I've been seeing a lot creep down in that range over at Reddit's /r/buildapcsales group.

Like here's one right now, with a $510 base price and a site code that will knock $50 off it. It's sold through eBay but is being sold by NewEgg, which is exactly like my purchase, except mine was NewEgg selling through Jet. Same model as mine too.

I wouldn't go with a 1080 if I had an existing Freesync monitor. Vega will perform on par

With all of this rabble rousin' and hubba bubba as regards the GPU market and stuff, is this not a great time to build out a new machine from scratch? I know the loop, that there's always something better coming in some fashion, so waiting for it is an instance of the Halting Problem. But maybe given the still-pricy GPU market and that the AMD offerings are not a slam-dunk, an interested PC builder might in this case do well to wait?

Well, midrange GPUs are badly overpriced at the moment. You'll pay too much for too little relative performance, or will have to go up into the high end to get anything close to retail. This probably will be over by Christmas, so that may be a better time to build.

However, if you were going to go $500+ on the video card anyway, then there's no real reason to wait.

GTX 1060s are starting to trickle back into the market at closer to their intended price. The 6GB versions are popping back up around $275, and the 3GB versions can be found at $230 pretty easily.

So if you were looking for NVIDIA sub-$300, those options are coming back. But the $300-450 range is wrecked. GTX 1070s all appear to be going for $450-500.

Who is ready for some new CPU's?? Intel set to announce along with the eclipse. Bad (Good?) news is that all new motherboards are needed for this CPU.

https://www.hardocp.com/news/2017/08...

TheGameguru wrote:

Who is ready for some new CPU's?? Intel set to announce along with the eclipse. Bad (Good?) news is that all new motherboards are needed for this CPU.

https://www.hardocp.com/news/2017/08...

I was considering upgrading in order to get 2 more cores, but it seems like intel doesn't want my money.

TheGameguru wrote:

Who is ready for some new CPU's?? Intel set to announce along with the eclipse. Bad (Good?) news is that all new motherboards are needed for this CPU.

https://www.hardocp.com/news/2017/08...

I hope it's something innovative and interesting, and not just better clock speeds and more cores.

Delbin wrote:
TheGameguru wrote:

Who is ready for some new CPU's?? Intel set to announce along with the eclipse. Bad (Good?) news is that all new motherboards are needed for this CPU.

https://www.hardocp.com/news/2017/08...

I hope it's something innovative and interesting, and not just better clock speeds and more cores.

More clock speeds and better cores. Eureka!

Probably a non-event to try to steal AMD's thunder. Promise the world, six months out, to damage your competitor today.

Pulled the trigger on the PC. Excited to do a full build in nearly 5 years. I think once I get everything up and running I'm going to actually get an HTC Vive as well.

Carlbear95 wrote:

Pulled the trigger on the PC. Excited to do a full build in nearly 5 years. I think once I get everything up and running I'm going to actually get an HTC Vive as well.

You do know that the Rift is currently 400 cheaper while on sale, and even after the sale ends it will still be 300 cheaper?

For reasons that would take way too long to describe, I'm not a fan of Oculus Rift, none of those reasons are technology based either.

AMD Threadripper 1950X review: Better than Intel in almost every way

He's stronger about his opinion than I would be, but I mostly agree with his conclusions. The tl;dr: A high-end i9 is a little faster in games, but costs twice as much. If you literally don't care what it costs, and want the very best gaming you can get, the i9 is better. For almost everyone else, Threadripper is a compelling option for a high end desktop. If you've got a workload that can take advantage of lots of cores, TR is both much cheaper and a fair bit faster, with the caveat that you'll need to pay attention to both cooling and your power supply, as these systems take more power than we've gotten used to. (also true of the i9; this isn't a weak point, it's just what happens when you get lots of cores.)

Man, this is miles from the Bulldozer era, where there were almost no reasons to buy an AMD chip. For the first time in years and years, they're compellingly better most of the time for most workloads, with the caveat that you're trading off a little gaming performance due to slightly lower per-thread throughput. Intel has been bilking us for years, and TR shows us just how bad it's gotten.

One thing I find particularly maddening is Intel's insistence on sandbagging their chip cooling by using thermal paste instead of solder. This is literally a two-cent cost difference, with ENORMOUS implications on the final performance of the chip. For this reason alone, I'd tend to avoid anything they're presently making.

I hope the chipset drivers are good. The reason I left AMD, years ago, was because I just couldn't trust their motherboards. I hope that's no longer true.

Carlbear95 wrote:

For reasons that would take way too long to describe, I'm not a fan of Oculus Rift, none of those reasons are technology based either.

I'm uninterested in the Rift for the simple reason that I refuse to deal, in any way, with Facebook.

I went with the 1950X and the Asus ROG Zenith Extreme.. I was impressed with Ryzen and my 1800x is my PC at work and now the 1950X will be my main PC at home. For the first time in as long as I can remember AMD is in both of my primary PC's.

I'll still use Intel in my HTPC as I want the very best gaming performance for my 4K PC gaming and honestly I can't imagine gaming using more than 8 threads anytime in the near future.

Yeah, for pure gaming, a quadcore Intel is probably still the best bet at the moment, giving the best performance without costing that much. (relatively speaking, anyway.) In a couple years, that may not be true anymore, but it's what you want right now.

But if you're doing anything resembling actual work with your machines, AMD's looking pretty darn good.