Help me build my PC 2017 Catch All

That they don't do that says that there's more money to be made selling G-Sync modules than there is taking AMD'S market share.

Middcore wrote:
TheGameguru wrote:

G-Sync needs to die... especially with HDTV's soon to be shipping with the adaptive refresh rate in the HDMI specification. We need one adaptive refresh rate that both consoles and PC's can use and one that isnt based on a locked in proprietary spec.

If Nvidia announced tomorrow that they were dropping G-Sync and making all of their cards compatible with FreeSync displays instead... they could basically drive AMD out of the consumer GPU market completely. Because right now apart from the few hardcore AMD fans and those who feel they need to buy AMD on principle just to prevent an Nvidia monopoly, the primary selling point of AMD cards is that that FreeSync displays are cheaper than G-Sync ones.

Nvidia has had the lion share of the market well before G-Sync existed. We all think the market revolves around enthusiasts but the bulk of the GPU’s sold are mid range and embedded.

TheGameguru wrote:
Middcore wrote:
TheGameguru wrote:

G-Sync needs to die... especially with HDTV's soon to be shipping with the adaptive refresh rate in the HDMI specification. We need one adaptive refresh rate that both consoles and PC's can use and one that isnt based on a locked in proprietary spec.

If Nvidia announced tomorrow that they were dropping G-Sync and making all of their cards compatible with FreeSync displays instead... they could basically drive AMD out of the consumer GPU market completely. Because right now apart from the few hardcore AMD fans and those who feel they need to buy AMD on principle just to prevent an Nvidia monopoly, the primary selling point of AMD cards is that that FreeSync displays are cheaper than G-Sync ones.

Nvidia has had the lion share of the market well before G-Sync existed.

Yes, they did. FreeSync is the only thing allowing AMD to hold on to the scraps they have. That's my point.

I doubt it. AMD has both Sony and Microsoft. Has several OEM contracts for GPU’s (notably Apple). I think you overstate the size and importance of the enthusiast market.

TheGameguru wrote:

I doubt it. AMD has both Sony and Microsoft. Has several OEM contracts for GPU’s (notably Apple). I think you overstate the size and importance of the enthusiast market.

I said the consumer GPU market, IE cards sold directly and separately to consumers. I make no assertions about its size or importance, not saying AMD would fold as a company without it or anything. I'm just saying FreeSync is the only thing keeping them in the game there.

In terms of GPU's inside pre-builts, AMD's market share is still dwarfed by Nvidia's. The Apple contract is nice for AMD but I am suspect that computers with a discrete GPU (upper-end iMacs and Mac Pros) are a pretty small fraction of Apple's sales.

AMD would still have a very nice revenue stream making graphics hardware for the console manufacturers, it's true.

How long will Apple keep AMD around with Apple starting to do their own GPU design for phones?

Middcore wrote:
TheGameguru wrote:

I doubt it. AMD has both Sony and Microsoft. Has several OEM contracts for GPU’s (notably Apple). I think you overstate the size and importance of the enthusiast market.

I said the consumer GPU market, IE cards sold directly and separately to consumers. I make no assertions about its size or importance, not saying AMD would fold as a company without it or anything. I'm just saying FreeSync is the only thing keeping them in the game there.

In terms of GPU's inside pre-builts, AMD's market share is still dwarfed by Nvidia's. The Apple contract is nice for AMD but I am suspect that computers with a discrete GPU (upper-end iMacs and Mac Pros) are a pretty small fraction of Apple's sales.

AMD would still have a very nice revenue stream making graphics hardware for the console manufacturers, it's true.

So Freesync is the only reason anyone buys an AMD GPU Got it. Let me tell the bit coin mining company down the hall they bought thousands of 470/480/570's

TheGameguru wrote:
Middcore wrote:
TheGameguru wrote:

I doubt it. AMD has both Sony and Microsoft. Has several OEM contracts for GPU’s (notably Apple). I think you overstate the size and importance of the enthusiast market.

I said the consumer GPU market, IE cards sold directly and separately to consumers. I make no assertions about its size or importance, not saying AMD would fold as a company without it or anything. I'm just saying FreeSync is the only thing keeping them in the game there.

In terms of GPU's inside pre-builts, AMD's market share is still dwarfed by Nvidia's. The Apple contract is nice for AMD but I am suspect that computers with a discrete GPU (upper-end iMacs and Mac Pros) are a pretty small fraction of Apple's sales.

AMD would still have a very nice revenue stream making graphics hardware for the console manufacturers, it's true.

So Freesync is the only reason anyone buys an AMD GPU Got it. Let me tell the bit coin mining company down the hall they bought thousands of 470/480/570's

Again, crypto-currency mining companies are not the consumer GPU market.

(The number of individuals mining is not to be discounted but the vast majority of mining GPU sales go to companies like the one you mentioned, especially in China, who have farms with literally thousands of cards. Miner sales have no doubt given AMD welcome short-term cash infusion but the mining market is too unpredictable to be counted on in the long term - anyway, while interesting, this is a separate discussion.)

For people who buy GPU's to play games on their PC's and aren't committed to one company for reasons of ideology or pure irrational fanboyism, FreeSync is the only meaningful argument for AMD.

And AMD knows it, which is why they tried to market their newest Vega cards as part of "bundles" with FreeSync monitors: "Our card isn't faster, cheaper, cooler, or less power-hungry than the competition's, and it came out way late. BUT if you consider the price of a card and a monitor together, go with us and you'll save a couple hundred bucks."

I would suggest the 56 matches up nicely in it's price segment (if not for miners) and is every bit a viable gaming GPU with Nvidia. You seem to have a negative view of what AMD is doing but it appears that their forward Thinking for DX12 as well as Ryzen and Threadripper they are a viable alternative to Nvidia and INTEL monopoly. Just have to drown out the fanboism and marketing plants

Vega 56 is a good competitor to the GTX 1070 but only came out after the 1070 had already been on the market for over a year.

On the CPU front, AMD has made amazing strides this year and given Intel a kick in the pants that was badly needed.

On the GPU front, they are treading water.

The Vega 56 is so good per dollar that NVIDIA is conjuring up a 1070ti to provide a head-on competitor, as the 56 falls into that range between the 1070 and 1080.

*Legion* wrote:

The Vega 56 is so good per dollar that NVIDIA is conjuring up a 1070ti to provide a head-on competitor, as the 56 falls into that range between the 1070 and 1080.

Sure. That's the problem with AMD taking so long to get their competitor out. It was always going to be edged out again pretty shortly after by a new Nvidia release.

Most people thought Vega would at least get a few months of breathing room before Nvidia releases their Volta cards in 2018. But no, here comes the 1070 ti.

AMD is at risk of falling a generation behind and needing the equivalent of a "Ryzen, but for GPU's" type of leap to get caught up.

Edit: Meant 2018 for Volta not 2019

Honestly, if Gsync genuinely works better, I'd rather have that. A standard would be better, but it should be a standard that can match the proprietary solution, and it sounds like FreeSync can be pretty hit-or-miss with some monitors and manufacturers.

Middcore wrote:
*Legion* wrote:

The Vega 56 is so good per dollar that NVIDIA is conjuring up a 1070ti to provide a head-on competitor, as the 56 falls into that range between the 1070 and 1080.

Sure. That's the problem with AMD taking so long to get their competitor out. It was always going to be edged out again pretty shortly after by a new Nvidia release.

Most people thought Vega would at least get a few months of breathing room before Nvidia releases their Volta cards in 2018. But no, here comes the 1070 ti.

The 1070ti isn't something that's going to crush Vega 56 though. There's not enough room in terms of price and performance between the 1070 and the 1080 to do anything drastic. 1070s are still selling well over MSRP, while the 1080 can pretty easily be found for MSRP, so the space that should exist there is a lot tighter. 1070ti could be slightly cheaper and slightly faster than Vega 56, but nothing that's going to make Vega 56 nonviable.

Malor wrote:

I'm pretty sure I was reading about ghosting problems on FreeSync monitors, though, which apparently didn't happen on GSync; the explanation I saw at the time was that the custom NVidia hardware did a better job of driving the screen. Is that still a thing?

Any benefit G-Sync has in terms of blurring is pretty much due to restricting the technology to high end monitors, which bring high quality panels to the table. Early FreeSync panels were particularly trash.

Also, some earlier panels had issues with using FreeSync with pixel overdrive enabled on the monitor, and pixel overdrive is important for blurring/ghosting. But there again, it's a function of the panel, not the adaptive sync tech, and any FreeSync monitor you get now shouldn't have any problem operating alongside pixel overdrive.

G-Sync monitors come with ULMB backlight strobing, however, enabling G-Sync disables ULMB. They cannot be used together (outside of some hacks to enable it on certain panels, and even then, it tends to not be worth doing). So if you plan on actually using the G-Sync part, ULMB doesn't help you any. (If you don't care about G-Sync but want to reduce blurring as much as possible, buying a G-Sync monitor and using ULMB instead of G-Sync would be the way to do that).

Note: "G-Sync" on laptops is actually VESA Adaptive Sync (ie. FreeSync), they don't use the module in laptops. But it's still good enough to call it "G-Sync" there.

Also on that topic: kids, when you buy new monitors, make sure you enable pixel overdrive, and figure out what setting it should be on. It's usually one of the middling settings, at least in my experience, which reduces blurring without introducing artifacts.

It remains to be seen exactly how prevalent HMDI 2.1 ends up and how widely adopted it gets on Monitors but I have to hope that it ends up replacing both G-Sync and FreeSync (though I suspect for a good period of time Monitors will support both FreeSync over Displayport and HDMI "Game VRR" over HDMI).

One hopes that we can eventually settle on HDMI 2.1 as the universal spec for both Monitors and HDTV's... it will have plenty of bandwidth that is for sure. Though I'm sure some PC diehards will still want above 120hz refresh rates.

I enjoy my G-Sync displays but I've used FreeSync and it seems to work fine.. hopefully Game VRR ends up being good enough as well.. The Xbox One X supports it and rumors are that LG's 2018 OLED lineup will as well so hopefully early next year I can get a new TV and try it out (depends on how many games are early adopters.. probably not many)

Though I'm sure some PC diehards will still want above 120hz refresh rates.

I did a lot of playing around with refresh rates in the 90s, back when you could do that sort of thing, and what I found that was there wasn't much point, at least in terms of screen stability on a CRT, to going past 85Hz. I could still see a difference up to about 80Hz, but 85 seemed absolutely rock-solid to me, no matter how I looked at the screen. (the edges of your eyes are more refresh-rate sensitive than the center.) 50Hz was painful, 60 was glaringly obvious, 72Hz was visible if I looked sideways, 80 was occasionally perceptible in just the right lighting, but 85 was stable from every angle.

The big advantage to 120Hz is that it's an even divisor of so many other multiples. (24, 30, and 60, at least.) The advent of the Syncs makes that less important, but it's still attractive, because integer frame-multiples work with pretty much anything.

120Hz has very short latency, 4.2ms average, 8.3ms maximum, at least assuming that the panel lives up to the advertised refresh rate. I don't see much additional benefit to 144 from there; 3.5/6.9ms is better, but 0.7/1.4ms is such a minor difference that I don't think even the gnarliest of pro gamers would benefit much from the difference. And I'm quite unsure about whether the screen tech can even truly keep up with 120, never mind 144.

I do know that, even as a young person, screen flicker was entirely gone by 85, so I'm pretty sure I'll be completely happy at 120, particularly with persistent-style displays, as opposed to the raster beams I was testing with.

I think that may be another case of something like CD audio: with the bitrate and bit depth of that format, the problem of per-channel sound reproduction is solved. You can go further, but for playback, there's literally no reason to do so. I'm suspicious 120Hz will be a similarly permanent solution.

To argue the "gnarliest of pro gamers" point... I was *never* as fast as those folks to begin with, am older and slower now and can still tell the difference between 144hz and 165hz when moving a window around on the desktop on the same monitor. When I got the Acer I'm using right now and was reading about "overclocking" it from 144hz to 165 I kind of giggled, and almost rolled my eyes while I was enabling the higher refresh rate. Then much to my surprise I could actually feel it in the mouse movement.

It's not a big difference but it's absolutely there on my Acer monitor. Past 120hz IMO it's not about seeing motion fluidity but it becomes about feeling it in controls. Not saying it's 144 or 165 or above or die or whatever from now on, that's just silly, but it is noticeable. I could easily see an advantage to one of those crazy 240hz monitors for some folks that are good at something like CSGO or Quake or whatever. Even people that are openly and admittedly bad at games have just flat out said they can absolutely feel the difference in fluidity between a game running at 144hz and 240hz when they test one. Or at least they can when the system they're on can push the framerate enough

120hz is likely to become/stay more a standard because it's far more approachable technically and specifically because it's divisible by 24, 30, and 60 as you mentioned, but to say going past 120hz doesn't hold any advantages for those people that are after every last little 1ms they can get doesn't sit quite right with me.

Will we ever see a 4k 240hz monitor? I mean, I don't know. I don't even have the slightest idea what it would take to get professional gamers to move past 1080p at this point. They just don't have a reason to.

Huh, that's unexpected, but I'm certainly not going to substitute theory for practical experience. I'll defer to your judgement, at least until I have some hands-on time of my own.

So I've looked at the thread but figured I'd post as well. I'm upgrading my processor/RAM/Mobo for my computer. Already have an SSD in there for the OS and certain programs and I'm running a 6gb gtx 1060. I'm looking at the following:

https://pcpartpicker.com/list/DDytbj

Ryzen 5 1600 Processor
Gigabyte - GA-AB350-Gaming 3 motherboard
Corsair Vengeance LPX 16GB DDR4-3000

Any suggested changes? This is right about the price point I want, so not looking for anything crazy. Thanks!

120hz would be a nice baseline, but yeah, the difference between 120hz and 144-165hz is noticeable.

I haven't had the opportunity to experience 240hz, but even at 165hz, dragging a window across the desktop still has perceptible judder that wouldn't exist with a real object moving slowly across your field of view. When people say they notice the difference at 240hz, I absolutely believe them.

Looking forward to trying 120hz myself. I've always believed that it couldn't possibly make much of an actual difference - but everyone says it does. 2018 monitors cant come soon enough.

Not directly related to PC-building, but Newegg is being sued by some South Korea banks for alleged massive fraud:

https://gizmodo.com/computer-parts-s...

I finally put together my build this weekend and I'm crushed to find out that I have a bad SSD. Once I got Windows installed and the drivers updated it gave me a blue screen and rebooted itself and the SSD wouldn't show up in the UEFI. Hard power off and back on, and it's back and booting normally.

This goes on over and over until finally the PC will only stay on a couple of minutes before crashing and rebooting. Oddly enough there's a clicking sound coming from the M.2 slot - arcing maybe? I RMA'ed the drive and I'm picking up a regular SATA SSD, a Samsung 850 EVO 500GB to replace the M.2 drive. Not sure if it was the drive slot on the mobo or a bad SSD module but it was a huge bummer.

When I get it all put back together tonight and clean up my workspace I'll share a photo.

For $709 shipped, yeah or nay?

https://www.newegg.com/Product/Produ...

https://www.engadget.com/2017/10/26/...

Should be good news for light thin laptops that can still do some light gaming.. This APU will churn out much better frame rates than Intel integrated GPU's.

Yeah an integrated GPU being able to functionally play Overwatch is a big win.

Some more info at Ars

https://arstechnica.com/gadgets/2017...

I sure hope we see these in the next Surface Pro refresh..

TheGameguru wrote:

Some more info at Ars

https://arstechnica.com/gadgets/2017...

I sure hope we see these in the next Surface Pro refresh..

That would be neat!