Help me build my PC 2017 Catch All

I just assumed Staples was going to cancel them all so I didnt bother to jump through the hoops required. Looks like the 5700XT is a great 1440P card and probably the go to card at $400 but probably not the best bang for buck against the Super versions of the RTX 2060/2070.

Interested in how the 2080 Super does at 4K that could be a "deal" for 4K gamers at $700.. at least over the Radeon VII

Looks like the new X570 Motherboards (which are hella expensive!) are drawing significant power over their X470 predecessors. No one is exactly sure why and unless someone gets a like model to like model to determine we probably won't know why either. But that will certainly cloud some of the benchmarks that might be showing the 3000 series as not as efficient as hoped.

https://www.extremetech.com/computin...

I saw that Linus Tech Tips also noticed that Ryzen 3000 performance in some games could improve drastically when locking the game to a core, as the scheduler was bouncing the process inefficiently around cores, crossing the boundaries to different CCX units.

I think it will take some BIOS and/or Windows updates before we see Ryzen 3000's complete performance profile, which is pretty much par for the course when it comes to AMD.

You might see the same behavior on Linux for a bit. It's got good support for weird CPU configurations (since it runs on so many different architectures, some of which are very strange indeed), but it will probably be subpar until it's explicitly configured to understand how the 3K chips work. And it may get some additional tuning, in later releases, as people understand the chips better.

It'll get worked out. People will be very happy to see the competition and will want to make it shine.

I managed to snag a 3900x on Newegg this afternoon. Now to get a motherboard.

I'm hoping by the end of the year or early next year, I will be able to upgrade. Right now I am thinking AMD 3000 series chips. Hopefully by then all the bugs with power draw etc. will be ironed out!

Logitech wireless products, it appears, have a whole bunch of nasty security bugs, many of which are not going to be fixed.

It seems to me that wireless keyboards and mice are an appealing idea, but a pretty poor decision from a security standpoint. It's a variant form of "Internet of sh*t", low-quality devices directly exposed to attack.

Wrong thread Malor?

Well, I intended to put it here, because I couldn't think of any better spot to put it.... deciding what keyboard and mouse to buy would seem pretty relevant to system builds. Would it fit better elsewhere?

Sorry, I thought we had a general security thread. You are right that picking accessories is relevant. For me I don't really care about proximity attacks like that, it is not something that almost any home user has to worry about.

Remember, 'proximity' is defined by anyone with a good enough antenna. And this kind of attack would be nearly impossible to trace, so you wouldn't understand why you kept getting hacked.

Malor wrote:

Remember, 'proximity' is defined by anyone with a good enough antenna. And this kind of attack would be nearly impossible to trace, so you wouldn't understand why you kept getting hacked.

This is kind of terrifying. Why do people have to suck?

Malor wrote:

Remember, 'proximity' is defined by anyone with a good enough antenna. And this kind of attack would be nearly impossible to trace, so you wouldn't understand why you kept getting hacked.

Those IR receivers that Logitech uses are pretty weak though.. and even with a significant booster on the other end you will still need a reasonable line of sight to transmit to. There is a reason we need IR extenders when you put your equipment in a rack or behind a wall.. IR no matter how boosted won't get past a wall or two reliably.

I could maybe see an attack in an apartment building working with someone in the hallway sticking their extender under the door and blasting away to see what they get but still the likely-hood of a non proximity attack is tiny.

Also worth pointing out that the majority of those exploits are for the Unifying receivers, which are the "one receiver, multiple devices" receiver used by their non-gaming product lines. The LogitechG gaming/enthusiast products do not use the Unifying receivers.

Okay, but keep in mind that your keyboard and mouse are two of the most security-sensitive devices you have, and exposing them to invisible attack from anyone with a good enough antenna, particularly using devices from a company that's showing such a poor track record and security focus, may not be in your best interest.

I'm not, in general, a big fan of wireless signals. They're convenient, but you're not the only one they're convenient for.

If someone's going to sit in my front yard about 75 yards away from the street and try to hack my mouse's wireless through the wall of the house they're welcome to try honestly.

Either my neighbors will see them and call the police, I will see them and call the police, or my blue iris server will record a motion event of some dumbass in my front yard with a laptop or whatever and I will send that to the police.

Meh.

I'm glad these vulnerabilities are getting revealed, because Logitech does need to do better with this. There is no good reason why a wireless peripheral connection should not be secure. It should be a safe connection traveling over insecure space, the same way a VPN connection over the untrusted Internet is.

But there's nothing in these vulnerabilities that is going to make me stop using my various Logitech wireless mice.

So.. Gamers Nexus opened up a 5700XT, already notorious for it's blower style cooler, and added like 5 cents worth of washers and replaced the stock thermal pad with paste.

It reduced load temps by 8C and eliminated thermal throttling at equivalent noise levels.

Who designs these things?

Yeesh.

It wouldn't be AMD if they didn't do some silly unforced error.

Wait for the aftermarket cards, potential 5700 buyers. (Or do GN's simple mod I guess)

I almost look forward to weirdly busted products just to see what Steve Burke might hack together to try and make it function the way it should have to start with.

AMD's use of blowers coolers on their reference design looks all the worse now that Nvidia has gone to a semi-respectable dual fan cooler on the reference 20-series cards.

And nobody is buying their excuses about cases with poor air flow. They do it because it's cheap.

Blower has its niche though.. I use them in small builds were I want the hot air exhausting outside the case... or in general low profile builds where airflow is restricted.

TheGameguru wrote:

Blower has its niche though.. I use them in small builds were I want the hot air exhausting outside the case... or in general low profile builds where airflow is restricted.

I agree with you, for what that's worth.

That said, the issue here is more that they couldn't be bothered to make sure the cooler for their card had enough clamping force to have it make solid contact with the GPU.

You've already gone with a blower again and you know people will give you crap for it to start with. At least maybe do the blower cooler as right as it can be done and not half ass it?

Thin_J wrote:
TheGameguru wrote:

Blower has its niche though.. I use them in small builds were I want the hot air exhausting outside the case... or in general low profile builds where airflow is restricted.

I agree with you, for what that's worth.

That said, the issue here is more that they couldn't be bothered to make sure the cooler for their card had enough clamping force to have it make solid contact with the GPU.

You've already gone with a blower again and you know people will give you crap for it to start with. At least maybe do the blower cooler as right as it can be done and not half ass it?

No argument there. One would have hoped they would have designed it at least well enough. I can’t fault them for using pads over paste though. These things are mass produced and pads probably lend well to that vs paste.

On a side rant I really hate this new trend towards 2.5 or 2.75 slot wide GPU’s. Wtf

Anyone had good experience enabling g sync on a unsupported 144hz monitor?

I'm actually considering a 5700XT as a replacement for my GTX970 to power a CV1 Rift and so I can crank the detail in 1080p gaming. Considering I'm comfortable enough around thermal paste and taking computer things apart, is the general consensus that that would be a good use of a $400ish budget? The only ray-traced product I'm currently interested in is Quake 2 and figure the tech will have matured (or flopped) enough in the 2-4 years I'm hoping to keep this card to keep raytracing out of the must-have territory for at least that long.

Jonnypolite, I'm pretty sure G-Sync uses a proprietary scaler in the monitor, so... I don't think it would work at all.

Robear wrote:

Jonnypolite, I'm pretty sure G-Sync uses a proprietary scaler in the monitor, so... I don't think it would work at all.

You're a little behind the times. Nvidia all but admitted defeat early this year and allowed G-sync to function on Freesync monitors. There's a (short) list of monitors they've officially certified as "G-sync compatible" but a lot of others also work pretty well.

Speaking of which, is there any good reason freesync wont work on gsync monitors then? Would be really great.

Middcore wrote:

Robear wrote:

Jonnypolite, I'm pretty sure G-Sync uses a proprietary scaler in the monitor, so... I don't think it would work at all.

You're a little behind the times. Nvidia all but admitted defeat early this year and allowed G-sync to function on Freesync monitors. There's a (short) list of monitors they've officially certified as "G-sync compatible" but a lot of others also work pretty well.

Huh. My understanding is that both G-Sync and FreeSync use special shader components, meaning that without that hardware, you aren't going to get anywhere. Is that incorrect? Can I buy a random DisplayPort monitor and run Freesync or G-Sync on it?