Help me build my PC 2020 Catch All

My last GPU purchase was a GTX 970.

https://pcpartpicker.com/user/cecoat...

After my last Linux major upgrade (like all the other major Linux upgrades, honestly) my system crapped the bed and I had to basically re-install.

tl;dr I'm over Nvidia and it's pain in the butt proprietary drivers. Looking to upgrade/sidegrade to AMD.

It's interesting to me that mid-range-ish level cards seem comparable to the GTX 970 even all these years later.

Since I have a Mini ITX board, a case that can accommodate full size GPUs, and a 460w power supply, that tends to limit my options.

A Sapphire Pulse RX 570 4GB seems to cost $120-140 new, and an MSI RX 570 "MK2" 8GB seems to be $175-200. I like those two models because of their specs on dba, they seem really quiet.

Performance-wise it seems like it would be a very minor downgrade/sidegrade. And the difference between the 4GB and 8GB non-existent especially for pretty close to half the price.

Given how comparable everything below a 5700 XT seems, and that once you go above the 570 TDP gets a lot higher (which would probably be pushing it in terms of my power supply) is there any reason that wouldn't be the way to go? Any logic/considerations I'm missing?

Beyond that it looks like I'd have to consider upgrading my power supply and CPU to really get the full benefit of a more powerful card.

The most demanding games I play are probably DOOM and Prey, and I'm okay not having everything on ultra settings.

If you're not going above 1920x1080, then perhaps that GTX 970 will hold you?

ccoates wrote:

tl;dr I'm over Nvidia and it's pain in the butt proprietary drivers. Looking to upgrade/sidegrade to AMD.

The NVidia drivers are basically the Windows drivers, with a thin shim layer to translate from Linux to Windows. They're typically some of the robust solutions you can find; they're a bit of a pain to install, but they work better than almost anything else.

AMD drivers have been getting better, but the last I looked into it, AMD-on-Linux was still quite buggy and unhappy-sounding. This was a major improvement over the totally inept and crashy earlier versions, but it still didn't sound all that wonderful.

Intel graphics are the only ones I know where you get open source, good quality, and comparable performance to Windows, but the Intel silicon is terribly weak in comparison.

Given the available options, NVidia's approach of bundling their Windows driver with a shim layer might be the best one. Install kinda sucks. Ubuntu makes it pretty easy, but the constant recompiles for new kernels get a bit old. The actual run environment is slightly faster than Windows (because Linux is a little more efficient) and the stability is excellent.

Malor wrote:

Given the available options, NVidia's approach of bundling their Windows driver with a shim layer might be the best one. Install kinda sucks. Ubuntu makes it pretty easy, but the constant recompiles for new kernels get a bit old. The actual run environment is slightly faster than Windows (because Linux is a little more efficient) and the stability is excellent.

This is inevitably what hoses my system. New release of a distro means a new kernel, and if I'm not extra super cautious something goes wrong and I end up trying to recover my system via the command line. Or giving up and reinstalling completely if I'm in a hurry. I full-dist upgraded Pop!_OS recently and it was a sh*tshow (I swear I used the NVIDIA image they make, but who knows, maybe I was careless.)

I'd say I'm somewhere between "average" and "power user", so I know just enough to be dangerous re: Linux. Specifically between the GPU and my dual-booting with Linux and Windows on separate encrypted partitions, a Linux screwup has painful repercussions.

I do like changing distros from time-to-time for that full dist upgrade smell. I know the performance will take a hit, but overall AMDs drivers are pretty "out of the box" by comparison, aren't they?

Robear wrote:

If you're not going above 1920x1080, then perhaps that GTX 970 will hold you?

From a performance standpoint, definitely. As far as I can tell you have to get into the $500-ish price range to see a worthwhile enough boost to justify a new purchase (unless I'm missing something. Has everyone switched to 4k gaming?)

This would definitely be a quality of life oriented switch as opposed to a raw FPS one.

Phoronix makes it seems like performance would be okay, but I guess I could suck it up and choose Debian LTS and just stop distro hopping.

I'd look at 1440p cards for the "meaningful performance boost", as a waypoint leading to 4K in the future. And that puts you at around $200 - $300, unless you want to add a G-Sync/FreeSync monitor. So the nVidia 1660 Ti or AMD RX 590.

Robear wrote:

I'd look at 1440p cards for the "meaningful performance boost", as a waypoint leading to 4K in the future. And that puts you at around $200 - $300, unless you want to add a G-Sync/FreeSync monitor. So the nVidia 1660 Ti or AMD RX 590.

That one looks intriguing. I'm worried I'd be pushing the wattage.

PC Part Picket puts my setup with a RX 590 at ~400 watts.

https://outervision.com/power-supply... puts me at 450 watts (if I select the 24/7 GAME ON) estimate. With the GTX 970 it estimates 428 watts and the RX 570 392 watts.

My power supply is 460 watts, but most specifications say they recommend a 500-550 watt power supply at a minimum for the RX 590-ish range of cards.

How close do you generally push those types of specs?

I'd want that 550, honestly. Or just go to like a 750W and be set for quite a while.

When my PSU crapped out, I went with a reputable 750 replacement.
I had a 500 and only wanted a 550 or 600 but the only corsair brand they had were 750 and above.
It was only $15 more than the crappy brand 600's so I grabbed it and haven't looked back.

I love how silent my setup is with those Noctua fans and the fanless heatsink.

I can hear it when it's on, but measuring it from my bed about 10 feet away it barely registers an increase in dba for my room above default base level "quiet". And those 650 fanless are so expensive...

I could bite the bullet and settle for "quiet" instead of completely silent, but once you start adding in a new PSU, it seems like I might as well go for a RX 5700, and then the costs stack up a bit.

I should probably suck it up and wait for a good sale or something. Crypto has made buying GPUs on eBay pretty risky, right? So new seems like the way to go.

Thanks for the insight, y'all.

ccoates wrote:

I should probably suck it up and wait for a good sale or something. Crypto has made buying GPUs on eBay pretty risky, right? So new seems like the way to go.

Thanks for the insight, y'all.

I've bought my last two GPUs used from eBay with zero issues, most recently this October. Not a large sample-size, and I'm certain there are people selling mining cards in bad faith on there, but I don't think it's quite as bad as many people say.

I bought a 1070 off of ebay in August. No issues so far. I'd just be checking the seller rating to make sure they are reputable.

The stuff coming out of Intel and their partners at CES about the NUC 9 line-up is pretty interesting. One of the most interesting bits to me is the NUC 9 Compute Element:

IMAGE(https://cdn.mos.cms.futurecdn.net/NuEbfj76kN5i7ESVAKpaTc.jpg)

That is basically a full computer on a PCIe card. It plugs into another board that gives it expanded capabilities like additional ports and PCIe cards. Razer, CyberPower, and Adata have shown of concepts using this model and the expansion board with Razer's being the most polished of the bunch so far:

IMAGE(https://d4kkpd69xt9l7.cloudfront.net/sys-master/root/hc6/hc7/9381953568798/razer-tomahawk-desktop_1500x1000_gallery_02.jpg)

Prices have not been announced but the non-modular Intel NUC 9 Extreme (which can handle an 8 inch PCIe card) looks like it is going to run from about $1000 for an i5 version to around $1700 for an i9 one. These are bring your own RAM and storage prices.

An Ars Technica article, AMD’s third shoe finally drops at CES 2020—7nm Zen 2 mobile CPUs mentions kind of sideways that Radeon cards are doing a lot better in Linux than the NVidia ones are. It's just an aside, a simple statement of fact, not a thing the author finds surprising or novel.

I've been talking on the Ars forums with Jim Salter for years, and if he says it's good, it's good. It looks like my info is out of date.

For the rest of us, it's looking like AMD laptops this year may be the thing to buy.

Malor wrote:

An Ars Technica article, AMD’s third shoe finally drops at CES 2020—7nm Zen 2 mobile CPUs mentions kind of sideways that Radeon cards are doing a lot better in Linux than the NVidia ones are. It's just an aside, a simple statement of fact, not a thing the author finds surprising or novel.

Wait, what? OK, at least he mentioned Tensorflow. That's not the only thing though. Handbrake and many high-performance computing apps are ported to run on CUDA. There's a reason cloud providers that offer a GPU option all offer Nvidia.

It is the Consumer Electronics Show though, and Nvidia's support when it comes to desktop Linux still sucks. That screen tearing issue he mentions is an easy fix, but it shouldn't be a problem in the first place.

The H sounds like a beast. Plus the U an 8/16 core cpu at 15W TDP is mind boggling...
And the 3990x, I mean wow, talk about a 2020 coming out party line up

I managed to resist upgrading my 1st gen Ryzen systems. I don't think that restraint will hold once the 4th gen Ryzen desktop lines come out.

I want a 15w TDP chip that outperforms a 9700k. Can we please have a way to put that in a desktop form factor?

I saw a comment in the attached Ars thread to that article, claiming that Intel might be in real trouble. They apparently botched their 10nm transition entirely, and they're now two full process nodes behind Samsung and TSMC. It used to be that Intel could outspend everyone on building fabs, where AMD was barely able to pay them off before they were obsolete.

But the x86/desktop market is shrinking a little, and ARM and all its variants have absolutely exploded. Intel is no longer capturing the lion's share of global revenue for chips, and seem to have bled off a lot of talent.... other comments in the thread suggested that the beancounters had turned it into an unpleasant place to work and had gotten rid of a ton of their most senior people to save money.

If that's true, then the musing that Intel may never again catch up in fab tech may have some real credence to it. It raises the question of whether Intel will get squeezed out of what was once its core market by Asian chip fabs.

Chairman_Mao wrote:

I want a 15w TDP chip that outperforms a 9700k. Can we please have a way to put that in a desktop form factor?

I wish those chips were easier to build your own systems with. It could open up a whole new avenue of small gaming builds.

Question. I am running an intel i5 6400k pc. How much life does this cpu have before it becomes a bottleneck?

Heretk wrote:

Question. I am running an intel i5 6400k pc. How much life does this cpu have before it becomes a bottleneck?

What are you doing with the computer, and what resolution is your monitor?

Malor wrote:

An Ars Technica article, AMD’s third shoe finally drops at CES 2020—7nm Zen 2 mobile CPUs mentions kind of sideways that Radeon cards are doing a lot better in Linux than the NVidia ones are. It's just an aside, a simple statement of fact, not a thing the author finds surprising or novel.

Definitely matches my experience! I can't overstate how every major fail I've had with a Linux distro over the past 5-6 years is because an Nvidia driver+distro upgrade failure. And when they fail, oof. There goes your weekend if you want to try and recover data.

screen-tearing during Linux video playback, among many other irritations, becomes a thing of the past when you yank out your Nvidia GPU and replace it with a Radeon.

The promised land is near!

Maybe the new stuff will drive down the price of the mid-range stuff I'm interested in?

I think I'm gonna stick it out and save up until I can upgrade from my Ryzen 7 1700+GTX 970 to a Ryzen 7 3700x+RX 5700. The 3800x+ has a higher tdp, and if I aim for the Sapphire/Powercolor RX 5700s they have a dual bios switch for a "silent" mode that draws less power, which might make it easier to work into my Mini-itx build. I could potentially get away with a 600w and lower PSU (fanless PSUs seem to top out at 550-600w).

Y'all successfully curtailed my impulse buying instincts for an overall better setup.

deftly wrote:

Wait, what? OK, at least he mentioned Tensorflow. That's not the only thing though. Handbrake and many high-performance computing apps are ported to run on CUDA. There's a reason cloud providers that offer a GPU option all offer Nvidia.

It is the Consumer Electronics Show though, and Nvidia's support when it comes to desktop Linux still sucks. That screen tearing issue he mentions is an easy fix, but it shouldn't be a problem in the first place.

Is that the same a nvenc? I've been experimenting with it. It's amazing how fast it is but the quality is pretty bad.

For streaming I grok it. But for Handbrake encodes it's just not worth using compared to CPU encoding imo.

It is nice to get an *acceptable* encode of a Blu-ray rip in just 30 minutes. But the filesize is around 12gb or more, and the quality is still worse than a multi-thread regular encode that takes 3-4 times as long, but ends up around 7-8gb.

If they could work that into Plex, though, it would be amazing. A one-time temporary not that far off from real-time transcode for streaming sounds great, and the filesize doesn't really matter as much in that case.

I am chomping at the bit to get a 3900x or 3950x. The only thing holding me back from the 3950x is that while cheap, it costs as much as what I usually spend on a system upgrade.

The other thing I would be interested instead is a cheaper 2 in 1 with the new ryzen chip. I have a need for something to carry with me and give me flexibility to be creative on the go or away from home.

ccoates wrote:

Is that the same a nvenc? I've been experimenting with it. It's amazing how fast it is but the quality is pretty bad.

For streaming I grok it. But for Handbrake encodes it's just not worth using compared to CPU encoding imo.

It is nice to get an *acceptable* encode of a Blu-ray rip in just 30 minutes. But the filesize is around 12gb or more, and the quality is still worse than a multi-thread regular encode that takes 3-4 times as long, but ends up around 7-8gb.

If they could work that into Plex, though, it would be amazing. A one-time temporary not that far off from real-time transcode for streaming sounds great, and the filesize doesn't really matter as much in that case.

The quality and features of nvenc vary between generations. I was suggesting that Nvidia is just as focused on datacenters and cloud providers as gaming. Handbrake was probably a poor example, but when someone uploads a video to a website, it'll get sent to a compute farm where the fastest-possible encoding will get put up first as a "preview" version, with the high-quality version showing up later.

deftly wrote:

The quality and features of nvenc vary between generations. I was suggesting that Nvidia is just as focused on datacenters and cloud providers as gaming. Handbrake was probably a poor example, but when someone uploads a video to a website, it'll get sent to a compute farm where the fastest-possible encoding will get put up first as a "preview" version, with the high-quality version showing up later.

Ah, okay! I wasn't think on the YouTube/Vimeo scale.

I never thought about that, re: having a really quick nvenc first that they replace later.

fangblackbone wrote:

I am chomping at the bit to get a 3900x or 3950x. The only thing holding me back from the 3950x is that while cheap, it costs as much as what I usually spend on a system upgrade.

The other thing I would be interested instead is a cheaper 2 in 1 with the new ryzen chip. I have a need for something to carry with me and give me flexibility to be creative on the go or away from home.

For sure. I get really wrapped up in the "performance per dollar" aspect. Like the 3800x seems basically pointless over the 3700x if the price difference is more than $20.

Jumping from $330 to $500 to $790... And at the end of the day I guess waiting an extra 40 minutes for a handbrake encode isn't going to kill me. OR IS IT?

Chairman_Mao wrote:

I want a 15w TDP chip that outperforms a 9700k. Can we please have a way to put that in a desktop form factor?

Another spot a chip like that would be extremely useful would be as an edge firewall/router. Goddamn, a box like that could move some bits. It could probably host a VPN server as well without even noticing, hardly.

WizKid wrote:
Heretk wrote:

Question. I am running an intel i5 6400k pc. How much life does this cpu have before it becomes a bottleneck?

What are you doing with the computer, and what resolution is your monitor?

Its a gaming rig. I recently updated the gpu to a 6GB 1060 and have a 1440p monitor.

My instinct is that if you run in 1080p, you'll be fine for most games at decent graphics levels (probably High). If you run at 1440p, you might have to cut back some on the more demanding games, but it'll still be pretty good. I would not worry unless you're really into really taxing games that you want to play at 1440p.

I totally forgot I upgraded my 460w PSU to a 520w (both fanless Seasonics with a Platinum rating).

Still a far cry from a 650w PSU, but I think for my mini-ATX build and considering the quality of the PSU it might actually be enough for a RX 5700 (which seems to be less power hungry than the RX 590s).

Gonna save my pennies and keep an eye out for sales.

Keeping my eye on the 3700x too. The 3900x/3950x definitely look amazing, but they're 1.5-2.5 times the price of the 5700x, and the gains don't justify the extra cost for my use cases.

I am waiting anxiously for my Amazon gift card, so I can splurge on a 2060 Super RTX gpu!