Help me build my PC 2017 Catch All

Yeah IMO those cards are still terribly priced.

At the rate videocards are going I don't expect any of them will give me an even moderately decent reason to upgrade my 1080Ti for another year or two.

Thin_J wrote:

Yeah IMO those cards are still terribly priced.

At the rate videocards are going I don't expect any of them will give me an even moderately decent reason to upgrade my 1080Ti for another year or two.

Not even the price.. Nvidia made reviewers sign very restrictive agreements prior to getting cards to review. We can bitch about AMD embargo but like I said still not even close to what Nvidia has been doing to control the reviews.

Yeah AMD has nothing on the "GeForce Partner Program" stuff that NVIDIA tried to pull.

RIP HardOCP, going to miss them calling this sort of stuff out. Most tech sites don't have the guts.

RTX 2060, 2070, and 2080 were bad buys, no way around it. (I say this as someone who bought a 2060.) Lots of failures right after release last year, now the "Space Invaders" bug, ray tracing barely implemented in games, non-ray tracing performance barely better than Pascal cards, and now "Super" versions coming out at about the same price. The 2080 ti is the only one that made sense in retrospect, assuming money was no object at least it was clearly the fastest GPU available. However, as discouraging as Nvidia's strategy is for their customers, I suspect it will be pretty effective in stealing the thunder of AMD's Navi release.

I saw an interesting perspective on youtube where they talked about the different paradigm busting technologies following an S curve and how the technology that replaces it starts in the middle of the rapid growth of the preceding technology.

It is an interesting concept but something was more logically thought provoking about it. They posited that ARM or things like Navi are the replacing technology that could break through in the next few years. It makes sense that since the dedicated GPU is tapering off in power increase from generation to generation and GPU makers like Nvidia are already promoting side grade technology like ray tracing.

So it really sounds logical that the escalating growth of Navi and its descendants will match the plateauing discrete GPU's power in the next say 5 years or 2-3 generations. Which while seemingly remote now would be an incredible game changer. It would also probably see a lot more people adopt whatever Xbox, Switch or PS of that gen as a home computer. And that gen console will probably be laptop size. And wouldn't tablets with that technology be insane? In that gen, your phone could be as powerful as a current gen console and your tablet could be the equivalent of a high end ryzen 3 with a 2080 in it. Server farms would be racks of Switch or ipad pro sized devices.

Huh? Navi is an architecture for a discrete GPU. And ARM is just another CPU architecture just like PowerPC was (and in many ways ARM is descended from PPC which also known as RISC). It has some benefits over x86_64 and some draw backs just like PPC did. PPC made big strides against x86 when x86 ran into scaling issues, but then PPC ran into similar issues farther down the road. x86_64 is having some issues now so of course ARM looks good in comparison, but eventually ARM will hit a roadblock or x86_64 will clear one and things will flip flop again.

Middcore wrote:

The 2080 ti is the only one that made sense in retrospect, assuming money was no object at least it was clearly the fastest GPU available. However, as discouraging as Nvidia's strategy is for their customers, I suspect it will be pretty effective in stealing the thunder of AMD's Navi release.

For $500 more than the 1080 Ti released at it? Meh. Still awful pricing. Whole lineup was/is massively overpriced.

Thin_J wrote:
Middcore wrote:

The 2080 ti is the only one that made sense in retrospect, assuming money was no object at least it was clearly the fastest GPU available. However, as discouraging as Nvidia's strategy is for their customers, I suspect it will be pretty effective in stealing the thunder of AMD's Navi release.

For $500 more than the 1080 Ti released at it? Meh. Still awful pricing. Whole lineup was/is massively overpriced.

Well, I did say if money was no object. I didn't say it was a good value. Although I guess if you REALLY didn't care about money and just wanted the fastest card available you would have bought the RTX Titan.

Just bought a refurb 1070 for $360, but with the announcement of the Super 2000s, now i'm wondering if i should just return it and go for a Super 2070, for $140 more.

Middcore wrote:

Well, I did say if money was no object. I didn't say it was a good value. Although I guess if you REALLY didn't care about money and just wanted the fastest card available you would have bought the RTX Titan.

Between that pricing and what they tried to make reviewers do before release I plan to hold this grudge for quite a while.

Thin_J wrote:

Yeah IMO those cards are still terribly priced.

At the rate videocards are going I don't expect any of them will give me an even moderately decent reason to upgrade my 1080Ti for another year or two.

20XX cards have been terribly priced for sure - they performed the same as 10XX, with RTX as a nearly useless feature for years to come - with $100 added to the price for good measure.
But has there ever really been much reason to upgrade from the best card in one generation, into the next one?

Shadout wrote:
Thin_J wrote:

Yeah IMO those cards are still terribly priced.

At the rate videocards are going I don't expect any of them will give me an even moderately decent reason to upgrade my 1080Ti for another year or two.

20XX cards have been terribly priced for sure - they performed the same as 10XX, with RTX as a nearly useless feature for years to come - with $100 added to the price for good measure.
But has there ever really been much reason to upgrade from the best card in one generation, into the next one?

For a long time, absolutely. You were doubling performance with each new generation.

Nowadays, the speed increases are much smaller, so it often makes more sense to skip a generation or two.

Even the the 1080Ti was an average ~30% faster than the regular 1080 less than a year later.

This is the first generation in a bit as far as I can remember where there's just no good reason to upgrade at any price tier that isn't "I will pay whatever price they put on the top end"

The 2080Ti is a great piece of hardware, but it's essentially the Titan of this release and they shifted the prices of all the other cards up a tier.

All the lower and mid tier cards are bad products for the money IMO.

This is he first video card generation I've sat completely out in as long as I can remember. I'm an every 12 to 18 months build kinda person if there's cool fun new stuff to be had to do it with.

And I'm still riding a 6700K and my 1080Ti that's pushing three years old.

Though the 6700k is likely getting the boot in the next week or so, but CPU's are a different thing right now.

I only went with a 2080 because it was only £80 more expensive than buying a used 1080ti here in the UK. And who knows what a used 1080TI has been through in the last couple of years. Possibly nothing bad? For £80, I wasn't willing to take a 'possibly', not when we're talking £550+.

Next up for me is CPU/Mobo and RAM, and NVMe, probably in March/April next year. Going from an i7-6700 (non-K). it's an mITX build, so my options aren't as free-flowing as a full size tower.

Tempted for the first time in many, many years to ditch Intel and go for one of the new Ryzen multicore, but haven't done much research into mITX versions of their motherboards and cooling. Will save that fun for the new year

Yeah, I went with 2080 as well. The alternative would have been a 1080TI for more or less same price. That didn't make much sense. It was time for an upgrade, longest time I have gone without a new GPU I think. In hindsigt, getting a 1080/1080TI at release had been much better, but hard to know how much the GPU market would fail after that.

Went back and looked at 'recent' gpu's I bought
Can't remember/find what I had before ~2008
Radeon 4870 - July 2008
Radeon 6850 - January 2011
GTX 970 - October 2014
GTX 2080 - November 2018 - costing more than twice what I paid for 970, ouch...

Guess I am on track for a new card in late 2021. Though it would be nice to get back into hitting the decent card updates. GTX 970 was a really nice card.
Having recently bought a Gsync monitor I guess I am stuck to nvidia for a while God I hate proprietary hard/software standards.

Everything else in my PC is from January 2017. With CPU progress being even slower than GPU's, I cant imagine an upgrade for quite a while. Went from I7 2700 to I7 7700 with the last CPU upgrade. That is what, 6 years.

Not the best SSD performance wise but it still beats a spinning disk. 1TB SanDisk SSD $110 via Amazon.

LOL just noticed that is only $10 off. Not as good of a deal as I thought it was.

Rykin wrote:

Not the best SSD performance wise but it still beats a spinning disk. 1TB SanDisk SSD $110 via Amazon.

LOL just noticed that is only $10 off. Not as good of a deal as I thought it was.

Seriously, SSD's are just that cheap now.

For real. I picked up a 2TB Samsung (QVO, so not quite tip top, but more than good enough) for $200 recently.

PCIe 4.0 NVMe's are starting to surface.

The fancy new Aorus M.2 NVMe SSDs follow what is becoming a familiar script—1TB and 2TB capacities, both of which can tap into the PCIe 4.0 bus to deliver sequential read and write speeds of up to 5,000MB/s and 4,000MB/s, respectively.

RRP for the 2TB version is $260. Fair to say 'disk' isn't going to be the bottleneck any more.

Yea love the NVMe drives in my home server and laptop (Razer Blade Stealth) they both boot super fast even without tweaking Windows 10 to optimize it much.

From the link:

Only Sabrent is offering a version of its next-gen SSDs without a heatsink, for a slightly cheaper price tag. However, the company notes that a "heatsink is required" to run at full bore, otherwise thermal throttling is going to kick in really fast.

I was going to make a joke about water cooling an m.2 drive, but look like there are already water blocks available for them.

Last night I realized that when I went to buy parts a few weeks ago for a PC build, Microcenter forgot to charge me for the graphics card. I didn't even notice until I watched a video of a build with the new Ryzen CPUs that I came in several hundred dollars under budget. Guess I should've splurged for a 2080 Ti.

Reviews are looking good for the 3000 and the 5700/XT. Not just competitive price+performance, but on power consumption too.

Chairman_Mao wrote:

Reviews are looking good for the 3000 and the 5700/XT. Not just competitive price+performance, but on power consumption too.

Yeah AMD is really jumping back into the game with these new offerings. Ceding the high end to Nvidia isn't a bad move considering the consoles are all AMD as well. Fighting over a small percentage of 4K PC gamers has not worked well for them in the past.

I was originally going to return the $300 Vega card I bought and get a 5700XT with a $100 gift card I got from credit card points. With the recent price drop it came out the same. Now that it turns out the Vega was free, it's oddly harder to justify spending the money. At 1440p it's running pretty well. I may as well just wait.

Wow. That's pretty special.

In other news, the 2070 SUPER (emphasis for stupid name) and Radeon 5700XT are both roughly 1080 Ti performance for at or under $500. It can finally be matched/beaten without paying the same amount or significantly more money than on the 1080 Ti's original release.

They finally did something right. It just took ages.

In other news: The Ryzen 3000 release is kind of a mess? A looooooot of people trying to get a hold of 3900X's and there just aren't any in stock anywhere. Reddit is full of stories of people who waited hours in a line of 50+ people in front of their Microcenter store, they were like 3rd of 4th in line, and left empty handed because the store ended up not getting 3900X's after they assured people who called they would have "plenty" of them.

I checked my Microcenter about an hour before they open and they had 10+ 3700X's, 10+3600X's, and 10+ 3600's. They did not have a single 3900X. And nobody has the 3800X yet. It is apparently not due until later in the week. If there were 3800X's available I probably would have just gone with that and been perfectly happy.

If you want a 3700X or a 3600X or a 3600 they're kind of everywhere, and because there's no stock of 3900X's and a lot of people are just using their X470 or B450 motherboards if you want an X570 board they're kind of all in stock everywhere.

Seems really good for the GPUs. A bit like the first Ryzen. Still a little behind the competition, but the big jump (compared to Nvidia at least) from their previous generation gives lots of hope for what they do next. Especially when we can see what they actually did next with Ryzen, which pretty much seems to have surpassed Intel.

On the other hand, Nvidia is a step behind by using 12nm vs 7nm (if those are comparable anyway), so maybe Nvidias architecture is still relatively better, for when they catch up to the 7nm process, presumably in their next generation.

Nvidia prices gonna drop (extreme example)

Possibly but one could have said the same thing about Intel and their transition to 10nm is still an issue.

From my limited understanding, every company labels their process differently, so you can't really treat the "X mm" claims as anything other than marketing, these days. I guess what most of them are doing is cherry-picking the smallest feature in the process, and using that as the label, when about 99% of it will be substantially larger.

Apparently, you can't even tell within a given company, because they change what they're measuring in new processes.

On Intel's process woes: IIRC, they had big layoffs awhile back, and now I'm wondering if they didn't end up dumping the talent they needed to get their yields to the proper level. I have some vague idea that QA took a major hit, and if that memory is accurate, that might be exactly what happened, that some beancounter decided it was too expensive to stay good at making chips.

TheGameguru wrote:

Possibly but one could have said the same thing about Intel and their transition to 10nm is still an issue.

Sure, but Intel was struggling making their own 10 nm. Nvidia was never going to do that, they are buying from someone else just like AMD (TSMC for this generation).

Malor wrote:

From my limited understanding, every company labels their process differently, so you can't really treat the "X mm" claims as anything other than marketing, these days. I guess what most of them are doing is cherry-picking the smallest feature in the process, and using that as the label, when about 99% of it will be substantially larger.

That is what I have read too. Sounds like the smaller these things get, the harder it is to keep up the density. Supposedly Intels 10nm is roughly comparable to AMDs 7nm.
With TSMC behind both Nvidias and AMDs GPUs those numbers might be more directly comparable though.

From Anandtech:

AMD in turn comes in with the edge on manufacturing process, as they’re using TSMC 7nm versus the 16nm offshoot that NVIDIA uses
Chairman_Mao wrote:

Nvidia prices gonna drop (extreme example)

So annoyed that I missed that one.

Need to watch more carefully over the next few weeks. Planning to pick up a 2080 Super, but if I can get a ti for a price like that, I'll take it in a heartbeat.

zeroKFE wrote:
Chairman_Mao wrote:

Nvidia prices gonna drop (extreme example)

So annoyed that I missed that one.

Don't be, lots of people who took advantage of that have already had their orders cancelled. But I WOULD expect some deals on the non-Super RTX cards, briefly, before they disappear from stock entirely since they're "EOL."