Help me build my PC 2017 Catch All

A supposed leaked Intel CPU roadmap claims Intel will remain on 14nm processes into 2021, at which time their 10nm process will finally reach market.

If this is true, AMD's Zen 2 and Zen 3 (the expected 2020 refresh, still on 7nm) may go relatively unchallenged by Intel, when before the expectation was their 10nm process would come to market late but still during AMD's 7nm lifespan. With this roadmap, 10nm may not come out until AMD is already preparing to move from 7nm to the next smaller process (5nm, likely).

Intel is going to need to do some magic with their 14nm chips to stay in the ball game these next couple years.

*Legion* wrote:

A supposed leaked Intel CPU roadmap claims Intel will remain on 14nm processes into 2021, at which time their 10nm process will finally reach market.

If this is true, AMD's Zen 2 and Zen 3 (the expected 2020 refresh, still on 7nm) may go relatively unchallenged by Intel, when before the expectation was their 10nm process would come to market late but still during AMD's 7nm lifespan. With this roadmap, 10nm may not come out until AMD is already preparing to move from 7nm to the next smaller process (5nm, likely).

Intel is going to need to do some magic with their 14nm chips to stay in the ball game these next couple years.

Marketing and their hordes of ravenous fanboys.

TheGameguru wrote:

Marketing and their hordes of ravenous fanboys.

If AMD's ever going to change their market perception, this is the opportunity to do it.

The outlook reminds me a lot of Intel in the Pentium 4 era. They couldn't measure up to the Athlon XP and then the Athlon 64.

Granted, what came next was AMD strangling themselves with the ATI acquisition, Intel violating the hell out of antitrust law, a Bush-appointed FTC chairperson blocking an inquiry into that antitrust behavior for years until she left the position, and then just paying out a settlement to AMD later after the damage was done. So, there's that.

I am not partial to either side, but I do love healthy competition and I am currently rocking an AMD Ryzen 5 chip atm.

AMD really needs to throw some money into their marketing machine over the next couple of years so they can claim some ground back from Intel.

It's interesting how badly chip development has stalled. The first 4GHz P4 parts shipped around 2005 (well, sort of, the fastest stable ones seem to have been 3.8, and the 4.0GHz parts were cancelled), and here we are all these years later, *still* clustered around 4GHz, with 5 being about as fast as mainstream chips get.

If chips were still doubling clock rates every other year like they once did, they'd be shipping 512GHz parts sometime this year. Can you even imagine what we could do with single 512GHz cores? Heck with all this multiprocessing and multicore weirdness, just run one really, REALLY fast CPU.

Heat, and the fact that the length of the clock tick is dependent on the hardware scale. Because the clock tick cannot *usefully* be longer than the longest instruction, and the instruction set has already been chopped into the smallest functions possible (speaking generally), the only way to shorten a clock tick is to make the hardware smaller. Which then requires a smaller longest instruction (or it'll just be waiting faster) and also more cooling (combined with a slowing mechanism to prevent over-heating).

If you think about this combination of issues, you'll see that a 512GHz cpu would have to be accompanied by some pretty much impossible changes to the instruction set, and the physical die size, as well as cooling. At 300GHz+, the processor would be out of the microwave (radio) spectrum and into the IR. Entirely new technology for chip design as well.

Those are some of the problems. The solution (today) is multiple cores and threads and parallel programming, to both preserve current software practices and keep machines from melting. Intel was late to that party, and so they had to push single thread performance as the most important marketing benchmark, but they hit the wall just like everyone else before them and had to grow out instead of up. Because of that, we still think of cpu "strength" as being speed. And that's true to a point. But the really remarkable thing is that we've got cpus now that offer *32* 5GHz cores with 256 threads - the equivalent of 1280 5GHz threads acting simultaneously. Still, it doesn't seem like we will go much beyond that even with die shrinks, in the near future, as the market rejected 64 core cpus for general purpose workloads over the last few years.

Innovation continued, it just butted up against physics and went off in the direction of multi-thread/multi-core as a result.

One of the things we can look forward to seeing in the next few years is a move away from current single process chip designs into chip packages where different features are manufactured separately and through different manufacturing processes and then linked together using something like AMD's Infinity Fabric. It almost sounds like a step backwards in that it may be the return of daughterboard style CPUs instead of the chip and socket design we are used to today. Though that may have some advantages in cooling since you might be able to cool the various chips on a daughterboard from both sides.

Currently lots of the lower end chips are high end chips with defects that can be worked around. An i5 is just an i7 with defects that they are working around by disabling cores/features and even the high end processors are often designed with tolerances for some defects in mind. For example your 8 core i7 might actually have 9 cores, but by having that one extra core that will be disabled (whether it works or not) they have increased their yield rate from 60% to 80% (not actual numbers just examples). But if one of those i7s has two defective cores then they just got a 6 core i5 out of that wafer, which helps bring their yields up even more. 3 or 4 defective cores? You got an i3.

In the future instead of making a wafer of chips that are complete packages and having to work around the defects like this they will be able to make a wafer that just has a bunch single features or sets of features and if there is a flaw it can be discarded. This will also allow them to mix parts of different scales together allowing them to make parts that don't get much benefit of being made at a smaller scale on the more reliable processes.

Basically you might end up with a processor package that includes 2 single thread high performance cores made on a 7nm process (that may either be a single part or two separate parts) and then it might also have 2 multithreaded cores running at a lower clockspeed made on a higher yield 10nm process and it may have the other components that go into the chip produced on even higher yield 14nm processes. By building smaller components instead of trying to do everything all at once it makes discarding the parts with defects less expensive.

I recently watched or read something about this and I wish I could find it to link. This is all super over-simplified of course, but it gets the basic idea across.

Recommendation request. I have a 32 inch LCD TV that I have been using for 1080p gaming on my PC. A new issue has cropped up where the TV (not Windows) will present a loss of input error on screen, even though the underlying video still works. If I power cycle the TV it goes away but has been returning with increasing frequency.

I would like to replace with a 32 inch PC monitor. 1080p is fine for now but I would like to go to 1440 eventually. Currently using a 1060 6GB video card. There are a lot of monitor choices out there. What is the best mid-range option? Any insight is appreciated. Thanks.

There is an active Monitor Recommendation thread, I would skim the last 2-3 pages and if nothing mentioned post in there.

For the TV have you already tried a different cable? Kind of sounds like the cable or one of the connectors is loose or bad.

Malor wrote:

Can you even imagine what we could do with single 512GHz cores?

Spend a lot of money on fire extinguishers?

The dead core thing has been done for real, except that companies will also "kill" good cores if the need more of the lower config than they are getting. I think the scale direction will be up, not out, since that reduces the distance between parts and also allows cooling materials to touch both sides in the layer. And indeed, Intel and HPE and others are working on stacked elements for their concept systems. (The stacks scale horizontally, of course.)

This video is dedicated to Malor.

Yeah that guy presented sh*t we've known about for years.. a good air cooler will perform as well if not slightly better than an AIO for half the price and never fail yadda yadda yadda.

I'll stick with AIO's because those giant copper air coolers suck ass to install and work around and look like crap.

The good news is that for low profile or small ITX builds you can get really nice air cooling that works really well so I'm glad they have kept up that innovation because its still super important for certain builds.

Yeah. News to nobody who pays even four seconds of attention.

If my AIO fails, I will be mad once, when I have to open my case and replace it.

If I use a giant cooler that matches my AIO's performance, I will be mad every time I open my case.

*Legion* wrote:

If my AIO fails, I will be mad once, when I have to open my case and replace it.

If I use a giant cooler that matches my AIO's performance, I will be mad every time I open my case.

Honest question: how often do you open your case?

This isn't me weighing in on the cooler debate. This is a tangent of me actually wondering how often people physically open up their machines and tinker with things.

Twice a year, to blow out dust, I guess...

Robear wrote:

Twice a year, to blow out dust, I guess...

That, pretty much. But this is my first tower (only laptops prior to this), and I've not had to do any maintenance, upgrades or repairs yet.

I watch a lot of LTT videos, but I often find the way they test stuff to leave a lot to be desired. That said I went with a giant air cooler in my last build, but have been considering an AIO for my next build. My last build was a giant home server PC with 12 hot swap bays, dual drive redundancy, and was build in a large CaseLabs (R.I.P.) Mercury S8 case, while my next one is more likely to be Mini-ATX or ITX in a small form factor case.

Vargen wrote:

Honest question: how often do you open your case?

More than once.

I clean it out, I upgrade parts of the system, all of that. I've added and replaced drives a few times, including an M.2 drive that made me really happy to not have any giant things blocking large portions of my motherboard.

I have a secondary system that I put a cheaper AIO into, and that system has had (non-AIO related) problems that had me back inside that system a lot, and again I was happy, or at least as happy as I could be when dealing with a troublesome system.

LouZiffer wrote:

This video is dedicated to Malor.

(redacted)

Yepyep. Air is just as good (if you buy a good one), and more reliable. It's more of a pain to install, but on an Intel chip, you normally only have to do that once. If I put an Intel chip on a board, it will never move again. (barring motherboard failure, anyway.) And the big annoying cooler isn't really in the way after it's installed, since any work I'll do on the computer is centered on the PCI slots and SATA cabling.

On an AMD chip, the relative ease of working around a watercooler might matter a little more, because you can theoretically upgrade those. But you'll probably only do that once at most, so two ttotal installs per motherboard lifetime. Probably.

Reliability, on the other hand, pays off every single day.

Vargen wrote:
*Legion* wrote:

If my AIO fails, I will be mad once, when I have to open my case and replace it.

If I use a giant cooler that matches my AIO's performance, I will be mad every time I open my case.

Honest question: how often do you open your case?

This isn't me weighing in on the cooler debate. This is a tangent of me actually wondering how often people physically open up their machines and tinker with things.

I rarely open my cases anymore. I take the front dust filter off every few months and wash that out, but rarely crack the side panels. When I do, it's usually the wrong-side panel so I can unplug drives.

I like having both Windows and Linux available, and Windows has a nasty habit of writing boot information on the wrong drives during an install, so I make sure it can't. I do the same thing with Linux, out of an abundance of caution, but that's more to make sure I don't get something wrong. It will do what I tell it, so I make darn sure I can't make a mistake and overwrite my Windows install by accident. I do both of these things with reasonable frequency... maybe once a year or so for each.

It's that, and occasionally replacing the video card, but I haven't touched the video card since I put the 970 in there.

Just reporting that I feel like such a noob sometimes. I bought my current PC about 18 months ago. Built up from parts that were pretty good at the time.

i7700k, 16GB of PC3200 RAM, a 1080ti, ..

just this weekend I just started doing a little research and realized that I've had XMP off all this time, so despite buying the faster RAM i really wasn't getting the most out of it.

I also went online and found some color profiles for my 24" monitors (Dell 2417 DG) and I swear its like i have a brand new video card. I was never satisfied with the color when playing games. Most of the time it looked way too washed out. These monitors don't come with gamma controls, only bright/contrast. I actually did go through the windows calibration which got the gamma a bit better but kept going back to using the nvidia tool to lower the gamma. I did some hunting around in win10 and found out that I never actually set the calibrated icc profiles to default so it was never loading them. Still not satisfied I did more research and lo and behold people much smarter than me have a database of icc profiles for each monitor. I loaded up the one for my monitor and holy crap. Everything suddenly looked like how I expect. On top of that I was actually able to lower my bright/contrast significantly to lower power consumption.

All of this started because I was interested in OC'ing. I play a lot of flight simming and just started recording/streaming my flights so CPU usage is quite high, even though video performance was good with my setup. I did do some minor OC (I'm up to 4.7ghz) and heat/power is barely changed from before.

So between OC, XMP and a real color profile I feel like I just got a new computer. Maybe that's a strategy. Buy a high end computer but leave settings at default. 2 years later tweak it to what it should have been and you've just bought yourself more time with your rig!

What was the site you found with the database of icc profiles?

LeapingGnome wrote:

What was the site you found with the database of icc profiles?

http://www.tftcentral.co.uk/articles...

Anybody ever buy from Bottom Line Telecommunications(BLT)? I think I've finally found a single blower fan GTX 1660 ti for my alienware x51:

https://www.shopblt.com/item/pny-tec...

I think I'd wait until the 11th and see if they are available from PNY directly. You could also reach out to them and see if there's a plan for them to be available for sale on that date. Looks like it'd be $5 cheaper with free shipping if you can buy direct.

As regards ICC profiles... I recently got my first G-Sync monitor (middle of the pack, not a screamer) and I really like the color and greys/blacks. But I have not done an ICC config on it. Should I, in hopes that it will get even better? Or is it likely to just be a slight tweak?

The proper way to do it is with a meter. If you're using a published ICC, then it's not tuned for your specific display, just your model. This is usually a fair bit better than nothing, but won't typically look as good as a proper calibration of your actual screen.

There's a problem there, though: a proper colorimeter is expensive as hell. You can get a decent one for basic monitors here: the ColorHug2. It's an open source colorimeter that's not too expensive. (95 English pounds.) However, it doesn't work well with LED screens, and doesn't really characterize wide gamut displays properly. You also have to run Linux to do the actual calibration with a boot CD, and then export the resulting ICC file to other OSes. (there's some support for the ColorHug hardware in another Windows-based open source calibrator, but I forget what that software is called.)

He's been working on a planned upgraded version that should be able to handle LED and wide gamut, as well as paper prints for calibrating an entire production process, at an expected price of about 300 pounds. That's a great price for something with those capabilities. Equivalents you'd order from, say, Amazon would probably set you back about US$1000. But he hasn't been able to get 100 preorders, and it's been languishing in limbo for at least a year or two.

If all of that is too much hassle, then a model-based ICC profile should be a substantial improvement for no out-of-pocket expense.

edit: I also just realized that he's out of stock on the ColorHug2, so that may not be available anymore. He probably isn't getting enough orders to make a production run worthwhile.

Malor wrote:
LouZiffer wrote:

This video is dedicated to Malor.

(redacted)

Yepyep. Air is just as good (if you buy a good one), and more reliable. It's more of a pain to install, but on an Intel chip, you normally only have to do that once. If I put an Intel chip on a board, it will never move again. (barring motherboard failure, anyway.) And the big annoying cooler isn't really in the way after it's installed, since any work I'll do on the computer is centered on the PCI slots and SATA cabling.

On an AMD chip, the relative ease of working around a watercooler might matter a little more, because you can theoretically upgrade those. But you'll probably only do that once at most, so two ttotal installs per motherboard lifetime. Probably.

Reliability, on the other hand, pays off every single day.

Thing is, I also agree that Linus' team's testing methods are hilariously clownshoe ludicrous. (GIANT BEAN BAGS?) However, just like their hilariously complicated water cooled rigs they're presented honestly and for entertainment - and they don't invalidate the point.

"Cool factor" and "I like to tinker" are genuine values for many people. Practicality isn't exactly on the same side of the scale as a gaming rig. So, in the end, the more varied the points of view are out there, the easier everyone will find solutions which will match up with their own tastes.

To sum it up: Liked the video and your POV enough to post it Malor. Keep on keepin' on.