Help me build my PC: 2024 Catch-All

A 6700 XT is just right for 1440p, and is overkill for 1080p unless you're really cranking up settings or trying to drive high refresh rates in demanding titles.

Thanks Legion. I pulled the trigger on the 6700 XT. I'm coming from an AMD R9 280 so i'm sure it'll be just fine =)

My stock Wraith cooler is making an ugly buzzing noise every time I start up my PC. It goes away after a few minutes, though, so I wonder if it's something with the fan bearing. (Could it be anything else?) In any case, I'm thinking of replacing the CPU cooler with an aftermarket one. I have a Ryzen 3700X that I'm not pushing to OC in a Phanteks P400 so there's room but it's not huge. I value simplicity and reliability more than anything so I want an air cooler that is easy to install and reasonably quiet. I'm leaning towards a Noctua NH-U12S Redux. Any better suggestions?

Pink Stripes wrote:

My stock Wraith cooler is making an ugly buzzing noise every time I start up my PC. It goes away after a few minutes, though, so I wonder if it's something with the fan bearing. (Could it be anything else?) In any case, I'm thinking of replacing the CPU cooler with an aftermarket one. I have a Ryzen 3700X that I'm not pushing to OC in a Phanteks P400 so there's room but it's not huge. I value simplicity and reliability more than anything so I want an air cooler that is easy to install and reasonably quiet. I'm leaning towards a Noctua NH-U12S Redux. Any better suggestions?

Just a few weeks ago I got a fan for my 3600 and Legion recommended a few others in this post.

Stele wrote:

Just a few weeks ago I got a fan for my 3600 and Legion recommended a few others in this post.

Thanks! Good to see the U12S there too. I'm guessing that will be my pick. I saw the Vetroo in a Gamers Nexus review and it looks like a great budget option. I don't really want an RGB fan, though. On the other hand...

Just started getting the black screen of death.
Checked the monitor with my laptop and at least the monitor is popping up on hdmi. It gets "no signal" on desktop with both hdmi and DP.

It got progressively worse too. At first it would come up during CMOS and then into windows for a bit before going black. Then it went black after less than 5 minutes. Then it would be black on reset and won't even show CMOS.

I had issues a while back with my prior rx 580 and I had to turn off all the Windows 10 overlay bells and whistles for it to stop crashing during games or video playback. So I am hoping its the same thing where some recent windows update turned some of the offenders back on. Sort of the same symptoms where it would crash and disable my gpu and I would play the hunt for the working driver game.

But it could be that my weeks old 6650XT is now borked Getting progressively worse and not showing CMOS is very troubling. And I don't have a replacement GPU in the meantime

Possibly the wrong thread for this but: I have received an unexpected Christmas bonus from my employers - it’s enough to get a substantial upgrade on my monitor.

I’m now running a 3070 Ti - am I better going for 1440p or the full 4K?

4K no question!

fangblackbone wrote:

4K no question!

Hmmmm. I can get a basic spec 4k or a reasonable spec 1440p.

Sorbicol wrote:

Possibly the wrong thread for this but: I have received an unexpected Christmas bonus from my employers - it’s enough to get a substantial upgrade on my monitor.

I’m now running a 3070 Ti - am I better going for 1440p or the full 4K?

I think your card is well balanced at 1440p and would be straining at 4k. Monitor size and tech also start mattering a lot here.

I have a 32 inch 1440p monitor and sometimes it feels almost too big at normal viewing distances. It's really nice for productivity things, but it consumes so much of my field of view that it can trigger a bit of vertigo.

I think refresh rate, sync tech, HDR specs (black and light levels), IPS vs other panel techs, and so forth all may matter more than 1440p than 4k.

Sorbicol wrote:
fangblackbone wrote:

4K no question!

Hmmmm. I can get a basic spec 4k or a reasonable spec 1440p.

Maybe go for the best possible within budget 1440p?

Feeank wrote:
Sorbicol wrote:
fangblackbone wrote:

4K no question!

Hmmmm. I can get a basic spec 4k or a reasonable spec 1440p.

Maybe go for the best possible within budget 1440p?

I’d say it depends on what you value most. Do you want the prettiest graphics or highest framerate? Some combination?

4k at high settings with HDR can look amazing but you pay in framerate to get it.

Since you didn’t mention it I’m assuming you aren’t likely trying to push high framerate for esports so don’t bother getting anything above 120/144hz refresh rate. Getting a monitor that is g-sync compatible is likely worth it.

I usually suggest people go to RTINGS, find the “best of” section that most closely matches their needs, then go from there. Not necessarily to find the exact model to buy but to get an idea of what kinds of models match your preference and price range. Here’s the “best of” monitor sections:
https://www.rtings.com/monitor/revie...

Maybe start with the best by price section and see what you find. Then dig into similar models.

Getting a 4K monitor does not mean you have to play games in 4K natively.

We're now in the era of resolution sliders, DLSS/FSR, etc.

Playing at 100% render resolution on a 1440p monitor, and playing at 66.6% render resolution on a 4K monitor, is the same thing as far as the 3D parts go.

I don't think there's any reason people need to shy away from 4K monitors any longer. I don't think I've seen a recent 3D game with high performance demands that doesn't have either DLSS/FSR or a render resolution slider. And usually they have both.

And honestly, even running 1440p resolution natively on a 4K screen isn't that bad as far as the scaling goes, not nearly as bad as scaling lower resolutions to a 1080p screen was. The pixels are a lot smaller and more numerous, so the "blurriness" from the uneven division is much less distinct. I run my living room system at 1440p on my 4K TV. (The only reason I don't do 4K with 66.6% render res instead is because my TV only does 4K@60, while it does 1440p@120, so I have to stay on 1440 to get 120hz).

When I bought my last monitor, 4K monitors were still largely 60hz, so that was a no-go. But there's a good selection of 4K monitors with 144hz+ to choose from now.

*Legion* wrote:

I don't think there's any reason people need to shy away from 4K monitors any longer.

I can think of one: Budget.

If I'm paying extra to leave display performance on the table because I can't afford the matching components, I might be better off saving the money or buying a better performing lower resolution monitor.

(Isn't this the same argument it's always been though? Granted, 4K is well... sort of coming into reach of more people. The monitors at least. GPUs are crazy-go-nuts these days.)

LouZiffer wrote:
*Legion* wrote:

I don't think there's any reason people need to shy away from 4K monitors any longer.

I can think of one: Budget.

If I'm paying extra to leave display performance on the table because I can't afford the matching components, I might be better off saving the money or buying a better performing lower resolution monitor.

I'm not quite sure what you mean by "matching components". Because if you're talking about a GPU that can drive 1440p natively but isn't up to snuff for native 4K, well, that's the whole point about the render resolution sliders. You're going to get the exact same performance driving a game on a 1440p monitor at 100% render resolution as you are with the same game on a 4K monitor at 66.6% render resolution.

If you're talking about monitor feature sets, ie. getting feature X and Y on a 1440p monitor being cheaper than a 4K monitor, then sure, although that delta is closing.

The point though is that people were avoiding 4K monitors because they didn't have GPUs powerful enough to drive 4K, and they didn't want to deal with ugly scaling of lower resolutions to 4K. But now that games pretty consistently let you scale the 3D render res separately from the output resolution, this should be much less a concern.

The decoupling of 3D render resolution and output resolution is one of those "this is how it should have always been" situations.

Its kind of how it all started with texture smoothing/sampling way back in GLQuake...
Then when texture pixels got small enough to matter less, it became about compression to take up less disk space and load faster.

*Legion* wrote:

The point though is that people were avoiding 4K monitors because they didn't have GPUs powerful enough to drive 4K, and they didn't want to deal with ugly scaling of lower resolutions to 4K. But now that games pretty consistently let you scale the 3D render res separately from the output resolution, this should be much less a concern.

The decoupling of 3D render resolution and output resolution is one of those "this is how it should have always been" situations.

Lots of older games don't have scaling sliders.

Just like some folks are extra sensitive to refresh rate and others are sensitive to color gradients and others are really bugged by anti-aliasing, etc, some people are really bothered by the visual fuzz and decreased clarity that comes from running in non native resolution.

It really bothers me. Having a 4k monitor without 4090 class hardware to drive it is a major perceptual negative for me. It's really bad for people who's eyes can render fine detail and are sitting 2.5 feet away from their screen.

I have a 4k monitor with a 3080ti and it runs most of those older games at 100FPS+. I don't think you need a 4090 class hardware to drive it. With newer games you have the resolution sliders. The only issue I have had so far is if want to or need to run games at lower resolutions I have to turn off the monitors overclocking (240hz down to 100hz) otherwise the usual resolutions is not even available to me. Not a big issue, just an inconvenience of my monitor.

I'll add to that. 3080 here. I use a native 4K 144MHz monitor, DP 1.4, and I can crank up most games to Ultra with no issues.

What hardware caused you to run into issues with a 4K monitor on a 30X0 card, polq37? Just curious.

4K gaming was good with the 2x generation…it’s gotten progressively better with the previous generation and now it’s totally fine unless you want 4K @ 120+ fps then it’s an issue.

polq37 wrote:

Lots of older games don't have scaling sliders.

Older games also don't require as much GPU power to push 4K natively. A GPU that can do new games at 1440p can do old games at 4K.

*Legion* wrote:
LouZiffer wrote:
*Legion* wrote:

I don't think there's any reason people need to shy away from 4K monitors any longer.

I can think of one: Budget.

If I'm paying extra to leave display performance on the table because I can't afford the matching components, I might be better off saving the money or buying a better performing lower resolution monitor.

I'm not quite sure what you mean by "matching components". Because if you're talking about a GPU that can drive 1440p natively but isn't up to snuff for native 4K, well, that's the whole point about the render resolution sliders. You're going to get the exact same performance driving a game on a 1440p monitor at 100% render resolution as you are with the same game on a 4K monitor at 66.6% render resolution.

I mean GPU, CPU, along with everything to drive those components. From folks who can only afford integrated graphics on up there's a spectrum. 4K isn't close to entry level gaming. As long as it isn't, the primary reason for many remains.

I also mean the potential to run at better frame rates for less money. And...

If you're talking about monitor feature sets, ie. getting feature X and Y on a 1440p monitor being cheaper than a 4K monitor, then sure, although that delta is closing.

That's part of it as well. EDIT: It's really the main part. While 1080p and 1440p monitors with higher frame rates and better features are out there for less money, matching your monitor with your hardware is a smart choice for those on a budget. Until price parity arrives or the lower resolutions phase out, it'll stay that way.

The point though is that people were avoiding 4K monitors because they didn't have GPUs powerful enough to drive 4K, and they didn't want to deal with ugly scaling of lower resolutions to 4K. But now that games pretty consistently let you scale the 3D render res separately from the output resolution, this should be much less a concern.

Yes. That point was made well. I am not arguing against it. I will readily argue the point that there's "no longer any reason to shy away from 4K" though, because it still comes at a cost if you want to get the most you can out of your hardware on a budget.

The decoupling of 3D render resolution and output resolution is one of those "this is how it should have always been" situations.

Agreed.

What Legion said. It gets even easier with older but popular titles like Diablo 3. I could run that in 4K mostly high on an RX580.

I think even raytracing at 4K with some of the 3000 series cards was possible so long as you are not looking for 120+ fps.

I am really bummed about my computer. I think the only thing I can do now is buy another GPU to see if that works and then I can return my current 6650XT.

Per the Avatar 2 discussion over in the movies thread, we really should be targeting 24fps for a more "cinematic" look.

Based on these monitor comments, I'm assuming that a 1080 monitor with a high refresh rate is all I need/ all my 1070 can do. The reason I would really want to go any higher than that is for a bit of future proofing or if I wanted higher quality video?

Actually, that 1070 will drive a 1440p at good frame rates with low graphics settings, or decent frame rates with medium frame rates. Going high still gives you functional frame rates.

I'd future-proof by buying a monitor with a higher resolution than my card can do (but in the same aspect ratio, and at one of the common resolutions - 1080p to 1440p to 2k to 4k. I would not buy too far ahead of my purchasing ability - spending $1K on a 4K monitor in the hopes that you can upgrade your 1070 to a 1650 is pretty much a waste of time, as that 4K monitor will die before the 30x0s come down to your price level, most likely. But buying a 1440p and growing into it with, say, a used 1650 would be a perfectly viable upgrade strategy. I've done it myself.

Robear wrote:

Actually, that 1070 will drive a 1440p at good frame rates with low graphics settings, or decent frame rates with medium frame rates. Going high still gives you functional frame rates.

I'd future-proof by buying a monitor with a higher resolution than my card can do (but in the same aspect ratio, and at one of the common resolutions - 1080p to 1440p to 2k to 4k. I would not buy too far ahead of my purchasing ability - spending $1K on a 4K monitor in the hopes that you can upgrade your 1070 to a 1650 is pretty much a waste of time, as that 4K monitor will die before the 30x0s come down to your price level, most likely. But buying a 1440p and growing into it with, say, a used 1650 would be a perfectly viable upgrade strategy. I've done it myself.

1070 is better than a 1650. Heck a 1060 is better than a 1650.

pandasuit wrote:

1070 is better than a 1650. Heck a 1060 is better than a 1650.

And the 1650 is the average GPU on Steam.

Huh. I'm surprised. I went to the 1650 instead of the 1070 because it was newer... guess I should have checked further. (This was a while ago.)

FWIW, the 1650 did well for me as an upgrade from the 960.

Shoot, I drove my 1440p monitor with a GTX 970 for years. I had to turn off anything resembling a dynamic shadow, but I got by OK.