Help Me Build My PC Catch-All

So, I've noticed RAM is super cheap.
Also, my video card is a little long in the tooth, and my motherboard is the older 1156 socket, so while there's a lot of room for CPU upgrading, that's not really cost-effective any more. I waited too long, I think.

Anyway, would I get more bang for buck with a GPU upgrade or a RAM upgrade? Maybe around $100-150. I know it won't take much GPU to make my system CPU-bound but if I do decide to rebuild, I could always bring forward the GPU.

Aside from the below, I have 2 HDs - one is a Maxtor 6L200S0(189 Gigs, maybe SATA, maybe IDE, only using about 125 of that - Windows and most apps on it).
The other is a WD 10EALS-00Z8A0(1 TB, SATA, only using about 125G of that - Steam mainly)

Current system:
Intel Core i3 530 @2.93GHz
RAM 4 G G.Skill Ripjaws
ATI Radeon HD 5770 1Gig GDDR5
Win 7 Home Premium 64-bit

Monitor is only 1680x1050 (Asus VW224U 22-Inch LCD Monitor http://www.amazon.com/gp/product/B00... ).
Not sure, but I might be using the 500W EarthWatts PSU that came with Antec Sonata III 500 (http://www.newegg.com/Product/Produc... ).

Intel Core i3-530 Clarkdale 2.93GHz 4MB L3 Cache LGA 1156 73W Dual-Core Desktop Processor BX80616I3530
http://www.newegg.com/Product/Produc...

GIGABYTE GA-P55A-UD3 LGA 1156 Intel P55 SATA 6Gb/s USB 3.0 ATX Intel Motherboard
http://www.newegg.com/Product/Produc...

G.SKILL Ripjaws Series 4GB (2 x 2GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800) Desktop Memory Model F3-12800CL9D-4GBRL
http://www.newegg.com/Product/Produc...

SAPPHIRE 100283L Radeon HD 5770 (Juniper XT) 1GB 128-bit GDDR5 PCI Express 2.0 x16 HDCP Ready CrossFireX Support Video Card
http://www.newegg.com/Product/Produc...

I recently acquired a GTX480 (should be a good upgrade for my aging GTX260 @ 1920x1200 res), but my ancient power supply doesn't appear up to the task (doesn't even have a PCI-E +2pin power plug). Reference specs on the 480 seem to indicate a 600W minimum, though I suspect the real requirement is a solid 12V rail (may need up to 25A?).

I was looking at replacing my existing no-name PS with:

http://www.newegg.com/Product/Produc...

It's lower than the reference spec, but has 40A on the 12V rail. Think it'll be sufficient? Or should I step up to a higher power unit?

duckilama wrote:

Just general "go faster turn on more shiny details " pretty much. The entire system was bang-for-buck when I built it in ... 2010? Maybe 2011 spring?

For the longest time all I played was l4d2 and tf2. Now it's usually D3 and Civ 5. Skyrim runs ok but not great. Same with GTA4.

Edit: one thing I really like about my current system is that it doesn't seem very loud. I'm no expert though. Ive never had any ATI driver problems and the only graphical issues I ever saw were due to an improperly seated CPU HSF.

Had a look around Newegg, I see you can pick up a 6850 for $150 and get a rebate on that. At the resolution you're gaming at you will probably be able to crank most settings in most games. It's actually probably overkill, but the 5770 was a decent budget card and I don't think anything else below $150 will be enough of an upgrade to be worth it.

*edit*
It will also potentially give you some legs if you upgrade the monitor and/or MB/CPU platform.

RAM is nice to have for general computing, but after 4 gigs it doesn't help gaming directly.

How would http://www.newegg.com/Product/Produc... (550Ti) compare to a 6850. I may have been looking in the wrong place, but there seems to be a lot of 550Ti under 150 and people really seem to love nVidia these days. I haven't used nVidia since GeForce3 series jet-engine noise-makers.

I also see a lot of 7770s for under $150. Is that one of those series where they use the chips from a "previous" series and cut costs in various ways?

I hate GPU shopping.

I don't think you're going to do wildly better for $150. At $250, the speed levels would be a major step forward, and $300 would get you up to a modern mainstream card like the 660Ti, which would find 1680x1050 a nice skip through the daisies.

The major area that's weak on the 5770 is memory bandwidth -- this is the most expensive thing on video cards. It's what's needed to drive most full-screen effects, especially at higher resolutions. If you're careful to disable both full screen antialiasing (FSAA) and texture multisampling (MSAA), that should give you pretty good framerates in Skyrim. My expectation is that it should run quite well. If it doesn't, you might want to verify that you're not having a heat issue before spending any money. I could be wrong about the performance you should see from a 5770, as I've never actually owned one, but checking into a heat problem doesn't cost anything but time.

ATI has rather fallen out of favor with us, between buggy driver releases and the completely botched 7XXX series. The 7970s were released at super-premium pricing, offering no more bang-per-buck than the prior generation, and the drivers were awful, at least at first. The 7970 was faster, but it carried a price tag that reflected that speed exactly. We perceived it as massive price-gouging. Then NVidia just slapped them silly with the 680, and then the 670, and our loyalties have shifted. In the compute market, I gather that ATI is a fair bit stronger, but for gamers, not so much.

Also worth noting is that the 7XXX series has abandoned the VLIW architecture of the earlier chips, so my expectation is that you'll see bitrot in the older drivers much faster than you ordinarily would, as most of their engineering effort will now be devoted to driving their NVidia-like 7XXX architecture, instead of their older offerings. The chips are so different that maintaining the old driver stack will be quite expensive, and with the financial difficulties they're having, I would be fairly surprised if they stay committed to keeping those old cards working well for very much longer.

Ah, thanks for the lesson man!
Like I said, my last nVidia card was one of those loud, hot, powersucking GeForce 3 cards waaaaaay back when. Everquest times maybe.

Ive never had driver issues with ATI but I also don't ride the bleeding edge like some folks here. I don't have room for 3 monitors either so Crossfire never even came up. While wraparound FPSes would be neat it just wont happen for me.

One of the first things I do in a game like Skyrim is go "fix " the autodetected gfx settings. I usually cut AA at least in half, knock shadows and water down at least a notch and often try to extend Max draw distance a tad. I'm used to reducing settings even at 1680. It's a shame there's not much to upgrade around $150 but it's also nice to know I did an ok job with this bang-for-buck build. Kinda hard to believe it's still chugging away pretty solidly. Then again, with the console cycle lengthening I'm still running higher resolution than a 720p Tv right?

So I guess I should just sock some money away for the next cycle. Or get an ipad.

As a real world counterpoint. The 12.8 ATI drivers my Crossfire 7970's offer superior gaming than my 680's with 3x27" LCD's

7970's are just beastly cards that continue to get better with every driver released. I think they get a bad rep on this site that is not entirely justified. Agreed they are overpriced but that is the market that we exist in.

The first camp that makes PLP work for multi monitor gaming will please a great deal of high end PC gamers..my money is on AMD with the 8000 release.

PLP?

duckilama wrote:

PLP?

Portrait Landscape Portrait

Basically the ideal multi monitor setup is a single 30" Dell U3011 in landscape flanked on either sideby a Dell FP2007 20" in Portrait.. 3 30" or 27" are nice but are massive in all landscape and very tall in all portrait.. The PLP is ideal for pixel count and visibility/immersion.

TheGameguru wrote:

As a real world counterpoint. The 12.8 ATI drivers my Crossfire 7970's offer superior gaming than my 680's with 3x27" LCD's

7970's are just beastly cards that continue to get better with every driver released. I think they get a bad rep on this site that is not entirely justified. Agreed they are overpriced but that is the market that we exist in.

The first camp that makes PLP work for multi monitor gaming will please a great deal of high end PC gamers..my money is on AMD with the 8000 release.

Are you using 4 GB 680s out of curiosity? I'm using 2x680 4 GB and they are monsters. The main issue is that it's difficult to switch between monitor setups because of the lack of hotkeys like AMD. I have to open up the actual control panel and go through setting up Nvidia surround every time I switch. I'm hoping Nvidia will add in some hotkey support soon. It's pretty crazy to be able to play Crysis 2 with the high res textures and ultra at 6020x1080. I wasn't getting 60 fps, but it was very playable.

I have 2 4GB 680's, 2 2GB 680's, 2 4GB 670's and 2 2GB 670's and a single 660 and 560 currently running or sitting in something partially running right now.. On the AMD side I'm down to my single X79 based 64GB dual 7970 system..but it's my primary system running 3 27's

The 12.8 ATI drivers my Crossfire 7970's offer superior gaming than my 680's with 3x27" LCD's

Can you be more specific? 'Superior' is awfully vague.

Don't you two get started again!

This stuff kinda matters, Gaald. I was running an ATI card up until a few months ago, and if they're genuinely doing anything better than NVidia, I'd like to hear about it.

Malor wrote:

This stuff kinda matters, Gaald. I was running an ATI card up until a few months ago, and if they're genuinely doing anything better than NVidia, I'd like to hear about it.

Their multi monitor gaming stuff has been better for a long while now. It's just more refined and fully featured. It's the one and only thing they're doing better IMO, but if you're going to do the multiple monitors thing... Eyefinity is better than Nvidia Surround. Not by leaps and bounds but it is at least a bit better.

I still don't think it's worth it because they're so miserably slow with things like updated crossfire profiles for new games, but then I've not been interested in multi monitor gaming anyway because I simply don't have the space on my desk for the monitors.

So, maybe I've just not been paying attention enough lately (I haven't) but there's 2 variations on the Cooler Master Hyper 212. The + and the EVO. From what I can tell, the EVO is $5 more on Newegg, but the specs say it's slightly lighter, quieter, and has the heat pipes lined up to be continuous without any aluminum between them where they touch the cpu, and doesn't have the silly gaps at the edges of heat pipes like the 212+ does in the current version. You can see this clearly via the images on the Cooler Master pages for the EVO and +.

From the few builds I looked at, it seems like some people haven't picked up on the EVO model or want to save that $5. Any thoughts?

There's also a Hyper 101 that released last year that's got an 80mm fan (instead of 120mm) with a fixed fan speed with much lower CFM rating, good sound rating, is shorter, much lighter at about half the weight, only has 2 heat pipes, only costs $22 normally ($13 after rebate at Newegg currently) and of course doesn't do quite a well as the 212, but does well enough from the bit of info I can find that it might be a decent idea if someone needs a shorter cooler or isn't interested in overclocking but has cpu cooling issues with the stock fan. It is also more limited in it's mounting capabilities and has models separately for intel and AMD.

Thin_J wrote:
Malor wrote:

This stuff kinda matters, Gaald. I was running an ATI card up until a few months ago, and if they're genuinely doing anything better than NVidia, I'd like to hear about it.

Their multi monitor gaming stuff has been better for a long while now. It's just more refined and fully featured. It's the one and only thing they're doing better IMO, but if you're going to do the multiple monitors thing... Eyefinity is better than Nvidia Surround. Not by leaps and bounds but it is at least a bit better.

I still don't think it's worth it because they're so miserably slow with things like updated crossfire profiles for new games, but then I've not been interested in multi monitor gaming anyway because I simply don't have the space on my desk for the monitors.

Nvidia Surround is not as good as AMD's.. I know some people have found it to be better..but I seriously question how much they did in the past or have done with AMD. I find the micro-stutter worse with Nvidia.. or perhaps I'm just sensitive to Nvidia's implementation. I get a more consistent frame rate with my 7970's in multi-monitor gaming as well.. a great deal less peaks and valleys for sure.

For sure I still would recommend either of the 3 current Nvidia offerings over AMD at the $300, $400, and $500 price points... But If someone was setting up a triple display gaming system.. I would steer them towards AMD.

duckilama wrote:

How would http://www.newegg.com/Product/Produc... (550Ti) compare to a 6850. I may have been looking in the wrong place, but there seems to be a lot of 550Ti under 150 and people really seem to love nVidia these days. I haven't used nVidia since GeForce3 series jet-engine noise-makers.

I also see a lot of 7770s for under $150. Is that one of those series where they use the chips from a "previous" series and cut costs in various ways?

I hate GPU shopping.

While I agree with what Malor says regarding the relative superiority of Nvidia in terms of the performance and drivers AMD basically has the lower end locked down. The Nvidia 550ti is weaker than the 5770 you have now, so definitely not worth looking at. If you want to go Nvidia the 560ti is where you need to start looking.

The upgrade from 5770 to 6770 was a case of AMD changing the card BIOS and sticker on the cooler, the 7770 is a different card. The performance is a little better, but the main benefit is lower power usage. As Malor says the main limit of the 5770 is the narrow memory bus. If you go to the 6850 that bus width doubles so you get a noticeable performance increase.

I do think you would be better served by saving a little more and going for a 560ti or 660ti.

McIrishJihad wrote:

AnimeJ - yeah, there's a Microcenter about 35 minutes away, so I'll give that a look. Of course, buying local with a deal + taxes barely breaks even with buying online.

Thanks for the help everyone, now I've just got to convince the wife that this is a good idea...

If you can find an i5-2500k and an AsRock Extreme4 P8Z77 motherboard for less than $260+tax online, then you are a better man than I am. I don't know what your tax rate is, but for me that'd work out to ~275. Newegg has the mobo for ~135 and the processor for $220 last time I checked.

That fails to be funny in every possible way.

Thin_J wrote:

That fails to be funny in every possible way.

I know, right? That guy is clearly from AL or MS.

Inside the second: Gaming performance with today's CPUs

Very nice Ars writeup on a bunch of common CPUs. What they're focusing on here is latency, rather than directly on framerates, and whether there's much difference between the various CPUs.

Turns out that it's pretty visible. Intel chips are substantially better. Overall, in a scatter plot comparing price to 99th-percentile frame rates, the 3470 and the 3570 offer the best bang per buck. (They didn't test the 3550, which looked fairly appealing in a recent build.) Our old favorite, the 2500K, isn't too far off the pace, and of course if you overclock it, it'll probably come out much more even, performance-wise. I'm seeing a strong confirmation here of our consensus that we 2500K and 2600K owners have no need of an Ivy Bridge upgrade.

They also concur with our general group wisdom that the AMD offerings are about equivalent to the Core 2, and that the current Bulldozer chips are much worse, for gamers, than the prior generation Phenom II. Bulldozer is just sh*t on a stick, and you really don't want it if you're a home user.

I wish they'd covered the i3 chips. Kind of bummed that they didn't. I would have been very interested to see where they placed.

Overall, I was very pleased to see how closely these results matched with what we've been telling people.

Malor wrote:

Inside the second: Gaming performance with today's CPUs

Very nice Ars writeup on a bunch of common CPUs. What they're focusing on here is latency, rather than directly on framerates, and whether there's much difference between the various CPUs.

Turns out that it's pretty visible. Intel chips are substantially better. Overall, in a scatter plot comparing price to 99th-percentile frame rates, the 3470 and the 3570 offer the best bang per buck. (They didn't test the 3550, which looked fairly appealing in a recent build.) Our old favorite, the 2500K, isn't too far off the pace, and of course if you overclock it, it'll probably come out much more even, performance-wise. I'm seeing a strong confirmation here of our consensus that we 2500K and 2600K owners have no need of an Ivy Bridge upgrade.

They also concur with our general group wisdom that the AMD offerings are about equivalent to the Core 2, and that the current Bulldozer chips are much worse, for gamers, than the prior generation Phenom II. Bulldozer is just sh*t on a stick, and you really don't want it if you're a home user.

I wish they'd covered the i3 chips. Kind of bummed that they didn't. I would have been very interested to see where they placed.

Overall, I was very pleased to see how closely these results matched with what we've been telling people.

Interesting to see that DICE really nailed the smooth performance metric with BF3. It'll play well on just about any processor of the last two or possibly three generations.

Chairman_Mao wrote:

Interesting to see that DICE really nailed the smooth performance metric with BF3. It'll play well on just about any processor of the last two or possibly three generations.

Kind of the opposite of BFBC2, which liked a lot of processor.

It will be interesting to see if Intel has been holding back considerably because of AMD's failures if/when AMD can ever produce a competitive chip. The difference in the end for the most part is 10-15% so the hurdle isn't exactly that large for AMD to become competitive again.

TheGameguru wrote:

It will be interesting to see if Intel has been holding back considerably because of AMD's failures if/when AMD can ever produce a competitive chip. The difference in the end for the most part is 10-15% so the hurdle isn't exactly that large for AMD to become competitive again.

I can just see Intel, upon AMD releasing a competitive chip, saying:"Oh, hai! You've caught up with us? Here's what we've been holding in reserve that blows your sh*t right out the water! LOL!"
The image both saddens and amuses me.

Rallick wrote:
TheGameguru wrote:

It will be interesting to see if Intel has been holding back considerably because of AMD's failures if/when AMD can ever produce a competitive chip. The difference in the end for the most part is 10-15% so the hurdle isn't exactly that large for AMD to become competitive again.

I can just see Intel, upon AMD releasing a competitive chip, saying:"Oh, hai! You've caught up with us? Here's what we've been holding in reserve that blows your sh*t right out the water! LOL!"
The image both saddens and amuses me.

I read somewhere that Haswell's GPU is triple the performance of IB's GPU, so they could be catching up on AMD's lead in the integrated graphics.

Well, we know that the 2500 and 2600Ks have a TON of room, and that they actively sandbagged the 35XX series with a sh*tty internal thermal material to, presumably, impair their overclockability.

If Intel really wanted to, I suspect they could probably ship 5Ghz parts today. There wouldn't be very many, but I'm sure they could ship at least some. About the only thing that would make them do that, though, is competitive pressure, and there just isn't any.

It's been said that Intel is its own strongest competitor, they're dealing with the installed base -- so moving as slowly as possible means that they're making their own lives easier in future years.

Something I wonder about is what's happening to intel's R+D investment. Without the pressure to release something fast and impressive, assuming a constant rate of improvement wouldn't they get backlogged in stuff to release. The other way of looking at it, and I know chip R+D and design is a very long term process, is that they'll shift the focus of their R+D to other areas that are currently where they can see a return on that investment.