Help Me Build My PC Catch-All

I've been running a Phenom II X6 1035T for a long time now, most recently paired with a gtx 560SE. This lets me get to ~45fps on Titanfall on low/med and ~30 fps on BF4 on med at 900p. I'm looking for a minimum upgrade to get me to 60fps on med/high at 1080p for these games and games likely to come out over the next year.

I'm worried because the 1035 and the 560SE seem really well matched. If I take one up I think the other would be a bottleneck.

Do you think a GTX 760 alone would do the trick, or should I save up for a matched GPU/CPU upgrade to get where I need to be? Anyone running a 760 with an older AMD?

Schmootle wrote:

I've been running a Phenom II X6 1035T for a long time now, most recently paired with a gtx 560SE. This lets me get to ~45fps on Titanfall on low/med and ~30 fps on BF4 on med at 900p. I'm looking for a minimum upgrade to get me to 60fps on med/high at 1080p for these games and games likely to come out over the next year.

I'm worried because the 1035 and the 560SE seem really well matched. If I take one up I think the other would be a bottleneck.

Do you think a GTX 760 alone would do the trick, or should I save up for a matched GPU/CPU upgrade to get where I need to be? Anyone running a 760 with an older AMD?

I'm running dual 6950s with the Phenom II X6 1100T, and so far BF4 (Crossfired) and Titanfall (Crossfire off since there's no support for it in game yet) both run pretty damned smooth. Not the highest settings, but at 1900x1200 and nearly-high settings they're super smooth (haven't checked exact framerates, tbh).

EDIT: That is to say, I'd think just upgrading your graphics card would do you wonders for now, especially if you're on a budget.

I went and looked, and they make it sound like BF4 really uses a lot of CPU, but it's possible that your large number of cores may make up for the fact that the individual cores aren't mega-fast.

The Phenom II was pretty good, about like a Core 2, and the 1035T is clocked at 2.6GHz. That's not lightning speed, but it's not too bad. And then you've got six of those, two more than nearly all Intel chips, and with BF4 being widely multithreaded, it might really work pretty well. (The new Bulldozer chips suck terribly, but the Phenom II was pretty good, just late to the party... they matched the Core 2 about 18 months after it first shipped.)

It also sounds like BF4 likes lots of memory: it's really a pig. The article I was reading on HardOCP suggested that they were running into problems at 8 gigs, and felt they needed more. But memory is real spendy right now, so I'd probably hold off on doing anything there until you know you need it.

So, yeah, I think I'd probably try just a strong video card, and see how it goes. The 760 has been more or less our default recommendation, but I think it might not be enough for BF4 on high settings. Unfortunately, HardOCP, who are the people I really trust for video card reviews, haven't done any 760-class cards anytime recently, and in the last review, they didn't include any BF4 numbers.

I'm suspicious that you may want an AMD 280X, a $350 card, if you really want BF4 to sit up and sing, but I'm not certain yet. I'm gonna keep digging around and see if I can come up with some hard numbers.

Update on my media/BU project: Despite me testing the system I intended to use before purchasing HDDs for it, apparently that boot was this PSU's last gasp. Just purchased a cheap single rail PSU (still a thermaltake in case I need to use it for a while) and a new mobo battery (no sense in not having one on hand if that isn't the problem). We'll see. I think I've reached the amount of money I'm willing to spend on this so if the PSU isn't the problem and it's the mobo, this is going on hold until I rebuild my regular PC. I can just toss those HDDs into my regular rig, I've still got 3 bays open.

Yeah, in looking around, if you want BF4 at max settings at 1920x1080, you're gonna want either an AMD 280X or an NVidia 770. The 280X is just a hair faster, and I think it's a little cheaper.

Thanks for the help folks. I managed an overclock on the 560SE yesterday (775 to 900mhz, plus some memory) that took my FPS from 45 to 55 and even let me increase the details a little bit, although I have to keep shadows low and textures medium. This agrees with what you are saying, that there is still ceiling left on the CPU. I think I will grab the 770 after the next drop so its ready for the next CPU, whenever that happens. AMD drivers have left me... sad... in the past.

Keep an eye on this reddit for gpu deals. They come all the time.

http://www.reddit.com/r/buildapcsales

Current AMD drivers are very robust.. you wouldn't have any more or less problems with Nvidia.

side note

[H] had a preview of AMD's Mantle API in BF4.. short of it its nothing game changing but its a nice free boost that has anywhere from 20% gain over D3D at low resolutions to 5-7% at high resolutions.

TheGameguru wrote:

Current AMD drivers are very robust.. you wouldn't have any more or less problems with Nvidia.

Now if they'd just fix the weird "corrupt pointer on one monitor bug" that randomly happens occasionally. That one's been going on for a long time.

side note

[H] had a preview of AMD's Mantle API in BF4.. short of it its nothing game changing but its a nice free boost that has anywhere from 20% gain over D3D at low resolutions to 5-7% at high resolutions.

Anandtech's stuff seemed to show Mantle helping more on low powered CPU systems, with less increase (your 5-7%) on higher end processors.

So if someone's thinking about a graphics card upgrade specifically for a Frostbite 3 game (Battlefield 4, Dragon's Age 3, Mass Effect 4, NFS, etc), and they don't have a fast CPU, AMD might now be even more compelling.

Now if they'd just fix the weird "corrupt pointer on one monitor bug" that randomly happens occasionally. That one's been going on for a long time.

That's fair.. happens to me occasionally so I know the one you are talking about.. But that's not to say that Nvidia doesnt have its own share of weird bugs (especially with multi-monitor) I have 2 that happen occasionally as well (about as frequent as the corrupt pointer)

My point is that its disingenuous to say AMD has driver issues and Nvidia does not (especially if you look at release notes in their drivers) Since I would argue they probably have equal amounts of issues with different games and hardware scenarios.

TheGameguru wrote:
Now if they'd just fix the weird "corrupt pointer on one monitor bug" that randomly happens occasionally. That one's been going on for a long time.

That's fair.. happens to me occasionally so I know the one you are talking about.. But that's not to say that Nvidia doesnt have its own share of weird bugs (especially with multi-monitor) I have 2 that happen occasionally as well (about as frequent as the corrupt pointer)

My point is that its disingenuous to say AMD has driver issues and Nvidia does not (especially if you look at release notes in their drivers) Since I would argue they probably have equal amounts of issues with different games and hardware scenarios.

Oh, I agree. I have had weird AMD bugs, but nothing bad enough that I wouldn't buy another AMD if it was the best bang for the buck in my price range.

The bad thing about the cursor bug is that the fix that will work is to turn on pointer trails. But if you turn on pointer trails, you often lose the cursor entirely from some game UI's (like Mechwarrior).

I get that what you're saying about drivers is true, but can't reconcile that with my own experiences.

I had more issues in 6 months with a pair of 6950's than I've had in the entire ownership period of my current and previous two pairs of Nvidia cards.

I know stats don't lie but as long as performace is equal or relatively close I'm going to keep going to the one that treats me best.

Thin_J wrote:

I get that what you're saying about drivers is true, but can't reconcile that with my own experiences.

I had more issues in 6 months with a pair of 6950's than I've had in the entire ownership period of my current and previous two pairs of Nvidia cards.

I know stats don't lie but as long as performace is equal or relatively close I'm going to keep going to the one that treats me best.

I've got a 6950, but am not running Crossfire. I think that's where both sides end up falling down more often than in single card installation. Maybe AMD's Crossfire is worse than Nvidia's SLI?

Yeah, crossfire is still struggling although it's improved in the last couple months. It was pretty much worthless until they started frame pacing a couple months ago. When I had 5870s in crossfire I always thought something was off. Even if Fraps showed 60fps, it always felt less smooth than my Nvidia laptop when it held a stable 30. The framerate tended to fluctuate a lot too. It's at least something to consider before going with a crossfire setup. I'm so happy I don't have to deal with their drivers and delays on crossfire compatibility with newly released games.

That said, their drivers are way better if you want to do a eyefinity/surround setup.

At the time that was kind of true. I don't know that it is now. *This is a response to Mannish, not Tuffalo

I had a brief experience with a pair of 280X's that performed well, though they dumped heat like nothing I've ever owned and turned a case that actually has really solid airflow into a hotbox and raised CPU temps by like 10C over the pair of cards they replaced.

When I was testing that PC in my little PC room it had me sweating through my clothes.

My (apparently unlucky) history with drivers and that experience with those 280's is what led to my pair off 770"s.

At this risk of going out on a tangent, can anyone recommend me a shop that will build a PC for me?

I've bought two machines over the years from CyberPower and have had good results both times. Is it cheaper/better to look for something off the shelf so to speak, or is there a custom shop that offers good deals? Or would I be better off finding a friend-of-a-friend to do this?

I'm in the market for a low-to-mid-range gaming machine that'll set me back $800 to $1,000.

Thanks for any leads!

(I'm going to head off the why-don't-you-build-your-own-you-lazy-ass by saying I don't have the time in my day or the space in my house to do it, and my toleration for frustration is at an especially low ebb these days. I wish I could be more Zen about this, and in a parallel life I'd build my own rig but this just ain't happening right now.)

I have a pretty strong preference for NVidia, because I do a lot of retrogaming, and AMD drivers are really crap at doing that. Old games blow up on their hardware with distressing regularity.

I thought the drivers were pretty okay when I got my 5870, but they just kept getting worse and worse, and with the entirely new architecture on the horizon, I figured an upgrade would not be a good idea: my assumption was that the bugs would get worse on both architectures, as they ignored the older 5870 and learned the new 7XXX. It was already showing pretty severe bitrot, and I didn't think it was likely to get better during a major architecture switch. So, I swapped back to an NVidia 680, and it's been treating me very nicely.

I was just reading yesterday that the 280X can be really hard to find. AMD cards in this generation are strong at compute (NVidia cards are horrible at compute: they crippled them to force people to buy the pro cards), and people are buying it for cryptocoin mining. The 280X has a ton of compute muscle without a giant price sticker, so it's very appealing for people who want serious grunt, but want to minimize their fiscal risk level.

For myself, I definitely wouldn't buy an AMD for actual graphics, because old games are too likely to fail. But if I needed compute power, I'd be eyeing them very seriously.

MannishBoy wrote:
TheGameguru wrote:
Now if they'd just fix the weird "corrupt pointer on one monitor bug" that randomly happens occasionally. That one's been going on for a long time.

That's fair.. happens to me occasionally so I know the one you are talking about.. But that's not to say that Nvidia doesnt have its own share of weird bugs (especially with multi-monitor) I have 2 that happen occasionally as well (about as frequent as the corrupt pointer)

My point is that its disingenuous to say AMD has driver issues and Nvidia does not (especially if you look at release notes in their drivers) Since I would argue they probably have equal amounts of issues with different games and hardware scenarios.

Oh, I agree. I have had weird AMD bugs, but nothing bad enough that I wouldn't buy another AMD if it was the best bang for the buck in my price range.

The bad thing about the cursor bug is that the fix that will work is to turn on pointer trails. But if you turn on pointer trails, you often lose the cursor entirely from some game UI's (like Mechwarrior).

I don't do that to fix it. I just refresh desktop a few times and it fixes it

To ThinJ's dual 6950 experiences:

I own those exact cards now (bought them off ThinJ a while back ) and haven't honestly noticed micro-stuttering, though I am aware of the issue. My rig isn't high-end at all, so I get slow-down due to other factors (I like to up my graphics settings then work down from there until the game is playable, rather than buttery-smooth-- so I get something like 50FPS average in BF4, for example, but most of the settings are turned up to ultra). For the most part I haven't had any major issues with the Crossfire setup, save for some graphics glitches in a BF4 patch (was quickly fixed), or a temporary lack of Crossfire support for a brand new game (Titanfall Beta has to run sans-Crossfire, for example). They were running pretty warm, so I installed Arctic's Accelero Twin Turbo fans and OC'ed the hell out of the cards and they run without a hitch or excess heat (office is noticeably cooler than it used to be), and they're quieter, too. I've never had a major problem with drivers that has rendered any game I've played totally unplayable or even ugly. Like I said, BF4 had some Crossfire issues, but that was on DICE, not AMD, and it was resolved rather quickly.

Would I buy two cards again? probably not. But I think I'm kind of an edge case, as I use my PC both for gaming and art, and Crossfire/SLI is useless in art software-- nothing takes advantage of dual GPUs. Hell-- AMD is practically useless in art software, as programs like Maya or 3DS Max don't run DirectX properly on AMD cards, so no proper Viewport 2.0 or alpha sorting on an AMD GPU. I've also found a lot of CGFX shaders run like sh*t (if at all) and HLSL shaders tend to be programmed with DX in mind rather than OpenGL, so they are kind of a crapshoot as well (lots of floating point differences in the code that DX handles but OGL is a stickler about).

tl;dr. I've had good experiences to counter ThinJ's bad ones with the very same cards, but I don't plan on running a dual-GPU setup (or even AMD, for that matter) in my next build.

Not that this is an argument against anyone else's experiences/knowledge, just my experience/thoughts on running dual GPUs.

So I'm sure there's a software solution out there somewhere and I'm figuring out how to diagnose this stuff myself, but I thought I'd ask folks here first. My system has handled everything like a champ. I've tried out Bioshock Infinite and Skyrim (amongst other games) and everything is absolutely stunning. I'm having one problem, though, with NBA2k14. It's basically a port of the PS3/XBox 360 game. So it's actually quite last gen. But I experience tearing constantly when I play games. With situations like this where's the best place to start? It seems like with games like this that aren't very popular there isn't a lot of movement on a patch or a fix from the community around the game. So I'm wondering if there is some obvious setting that might be the cause of it.

TheGameguru wrote:

I don't do that to fix it. I just refresh desktop a few times and it fixes it

What I meant was to prevent it, you can set cursor trails on (I use low, so it's almost invisible). I can't remember it ever happening after doing that.

DSGamer wrote:

So I'm sure there's a software solution out there somewhere and I'm figuring out how to diagnose this stuff myself, but I thought I'd ask folks here first. My system has handled everything like a champ. I've tried out Bioshock Infinite and Skyrim (amongst other games) and everything is absolutely stunning. I'm having one problem, though, with NBA2k14. It's basically a port of the PS3/XBox 360 game. So it's actually quite last gen. But I experience tearing constantly when I play games. With situations like this where's the best place to start? It seems like with games like this that aren't very popular there isn't a lot of movement on a patch or a fix from the community around the game. So I'm wondering if there is some obvious setting that might be the cause of it.

Maybe check your GPU's settings for a hardware-forced V-sync? You know, the graphics control panel-- there might be something in there about using certain settings or letting the application dictate video settings. Otherwise, I heard that the port of NBA2K14 was complete crap by any standard, so that might just be a thing to deal with, as far as I know. This is assuming there are no V-sync options in the graphics settings of the game itself, though. If you can, turn V-sync on to triple-buffer, that usually nets the best results (at the cost of GPU processing power, which means slower frame-rate, of course).

Yeah if it's a last gen console port it's probably just running at a higher framerate than your monitor can display. Thus, the tearing.

VSync will fix it, though my favorite "fix" for that issue when I was still on 60hz monitors was to just get in the driver profile for whatever game it was and crank up AA/AF until the framerate stayed around or right below 60.

That makes the game look nicer and kills the tearing without some of the weird drawbacks of VSync

Enix wrote:

At this risk of going out on a tangent, can anyone recommend me a shop that will build a PC for me?

I've bought two machines over the years from CyberPower and have had good results both times. Is it cheaper/better to look for something off the shelf so to speak, or is there a custom shop that offers good deals? Or would I be better off finding a friend-of-a-friend to do this?

I'm in the market for a low-to-mid-range gaming machine that'll set me back $800 to $1,000.

Thanks for any leads!

(I'm going to head off the why-don't-you-build-your-own-you-lazy-ass by saying I don't have the time in my day or the space in my house to do it, and my toleration for frustration is at an especially low ebb these days. I wish I could be more Zen about this, and in a parallel life I'd build my own rig but this just ain't happening right now.)

The best thing is to go to your local computer shop. They will be cheaper than CyberPower or the like and you get to support a local business.

To my horror, I discovered 2... cat 5 cables in use in my home network. I know, I know, I am worst at computers (my whole house is wired for 6, i must have had these lying around and forgot to replace them).

Still, it doesn't matter since right now my biggest bottleneck is I/O due to this crappy drive I was using for data. But that's been sorted. New machine is up and rolling and things are actually loading faster off the new machine since the drive in there is so much better.

Thanks for the help, all.

Took me a while, but I've finally got my PC up and running. Not sure what did it. I took everything out and then reinstalled it all. Got it up and going. Thanks for all the prior suggestions; they were much appreciated.

To my horror, I discovered 2... cat 5 cables in use in my home network. I know, I know, I am worst at computers (my whole house is wired for 6, i must have had these lying around and forgot to replace them).

Actually, as long as they're Cat5e, that's all you need for gigabit. Cat6, unfortunately, isn't better. Well, more specifically, it's not enough better. If you want to run 10 gigabit, you want Cat6a. 10gE may work over Cat6, especially over short runs, but you want 6a to be sure.

There was an aborted standard for gigabit that required straight Cat6, which I think is where the idea spread that that's what you need, but it's not true. 5e is enough. I think the failed standard used four Cat6 wires, but the standard that succeeded uses eight wires at 5e level. And then 10 gig needs 6a. (That's probably about as far as we'll go with copper.... after that, you really want fiber.... and you probably want fiber even at 10 gig.)

Now, if those wires you removed were regular Cat5, not E, then you had the same thing that you do with the regular Cat6... it might work, especially over short runs, but it's not certain.

This is a fun article about the 750 Ti.

Edit: Think about it from the perspective of being a kid in high school. You could upgrade the family computer to play games for $150.

That's way under the performance class we usually point people at, but it's a heck of a lot better than onboard video, and it just uses slot power, so you should be able to put it in very small and/or power-constrained computers.

Something we'll have to keep in mind. Someone was looking at a rig for an RV, and I think this card might be just the thing.

Looks like a great card at its price point. I saw also Nvidia refreshed the Titan line