On AMD/Intel, commiditisation, stagnation, the need for power, etc

There's this article on Ars today regarding AMD: Op-ed: AMD may be facing irrelevance

It touches on many of the topics that have come up in the build a pc thread, that AMD really isn't very competitive in the desktop PC market, and intel has responded by focusing it's efforts elsewhere. As noted in the article and comments, desktop isn't the only game in town these companies care about, and must take it's share of common R&D efforts that produce a wide range of hardware.

There's lots of specifics you can go into, lots of directions that processing chips can go in, and the merits of each will suit different types of user. I guess what I'm wondering is as a group of people who generally are about gaming, how good a move away from what has traditionally been good for us, more processing power and towards different areas such as efficiency at a certain capability level could be better, and what the limits are of that.

The other side of the coin to hardware that can do lots of sums, is the software that makes those demands and runs well or poorly depending on what's available. There will always be software that consumes all resources that are available to it, but I wonder whether a extended period of limits could be beneficial to rein in inefficient code. The parallel I'd make here is to the current extended console cycle, where developers keep pulling rabbits out of hats to do most of what they want.

Having limits has a cost, as some things won't be possible (say you need to run a few thousand complex AI actors in a world) and the developer would need to produce a different game because they can't make what they want. Something I've seen though is that when a new 'generation' of hardware comes out, while there's genuinely things that couldn't be done on previous generations, there's also developers who use that power trivially and inefficiently. There's a balance to be stuck somewhere.

Developing more efficient and cheaper to produce hardware could also have the effect of making it into a commodity good. Rather than raising the ceiling of capability at a high cost, you raise the floor and bring the majority of users up to the current 'standard'. Another factor I can see is efficiency influencing form factors, moving away from big boxes to where a small quiet ITX box would be a general option worth considering for the majority of PC users.

My one thing is that for most games the CPU is all but irrelevant anyway.. You certainly can't go rocking a 10 year old CPU and expect to perform great in modern games with a nice GPU.. but you probably would have a completely fine gaming experience with any camps $100 CPU these days paired with a nice GPU relative to your gaming resolution.

The author makes good points though.. these days Intel's main competitor and threat is ARM.. and to a lesser extent Qualcomm.... certainly not AMD.

Yeah, well, just try playing Dwarf Fortress, and see how long you stick with any AMD chip, if you've got an Intel offering available.

It has taken Intel a few years to develop serious, competitive chips to combat the ARM-based offerings prevalent in tablets and smartphones today, but by this time next year I expect Intel to be available in many more mobile devices than it is presently.

I'm not sure if that's going to happen in any major way unless Windows 8 x86 tablets are massively popular. x86 compatibility is almost a disadvantage in the mobile space given the massive infrastructure built up around ARM. The 10W he quotes in the article is still a lot of power compared to modern ARM chips.

Perhaps Android x86 will be a surprise massive hit?

Malor wrote:

Yeah, well, just try playing Dwarf Fortress, and see how long you stick with any AMD chip, if you've got an Intel offering available.

My one thing is that for most games the CPU is all but irrelevant anyway

Generally I'm not sure I'd say the CPU is irrelevant, but that it's got to a point already where there's generally 'enough' processing power for most current software, and you have to go down a class of PC to something where design and component choice is optimised for portability before you're really lacking, and that's the floor which is coming up. You could still make a PC where the CPU is a bottleneck, but assuming you spend more than 2 minutes reading a guide or a boiler-plate spec for what you intend to do with your PC, then yes, thinking about choosing a poor CPU irrelevant for a lot of self builds.

Zelos wrote:

Perhaps Android x86 will be a surprise massive hit?

I don't know about "massive hit", but android on x86, via intel's atom chips, is already here and devices are coming.

It's not like games are ever worse on an Intel chip, Guru. And the motherboard drivers are usually a lot better. I've had remarkably stable, problem-free systems since I switched over to the Core 2.

AMD has just mismanaged itself to the point that it's largely a bad idea to buy their products. They've got, like, one tiny niche left in desktop CPU's, and that's it. They're not doing proper engineering anymore, relying on automated layout tools to do most of their design work, and they're supposedly an engineering company. The one thing you never, ever do is outsource your central competence, and that's exactly what they've done. Their competitors have access to all the same tools, so AMD has nothing special to offer, anymore. As the article (or the comments) say, they're going to end up like Via, serving tiny, hyper-specialized markets.

It seems very unlikely to me that we gamers will ever again be interested in their CPUs. Considering their announced design principles, I expect their GPUs to fall farther and farther behind those from NVidia, which is hammering them with equal performance out of much smaller chips. Even with all NVidia's process problems and general ineptitude when it comes to actually making the chips they design, they're just wrecking AMD in terms of wafer efficiency and, almost certainly, profit per board sold.

And then you look at what Apple is doing with its A6 design.... way better than anything else in its class, because they took the time to do it right, with real human engineers.

There may be a day when programs are as good as people at chip layout, but it's not today.

Scratched wrote:
Zelos wrote:

Perhaps Android x86 will be a surprise massive hit?

I don't know about "massive hit", but android on x86, via intel's atom chips, is already here and devices are coming.

Wow, I really like the word 'massive' today, don't I?

I just had a look at the Anandtech review of that Intel Android phone, it does look impressive

It seems very unlikely to me that we gamers will ever again be interested in their CPUs. Considering their announced design principles, I expect their GPUs to fall farther and farther behind those from NVidia, which is hammering them with equal performance out of much smaller chips. Even with all NVidia's process problems and general ineptitude when it comes to actually making the chips they design, they're just wrecking AMD in terms of wafer efficiency and, almost certainly, profit per board sold.

I don't disagree on the CPU side.. but I do on the GPU side.. your assumption that AMD is falling behind is laughable based on current performance across all price points. At the $200 is appears AMD is in fact a better option. If anything AMD is ahead of Nvidia and will launch the 8X series shortly that should provide a nice speed boost over Nvidia's flagship models at equal prices. Nothing so far has shown us that the current flip-flopping of price/performance leadership will shift any time soon despite your best efforts at FUD.

Clearly AMD has some serious management and performance issues that they need to quickly address.. if its a complete exit from the CPU market or at least a complete exit from the desktop CPU market (I'm told that AMD still sells a decent amount of Opterons to white box guys for massively dense VDI) then they need to figure that out. As Gamers and Consumers we desperately need AMD to continue to provide competitive pressure to Nvidia (for sure) and Intel (maybe) but my point was that basically the CPU market for gamers is largely irrelevant.. increased competition isnt going to really get us that much more out of the CPU anymore.. thus my point that most gamers would be perfectly happy with an Intel Core i3 at $100.

It's not like games are ever worse on an Intel chip, Guru. And the motherboard drivers are usually a lot better. I've had remarkably stable, problem-free systems since I switched over to the Core 2.

This is you trying to make an argument out of nothing.. never did I say it was or even try and push AMD as a viable CPU choice.. My point was fairly clear.

Scratched wrote:

Generally I'm not sure I'd say the CPU is irrelevant, but that it's got to a point already where there's generally 'enough' processing power for most current software, and you have to go down a class of PC to something where design and component choice is optimised for portability before you're really lacking, and that's the floor which is coming up. You could still make a PC where the CPU is a bottleneck, but assuming you spend more than 2 minutes reading a guide or a boiler-plate spec for what you intend to do with your PC, then yes, thinking about choosing a poor CPU irrelevant for a lot of self builds.

Zelos wrote:

Perhaps Android x86 will be a surprise massive hit?

I don't know about "massive hit", but android on x86, via intel's atom chips, is already here and devices are coming.

Unless Intel is holding back on a CPU that would magically increase FPS in current games then I stand by that its mostly irrelevant. Look at how most game software is designed.. and what in terms of PC architecture currently bottlenecks gaming performance.

If my baseline is [email protected] with Max Details (AA/AF/Detail) then for the most part I'm picking a $100 Core i3. What benefits to that baseline does a $300 Core i7 buy me?

Oh, and you say:

This is you trying to make an argument out of nothing.. never did I say it was or even try and push AMD as a viable CPU choice.. My point was fairly clear.

But earlier, you said, boldface mine:

but you probably would have a completely fine gaming experience with any camps $100 CPU these days paired with a nice GPU relative to your gaming resolution.

You're pretty directly claiming there that AMD is a viable CPU choice. And I agree, it IS viable, for many games. But it's not your best value for money spent.

your assumption that AMD is falling behind is laughable based on current performance across all price points.

Not price points, profit. An AMD 7970 is 365mm^2. An NVidia 680 is 294mm^2. The first number I found on TSMC wafer sizes is 300mm in each direction, meaning that out of one wafer, AMD could get at most 246 chips, and NVidia could get, at most, 306. This means that NVidia is about 25% more efficient. And then, in turn, the AMD design has a lot more memory bandwidth, and that's really expensive to implement. It's real nice for compute work, and really nice at super high resolutions, but it costs a lot of money. This means that NVidia has much higher margins, ergo a much higher budget for R&D and, if necessary, price wars.

AMD can sell its chips at any dollar figure it likes, but if it's not making a profit, it's not going to be doing it for very long. I don't know what the exact dollar figures are, but it's obvious that, on the manufacturing side, NVidia's boards cost a lot less to reach roughly comparable performance levels. If I were NVidia, I'd know I was in the driver's seat, and I'd be trying to set prices so that I was nicely profitable, and AMD was losing a little money on every board they sold.

On Newegg right now, I'm seeing the 7970 in the $400-$450 price range, where the 680 is around $500. So, the card that's cheaper to manufacture commands a HIGHER price to end users.

NVidia is beating them up bad, Guru. Really, really bad. The only way AMD wins in this scenario is if they are moving huge, huge volume, or if NVidia is having disastrously bad yields, and I don't see strong evidence of either being the case.

And I agree that I want AMD to be around and healthy, but their management is so terrible that I just don't see it happening.

I have to agree with Guru. I've used AMD chips for the last 10 years in a variety of price ranges. I've never had a problem running any games and, in fact, even now older "inferior"* chips are perfectly able to run the majority of games with a decent graphics card on mid-high and sometimes very/extremely high settings.

Mostly I attribute this to the consoles being the main design constraint on a lot of games plus the move away from AAA to micro- to mid-sized A and AA games.

Let's put it this way: My Phenom 2 X4 955, 4GB RAM and Geforce GTX 560 will play any and all games I've tried to without a problem. It's not a new system and it's not as powerful as an i5 with 8GB RAM and a more modern video card but I don't need that because I'm not driving 3 1080p displays. Even crysis runs at high settings on my monitor's native 1600x900 resolution (haven't tried Metro 2033 yet though).

In fact I only have the 560 because I couldn't source an AMD card on this tiny island (I could have gotten a better/newer gen model for the same price otherwise)...

*That includes the older intel chips as well.

Malor wrote:

NVidia is beating them up bad, Guru. Really, really bad. The only way AMD wins in this scenario is if they are moving huge, huge volume, or if NVidia is having disastrously bad yields, and I don't see strong evidence of either being the case.

And I agree that I want AMD to be around and healthy, but their management is so terrible that I just don't see it happening.

Is it possible we're just in the middle of yet another flip-flop where AMD and Nvidia trade places? Two years ago, AMD was creaming Nvidia up and down the GPU line (5000 family vs. "Tesla"?). I think Nvidia finally fixed whatever issues they were having earlier this year / late last year while AMD is kinda falling apart from a management perspective.

Right... the Phenom IIs are about the same as a Core 2. The newer Bulldozer chips are worse, substantially worse, for most applications, most of the time. It's the biggest step backward I remember in a chip maker since Intel's P4 fiasco. It's not as bad as P4, but it's bad. Phenom IIs are acceptable, but AMD doesn't make those anymore. Once any remaining inventory is sold out, they're gone.

As a consumer, if you were buying new today, there is only one niche where you'd actually better off buying AMD: if you have to have the absolute most powerful on-CPU graphics you can get, but you can't buy an add-in video card. This is a tiny, tiny niche.

Otherwise, there's basically always an Intel solution that will run faster for the same or less money. In the great majority of cases where you're CPU-bound, single-threaded performance is king, and Intel is absolutely dominant in this area. It's so dominant that, even in widely-multithreaded programs, the ones that should really show off the wide architecture of Bulldozer, a similarly-priced Intel chip, running with fewer cores, will still win. Even in the center of what it claims to be its performance strength and its design goal, Bulldozer still loses, most of the time.

It runs too hot, it doesn't perform well, even compared to the last chip from the same company, and it's not really any cheaper than a faster, cooler Intel solution. It's like comparing AAA ball to the major leagues.

Yes, there are plenty of times when you don't need that much CPU power, but why would you saddle yourself with an inferior chip for the same price? When you do need raw compute power, you can get more if you buy one of the i-series chips, without spending more money.

And, no, you don't need a high-end i5. Even the Pentium G850, down at $80, is a fast little chip.

Maybe prices are different where you are but when I looked last year intel chips that performed comparably or better than an AMD chip were 30-50% more expensive... Mostly because AMD chips slotted into places in-between intel chips and so were never really directly comparable - at least, IMO.

[edit]
And that price increase, of course, includes the more expensive motherboards too.

Well, I took the time to actually go look up some hardware, comparing AMD's flagship, the 8150, with a motherboard, versus spending about the same amount on an Intel chip+motherboard. This is what I found.

I chose ASUS for motherboard brand because they make a lot of them, they're pretty good, and they make both Intel and AMD boards, so hopefully that will help control for margin and quality differences between manufacturers.

Using Newegg pricing.

On the AMD side:

  • FX-8150 Zambezi AM3+, $189
  • Expensive motherboard option:
    M5A99FX Pro, $155, no PCIe3, 2 PCIe2 x16, 2x4, 1x1, 7 SATA 6Gb/sec, yucky realtek networking
  • Cheap motherboard:
    M5A97, $99, no PCIe3, 1x16 slot, 1x4. 6 SATA6, icky realtek network chip.

Total: $344 expensive version, $288 cheap.

On the Intel side:

  • i5-3330, 3.0Ghz, 4 cores, $189. (Compare to 3570K at $230, 3.4Ghz.)
  • Expensive motherboard:
    p8z77-m pro, $150, 2PCIe3, either 1x16 or 2x8 (note that the speed is double, so that's as much actual bandwidth as 2 x16 slots in PCIe2), 1x4, 1x1. 4 SATA 3Gb/s, 2x6Gb/s, blech realtek networking.
  • Cheap motherboard:
    P8B75-V, $99, 1 PCIe3 x16, 1x4, 2x1, 3 pci. 5 SATA 3, 1 SATA6. Realtek again.

Total: $339 expensive version, $288 cheap.

The major difference there: that Intel chip is going to be a much stronger gaming choice than the 8150. It's got four cores, and it's got the superb Ivy Bridge per-clock throughput. I couldn't find direct benchmarks of that specific chip, but it's 3GHz compared to the 3.4Ghz of a 3570K, so it should be about 90% as fast. And the 3570K will usually beat the 8150 even where it's strong, and just dominates it where it's weak. CPU often doesn't matter, but when it does, the Intel chip should be much stronger for the great majority of current uses.

The AMD boards have more SATA 6G/s ports, though, which is nicer. The Intel boards have twice the bandwidth to their expansion cards. The SATA ports will probably matter more in the short to medium run, but if you were to keep these systems for a long time, the high bandwidth available in the PCIe3 slots would let you run new interface types (like, say, USB 4) twice as fast as on the AMD board.

Kinda pie in the sky there, though. SATA 6G/s is useful right now, faster PCIe may be useful someday.

Regardless, for gaming, I'd definitely go Intel. When CPU matters, the system will run much faster. And you will always, always be saving money, whether you're taxing the CPU or not, because the AMD chip will burn up to 125 watts, where the Intel chip is rated at 77 watts with its video chip, which you probably won't be using, if you're a gamer. 50 watts, under heavy load, is more likely. That will be much quieter, and cheaper, than 125.

Me, I'd spend the extra $40 to go up to a 3570K, which will be quite a lot faster with an overclock. If you were upgrading an existing computer, that $40 would be pretty noticeable, but if you were buying a whole system, you'd barely notice, and you'd have a substantially faster machine.

In an Ars Technica writeup on the new Piledriver desktop chips, they have a nice scatterplot at the end. They didn't bench the 3330, but per their graph, the 3470 is a dollar or two more than the 8150, and decidedly faster. And, of course, it does it in a much smaller power envelope.

Still going through the article, so no real strong conclusions yet.

The idea that processor performance has no impact on gaming is, I'm sorry, just wrong. It's one of those little things people take as common sense without backing it up.

The Tech Report's recent CPU gaming performance round up was enlightening. In Skyrim, for example, the AMD FX-8150 reached an average of 68 FPS. Intel's Core i5 3470 reached an average of 102 FPS. And this was no low-resolution CPU-oriented benchmark, mind you. They were running the game at 1920x1200 with most settings at Ultra and FXXA. A Radeon HD 7950 was used for the video card.

They received similar results in other games. In my opinion buying a new AMD processor for a gaming rig is nothing short of foolish.

There are a few things the AMD processors can do well but in most areas they are easily bested by the Intel competition and sometimes by a very large margin. This is despite the fact AMD processors use more power. It's just a mess.

Yes, AMD's graphics processors are still good, but that's all they have going for them and the profit they make in that market is not impressive.

In Skyrim, for example, the AMD FX-8150 reached an average of 68 FPS. Intel's Core i5 3470 reached an average of 102 FPS. And this was no low-resolution CPU-oriented benchmark, mind you. They were running the game at 1920x1200 with most settings at Ultra and FXXA. A Radeon HD 7950 was used for the video card.

Right. But it is worth pointing out that, when you're on an LCD monitor, everything over 60FPS all looks identical. 68 or 5,000 frames a second, it doesn't matter. A fast-refresh LCD would show the difference, but not many of us have those yet.

Now, keep in mind that I agree with you. I think it's kind of silly to buy an AMD processor. But, at the same time, when people say raw speed isn't that important, they're right in many cases. But then the Intel chip also uses half the power, so even if you're not using the extra speed, you're certainly saving on your electricity bill.

I wish it weren't so. I want AMD to be competitive. Intel gets very abusive when they're in control of the market, as they are now. But, no matter what I want to be true, if you're going to spend $180 on a processor, the Intel alternative will just beat the sh*t out of the AMD offering. And then if you want to really go fast, spending a little more money ($230ish) will get you into the K chips, which run away laughing.

And then consider the HTPC market, where the machines need to be as cool as possible, so that their fans can spin slow and they won't make much noise. The Ars article points out that the AMD chips have such a ridiculous thermal envelope that going with an Intel chip, and adding an entire aftermarket 7750 to a PCIe slot, ends up with a system with only 10 additional watts of TDP -- while, of course, being much faster both in the CPU and GPU sides.

This is what you get when you use automated tools for chip layout, instead of human engineers. AMD was talking this up, that they were really proud of the initiative to switch to cheaper tools-driven development. But, lo and behold, the engineers are proving they're worth paying for.

Faceless Clock wrote:

The idea that processor performance has no impact on gaming is, I'm sorry, just wrong. It's one of those little things people take as common sense without backing it up.

The Tech Report's recent CPU gaming performance round up was enlightening. In Skyrim, for example, the AMD FX-8150 reached an average of 68 FPS. Intel's Core i5 3470 reached an average of 102 FPS. And this was no low-resolution CPU-oriented benchmark, mind you. They were running the game at 1920x1200 with most settings at Ultra and FXXA. A Radeon HD 7950 was used for the video card.

They received similar results in other games. In my opinion buying a new AMD processor for a gaming rig is nothing short of foolish.

There are a few things the AMD processors can do well but in most areas they are easily bested by the Intel competition and sometimes by a very large margin. This is despite the fact AMD processors use more power. It's just a mess.

Yes, AMD's graphics processors are still good, but that's all they have going for them and the profit they make in that market is not impressive.

Actually the article shows in the scatter plot just how close modern processors are to each other. Basically my point in my original point. But since you are calling me out in your post I will actually go back and look at my post again.

My one thing is that for most games the CPU is all but irrelevant anyway.. You certainly can't go rocking a 10 year old CPU and expect to perform great in modern games with a nice GPU.. but you probably would have a completely fine gaming experience with any camps $100 CPU these days paired with a nice GPU relative to your gaming resolution.

Looking at the article seems like I was dead on.. none of the CPU's reviewed in that article would result in a "bad" gaming experience at 1080P provided you are matching up with a "nice" GPU relative to your gaming resolution.

Nowhere did I say that an AMD CPU is a better choice than an Intel CPU. I havent purchased an AMD CPU in years.. why would I? I havent recommended anyone purchase an AMD CPU in years.. I dont build any of my systems with AMD CPU's anymore and I won't until there is a compelling reason to do so.

But the facts are the facts.. while it would be silly to spend $$ on an AMD CPU when you can spend just a few more $$ on an Intel CPU both CPU's would indeed be basically fine for gaming.

Apparently you did not read the review. It points you several times that gaming does feel a bit better on Intel chips, as is evidenced by the frametime tests. Skyrim and Batman: Arkham Aslyum were smoother when running on Intel chips than AMD, and by large margin. Go back and look at the frametime portion of the review if you'd like to know more.

In addition, it's safe to assume that games will become more demanding as they go on. So while a basic AMD processor might run many modern games well it will need replaced very soon. Tom's look at Guild Wars 2 performance is a good example, as it shows very clearly that there is a large difference in performance between Intel and AMD in this new and demanding title.

I guess the AMD is "good enough" in the sense that it will, in fact, play games. And those games will often play acceptably. But pressing this line of though seems silly when we all know that Intel is better, and sometimes by a large margin.

I had a long rebuttal typed up.. then I realized you are probably one of those people. So I will bow to your superior knowledge and say yes you are 100% correct... I'm a nimrod who can't read.

Actually the article shows in the scatter plot just how close modern processors are to each other.

But Intel chips use half the power. Or less. And the fast Intel chips, which are only about $40 more than AMD's flagship, are a LOT faster. And they still use less power!

It's just dumb to buy AMD at this point. You might get an okay gaming experience, but you can get an Intel chip that's a little faster for the same amount of upfront money, and then spend less on power and cooling bills, and probably have a quieter and more reliable computer to boot. And if you're willing to spend a little more, and you have any real use for CPU power, the K chips just kick sand in AMD's face, and force it to run home to mommy.

omg... I hate human life right about now.. I'm so close to losing it.

Then why, Guru, do you insist on constantly bringing up AMD? You just hammer on that all the time, and then you get angry when we tell you that you shouldn't be recommending them, and then you insist that you weren't recommending them.

Why even mention them in the first place? Why are you in this thread, if you're not trying to push AMD chips?

Malor wrote:

Then why, Guru, do you insist on constantly bringing up AMD? You just hammer on that all the time, and then you get angry when we tell you that you shouldn't be recommending them, and then you insist that you weren't recommending them.

Why even mention them in the first place? Why are you in this thread, if you're not trying to push AMD chips?

I'm sorry.. I made my point very clear.. seems easy enough for me and I'm certainly not that smart.. so if you don't get it I can't help you anymore. Why you haven't been banned is a complete mystery to me and judging by emails I get to many others.

TheGameguru wrote:

I'm sorry.. I made my point very clear..

Unless your point is that people should buy AMD chips because they don't care about CPU speed, I don't think you have. That's all I'm seeing.

We're a Tech and Help forum. We're supposed to be helping. And telling people they should buy chips that cost twice as much to run doesn't strike me as helpful. We should be pointing people to the best value for money we can find.

And, again, Guru, the very first comment you made in this thread:

but you probably would have a completely fine gaming experience with any camps $100 CPU these days paired with a nice GPU relative to your gaming resolution.

And what we've been pointing out since is that this is not true. You probably want something up at $150 or so at least, and you've been getting links and data from multiple people that "any camp's CPU" is not the best solution. At basically every price point, you want Intel. And there are real uses for a fast CPU in the current generation of games; going up to a $220 CPU will give you very real performance increases in numerous current titles.

I know you love AMD, Guru, but you're making claims that just aren't true. In the Phenom II era, a couple of years ago, what you're saying would have been perfectly valid. But it's now the end of 2012, computer games are finally exceeding what the consoles can handle, and AMD's CPU offerings are slower than they were two years ago.

So you are saying that if I bought a $100 AMD CPU and a Nvidia 680GTX I would not have a completely fine gaming experience at 1080P?? Because that is exactly all I said.. anything you have added has been your own demented mind to somehow make it seem like I'm going into threads telling people to buy AMD over Intel at X price point.

I mean its trivial for me to set up this test and produce real world benchmarks at 1080P to prove my point... trivial in $$ but not time.. but say for example we did something like if I'm right you leave this website for good and never come back.. then I'm in.

Depends on the game. Many of the games out now wouldn't run that well anymore. Guild Wars 2, Starcraft 2, heck, maybe even WoW in a crowded place might not get very good frame rates. I gather Battlefield 3 would get choppy if you were on a big server... running by yourself, it would be wicked quick, but get into a big firefight and that rig would probably chug pretty hard.

And, by the way, when you accuse me of 'turning this into AMD versus Intel', try reading the thread title.

Did you even read the original linked article? Did you even have the same context that the rest of us did?