8800 GT - Give me!

Pages

Been reading some great things about the 8800 GT, a card that I can hope to afford, and provide enough of a jump over my 7600 GT to justify picking it up.

I like this quote from ExtremeTech:

ExtremeTech wrote:

This is, by far, the best bang-for-the-buck we've seen in years. It's a killer card, likely destined to be remembered with the same fondness as the GeForce 4 Ti 4200, Radeon 9700 Pro, or GeForce 6800 GT

Bang-for-the-buck is the name of my PC building game.

I just shipped my 320MB GTS off with the EVGA Step-Up to exchange it for one of these. The benchmarks look just superb on this card, and hopefully the extra RAM will future-proof me better than the GTS would.

It's an amazing deal. It's not just an inexpensive so-so performance or broken/neutered card either.

I think it's actually cheaper right now to pick up a pair of 8800GTs for SLI then a single 8800GTX...

I wonder if the rest of the 8-series will get a price adjustment since they're basically boat anchors at current price points vs. the GT.

That's close to the card I've been waiting for, I think. My 7800 GTX is still doing pretty good in games now that I upgraded my proc and stuff, but Crysis still makes it weep a little. How long will ATI/AMD let Nvidia stomp all over them I wonder? I'd like to see a new card from them before I make any decisions.

As report associates with reference to official AMD notice , the announcement of video card on base of chip RV670 is now planed on 15 November, 2007. Video card on base of RV670 chip must enter immediately after the announcement in sale.

AMD and NVIDIA divided measures for the announcement of new graphical solutions and chipsets. For example, GeForce 8800 GT will be released on 29 October, and the chipset series nForce 750/780 will be available on 12 November. Analogously, following the announcement of RV670 in fifteenth November AMD plan to announce the chipsets AMD of 790FX (RD790) and AMD 770 (RX780) .

I'm also waiting to see what ATI has in-store for us with their new announcement. My card is still very capable (ATI X1950XTX) , but my next upgrade will have to be for a DX10 card.

I got a 7950gt and it runs all my games nicely (cept crysis, i have to run 1024x768 with mixed settings) and i'm not sure if the 8800gt will make me pull out my credit card. But it is dangerously close to it. I promised myself i would wait for a generation to pass before i go and purchase but all the reviews have got my mouth watering.

According to all the reviews i have read the 8800gt is VERY close to the GTX in terms of performance, if you OC you can get it with ease as well, but it does run a little hot with the tiny 1 slot cooling. I still wish someone would comeout with a passive cooler like my XFX 7950GT card, i am really starting to enjoying having no noise except my hard drives.

Hopefully they will drop the new 9 series shortly, then i may jump up to the best bang for the buck there.

Actually I think the single slot design of the card is a huge draw. If they'd do a 65nm version of the GTX in a single slot design that kept the performance I'd buy one.

I'd wait and see what ATI's new refresh brings.. historically since the 9800PRO (ATI's last slam dunk) ATI's intial GPU is always underwhelming.. but their refresh is usually something special.

That is true. I hope they can get their act together before we're down to only one choice for gaming graphics. Is the 8800GT 512MB? I love my X1950Pro 512MB but some stuff it can't handle well (like say Crysis) and if the GT's the right price and 512MB, I might consider jumping over and putting this card on eBay while it's still worth something.

Actually I think the single slot design of the card is a huge draw.

Seconded, something about the dual slot cards makes me feel quite overzealous

I'm beginning to regret my GTS more and more every minute since this was announced. I wonder if I can get anyone to buy it off me still. The size is a huge draw to me.

I always liked dual slot cards myself.. having external exhaust is always a good thing over internal exhaust like a single card has.

I like the GTS because it's so very quiet. That single-slot cooling solution worries me.

If I were to pick up a pair of these, what's the best SLI capable motherboard available?

Parallax Abstraction wrote:

Is the 8800GT 512MB?

There is a 512MB version as well as a cheaper 256MB version.

Very cool. I'm really having to restrain myself from going out and getting an eVGA 512MB today. Is anyone interested in a 4 month-old X1950 Pro 512MB?

Parallax Abstraction wrote:

Very cool. I'm really having to restrain myself from going out and getting an eVGA 512MB today. Is anyone interested in a 4 month-old X1950 Pro 512MB? :)

Heh, I'm in the same boat you are, except my card is a little newer. I'm going to resist the new hotness for the time being and wait to see how ATI/AMD responds. I think I can live w/ Crysis on this card, and by the time my budget can absorb another video card purchase, pricing options will have changed.

I just ordered a 8800GT from Newegg. I was interested to see that, after being sold out for the last few days, it appears that they've made an across-the-board price hike of about $10. At least, this appears to be true of the EVGA cards I was interested in. Still, this is a mandatory upgrade; my 7600GT simply won't drive graphics at 1680x1050 for any game beyond HL2 or WoW. But now I'm hopefully set for a little while.

Newegg always hikes prices a little after impressive initial opening sales. It's not something I like.. but I deal with it from them because in the probably 100 or more orders I've put into that site over the year I've never had one single issue.

So whats the deal here with modern graphics cards? Is there such a thing as CPU limited anymore or are graphics cards so fully featured that you can use them in a single core machine and still get full or near full performance?

In other words, can I throw one of these in my Pentium D 2.8 ghz and run things well? HGL is bringing my x800XL to its knees. It runs ok and looks ok with moderate settings but it looks so phenomenal with cranked settings that I want to play it like that.

fangblackbone wrote:

So whats the deal here with modern graphics cards? Is there such a thing as CPU limited anymore or are graphics cards so fully featured that you can use them in a single core machine and still get full or near full performance?

In other words, can I throw one of these in my Pentium D 2.8 ghz and run things well? HGL is bringing my x800XL to its knees. It runs ok and looks ok with moderate settings but it looks so phenomenal with cranked settings that I want to play it like that.

The CPU is still the heart of the system and if it ain't pumping enough blood to the other body parts they are going to underperform. The CPU still needs to route information to the graphics subsystem. If the GPU is forced to sit there waiting for instructions from the CPU you're not going to be getting the full potential from the graphics card.

My understanding is rather limited, but I think a lot of the muscle in the new cards is about driving resolution more than number of objects.

Very loosely speaking, I think CPU mostly has to do with how many things can be on screen at once, and then graphic card muscle tends to be the bottleneck for driving high resolution, FSAA, and, um, I've lost the name for the third thing -- the texture smoothing. Edit: I remember now...anisotropic filtering.

The CPU power more or less determines the maximum possible framerate for your game. GPU power lets you reach that maximum at higher and higher resolutions.

fangblackbone wrote:

So whats the deal here with modern graphics cards? Is there such a thing as CPU limited anymore or are graphics cards so fully featured that you can use them in a single core machine and still get full or near full performance?

In other words, can I throw one of these in my Pentium D 2.8 ghz and run things well? HGL is bringing my x800XL to its knees. It runs ok and looks ok with moderate settings but it looks so phenomenal with cranked settings that I want to play it like that.

Jumping from my single core Athlon 64 3500+ to a dual-core E6750 and sticking with my 7800GTX saw a dramatic performance increase for me. The x800XL may not scale to a level that extreme, but you should notice a difference.

Sorry, my mistake. My pentium D 2.8 is a dual core. My understanding is that its performance should be comparable to a core2duo of lesser clock speed with the side effect of using more power and running hotter.

I guess at least since the 8800GT has 512mb I could conceivably run better texture and shader settings on it. The x800 has 256mb.

I could care less about higher resolutions. Ive always prefered lower res's with all the non trivial bells and whistles on with higher performance. I thought that 3d data was largely streamed from the CPU on modern GPU's with the GPU handling the processing. Is this not the case?

fangblackbone wrote:

I guess at least since the 8800GT has 512mb I could conceivably run better texture and shader settings on it. The x800 has 256mb.

I could care less about higher resolutions. Ive always prefered lower res's with all the non trivial bells and whistles on with higher performance. I thought that 3d data was largely streamed from the CPU on modern GPU's with the GPU handling the processing. Is this not the case?

Well, the CPU still has to handle all the game logic and AI. The 3D data that's being sent to the GPU doesn't come from thin air. The CPU is still on the hook for calculating the 3D position of all the vertices for all the polys being displayed and moving about.

CPU, GPU & RAM all play together to maximize gaming performance. Crap RAM can force the CPU to spend lots of cycles waiting for data. Slow CPU means you can't generate the data to send to the GPU fast enough. Slow GPU and it won't render the frame very quickly.

You can see pretty easily the CPU impact on performance if you poke around Tom's Hardware's Charts.

P4D vs. C2D isn't close (sorry). Neither is a X800 vs. a 8800GTX - 8800GT isn't on the chart yet, but performance is similar.

My understanding is that its performance should be comparable to a core2duo of lesser clock speed with the side effect of using more power and running hotter.

Well, it'd be a lot slower... I suspect it'd be roughly like a 1.6 or a 1.8 C2D.

Depending on your board, you might be able to just drop a C2D in... it has to do with whether or not your voltage regulator can handle the lower C2D voltage. If you bought early, it probably won't, but if you bought later, it's likely to work.

I thought that 3d data was largely streamed from the CPU on modern GPU's with the GPU handling the processing. Is this not the case?

I think that's roughly correct, yes.... how fast the CPU can generate geometry (which is fairly resolution independent) determines your maximum possible framerate, and then output resolution will slow you down to a greater or lesser degree based on GPU power. An 8800GT should run 1920x1200 flawlessly at max settings, I'd think.

FiringSquad recently did a nice comparison between a wide range of cpus on a rig equipped with an 8800gt.

In general a faster cpu will give you better frame rates if you aren't bottlenecking your vid card. If you are trying to run at really high resolutions with all goodies on then your $500+ cpu probably won't be any faster (frame rate-wise) than an $80 Althon x2 4000.

This is basically what the FS tests show on 3 new games.

Well, just got my 8800GT yesterday. Verdict: oh, it's sweet. I can finally run Bioshock and Oblivion maxed (although the latter still has the annoying HDR/AA trade-off). World in Conflict runs like butter and I'll be testing out CoH and the CoD4 demo shortly. Basically, it was an incredible upgrade over the 7600GT I had in there before.

That said, Neverwinter Nights 2 *still* runs like crap. Makes Coldstream mad. Coldstream smash!

That means it's crap code. The 8800GTS, a lesser version of what you have, is a monstrously fast card. Any current software that doesn't run well on it is written badly.

Pages