• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

8800gtx preview

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Elfear
Originally posted by: nanaki333

ack! where'd you see that at?

i'm so torn. it's not like there's going to be many games taking advantage of dx10 immediately so i COULD wait.

they better have a damn good card coming out to be 2 months behind nvidia!

Here is where I saw the rumor. Scroll down to post #2282. The other guys seem to respect CJ so I figured the rumor must be at least as credible as the others we've heard.

Also here: Link

ATI R600 GPU is expected to be released in Early Q1 on an 80nm (and perhaps 65nm too) process. The R600 is the successor to the R580 core and is expected to be built on a 65nm process and will be fully DirectX 10 compliant, utilising a Unified Shader Model architecture. Current rumours suggest that R600 will feature 64 Shader pipelines (processing both vertices and pixels) with 32 TMUs and 32 ROPs running at a clock speed of around 800Mhz. R600 is expected to interface to 512MB of 2Ghz+ GDDR4 Memory over a 512-bit interface. R600 is reported to consume up to 250W (twice that of R580) and may require two PCI Express power connectors.


Ok they tested on DT using a QX6700, so should I assume my "crappy" X2 3800+ will limit this cards capabilities? I really do not want to do a full upgrade until RD600, but I would love to up my frames in Minesweeper.

 
Originally posted by: nitromullet
Originally posted by: munky
In case nobody noticed, the rumored power consumption specs always turn out to be blown out of proportion compared to the actual power consumption. It's good to see the g80 not using ridiculuusly a lot of power, even my 430 watt psu would easily handle it.

I agree, but why the dual 6-pin molex plugs if it only draws 4% more juice than an X1950XT?

I'm wondering the same thing... is it possible the test wasn't done with GPU at 100% load? are we hitting some kind of bottleneck?
 
Those scores look really degrading to the X1950XTX.

Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.
 
Wow. Just wow.

I think DT just talked me into buying one of these bad boys.

>90% performance improvement in HL2 and Quake 4?? Native HDR+AA, angle independent AF, 16xQ single card AA...

Should be a beast in my rig running 1680x1050!
 
Originally posted by: Matt2
Wow. Just wow.

I think DT just talked me into buying one of these bad boys.

>90% performance improvement in HL2 and Quake 4?? Native HDR+AA, angle independent AF, 16xQ single card AA...

Should be a beast in my rig running 1680x1050!

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?
 
"When compared to AMD?s current flagship ATI Radeon X1950 XTX, the GeForce 8800GTX only consumes 24% more power at idle. The power consumption differences under load decreases to around 4%. Considering the performance differences, the GeForce 8800GTX is no worse than AMD?s ATI Radeon X1950 XTX in terms of performance-per-watt."

so relatively speaking, the power consumption is about the same as the ati.. where are all those condeming the power consuption fo the xtx now (tho if rumors hold true regarding r600 requiring 2x the power of r580, seems there would be good reason to complain about that)?

at any rate, looks to be a very nice product. i didn't see it in the article referenced here, but some have mentioned it will have "angle independant" AF, which will be very nice. i'll be curious to see what the price/performance comes out before i get too excited tho - the last generation things got back to sane levels ($299 gt's at release, etc) and would hate to see it reach ridiculous levels again..

and an inch and a half longer than the radeon? sadly, this won't fit in my mid-tower case (ttake tsunami) without some modifications (or i'll have to remove all the hard drives). why can't they make the pcb a bit taller instead of going longer all the time? would make fitting it a lot more easier 🙁


 
Originally posted by: KeithTalent
Originally posted by: Matt2
Wow. Just wow.

I think DT just talked me into buying one of these bad boys.

>90% performance improvement in HL2 and Quake 4?? Native HDR+AA, angle independent AF, 16xQ single card AA...

Should be a beast in my rig running 1680x1050!

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?

I'm pretty sure you're going to be CPU limited at only 2GHZ, but then again, at 1920x1200, you might be ok because you're mostly GPU limited at such a high resolution. Especially with AA and all the goodies.
 
Originally posted by: Matt2
Originally posted by: KeithTalent
Originally posted by: Matt2
Wow. Just wow.

I think DT just talked me into buying one of these bad boys.

>90% performance improvement in HL2 and Quake 4?? Native HDR+AA, angle independent AF, 16xQ single card AA...

Should be a beast in my rig running 1680x1050!

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?

I'm pretty sure you're going to be CPU limited at only 2GHZ, but then again, at 1920x1200, you might be ok because you're mostly GPU limited at such a high resolution. Especially with AA and all the goodies.

Good stuff, thanks! I may need to sell my 3800 and 1900xt and upgrade to a 4400 & 8800gtx now, but only if the 8800 comes in under $700 Canadian :frown:
 
Originally posted by: KeithTalent
Originally posted by: Matt2
Originally posted by: KeithTalent
Originally posted by: Matt2
Wow. Just wow.

I think DT just talked me into buying one of these bad boys.

>90% performance improvement in HL2 and Quake 4?? Native HDR+AA, angle independent AF, 16xQ single card AA...

Should be a beast in my rig running 1680x1050!

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?

I'm pretty sure you're going to be CPU limited at only 2GHZ, but then again, at 1920x1200, you might be ok because you're mostly GPU limited at such a high resolution. Especially with AA and all the goodies.

Good stuff, thanks! I may need to sell my 3800 and 1900xt and upgrade to a 4400 & 8800gtx now, but only if the 8800 comes in under $700 Canadian :frown:

You could always overclock! Think of it as a FREE upgrade to that 4400!
 
Originally posted by: Matt2
Originally posted by: KeithTalent
Originally posted by: Matt2
Originally posted by: KeithTalent
Originally posted by: Matt2
Wow. Just wow.

I think DT just talked me into buying one of these bad boys.

>90% performance improvement in HL2 and Quake 4?? Native HDR+AA, angle independent AF, 16xQ single card AA...

Should be a beast in my rig running 1680x1050!

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?

I'm pretty sure you're going to be CPU limited at only 2GHZ, but then again, at 1920x1200, you might be ok because you're mostly GPU limited at such a high resolution. Especially with AA and all the goodies.

Good stuff, thanks! I may need to sell my 3800 and 1900xt and upgrade to a 4400 & 8800gtx now, but only if the 8800 comes in under $700 Canadian :frown:

You could always overclock! Think of it as a FREE upgrade to that 4400!

I would love to, but have absolutely no idea how to do it. I'd probably end up melting something.
 
Originally posted by: KeithTalent
I would love to, but have absolutely no idea how to do it. I'd probably end up melting something.

It's really a lot more painless than most people think.

You could probably get 2.4-2.5GHZ without even increasing the voltages and keeping the temperatures low.

You should read up on the AMD OC threads in CPU/Overclocking forum. Even if you still dont want to do it, at least you'll know how and have the option.
 
Originally posted by: Matt2
Originally posted by: KeithTalent
I would love to, but have absolutely no idea how to do it. I'd probably end up melting something.

It's really a lot more painless than most people think.

You could probably get 2.4-2.5GHZ without even increasing the voltages and keeping the temperatures low.

You should read up on the AMD OC threads in CPU/Overclocking forum. Even if you still dont want to do it, at least you'll know how and have the option.

Will do, thanks man. I'm kind of lazy and I don't want to have to put a new heatsink and everything on it, so if I can get away with a mild OC without doing that stuff I will definitely give it a try.

Then get my 8800 and be off to the races....
 
Originally posted by: josh6079
Those scores look really degrading to the X1950XTX.

Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.

I think everyones jumping the gun too fast here.. The scores look suspicious indeed, although I dont doubt the GTX will be that fast or faster , but the x1950xtx is alot faster than that
 
Originally posted by: KeithTalent

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?

I highly doubt the 8800GTX will be able to run games maxed out at 1920x1200. Most games it will fly through at that res but I imagine some will still bring it to it's knees. With the clocks in my sig I still get slowdown in a few areas with Oblivion at 1920x1200 HDR+4xAA and I doubt the 8800GTX will be much faster.
 
Originally posted by: Elfear
Originally posted by: KeithTalent

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?

I highly doubt the 8800GTX will be able to run games maxed out at 1920x1200. Most games it will fly through at that res but I imagine some will still bring it to it's knees. With the clocks in my sig I still get slowdown in a few areas with Oblivion at 1920x1200 HDR+4xAA and I doubt the 8800GTX will be much faster.

Yeah, but for the majority of games I play at the moment it will.

It would be nice to bump up some of the effects on Oblivion though, that's for sure. That game kills my system big time at my native res. :frown:
 
The power consumption measured is based on the power that DailyTech's G80 required to run those particular applications. Naturally, the apps aren't quite using the G80 core to it's full potential and power consumption can be upwards of 225W per card.

 
Originally posted by: josh6079
Those scores look really degrading to the X1950XTX.

Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.

i think those scores are very suspicious. a 1950XTX only puts up 34 frames in quake 4 @ 1600x1200 w/4x AA? i dont think so...
 
Originally posted by: Corporate Thug
Originally posted by: josh6079
Those scores look really degrading to the X1950XTX.

Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.

i think those scores are very suspicious. a 1950XTX only puts up 34 frames in quake 4 @ 1600x1200 w/4x AA? i dont think so...

DT =/= AT.

Remember, different sites use different methods, settings and timedemos.

They could for one haved used 16xHQ AF along with TRAA/AAA as well in a different timedemo.

But it looks like G80 is on 80nm.

Link

While ATI prepares to roll out five 80nm grahics processor units (GPUs) in November, Nvidia plans to speed up its 80nm-production schedule of by rolling out revised versions of its G72 and G73 chips to compete with the rival, according to a Chinese-language Commercial Times report.

The revised versions of G73 (codenamed G73-B1) and G72, along with the highly anticipated G80, will be targeted to compete with ATI solutions from mid-range to high-end market segments, the paper noted.

This could explain the lower power consumption.
 
Originally posted by: CaiNaM
"When compared to AMD?s current flagship ATI Radeon X1950 XTX, the GeForce 8800GTX only consumes 24% more power at idle. The power consumption differences under load decreases to around 4%. Considering the performance differences, the GeForce 8800GTX is no worse than AMD?s ATI Radeon X1950 XTX in terms of performance-per-watt."

so relatively speaking, the power consumption is about the same as the ati.. where are all those condeming the power consuption fo the xtx now (tho if rumors hold true regarding r600 requiring 2x the power of r580, seems there would be good reason to complain about that)?

actually the power consumption isn't an extra 24% at idle. the power consumption of the system as a whole is 24% higher. a radeon 1950xtx consumes just 33 watts at idle. a gf8800 gtx consumes 45 more watts than that, or 136% more watts at idle (actually a little lower, as power supply inefficiencies enlarge the consumption difference, so more like 110%). most people's computers are idle more often than not, so that is a sizeable increase.

at load the x1950xtx consumes about 125 watts, so the 8800 consumes about 135 watts. or 61% more than the 7900GTX (and only ~8% more than the xtx).
 
Originally posted by: Elfear
Originally posted by: KeithTalent

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?

I highly doubt the 8800GTX will be able to run games maxed out at 1920x1200. Most games it will fly through at that res but I imagine some will still bring it to it's knees. With the clocks in my sig I still get slowdown in a few areas with Oblivion at 1920x1200 HDR+4xAA and I doubt the 8800GTX will be much faster.

We all know there will be speed penalties when using dual cards in Crossfire/SLI. What I mean by that is that you'll never get 2x single card performance.

Take into consideration that these benches show G80 >90% in Hl2 and Quake 4, I doubt that Crossfired X1900XTXs are 90% faster than a single X1900XTX. probably more like 60-70% depending on resolution and settings (If you get Crossfire working properly).

Every generation we also see GPUs become more efficient. While you're X1900s might get slowdowns, it is very possible that G80 can handle higher resolutions without as much overhead.
 
Back
Top