NV35 Benchmarks from Cebit. 131% Faster than NV30..and its downclocked! *Post from Anand Inside*

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76
Originally posted by: AtomicDude512
I hope this isnt the core that produces 120w like Nvidia's CEO said a while back... :(

On the plus side if it isn't 60db that is nice. :)

120W is a bit extreme. I think 120W would be right if NVIDIA was planning on releasing a dual chip Geforce card.
 

Kommet

Junior Member
Jan 7, 2002
2
0
0
I've been waiting and waiting for someone to point out that the NV30 runs with DDR-II memory (which someone finally has) and that DDR-II runs its core differently than DDR-I did (which nobody has yet).

DDR-II runs its address clock at twice the frequency of its internal DRAM core. Thus DDR-II 400 (200 MHz address clock) runs its core at 100 MHz, for a 10 ns cycle time. The DDR-II 1000 (500 MHz adress clock) memory used by the NV30 and presumably the NV35 is pushing data at 1 Gbit/s per pin, but its DRAM cells are only (heh. only.) cycling at 4 ns, not 2 ns like people have been discussing.

DDR-II memory cells actually have a longer cycle time at 1 Gbit/s per pin (500 MHz DDR) than does DDR-I operating at 650 Mbit/s per pin (325 MHz DDR), in fact apreciably so. This is why we are seeing a push to DDR-II now, and I guess GDDR-3 in the near-ish future; the memory interface can be scaled up faster than DRAM cycle times can be scaled down, so maintaining bandwidth while doubling cycle time is a huge load off the mind for desktop DRAM makers (DDR-I 400 vs. DDR-II 400) and maintaining cycle times while doubling theoretical bandwidth is a huge win for device manufacturers like nVidia.

The R350 was "released" in early March and the NV35 is expected to be "released" in May. As "released" vs. "available" is a gray area, NV35 could be realistically available anywhere from 1 to 3 months after the R350, although anywhere from negative to positive infinity is possible. This is NOT the 6 months lead time some people seem to have assumed. nVidia has stated publicly that the delayed release/availability of NV30 did not greatly change the release date of NV35. The GeForce FX 5800 is thus almost still-born, but longer-term roadmaps remain largely in-place.

Finally, the NV35 is unlikely to be pitted against the R400 unless ATI moves up the R400 design from 2004 or nVidia misses a release. Based on the headroom in the R350, really ATI should be just fine for a Fall Refresh part (switch to DDR-II or GDDR-3, tune the core a little, raise clocks, done). Both companies expect their next major cores to arrive in 2004, presumably about this time next year.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Originally posted by: GTaudiophile
According to Anand's review of the Radeon 9800 PRO, it scores 128.5 FPS in QIII with 4X AA/8X Perf Ansio at 1600x1200x32.

I don't see the big news here. By the time NV35 is released, the Radeon 9800 PRO will cost much less and offer say 20-40 fewer FPS in said benchmark if that? And if nVidia releases NV35 this summer or fall, they'll be getting close to R400's release date.

THe big news here is, GTaudiophile, is the NV35, was UNDERCLOCKED to 250Mhz, and the Ati9800 Pro was at FULL clock speeds. The NV35 was 250Mhz and got 111FPS, compared to ATI's FULL speed and 128FPS? When NV35 gets FULL clock speeds, this thing is gonna be tough to beat, however...the price!
 

Dean

Platinum Member
Oct 10, 1999
2,757
0
76
Originally posted by: Shamrock
Originally posted by: GTaudiophile
According to Anand's review of the Radeon 9800 PRO, it scores 128.5 FPS in QIII with 4X AA/8X Perf Ansio at 1600x1200x32.

I don't see the big news here. By the time NV35 is released, the Radeon 9800 PRO will cost much less and offer say 20-40 fewer FPS in said benchmark if that? And if nVidia releases NV35 this summer or fall, they'll be getting close to R400's release date.

THe big news here is, GTaudiophile, is the NV35, was UNDERCLOCKED to 250Mhz, and the Ati9800 Pro was at FULL clock speeds. The NV35 was 250Mhz and got 111FPS, compared to ATI's FULL speed and 128FPS? When NV35 gets FULL clock speeds, this thing is gonna be tough to beat, however...the price!


I also question the IQ setting thos benchmarks show. If its the same as in the NV30(performance or balanced) you are comparing apples to Oranges. If you want to compare it to ATI's quality settings FPS I hope they benched it in APPLICATION mode. I have my doubts on that however.
 

jbirney

Member
Jul 24, 2000
188
0
0
THe big news here is, GTaudiophile, is the NV35, was UNDERCLOCKED to 250Mhz, and the Ati9800 Pro was at FULL clock speeds. The NV35 was 250Mhz and got 111FPS, compared to ATI's FULL speed and 128FPS? When NV35 gets FULL clock speeds, this thing is gonna be tough to beat, however...the price!

Your also comparing different AA mods. I would take ATI's rotated gamma corrected AA over the OG x4 of the FX anyday :)
 

Wurrmm

Senior member
Feb 18, 2003
428
0
0
Ooooo....looks good but when??? Will Nvidia announce in May??? Does anybody know?
 

Wurrmm

Senior member
Feb 18, 2003
428
0
0
The way I look at it....Nvidia needs to say something soon, becasue there are alot of people out there in the May upgrade mood looking for a good GPU. I might have to do like alot of people do get a 9800 pro. Great an inoovative tech don't do anybody any good if it is not here.
 

DannyBoy

Diamond Member
Nov 27, 2002
8,820
2
81
www.danj.me
Im holding back waiting for nVidia to come up with something good.

Ive stayed with nVidia right from the days I got my first GeForce 256 GPU.

Im used to nVidia now, and i dont like the idea of moving to ATI.

I remember using ATI graphics cards long before nVidia were around, and i dont miss them.

Perhaps I will be considering the 9800 Pro though, unless i hear of something good from nVidia soon.

I start a new job in 2 weeks, and after my first 4 weeks, when i recieve my first paycheck (Which will be a nice one) I will be buying a new computer setup.

The mobo & CPU i have chosen are going to be the obvious nForce2 combo.

As for Graphics Card, well I hope they do something good soon :(

Dan
 

Wurrmm

Senior member
Feb 18, 2003
428
0
0
Man...I hear you....I have also been with Nvidia for a long long time and when I get my P4 Canterwood system, I would like a nice GPU to match it. Hopefully Nvidia will announce next month around the time that Intel CPU prices drop.
 

DannyBoy

Diamond Member
Nov 27, 2002
8,820
2
81
www.danj.me
Originally posted by: Wurrmm
Man...I hear you....I have also been with Nvidia for a long long time and when I get my P4 Canterwood system, I would like a nice GPU to match it. Hopefully Nvidia will announce next month around the time that Intel CPU prices drop.

Yea, I would possibly consider intel, but only if the price was low enough, which im pretty sure it wont be.

I will be going nForce2 due to the insanely low prices compared to Intel Chipets & Chips, and the nice overclockability :)

Dan
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Adul
Originally posted by: Wurrmm
Ooooo....looks good but when??? Will Nvidia announce in May??? Does anybody know?

nvidia knows

Rumours . . . WE got RUMOURS - NV 40 and beyond (the Inq of course, yesterday)
rolleye.gif


NV40 will give Nvidia performance lead by year end

Pricing and other machinations of the graphics folk

THIS IS WHERE we see pricing and technology from ATI and Nvidia for their family of products close to the end of this year and the beginning of next year.

Remember that by this time next year, PCI Express is bound to upset the applecart and change all the rules of the game.

ATI
Up to $100 - R9200 family based on RV280 (still DX8 chip)
Up to $200 - R9600 family on RV350
Up to $300 - R9800 family on R350
Up to $400 - R9900 family on R350 with DDRII memory

Nvidia
Up to $100 - NV34 in a lot of different core/memory speeds (DX9!)
Up to $200 - NV36 the true half of NV35
Up to $300 - NV35 256 bit DDR + twice the floating power of NV30
Up to $400 - NV40 - probably we see it, as a Christmas present.

But we don't see any NV30/31 at the end of this year. Seems like this was a kind of error from Nvidia's point of view, and Nvidia has attempted to correct this "error" just as fast as it possibly can.

And it also seems as if NVDA is doing a sort of half step forward, not only will it have DX9 in all segments but it will also try to introduce the new generation chip NV40 sometime around the end of the year ? a couple of months earlier than ATI's R400 schedule.

Let's explore this a little further.

The NV35 - DX9 chip with PS/VS 2.0+ functionality and base OpenGL 2.0 fragment shading capabilities. 4õ2 architecture with 8õ1 turbo mode for single texturing and with 2 universal floating/integer ALU and two TMU on each pipe. Also this will have twice the width internal register file, and this fact removes the known NV30 bottleneck and busts the complicated 2.0 shaders performance a lot.

NV36 is a true half slice of the the NV35 - with 2õ2 or 4õ1 modes. Pixel power is half as low but the vertex power is pretty much the same as the NV35.

The NV40-DX9 chip will have PS/VS 3.0 functionality and sophisticated OpenGL 2.0 fragment shading capabilities. 8õ1 (only) plain architecture with two universal floating/integer ALU and one TMU on each pipe. So no more sophisticated schemes like 4õ2 or 8õ1 depending on the application, only raw power.

Yes, it's hard to believe right now, but probably by the end of the year history will repeat itself, and Nvidia stands to be a bit faster and a bit more revolutionary, but ATI will be less expensive and will give practically the same benefits.

So developers will be able to rest a little. Only NV40 will create problems for them, everything else will remain pretty similar right up to the end of the year.

So, guys, now's the time for to sit down and write a couple of 2.0 shaders, eh? µ

I got more rumours . . . :D

 

DannyBoy

Diamond Member
Nov 27, 2002
8,820
2
81
www.danj.me
Originally posted by: apoppin
Originally posted by: Adul
Originally posted by: Wurrmm
Ooooo....looks good but when??? Will Nvidia announce in May??? Does anybody know?

nvidia knows

Rumours . . . WE got RUMOURS - NV 40 and beyond (the Inq of course, yesterday)
rolleye.gif


NV40 will give Nvidia performance lead by year end

Pricing and other machinations of the graphics folk

THIS IS WHERE we see pricing and technology from ATI and Nvidia for their family of products close to the end of this year and the beginning of next year.

Remember that by this time next year, PCI Express is bound to upset the applecart and change all the rules of the game.

ATI
Up to $100 - R9200 family based on RV280 (still DX8 chip)
Up to $200 - R9600 family on RV350
Up to $300 - R9800 family on R350
Up to $400 - R9900 family on R350 with DDRII memory

Nvidia
Up to $100 - NV34 in a lot of different core/memory speeds (DX9!)
Up to $200 - NV36 the true half of NV35
Up to $300 - NV35 256 bit DDR + twice the floating power of NV30
Up to $400 - NV40 - probably we see it, as a Christmas present.

But we don't see any NV30/31 at the end of this year. Seems like this was a kind of error from Nvidia's point of view, and Nvidia has attempted to correct this "error" just as fast as it possibly can.

And it also seems as if NVDA is doing a sort of half step forward, not only will it have DX9 in all segments but it will also try to introduce the new generation chip NV40 sometime around the end of the year ? a couple of months earlier than ATI's R400 schedule.

Let's explore this a little further.

The NV35 - DX9 chip with PS/VS 2.0+ functionality and base OpenGL 2.0 fragment shading capabilities. 4õ2 architecture with 8õ1 turbo mode for single texturing and with 2 universal floating/integer ALU and two TMU on each pipe. Also this will have twice the width internal register file, and this fact removes the known NV30 bottleneck and busts the complicated 2.0 shaders performance a lot.

NV36 is a true half slice of the the NV35 - with 2õ2 or 4õ1 modes. Pixel power is half as low but the vertex power is pretty much the same as the NV35.

The NV40-DX9 chip will have PS/VS 3.0 functionality and sophisticated OpenGL 2.0 fragment shading capabilities. 8õ1 (only) plain architecture with two universal floating/integer ALU and one TMU on each pipe. So no more sophisticated schemes like 4õ2 or 8õ1 depending on the application, only raw power.

Yes, it's hard to believe right now, but probably by the end of the year history will repeat itself, and Nvidia stands to be a bit faster and a bit more revolutionary, but ATI will be less expensive and will give practically the same benefits.

So developers will be able to rest a little. Only NV40 will create problems for them, everything else will remain pretty similar right up to the end of the year.

So, guys, now's the time for to sit down and write a couple of 2.0 shaders, eh? µ

I got more rumours . . . :D

Dats happy man, thats real happy.

I think the whole Inquirer Site is Happy.

I feel sorry for the people who believe that crap.

Would be kinda cool to see "NV40" at christmas, although we know this is not true...although who knows, perhaps nVidia is keeping a BIG secret (Which we all know is very unlikely) ;)
 

mechBgon

Super Moderator<br>Elite Member
Oct 31, 1999
30,699
1
0
If that card really does draw over 100W of power, it will be interesting to watch what happens when it's installed in a cube with a ~200W PSU. Hopefully the real power draw is much lower than that.
 

DannyBoy

Diamond Member
Nov 27, 2002
8,820
2
81
www.danj.me
Originally posted by: mechBgon
If that card really does draw over 100W of power, it will be interesting to watch what happens when it's installed in a cube with a ~200W PSU. Hopefully the real power draw is much lower than that.

Lol yeah I agree with you tom, there would be alot of annoyed people if that was the case.

Do you or anyone else know what the current draw is on the 9800?
 

mechBgon

Super Moderator<br>Elite Member
Oct 31, 1999
30,699
1
0
Originally posted by: DannyBoy
Originally posted by: mechBgon
If that card really does draw over 100W of power, it will be interesting to watch what happens when it's installed in a cube with a ~200W PSU. Hopefully the real power draw is much lower than that.

Lol yeah I agree with you tom, there would be alot of annoyed people if that was the case.

Do you or anyone else know what the current draw is on the 9800?
Ok, Google coughed this up: Power output of GFFX is ~75W and they say that's about 1/3 higher than a 9700 Pro, so the 9700 Pro must be in the neighborhood of 50-55W. The 9800 is built on 0.15-micron process like the 9700's, so it should be a bit higher, say 55-60W? Sound logical?
 

DannyBoy

Diamond Member
Nov 27, 2002
8,820
2
81
www.danj.me
Originally posted by: mechBgon
Originally posted by: DannyBoy
Originally posted by: mechBgon
If that card really does draw over 100W of power, it will be interesting to watch what happens when it's installed in a cube with a ~200W PSU. Hopefully the real power draw is much lower than that.

Lol yeah I agree with you tom, there would be alot of annoyed people if that was the case.

Do you or anyone else know what the current draw is on the 9800?
Ok, Google coughed this up: Power output of GFFX is ~75W and they say that's about 1/3 higher than a 9700 Pro, so the 9700 Pro must be in the neighborhood of 50-55W. The 9800 is built on 0.15-micron process like the 9700's, so it should be a bit higher, say 55-60W? Sound logical?

Do you trust what comes from the inquirer?

I mean I often wonder if half the stuff that ends up on their site was written by an 11yr old & then stolen off this kids site
rolleye.gif


Is there no reliable evidence for the power ratings of the NV35 based cards?
 

mechBgon

Super Moderator<br>Elite Member
Oct 31, 1999
30,699
1
0
Originally posted by: DannyBoy
Originally posted by: mechBgon
Originally posted by: DannyBoy
Originally posted by: mechBgon
If that card really does draw over 100W of power, it will be interesting to watch what happens when it's installed in a cube with a ~200W PSU. Hopefully the real power draw is much lower than that.

Lol yeah I agree with you tom, there would be alot of annoyed people if that was the case.

Do you or anyone else know what the current draw is on the 9800?
Ok, Google coughed this up: Power output of GFFX is ~75W and they say that's about 1/3 higher than a 9700 Pro, so the 9700 Pro must be in the neighborhood of 50-55W. The 9800 is built on 0.15-micron process like the 9700's, so it should be a bit higher, say 55-60W? Sound logical?

Do you trust what comes from the inquirer?

I mean I often wonder if half the stuff that ends up on their site was written by an 11yr old & then stolen off this kids site
rolleye.gif


Is there no reliable evidence for the power ratings of the NV35 based cards?
For starters, what they publish seems often to come true. If you've got a laundry list of failed Inquirer predictions, I'd be interested to see :D

As for their numbers in this case, it certainly sounds reasonable to me for the GeForce FX to have 75W of thermal output, or else they wouldn't need that crazy Dustbuster stuck to it (or an auxiliary power lead either). Eh? ;) However, I'll see if I can dredge up some hard data on either GPU at ATI's and nVidia's sites. Typically this is a lost cause because for some reason, they like to hide that kind of stuff from the consumers. Anand, you got any input on the official power draw of any of these GPUs?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: GTaudiophile
Originally posted by: jdogg707
But they said it was clocked down by half the original specs for a fair comparison, so I would think that by increasing the speed by another 250MHZ for the final product we will see more than a 20-40 FPS differece.

I don't think another 250mhz is going to produce 222 FPS in that benchmark, sorry.

Yes, I expect nVidia to take the crown with NV35, but I also expect ATi to respond swiftly with R400.

I agree and I also don't see where another 30fps even matters when you are running 1600x1200 at 8X/4X and 100+FPS. I mean common
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I have to question their benchmarks in Q3 though because Q3 is very old and not very good at stressing ANY GPU anymore. Maybe a GF4 MX, but definately not a GFFX no matter what clock speed.