G70 does 7703 in 3DMark05

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Nil Einne

Member
May 4, 2005
40
0
66
Sure 3dmark05 or any of them don't tell you EXACTLY how gaming performance will be. But if a card scores 2000 in 3dmark05, you know for a fact it's too slow for today's games given that today's cards are in 5000/6000 range. A score still can indicate a rough performance potential of the card. Generally if a card scores 10000, you can expect it to be 2x as fast as 6800Ultra in games -- the raw performance of the card is there to give 10000 points. So it's not going to go anywhere in real world game benchmarks. As much as people hate 3dmark benches, they show time and time over which card is the most adept at handling shaders and intensive graphics (ie. ATI cards in nature scene, 9800Pro>5900 series, x800xt > 6800U). Even though the difference isnt great and it depends on the game (opengl whatever), generally the higher the score in 3dmark03/05 benches, the more the videocard is capable of handling higher shader intensive games -- that is where ati cards continue to slightly lead, each generation from 9700Pro. But it certainly wont tell you whether or not you can get 100 frames in Quake 4 or Unreal 3.
Lol 2000 3dmark05 is too slow for todays games? For you maybe but not for a lot of people. You also seem confused. Whatever today's cards may do doesn't matter because no one designs games so they can only run on 'todays' cards, they're design for at least 2 gens ago cards or more as the baseline.

Heck for many people a 2000 3dmark2005 score card/system will probably be just enough for many of 'tomorrows' games.

Personally, I'm still wondering if the 7800 is going to be widely available over the next few weeks. Nvidia's marketing blitz suggests it should be but who knows?
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: AnnihilatorX
Isn't that expected?

Rumours was that the name GeForce 7800 came from the 3d mark 2005 score.

Yes, but those rumours are stupid.
Unless previous cards names came from their 3D Mark scores, like the Geforce 4 Ti4800, or the FX5800, or the 6800.
7800 for the top end product is just, well, obvious.
 

Nil Einne

Member
May 4, 2005
40
0
66
Originally posted by: xtknight
If the image is correct, it says the card is a G70 GTX, unfortunately. Anyway how the hell do these products get over to Chinese speculation sites before they get to US ones? NVIDIA and ATI are in North America...do these people steal the cards from somewhere or do they just leak NDAs like there's no tomorrow?

You seem confused. Nvidia and ATI might be American companies but most of their cards are produced in China or Taiwan. Nvidia is launching their card next week. Do you think their partners can produce the cards in a day or two? Er no... Do you really think their important partners don't get reference cards from the moment their first produced? Simple fact is these cards probably end up in a lot more places and people in China and Taiwan and such then they do in the North America. Other then Nvdia and ATI, most other people people who will get them in North America will be review sites and the like which probably aren't really that many in numbers. Some of the the engineering departments and the like of these companies in North America may get them but the bulk of them are probably in China or Taiwan. Maybe some game developers and the like as well. But in terms of the people these end up with and the numbers, these are going to be much smaller then the numbers that exist in Taiwan and China and also there's going to be a lot tighter lid on stuff (let's face it, if a 7800 disappears from the ID offices or Anandtech's office, it's going to be noticed)

Of course, the NDA issue is probably another important one. It's likely a lot harder for Nvidia or ATI to track down and sue someone for breaking a NDA in China or Taiwan for quite a number of reasons including the ones listed above (there are just too many possible sources).
 

Nil Einne

Member
May 4, 2005
40
0
66
Originally posted by: Lonyo
Originally posted by: AnnihilatorX
Isn't that expected?

Rumours was that the name GeForce 7800 came from the 3d mark 2005 score.

Yes, but those rumours are stupid.
Unless previous cards names came from their 3D Mark scores, like the Geforce 4 Ti4800, or the FX5800, or the 6800.
7800 for the top end product is just, well, obvious.

I agree the rumour is a bit silly but your example is a bit flawed. Unless my memory fails me, the Ti4600 was the original topend Geforce4 model, the Ti4800 only came out later.

The trend since the 5800 and 6800 does seem to follow tho...
 

Nil Einne

Member
May 4, 2005
40
0
66
Originally posted by: Lonyo
Originally posted by: AnnihilatorX
Isn't that expected?

Rumours was that the name GeForce 7800 came from the 3d mark 2005 score.

Yes, but those rumours are stupid.
Unless previous cards names came from their 3D Mark scores, like the Geforce 4 Ti4800, or the FX5800, or the 6800.
7800 for the top end product is just, well, obvious.

I agree the rumour is a bit silly but your example is a bit flawed. Unless my memory fails me, the Ti4600 was the original topend Geforce4 model, the Ti4800 only came out later.

The trend since the 5800 and 6800 does seem to follow tho...
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Originally posted by: Nil Einne
Originally posted by: Lonyo
Originally posted by: AnnihilatorX
Isn't that expected?

Rumours was that the name GeForce 7800 came from the 3d mark 2005 score.

Yes, but those rumours are stupid.
Unless previous cards names came from their 3D Mark scores, like the Geforce 4 Ti4800, or the FX5800, or the 6800.
7800 for the top end product is just, well, obvious.

I agree the rumour is a bit silly but your example is a bit flawed. Unless my memory fails me, the Ti4600 was the original topend Geforce4 model, the Ti4800 only came out later.

The trend since the 5800 and 6800 does seem to follow tho...

I cannot even begin to comprehend how somebody could think Nvidia would name one of its products after a 3dmark score. It's so stupid I can't believe it's still being discussed.... :roll:
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
I hope people realize that a hypothetical 24-pipe NV40 at 430Mhz would score close to 9000 in 3dmark05. So obviously the benchmark is not fillrate limited at default resolution. Something to chew on......
 

klah

Diamond Member
Aug 13, 2002
7,070
1
0
Originally posted by: xtknight
If the image is correct, it says the card is a G70 GTX, unfortunately. Anyway how the hell do these products get over to Chinese speculation sites before they get to US ones? NVIDIA and ATI are in North America...do these people steal the cards from somewhere or do they just leak NDAs like there's no tomorrow?

Most of these cards are manufactured in China/Taiwan. Hardspell and GZeasy always have access to new hardware before other sites.

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I hope people realize that a hypothetical 24-pipe NV40 at 430Mhz would score close to 9000 in 3dmark05.
Can you explain that to the rest of us?

...fact is these cards probably end up in a lot more places and people...

ok, that's just plain gross...
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
As far as the score goes... While it certianly doesn't blow my mind, it is a nice thought to think that in a few months with a bit of money I might be able to get the same 3DMark05 score out of the same computer I check email with and my wife does her taxes on as this guy's rig:

http://forums.vr-zone.com.sg/showthread.php?t=22984

(without the big f***ing clamp either)
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Originally posted by: nitromullet
I hope people realize that a hypothetical 24-pipe NV40 at 430Mhz would score close to 9000 in 3dmark05.
Can you explain that to the rest of us?

Simple speculatory math.

A stock Ultra scores around 5700.

So assuming that efficiency remains constant -

430Mhz = 30Mhz increase = 7.5%.
24 pipes = 8 pipe increase = 50%.

So a 430Mhz 16 pipe NV40 = 5700 x 1.075 = 6127.5
And add 8 pipes = 6127.5 x 1.5 = 9191.25

So if we were fillrate limited that's the lowest score we should expect from a 24-pipe G70. But since we don't get that we are most likely limited somewhere else. That's my best guess.
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Originally posted by: trinibwoy
So if we were fillrate limited that's the lowest score we should expect from a 24-pipe G70. But since we don't get that we are most likely limited somewhere else. That's my best guess.

Take a look at the top graph. As 3dmark05 becomes more fillrate limited the Ultra closes in on the PE. If these results are accurate it tells me two things -

1) The PE's advantage at lower resolutions is due to the higher clock on its vertex engines
2) The Ultra is more efficient at shading and this begins to show as the bottleneck moves away from the vertex load.

Disclaimer: this could all be a bunch of crock but it's fun anyway ;)
 

Regs

Lifer
Aug 9, 2002
16,665
21
81
Even if were true it's still unreliable information. 3D mark might have to make a new build of its engine, and of course, game benchmarks will be the ultimate testament of it's performance.

There's no reason to get your panty in a bunch just yet....Guys.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Nil Einne

Heck for many people a 2000 3dmark2005 score card/system will probably be just enough for many of 'tomorrows' games.

3dmark05 scores
A score of 2000 puts us at Geforce 6600 level. You think 6600 is a card that's good enough for tomorrow's games? 6600GT will become low-end in about 10 days. Even today 6600GT is not capable of playing 90% of games at anything above 1024x768 4AA/8AF - LINK Sure it's adequate to most. But then again Intel has 60% marketshare with its intergrated graphics. So does it mean it's adequate for gaming? Remember this is a site for hardware enthusiasts. Sure not everyone owns x800xt and 6800Ultras but usually it's not a good idea to be more than 1 generation behind. So when G70 is around, x800xt/xl and 6800gt are the perfect sweet spots for another 1 year or 1.5 years. They score 5000/6000 in 3dmark05. I dont think 7700 is anything amazing by any means from a 24 pipeline card 12 months+ months after NV40 was introduced. By the way ATI is Canadian....

Whatever today's cards may do doesn't matter because no one designs games so they can only run on 'todays' cards, they're design for at least 2 gens ago cards or more as the baseline.

This statement is contradictory in itself. It it doesnt matter if today's cards support latest features like SM3.0 and are powerful, how do you expect them to run future games? The general rule is in fact the opposite. Next generation cards are meant for today's games not tomorrow's games. Look at history: By the time DX9 and PS2.0 rolled around, 9800pro and 5900 were too slow to play any of those games smoothly. Even if 6800 series support sm3.0 now, by the time HDR and sm3.0 features are widely used, 6800 will be rapped. And saying that games are designed for 2 generations of cards ago or beyond is a misleading statement. Sure, doom 3 can be run on voodoo5 and HL on Radeon 64. But both Doom 3 and HL2 were designed to run on current generation cards like x800xt and 6800Ultra. Anything below that just means the game isn't performing the way the developers intended it (ie. at reduced settings and image quality). Of course this is my opinion so you are free to disagree.
 

chinkgai

Diamond Member
Apr 4, 2001
3,904
0
71
lol, i love your posts RS

im unimpressed at these prelim scores if they are true...my x800xt running at pe speed yields 6300ish
 

biostud

Lifer
Feb 27, 2003
19,478
6,543
136
Originally posted by: Avalon
Originally posted by: JBT
maybe this is the vanilla low end g70? sorta like a 6800Nu of this gen. Its a bit faster than the 5950U high end of two gens ago but its the low end of this gen?

That's what I was thinking.

it says 7800 GTX in the 7703 screen dump.
 

Intelia

Banned
May 12, 2005
832
0
0
Know way Is that the right score I think Nvidia might be tring to suck Ati in.
Ati isn't going to show there hand no way.
If these are the correct scores nividia is in deep trouble very deep.
It really is upsetting now Ati well stay with there game plan . Damm it.
I am waiting for the second generation anyway R600 or what every number it well be.
For you guys interested in cross fire Don't buy this generation . there seems to be something up with the roadmaps. R520 maybe bring another version of cross fire. Take this with a grain of salt as I am reading between the lines.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The latest "rumour" (or facts?) by the Inquirer:

"Geforce 7800 GTX - The chip is made using a 110 nanometre process and will have 302 million transistors. So far, this is the biggest chip ever built for graphics use. As we revealed before, the chip will be clocked at 430MHz and will use 1200MHz memory with a 256 bit GDRR3 interface.

It will have eight vertex shader units and will be able to process 24 pixels per clock. Nvidia claims that it has 24 pipelines. Some senior editors are referring to this chip as NV47 as it's nothing more than the NV47 was supposed to be, an NV40 with more pipelines and two more vertex shaders.
The peak fill rate of the card is 6.88 Billion/second (16 ROPs at 430 MHz). Bilinear-filtered texel fill rate is 10.32 billion/second when all 24 pipelines work at the full 430MHz.

The peak power consumption of the chip is 100 to 110W, all the information and benchmarks of the Geforce 7800 Ultra or two cards under SLI will be revealed at six in the morning European time, on the 22nd of June."

OH NO!!!! Now all the Nvidia zealots are going to say that NV40 was an amazing design in the first place, so simply adding more pipes and vertex shaders and higher memory and keeping everything else the same is OK since it has SM3.0 already. But but...but R420 is just a dumbed down R300 they screamed back in the days. Time to run for cover...
 

imported_X

Senior member
Jan 13, 2005
391
0
0
I'm definitely looking forward to seeing the results of the G70 Ultra SLI tests next Wednesday.