G70 does 7703 in 3DMark05

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Well according to the inquirer, they were referring to 7800GTX version - Pictured below

Genx87, 6800GT/Ultra are 222 or 223 million transistors I believe. I remember when NV40 was introduced, it was mentioned that a large portion of the additional transistors over R420 was allocated for SM3.0. If that is true, we can roughly estimate:

R420 = 160 million transistors
NV40 = 222 - that means perhaps 62 was allocated for some nvidia specific features and had nothing to do with pipes since both cards are 16.
G70 = 302 million. We got 24 pipes or 50% increase in transistor count. Take ATI's SM2.0 with no Nvidia shadow technology or CineEngine whatever it's called = 160 x 1.5 = 240 million trannies. If G70 is largely unchanged, we need to add those remaining 62. This gives us 302 million specified. Of course this could be a coincidence and I could be blabbing.....
 

imported_Pressure

Senior member
Jun 9, 2005
200
0
0
Originally posted by: RussianSensation
Well according to the inquirer, they were referring to 7800GTX version - Pictured below

Genx87, 6800GT/Ultra are 222 or 223 million transistors I believe. I remember when NV40 was introduced, it was mentioned that a large portion of the additional transistors over R420 was allocated for SM3.0. If that is true, we can roughly estimate:

R420 = 160 million transistors
NV40 = 222 - that means perhaps 62 was allocated for some nvidia specific features and had nothing to do with pipes since both cards are 16.
G70 = 302 million. We got 24 pipes or 50% increase in transistor count. Take ATI's SM2.0 with no Nvidia shadow technology or CineEngine whatever it's called = 160 x 1.5 = 240 million trannies. If G70 is largely unchanged, we need to add those remaining 62. This gives us 302 million specified. Of course this could be a coincidence and I could be blabbing.....
Is this a good or bad thing? I don't know what all this means.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Pressure

Is this a good or bad thing? I don't know what all this means.

I was just saying that based on transistor count and so on G70 is very unlikely to be a 32 pipeline card. Also, given the transistors and how things are, it seems very likely that it's just an enhanced NV40 core and not a brand new design like NV40 was compared to NV30. Again, I dont necessarily think it's bad to keep NV40 design since it is both excellent in terms of feature set and efficient . But given that Nvidia fans kept dissing ATi for rehashing R300 into R420, I wonder what they will say now about their G70?
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: RussianSensation
Originally posted by: Pressure

Is this a good or bad thing? I don't know what all this means.

I was just saying that based on transistor count and so on G70 is very unlikely to be a 32 pipeline card. Also, given the transistors and how things are, it seems very likely that it's just an enhanced NV40 core and not a brand new design like NV40 was compared to NV30. Again, I dont necessarily think it's bad to keep NV40 design since it is both excellent in terms of feature set and efficient . But given that Nvidia fans kept dissing ATi for rehashing R300 into R420, I wonder what they will say now about their G70?

Doesnt surprise me if they rehash R4.xx technology. This round of video cards may be a lame duck since it appears the ambitious plans have been put off due to the delay in longhorn and DX Next.

The rumors flying around are the R520 is a rehash of the R420 which is a rehash of the R300. Nvidia is probably playing it safe and going with a 24 pipe designed NV40.

They bring it out 6 months before the competition and they have gained the upper hand with minimal effort.

Most likely Ill be getting a 7800GT since I need to build a new machine anyways.
 

imported_Rampage

Senior member
Jun 6, 2005
935
0
0
Originally posted by: Rudee
I was expecting a 10k 3DMark at a minimum. :(

get real.

the only cards that will do that are the G70 Ultra and highest end Radeon.. IF the luck of the irish pours down from heaven.

people do this same crap every generation. expect the moon. when they dont get that out of the high end (or dont want to pay for it like SLI), they want $400 cards that do it.

you want 10K in 3dmark05 you buy SLI. why the hell woudl you wait, and when the cards that you will wait months and months on for availability... dont do it, then complain?
if you are that serious about performance, get fvckin serious already.
 

Kalessian

Senior member
Aug 18, 2004
825
12
81
I was just saying that based on transistor count and so on G70 is very unlikely to be a 32 pipeline card. Also, given the transistors and how things are, it seems very likely that it's just an enhanced NV40 core and not a brand new design like NV40 was compared to NV30.

Yes, I remember someone on these forums saying pretty much the same thing. I agreed which is why I remembered it. I think it was Pete?

G70 is really NV47, and the NV5x, which is the real next-gen, won't come out until after the PS3, which makes sense. nvidia's probably concentrating on the RSX which it will adapt for the PC later.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: RussianSensation
OH NO!!!! Now all the Nvidia zealots are going to say that NV40 was an amazing design in the first place, so simply adding more pipes and vertex shaders and higher memory and keeping everything else the same is OK since it has SM3.0 already. But but...but R420 is just a dumbed down R300 they screamed back in the days. Time to run for cover...


Don't you think it's a little early to be claiming they kept everything else the same when no one who knows can tell us yet?

Hypothetically speaking, even if your rumor is true, if these cards deliver 6800GT SLI level performance on one card, or close to it, and sell for $560MSRP, run on any PCIE motherboard, with single slot cooling you think that is somehow a "bad" thing?

LOL- I've got news for you- ATI won't be selling many cards till R520 is released if the above is true. If they can get these parts out the door in enough quantity for there to be vendor competition- I predict some serious nVidia ownage in the months to come.

If I were ATI, I'd be sweating like that guy in "Office Space" whose cube keeps shrinking. What do they think? People are going to buy ULI based motherboards, master/slave cards and dongles to get a little better (maybe) SM2 performance? I sort of doubt it.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: biostud
Originally posted by: Avalon
Originally posted by: JBT
maybe this is the vanilla low end g70? sorta like a 6800Nu of this gen. Its a bit faster than the 5950U high end of two gens ago but its the low end of this gen?

That's what I was thinking.

it says 7800 GTX in the 7703 screen dump.

Yeah, but screen dumps don't mean anything. The screenshot shows the Global Settings Profile in the drivers, not the 3dmark05 profile. You can force AA/AF to 4x/16x in the control panel of the 3dmark05 profile, leave the Global Settings "off", and when you boot 3dmark05, it will show "none" for AA even though it's being forced with the drivers.

 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Rollo

Don't you think it's a little early to be claiming they kept everything else the same when no one who knows can tell us yet?

Hypothetically speaking, even if your rumor is true, if these cards deliver 6800GT SLI level performance on one card, or close to it, and sell for $560MSRP, run on any PCIE motherboard, with single slot cooling you think that is somehow a "bad" thing?

I wasn't saying it's 100% NV40 design but most likely. And I never said I would not be happy with G70. $560 is too much though. Last generation the cards doubled the speed of previous and even $399 x800Pro was 2x faster than 5950U and 9800xt. Since it doesn't look like G70 is anywhere near 10000 marks, I don't think it's as good this generation to pay $560 for less than 2x increase. Of course i'll wait until gaming benchmarks.
 

Intelia

Banned
May 12, 2005
832
0
0
As long as ati had the 300 and kept the performance crown who cares what people say or think. If G70 beats R520 no one should say a word .There are rumors that R520 is 300 based LOL To be honest the 300 is a living legand.
 
Jun 11, 2005
70
0
0
i dobt this is true, cos i read that two ultras in Sli got 11000+ in 3DMark05, and im sure thhat you could get them up to 12000+ if you overclock them so i dobt that this is accurate. im guessing that the score's will be more like this:

1xG70: 9000~

2xG70: 1700~

...but thats just a guess, no fact behind this at all.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
1700 huh ;) .... you mean 17,000

I dont really care what all of this speculation is anymore. We will know the truth in a few weeks.

Additionally Nvnews has some links up regarding the G70. I just want to know where everyone claims to be getting this info from. No one states there sources, and no one says where the numbers are from, they were just obtained. :thumbsdown:

-Kevin
 

Intelia

Banned
May 12, 2005
832
0
0
Originally posted by: SLIorCROSSFIRE
i dobt this is true, cos i read that two ultras in Sli got 11000+ in 3DMark05, and im sure thhat you could get them up to 12000+ if you overclock them so i dobt that this is accurate. im guessing that the score's will be more like this:

1xG70: 9000~

2xG70: 1700~

...but thats just a guess, no fact behind this at all.

Ya it would be nice if it scaled like that but it won't
So far the only sources unless u can post 1.
Give 7737 as G70 score (not reliable)
It seems that 1 of our members you know who you are. Has tested Both the G70 and the R520. And stated the G70 creams R520 . He must be under a gag order because he posted no proof . If the G70 does't cream the R520 he well have exposed him or herslf as a blatent liar.

 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Originally posted by: Intelia
It seems that 1 of our members you know who you are. Has tested Both the G70 and the R520. And stated the G70 creams R520 . He must be under a gag order because he posted no proof . If the G70 does't cream the R520 he well have exposed him or herslf as a blatent liar.

Or they could just be extremely stupid and just not care what strangers on a forum think of them. Just like tkers and spawn campers in online shooters.

 

Intelia

Banned
May 12, 2005
832
0
0
Originally posted by: trinibwoy
Originally posted by: Intelia
It seems that 1 of our members you know who you are. Has tested Both the G70 and the R520. And stated the G70 creams R520 . He must be under a gag order because he posted no proof . If the G70 does't cream the R520 he well have exposed him or herslf as a blatent liar.

Or they could just be extremely stupid and just not care what strangers on a forum think of them. Just like tkers and spawn campers in online shooters.

Hay you just got davids att. What is a tkers. Also he wants to know if you think its easy to get to the other teams spawn and camp out there.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
OH NO!!!! Now all the Nvidia zealots are going to say that NV40 was an amazing design in the first place, so simply adding more pipes and vertex shaders and higher memory and keeping everything else the same is OK since it has SM3.0 already. But but...but R420 is just a dumbed down R300 they screamed back in the days. Time to run for cover...
The big difference here is that there isn't another standard readily available that nVidia can support with G70. You want DirectX Next support? Why? On the otherhand, there are quite a few games that do offer SM3.0 and HDR to those whose cards support it. Additionally, another very popular FPS would probably already be using SM3.0 and HDR if it wasn't for the developer's relationship with a certain chipmaker that as of yet doesn't support either. I'm not necessarily saying that being fed the same technology year after year by anyone is a good thing, but there should at least be a foreseeable need for a new tech for it to be reasonable. I think that it could fairly be argued that nVidia was behind the curve in terms of tech and performance with NV30 and that ATI was behind the curve with R420 in terms of tech, but not performance.

BTW... Don't even try to paint me with the fan boy brush either. Aside from what my sig says... I sold my GT yesterday and am now proudly running an ATI Radeon 7200 like the good Canadian that I am. :)
 

Avalon

Diamond Member
Jul 16, 2001
7,569
172
106
Originally posted by: biostud
Originally posted by: Avalon
Originally posted by: JBT
maybe this is the vanilla low end g70? sorta like a 6800Nu of this gen. Its a bit faster than the 5950U high end of two gens ago but its the low end of this gen?

That's what I was thinking.

it says 7800 GTX in the 7703 screen dump.

I see. Is the GTX second in line to the Ultra model?
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Originally posted by: Intelia
Originally posted by: trinibwoy
Originally posted by: Intelia
It seems that 1 of our members you know who you are. Has tested Both the G70 and the R520. And stated the G70 creams R520 . He must be under a gag order because he posted no proof . If the G70 does't cream the R520 he well have exposed him or herslf as a blatent liar.

Or they could just be extremely stupid and just not care what strangers on a forum think of them. Just like tkers and spawn campers in online shooters.

Hay you just got davids att. What is a tkers. Also he wants to know if you think its easy to get to the other teams spawn and camp out there.

T'kers are team-killers. Retards that run around killing their teammates and destroying friendly vehicles. It's easy enough to camp enemy spawns in some games according to the map. It's definitely easy to camp the MEC base in the BF2 demo.
 

bersl2

Golden Member
Aug 2, 2004
1,617
0
0
Originally posted by: Gamingphreek
1700 huh ;) .... you mean 17,000

I dont really care what all of this speculation is anymore. We will know the truth in a few weeks.

Additionally Nvnews has some links up regarding the G70. I just want to know where everyone claims to be getting this info from. No one states there sources, and no one says where the numbers are from, they were just obtained. :thumbsdown:

-Kevin

Right on.

Unless you have a reputation to put on the line and are willing to claim yourself as the source, it's pretty worthless to not be able to give some kind of source and expect the information to mean anything. And even then, always have an appropriate amount of skepticism, depending on the source.

The amount of bullshiat you people put up with from these companies and shills is amazing. And you all buy it hook-line-sinker every single time.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
the latest

G70, Geforce 7800 GTX final specs out
302 million transistors, eight vertex shaders

The chip is made using a 110 nanometre process and will have 302 million transistors. So far, this is the biggest chip ever built for graphics use. As we revealed before, the chip will be clocked at 430MHz and will use 1200MHz memory with a 256 bit GDRR3 interface.

It will have eight vertex shader units and will be able to process 24 pixels per clock. Nvidia claims that it has 24 pipelines. Some senior editors are referring to this chip as NV47 as it's nothing more than the NV47 was supposed to be, an NV40 with more pipelines and two more vertex shaders.

The peak fill rate of the card is 6.88 Billion/second (16 ROPs at 430 MHz). Bilinear-filtered texel fill rate is 10.32 billion/second when all 24 pipelines work at the full 430MHz.

The peak power consumption of the chip is 100 to 110W

:D

almost forgot :eek:

Nvidia G70 benchmarks published
A WEB SITE has published 3DMark05 scores for the G70.

According to Hardspell, that amounts to 7703 3DMarks.

That contrasts slightly with another benchmark for the G70 that a mole whispered to us - 7737. And for an SLI configuration 12070.

but it's been posted before . . the 2 stories go together like ham and cheese :Q

:laugh:
 

shabby

Diamond Member
Oct 9, 1999
5,782
45
91
I still cant believe the 7800 gtx is using a single slot cooling solution. It runs faster then the 6800 ultra, has 50% more pipes and yet it has a single slot heatsink/fan.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Woah... slow down.

You sold a 6800GT for a Radeon 7200!!!!!!

What in gods name was going through your mind.

-Kevin
 

imported_Pressure

Senior member
Jun 9, 2005
200
0
0
Anyone else see in that inquirer article it said:

The peak power consumption of the chip is 100 to 110W, all the information and benchmarks of the Geforce 7800 Ultra or two cards under SLI will be revealed at six in the morning European time, on the 22nd of June. µ

So are they going to be showing the card above the GTX too or are they talking about the GTX when they say Ultra?
 

Avalon

Diamond Member
Jul 16, 2001
7,569
172
106
Originally posted by: Intelia
Originally posted by: SLIorCROSSFIRE
i dobt this is true, cos i read that two ultras in Sli got 11000+ in 3DMark05, and im sure thhat you could get them up to 12000+ if you overclock them so i dobt that this is accurate. im guessing that the score's will be more like this:

1xG70: 9000~

2xG70: 1700~

...but thats just a guess, no fact behind this at all.

Ya it would be nice if it scaled like that but it won't
So far the only sources unless u can post 1.
Give 7737 as G70 score (not reliable)
It seems that 1 of our members you know who you are. Has tested Both the G70 and the R520. And stated the G70 creams R520 . He must be under a gag order because he posted no proof . If the G70 does't cream the R520 he well have exposed him or herslf as a blatent liar.

You mean the member in my sig? He's a blatant liar, and I signaturized his quote to expose him as such once the benchmarks are out.