8800GTX to be 30% faster than ATI's X1950XTX. GTS to be about equal to it.

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
hmm cant leave this thread alone for too long, jesus...

nice pics Gstan. Hey holy moses, I just solved the power supply problem, now did anybody else notice the 2 SLI bridge connectors? GOOD GOD now I need another bridge? Will I have to hit ebay to get another one?
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: Gstanfor

EDIT: Couldn't resist posting up an image of Batshit all decked out in his "armour".

*sigh* I guess I just don't get it. I am sure you are being extremly funny or something, but to outsiders you seam kinda retarded. Can you please focus on the topic at hand instead of trying so hard to make a nickname stick? You can do it sometimes like those photos you posted of naked G80 those where pretty good. Just leave the he said she said bullshit to soap operas and bon-bon boxes. K?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Errr, try looking under the PR part of my image post for the G80 rendered chic.

i don't just post pic1 pic2 pic3 pic4...pic1,000 :p

your "reality redefined" needed a separate pic
:roll:

with a comment directed just for you ;)

:D
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: apoppin
Originally posted by: redbox
Originally posted by: josh6079
Originally posted by: 5150Joker
30% faster sounds good to me, who wants to buy my X1900XTX? It clocks very nicely. I'd like to have read the thread but all the goons and batshit got in the way. :D

Like I said, I hope there are better visuals to be reaped from this than just raw frames. Current X1900's/7900's in SLI or CF will perform well with any game out and then some. If G80 can bring better IQ while maintaining these supposed improvements then I think it will be money well spent.

The thing you have to keep in mind is that the games that G80 will be playing in it's life time aren't out yet and we don't know how our current SLI or CF systems will run those new games. Guild Wars, Dark Messiah, Neverwinter Nights those are going to be some fun games and will no doubt stress our systems. The question is by how much.

current DX9 games are what g80 will be judged by - on Nov 8th . . . not future games . . .

There are alot of games that are coming just around the bend. When the x1900 or x1800's where out Oblivion wasn't. They still got judged by how they handled it. I'm not really saying that the G80 will be tested that much with DX10 games as there will probably be few of those games in it's lifetime. However to assume that this card won't be judged by say quake wars is a little outlandish I think.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: redbox
Originally posted by: apoppin
Originally posted by: redbox
Originally posted by: josh6079
Originally posted by: 5150Joker
30% faster sounds good to me, who wants to buy my X1900XTX? It clocks very nicely. I'd like to have read the thread but all the goons and batshit got in the way. :D

Like I said, I hope there are better visuals to be reaped from this than just raw frames. Current X1900's/7900's in SLI or CF will perform well with any game out and then some. If G80 can bring better IQ while maintaining these supposed improvements then I think it will be money well spent.

The thing you have to keep in mind is that the games that G80 will be playing in it's life time aren't out yet and we don't know how our current SLI or CF systems will run those new games. Guild Wars, Dark Messiah, Neverwinter Nights those are going to be some fun games and will no doubt stress our systems. The question is by how much.

current DX9 games are what g80 will be judged by - on Nov 8th . . . not future games . . .

There are alot of games that are coming just around the bend. When the x1900 or x1800's where out Oblivion wasn't. They still got judged by how they handled it. I'm not really saying that the G80 will be tested that much with DX10 games as there will probably be few of those games in it's lifetime. However to assume that this card won't be judged by say quake wars is a little outlandish I think.

i am not assuming that g80 will not be judged by Quake Wars . .. or Crysis . . . it most definitely will.

However as some have posted . . . 'time to market' is important . . . all the g80 Hype comes to a 'head' on Nov 8 as the cards are reviewed and judged on current games benchmarks . .. most g80 "buying decisions" will be made then - on the drivers released by 11/8.

nvidia doesn't have a second chance to make a real impact with g80 . . . if it is perceived a 'flop' than it will lose that 'time to market' . . . if it is proclaimed 'success' on Nov 8 by the reviewers, it will gain substantially.

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
'Keep out' revealed

The truly interesting part of this picture is the fact that the GTS has 12 ram chips - just like the GTX does.

Makes you wonder what the 384 vs 320 bit busses are all about.

It has been (Jovially it would seem) suggested that the chip to the left is 16mb of 55 nm NEC EDRAM, used for z-ROP rendering & direct draw, but it seems an awfully long way from the actual main die (and sits outside the main memory to boot) to me if thats what it really does. Note that while NEC won't have such EDRAM until mid 2007

I'm wondering it it isn't SoundStorm mark II myself (total speculation on my part, but Huang did say SoundStorm II was coming and in a way we wouldn't expect).
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: apoppin
Originally posted by: redbox
Originally posted by: apoppin
Originally posted by: redbox
Originally posted by: josh6079
Originally posted by: 5150Joker
30% faster sounds good to me, who wants to buy my X1900XTX? It clocks very nicely. I'd like to have read the thread but all the goons and batshit got in the way. :D

Like I said, I hope there are better visuals to be reaped from this than just raw frames. Current X1900's/7900's in SLI or CF will perform well with any game out and then some. If G80 can bring better IQ while maintaining these supposed improvements then I think it will be money well spent.

The thing you have to keep in mind is that the games that G80 will be playing in it's life time aren't out yet and we don't know how our current SLI or CF systems will run those new games. Guild Wars, Dark Messiah, Neverwinter Nights those are going to be some fun games and will no doubt stress our systems. The question is by how much.

current DX9 games are what g80 will be judged by - on Nov 8th . . . not future games . . .

There are alot of games that are coming just around the bend. When the x1900 or x1800's where out Oblivion wasn't. They still got judged by how they handled it. I'm not really saying that the G80 will be tested that much with DX10 games as there will probably be few of those games in it's lifetime. However to assume that this card won't be judged by say quake wars is a little outlandish I think.

i am not assuming that g80 will not be judged by Quake Wars . .. or Crysis . . . it most definitely will.

However as some have posted . . . 'time to market' is important . . . all the g80 Hype comes to a 'head' on Nov 8 as the cards are reviewed and judged on current games benchmarks . .. most g80 "buying decisions" will be made then - on the drivers released by 11/8.

nvidia doesn't have a second chance to make a real impact with g80 . . . if it is perceived a 'flop' than it will lose that 'time to market' . . . if it is proclaimed 'success' on Nov 8 by the reviewers, it will gain substantially.

You are right. Those who are planning on buying G80 right out of the gates will really look at the release, and for those buyers Nvidia will have to put on one heck of a show to prove that their pony deserves the audiences attention. The x1900xtx did pretty good with it's release, however ATI didn't sell all of their cards opening week. Once Oblivion came out and was so popular people started looking at what cards would play that game without many problems and with the most eye candy. The x1k series did well with that game. When they started lowering the price even more people bought the card.

So you are right Nvidia needs a good show early on. However, I would hesitate saying most G80 buying decisions will be made right on release. ATI really had a second chance with their x1800's with oblivion and IMO those cards, though late, came out to be pretty decent cards in their own right. Nvidia has the same chance here. Most games that are out right now are getting quite playable frames with current gen hardware. Even if G80 gives us "only 30%" in current games, but holds the potential for stomping current cards in future games (i.e. those coming this month or next month) then the outcome for Nvidia won't be so bad.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
However as some have posted . . . 'time to market' is important . . . all the g80 Hype comes to a 'head' on Nov 8 as the cards are reviewed and judged on current games benchmarks . .. most g80 "buying decisions" will be made then - on the drivers released by 11/8.
Time to market also serves another purpose it tends to cement public opinion about a particular generation of hardware (providing the hardware performs well) - if its first to market & it performs well, the opposition has no real chance of catching up.

Also, it means that developers are more likely to code for that chip - since more of that family will be circulating and they have had more time to get used to it.

Time to market is critical for many reasons, and only a fool would play down its advantages (providing the hardware performs as reasonably expected).
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: redbox
Originally posted by: apoppin
i am not assuming that g80 will not be judged by Quake Wars . .. or Crysis . . . it most definitely will.

However as some have posted . . . 'time to market' is important . . . all the g80 Hype comes to a 'head' on Nov 8 as the cards are reviewed and judged on current games benchmarks . .. most g80 "buying decisions" will be made then - on the drivers released by 11/8.

nvidia doesn't have a second chance to make a real impact with g80 . . . if it is perceived a 'flop' than it will lose that 'time to market' . . . if it is proclaimed 'success' on Nov 8 by the reviewers, it will gain substantially.

You are right. Those who are planning on buying G80 right out of the gates will really look at the release, and for those buyers Nvidia will have to put on one heck of a show to prove that their pony deserves the audiences attention. The x1900xtx did pretty good with it's release, however ATI didn't sell all of their cards opening week. Once Oblivion came out and was so popular people started looking at what cards would play that game without many problems and with the most eye candy. The x1k series did well with that game. When they started lowering the price even more people bought the card.

So you are right Nvidia needs a good show early on. However, I would hesitate saying most G80 buying decisions will be made right on release. ATI really had a second chance with their x1800's with oblivion and IMO those cards, though late, came out to be pretty decent cards in their own right. Nvidia has the same chance here. Most games that are out right now are getting quite playable frames with current gen hardware. Even if G80 gives us "only 30%" in current games, but holds the potential for stomping current cards in future games (i.e. those coming this month or next month) then the outcome for Nvidia won't be so bad.

i dunno . . . x1800 series could be considered a 'failure' ... it never overcame the late failed launch . . . ati's situation looked pretty grim until x1900 [which did have a decent launch].

looking back even further, the radeon 8500 never could overcome it's poor initial showing - even though it later managed to [nearly] catch up with the Ti series with improved drivers. . . . need i mention the DustBuster and it's failed launch and what it did for nvidia's credibility? :p

With R600 relatively near, ATi will release a lot of small details to help offset the "+30% performance lead" [evidently] held by g80 - as well as price drops to compete with the GTS . . . if G80 is PERCEIVED as disappointing or overpriced . . . then it WILLl 'hurt' nvidia's sales.

this launch is important . . . as noted by others . . . this "time to market" is especially important . . .

nvidia stands to gain or lose with g80 . . . more so than usual . . . it is a NextGen GPU on which many refreshes will likely be based . . . if G80 is flawed . . . bad news.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Actually this will be the only the second time since R200 that nvidia has managed to launch ahead of ATi in recent years. ATi's slipping release dates allowing nvidia to take the launch lead may end proving extremely costly to them. It was a very large levelling factor ATi had going for it while it lasted.

You can rest assured that nvidia finding themselves in this happy situation was no accident either.
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
I am going to continue to be bias and predict that...

1) All along Crytek has been working with the superior card, ATI's R600, during the development of Crysis.
2) ATI's R600 will launch when Crysis launches, perhaps even bundled together.
3) Crysis will be the first major test of DX10 technology.

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Awesome finds Gstanfor. I dropped them in lopri's thread, so he can add them to his first post. Good job, and nice pics...
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
the keep out is the quantum physics processor. guaranteed.

please someone answer me ont he dual SLI bridge issue
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Actually this will be the only the second time since R200 that nvidia has managed to launch ahead of ATi in recent years. ATi's slipping release dates allowing nvidia to take the launch lead may end proving extremely costly to them. It was a very large levelling factor ATi had going for it while it lasted.

You can rest assured that nvidia finding themselves in this happy situation was no accident either.
RECENT years? :p

the rivalry between ATI and nVidia has take place ALL in "recent years"

Your History is plainly wrong . . . unlike you - which flatters me by posting mini errors i make in YOUR sig - you are NOT important enough for ANYone to make a sig of your errors . . . nor is there enough room in PAGEs of signatures to put 1/4 of the misleading statements, fud, hate, personal attacks, misquotes, childish name calling and outright lies you continually spew
:thumbsdown:

letsee radeon 8500 was 6 months after the GF3 release . . then GF4 Ti was out next . . . THEN r300 [ahead of nvidia]. . . then x800 [a refresh] . . . and x1800 was 6 months late :p

and YOu are ASSuming that g80 will be perceived a "success"
:Q

if g80 is a 'failure' or 'neutral' than ati finds themselves in a happy situation . . . that may be no accident . . . either ;)

:)
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
retards can you stop fighting and please answer some questions, since it seems like you all know more than anyone else!
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: JAG87
the keep out is the quantum physics processor. guaranteed.

please someone answer me ont he dual SLI bridge issue

I don't think anyone really knows the answer to the dual bridge yet. Maybe it's a two way communication thing like the latest version of Crossfire...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: JAG87
retards can you stop fighting and please answer some questions, since it seems like you all know more than anyone else!

ask them :p

:roll:

and no one knows or can say till nov 8 about your 'bridges' :p

ask another

:D
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Originally posted by: JAG87
the keep out is the quantum physics processor. guaranteed.

please someone answer me ont he dual SLI bridge issue

Looks a lot like the new internal CrossFire bridge on the X1950 PRO.

PIC
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
I think the GTX needs the extra interconnect. I cant see triple SLI or quad SLI (with 4 cards) happening.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
no I mean you need 2 bridges to run GTXs is SLI.

Right now if you want to compare, SLI is running in simplex (one transmission, one way). Now SLI is going to be full duplex (two transmissions, two ways).

That could help quite a bit with syncronization issues.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
I thought there was quite a lot of speculation going around about how nVidia and ATi were going to migrate to 3 GPUs in sync since they're putting physics processors on board?