G80 De-Mystified; GeForce 8800GTX/GT

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Schadenfroh
Originally posted by: BFG10K
All water cooling sucks.

When you spring a leak...well, you already know what happens then.

Agreed, I dont like the idea of fluids flowing inside a $650+ video card.

I was a bit weary as well. Once you leak test everything, and all of your components are safe due to no electricity flowing through them, you begin to like it. Once you completely put together a loop and do so without messing up, the reward is awesome.

It just takes a lot more care and time when setting up a water cooling setup than with slapping on a heatsink with a fan and forgetting about it. The maitence is a little more complicaated than just spraying a heatsink to clear it of dust as well. Then again, there is less dust in your case because there aren't as many fans.
 

CVSiN

Diamond Member
Jul 19, 2004
9,289
1
0
Originally posted by: Ulfhednar
If that's real, then this X1900XT 512MB might be my final ATI purchase for a while. :)

Nvidia have been disgraceful the last year or two in terms of price/performance ratio, in my opinion, and they have missed out on important features that have only kept ATI in my favour (HDR+AA for one.) It has been assumed all along that Nvidia was not going to bother with unified shader architecture for a long time too, and this pretty much decided for me that my next upgrade would be ATI again.

I don't plan to upgrade it for a year though, so I'll see how the entire situation pans out.

Youre kidding right? the 7800 and 7900 series kicked the crap outa ATI.. and the 7950 just buried them again..

good job spouting no facts..
just about every single bench shows Nvidia faster in most apps.. there are some where ATI excells and i'm not blind.. but the overall winner the last few years defiantly goes to Nvidia..
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Youre kidding right? the 7800 and 7900 series kicked the crap outa ATI.. and the 7950 just buried them again..
How so? I don't think either side has "burried" anyone. Both are pretty close in a lot of things and differ only in AA/AF algorithms.

good job spouting no facts..
Was that a pat on the back?
just about every single bench shows Nvidia faster in most apps.. there are some where ATI excells and i'm not blind.. but the overall winner the last few years defiantly goes to Nvidia..
Depends on the benchmark you're looking at, what settings were used, ect. Ulfhednar was talking about the price/performance ratio, and last I checked, Nvidia's cards seemed too expensive for the performance they gave whereas ATI's lower X1900XT models are priced amazingly for the kind of performance you get.

 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
Let's keep this thread a discussion about the upcoming 8 series, OK?

We can argue about current cards and listen to the fanboys in other threads. :D
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: Cookie Monster
Originally posted by: josh6079
Originally posted by: Cookie Monster
The stuff has been taken down from both hardspell and vr-zone.

What does this all mean? :D

...that Shamino messed up again :p

Who knows. I say just sit back and relax. Even when it hits I'm going to wait and see just how good it does against the R600. If it beats it in what I want it to, then I'll glady buy a G80. I think Nvidia has what it takes to be an industrial leader, but I think ATI has that too. We'll just have to see what happens and not pre-ejaculate before the veil even falls.

:laugh:

I wonder if the whole "... scaling up to 1.5ghz.." means GDDR4 and not the core clock speed.

edit - wonder if this 700M chip means 350 + 350 (dual GPU) as in similiar to 7950GX2 except on 1 PCB.

Good guess on the 1.5ghz thing

 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: CVSiN
Originally posted by: Ulfhednar
If that's real, then this X1900XT 512MB might be my final ATI purchase for a while. :)

Nvidia have been disgraceful the last year or two in terms of price/performance ratio, in my opinion, and they have missed out on important features that have only kept ATI in my favour (HDR+AA for one.) It has been assumed all along that Nvidia was not going to bother with unified shader architecture for a long time too, and this pretty much decided for me that my next upgrade would be ATI again.

I don't plan to upgrade it for a year though, so I'll see how the entire situation pans out.
Youre kidding right? the 7800 and 7900 series kicked the crap outa ATI.. and the 7950 just buried them again..
I don't recall mentioning the 7800 series but the 7900 series was terrible in terms of price/performance ratio (at least here in the UK), especially the 7950GX2, when compared with ATI alternatives which have been at rock-bottom prices and offering similar and sometimes better performance (including better image quality.)

Originally posted by: CVSiN
good job spouting no facts..
just about every single bench shows Nvidia faster in most apps.. there are some where ATI excells and i'm not blind.. but the overall winner the last few years defiantly goes to Nvidia..
Good job being an offended little fanboy. Was I or was I not just praising these G80 specs and saying I may change back to Nvidia if they are real? :confused: Your green knickers are in such a tight twist it seems to me your balls are getting crushed in them, grow up.
 

CP5670

Diamond Member
Jun 24, 2004
5,692
796
126
Interesting stats, but as remarked already that 1.5ghz number looks way off the wall. So is this supposed to have two distinct memory interfaces with separate memory chips on each one?
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Let me add to what I said earlier. A 700M transistor count would only be possible if this new card were a dual die product like presler.

presler pic

Others on this thread have already mentioned that they think the G80 must be a a dual core product to meet the 700M transistor spec but I think they actually meant to say dual die. Of course, being dual die means that the thing is at least a dual core.

A product like that would make some sense since it would essentially be a successor to the GX2 that were actually economical to make.

Like Presler, the two dies could ride on the same memory bus thus making the end product G80 card much more RAM frugal than the GX2 which requires 2 sets of ram both having near duplicate data. The extra "odd bus" could be a 2x64bit bus with each die having 1 64bit bus of its own. This would relieve memory pressure on the shared bus which would become specialized for tile based access patterns which are useful for texture access.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,407
8,595
126
Originally posted by: zephyrprime
Let me add to what I said earlier. A 700M transistor count would only be possible if this new card were a dual die product like presler.

presler pic

Others on this thread have already mentioned that they think the G80 must be a a dual core product to meet the 700M transistor spec but I think they actually meant to say dual die. Of course, being dual die means that the thing is at least a dual core.

A product like that would make some sense since it would essentially be a successor to the GX2 that were actually economical to make.

Like Presler, the two dies could ride on the same memory bus thus making the end product G80 card much more RAM frugal than the GX2 which requires 2 sets of ram both having near duplicate data. The extra "odd bus" could be a 2x64bit bus with each die having 1 64bit bus of its own. This would relieve memory pressure on the shared bus which would become specialized for tile based access patterns which are useful for texture access.

just thinking out loud here... the problem with the pentium pro was that if either of the dice in the package was bad, the whole thing had to be tossed (including the good die).

but the pentium pro was a processor. and a processor needs both the logic unit and the cache to be any good.

this is a video processor, and both parts would be identical.

so, even if one part is bad when mounted, they could still sell it as a midrange model, with one of the dice simple disabled.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Originally posted by: ElFenix
just thinking out loud here... the problem with the pentium pro was that if either of the dice in the package was bad, the whole thing had to be tossed (including the good die).

but the pentium pro was a processor. and a processor needs both the logic unit and the cache to be any good.

this is a video processor, and both parts would be identical.

so, even if one part is bad when mounted, they could still sell it as a midrange model, with one of the dice simple disabled.
Well, the pentium pro had to be packaged together before it could be tested for reasons I don't know but nowadays, I've read that they can test a die before it's even cut from its wafer so I don't think the same problem exists anymore.

 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: CVSiN
Originally posted by: Ulfhednar
If that's real, then this X1900XT 512MB might be my final ATI purchase for a while. :)

Nvidia have been disgraceful the last year or two in terms of price/performance ratio, in my opinion, and they have missed out on important features that have only kept ATI in my favour (HDR+AA for one.) It has been assumed all along that Nvidia was not going to bother with unified shader architecture for a long time too, and this pretty much decided for me that my next upgrade would be ATI again.

I don't plan to upgrade it for a year though, so I'll see how the entire situation pans out.

Youre kidding right? the 7800 and 7900 series kicked the crap outa ATI.. and the 7950 just buried them again..

good job spouting no facts..
just about every single bench shows Nvidia faster in most apps.. there are some where ATI excells and i'm not blind.. but the overall winner the last few years defiantly goes to Nvidia..

LOL, stfu. The x1900 series wins against the 7900GTX hands down overall, particularly when quality settings are equalized.

The 7950GX2 wins (barely) against a stock x1950xtx but certainly won't beat 2x1900xts in crossfire

 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Frackal
Originally posted by: CVSiN
Originally posted by: Ulfhednar
If that's real, then this X1900XT 512MB might be my final ATI purchase for a while. :)

Nvidia have been disgraceful the last year or two in terms of price/performance ratio, in my opinion, and they have missed out on important features that have only kept ATI in my favour (HDR+AA for one.) It has been assumed all along that Nvidia was not going to bother with unified shader architecture for a long time too, and this pretty much decided for me that my next upgrade would be ATI again.

I don't plan to upgrade it for a year though, so I'll see how the entire situation pans out.

Youre kidding right? the 7800 and 7900 series kicked the crap outa ATI.. and the 7950 just buried them again..

good job spouting no facts..
just about every single bench shows Nvidia faster in most apps.. there are some where ATI excells and i'm not blind.. but the overall winner the last few years defiantly goes to Nvidia..

LOL, stfu. The x1900 series wins against the 7900GTX hands down overall, particularly when quality settings are equalized.
That depends on what the user defines as a "win". Sometimes the features of the 7900GTX may benefit more for the user than an X1900 depending on the type of game they play.

The 7950GX2 wins (barely) against a stock x1950xtx but certainly won't beat 2x1900xts in crossfire

Sometimes the 7950GX2 beats a stock X1950XTX by a good margin, sometimes it doesn't. Two 7900GTX's in SLI compete pretty well with X1900's in CF and even X1950's in CF for certain instances.

Niether company is absolutely "killing" the other, but both have their strenghts and weaknesses. For me, HDR+AA/HQAF/playable frames/better overclocking options makes the ATI choice a better one. However, for someone else TrAA/less power draws/quieter cooling/8xAA/EVGA/playable frames, etc make the Nvidia choice the better one. It depends on the user.
 

imported_michaelpatrick33

Platinum Member
Jun 19, 2004
2,364
0
0
Originally posted by: Schadenfroh
Originally posted by: inspire
Originally posted by: Ulfhednar
Originally posted by: SpeedZealot369
If this is true, time for external power bricks :(
That rumour is already in full circulation and the flames are burning ever hotter due to this article from June 5th.

http://www.anandtech.com/tradeshows/showdoc.aspx?i=2770

ATI and Nvidia can bite me if this comes to pass, I will buy a Wii.

:thumbsup: Wii FTW.

That is right! Defy ATI by buying a Wii! That will show them!

:laugh:

I find these specs to be a little extreme myself. 700 million transistors on a 80nm die? Ye Gods
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: josh6079
playable frames
I agree with absolutely everything you said in your post, but this one is purely game-dependant. Oblivion is a good example of a game in which Nvidia hardware really lags behind, the minimum framerates on the 7900GT I borrowed were horrendous compared to the X1800XT I owned at the time.

And likewise there are some games where ATI performs poorly.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Ulfhednar
Originally posted by: josh6079
playable frames
I agree with absolutely everything you said in your post, but this one is purely game-dependant. Oblivion is a good example of a game in which Nvidia hardware really lags behind, the minimum framerates on the 7900GT I borrowed were horrendous compared to the X1800XT I owned at the time.

And likewise there are some games where ATI performs poorly.

That's what I meant. The user will have to determine whether the card will provide playable frames for the games they will be using.
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: josh6079
That's what I meant. The user will have to determine whether the card will provide playable frames for the games they will be using.
Yeah, I completely agree, and this is why I have gone ATI for my last three graphics purchases (which some members on this forum clearly find insulting.) The two things I look for when I am upgrading are a.) performance and b.) cost, also known as "bang-for-buck."

It just so happens that Nvidia cards cost more than their ATI counterparts here in the UK, sometimes obscenely so, and with Nvidia cards missing features like the ability to render HDR and antialiasing at the same time I have simply seen no reason to buy their products.

As the insulted Nvidia flag-wavers seem to miss though, I am hoping that these "8800GT/GTX" specifications are correct because it's time Nvidia supported HDR+AA and I have been interested in unified shader architecture since first hearing about it (so, naturally I have leaned even closer toward ATI as Nvidia was rumoured not to be interested.)

I also like to switch brands often for variety. Even if these specifications are true though, something tells me that the price trends in my country won't change any time soon.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: CVSiN
Originally posted by: Ulfhednar
If that's real, then this X1900XT 512MB might be my final ATI purchase for a while. :)

Nvidia have been disgraceful the last year or two in terms of price/performance ratio, in my opinion, and they have missed out on important features that have only kept ATI in my favour (HDR+AA for one.) It has been assumed all along that Nvidia was not going to bother with unified shader architecture for a long time too, and this pretty much decided for me that my next upgrade would be ATI again.

I don't plan to upgrade it for a year though, so I'll see how the entire situation pans out.

Youre kidding right? the 7800 and 7900 series kicked the crap outa ATI.. and the 7950 just buried them again..

good job spouting no facts..
just about every single bench shows Nvidia faster in most apps.. there are some where ATI excells and i'm not blind.. but the overall winner the last few years defiantly goes to Nvidia..



^^^LOL hahahaha...this is definitely one of the dumbest posts I've read in awhile.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: 5150Joker
Originally posted by: CVSiN
Originally posted by: Ulfhednar
If that's real, then this X1900XT 512MB might be my final ATI purchase for a while. :)

Nvidia have been disgraceful the last year or two in terms of price/performance ratio, in my opinion, and they have missed out on important features that have only kept ATI in my favour (HDR+AA for one.) It has been assumed all along that Nvidia was not going to bother with unified shader architecture for a long time too, and this pretty much decided for me that my next upgrade would be ATI again.

I don't plan to upgrade it for a year though, so I'll see how the entire situation pans out.

Youre kidding right? the 7800 and 7900 series kicked the crap outa ATI.. and the 7950 just buried them again..

good job spouting no facts..
just about every single bench shows Nvidia faster in most apps.. there are some where ATI excells and i'm not blind.. but the overall winner the last few years defiantly goes to Nvidia..



^^^LOL hahahaha...this is definitely one of the dumbest posts I've read in awhile.

Nope, I beat it by finding this one ^^^^ LOL LOL LOLLLLLOOOLLLLLLL, you tool.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Crusader
Originally posted by: 5150Joker
Originally posted by: CVSiN
Originally posted by: Ulfhednar
If that's real, then this X1900XT 512MB might be my final ATI purchase for a while. :)

Nvidia have been disgraceful the last year or two in terms of price/performance ratio, in my opinion, and they have missed out on important features that have only kept ATI in my favour (HDR+AA for one.) It has been assumed all along that Nvidia was not going to bother with unified shader architecture for a long time too, and this pretty much decided for me that my next upgrade would be ATI again.

I don't plan to upgrade it for a year though, so I'll see how the entire situation pans out.

Youre kidding right? the 7800 and 7900 series kicked the crap outa ATI.. and the 7950 just buried them again..

good job spouting no facts..
just about every single bench shows Nvidia faster in most apps.. there are some where ATI excells and i'm not blind.. but the overall winner the last few years defiantly goes to Nvidia..



^^^LOL hahahaha...this is definitely one of the dumbest posts I've read in awhile.

Nope, you beat it by finding this one LOL LOL LOLLLLLOOOLLLLLLL.


True true, nobody could match your sheer stupidty.

 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: 5150Joker
Originally posted by: Crusader
Originally posted by: 5150Joker
Originally posted by: CVSiN
Originally posted by: Ulfhednar
If that's real, then this X1900XT 512MB might be my final ATI purchase for a while. :)

Nvidia have been disgraceful the last year or two in terms of price/performance ratio, in my opinion, and they have missed out on important features that have only kept ATI in my favour (HDR+AA for one.) It has been assumed all along that Nvidia was not going to bother with unified shader architecture for a long time too, and this pretty much decided for me that my next upgrade would be ATI again.

I don't plan to upgrade it for a year though, so I'll see how the entire situation pans out.

Youre kidding right? the 7800 and 7900 series kicked the crap outa ATI.. and the 7950 just buried them again..

good job spouting no facts..
just about every single bench shows Nvidia faster in most apps.. there are some where ATI excells and i'm not blind.. but the overall winner the last few years defiantly goes to Nvidia..



^^^LOL hahahaha...this is definitely one of the dumbest posts I've read in awhile.

Nope, you beat it by finding this one LOL LOL LOLLLLLOOOLLLLLLL.


True true, nobody could match your sheer stupidty.

:laugh:
 

ebeattie

Senior member
May 22, 2005
328
0
0
ouch, those are some steep specs for power! Is the transistor count going to decide how much power is used? How will die shinking effect power consumption with that many transistors?
 

twjr

Senior member
Jul 5, 2006
627
207
116
Originally posted by: 5150Joker
Originally posted by: Crusader
Originally posted by: 5150Joker
Originally posted by: CVSiN
Originally posted by: Ulfhednar
If that's real, then this X1900XT 512MB might be my final ATI purchase for a while. :)

Nvidia have been disgraceful the last year or two in terms of price/performance ratio, in my opinion, and they have missed out on important features that have only kept ATI in my favour (HDR+AA for one.) It has been assumed all along that Nvidia was not going to bother with unified shader architecture for a long time too, and this pretty much decided for me that my next upgrade would be ATI again.

I don't plan to upgrade it for a year though, so I'll see how the entire situation pans out.

Youre kidding right? the 7800 and 7900 series kicked the crap outa ATI.. and the 7950 just buried them again..

good job spouting no facts..
just about every single bench shows Nvidia faster in most apps.. there are some where ATI excells and i'm not blind.. but the overall winner the last few years defiantly goes to Nvidia..



^^^LOL hahahaha...this is definitely one of the dumbest posts I've read in awhile.

Nope, you beat it by finding this one LOL LOL LOLLLLLOOOLLLLLLL.


True true, nobody could match your sheer stupidty.


I'd say that nobody could disagree with that statement there.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Its about time the vertex and geometry pipelines received some attention. In the past all the love has gone to the pixel shaders (because its easier to proivde visible proof of your improvements that way).
No, all the love has gone there because that's where your performance bottleneck primarily is with modern games.

Furthermore vertex/geometry load is constant regardless of resolution, unlike pixel load. That's why it makes sense for the most resources to go to pixel shaders.

That was below the belt!!
Oh it wasn't supposed to be a personal attack, just an illustration of the dangers of water + electricity. ;)

Agreed, I dont like the idea of fluids flowing inside a $650+ video card.
Yeah, not to mention what happens when it drips onto a pricey motherboard and CPU. I've heard some rather spectacular stories on the 'net when that happened.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Water cooling is more dangerous than air cooling but as long as you know what you're doing it's safe. I haven't had any problems with my water cooling setup for as long as I've had it.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
When they say the core clock can be scalable up to 1.5Ghz, what if they mean 750Mhz x 2? After all, most of you guys seem to agree that this will likely be a dual die GPU, so 750Mhz per GPU does not seem unreasonable for a next gen item.