G80 Taped out.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: keysplayr2003
Yes, very little leaked so far. I have heard that G80 will not be a unified architecture. And a small glimpse of what it might be is Here.

Ya the INQ has been saying Nvidia will come out with 32pixel pipes for a long time. I do hope they are right this time though, because if they aren't doing more pixel pipes then they will have a hard time beating ATI without clocking their card extremly high. Considering the power consumption these cards are supposed to have I would hope Nvidia wouldn't use clocks to overcome architecture.
 

Nelsieus

Senior member
Mar 11, 2006
330
0
0
Originally posted by: redbox
Originally posted by: keysplayr2003
Yes, very little leaked so far. I have heard that G80 will not be a unified architecture. And a small glimpse of what it might be is Here.

Ya the INQ has been saying Nvidia will come out with 32pixel pipes for a long time. I do hope they are right this time though, because if they aren't doing more pixel pipes then they will have a hard time beating ATI without clocking their card extremly high. Considering the power consumption these cards are supposed to have I would hope Nvidia wouldn't use clocks to overcome architecture.

The 32/16 figure was thrown in as an example (I'm assuming). Absolutely nobody knows the specifics of the architecture, much less the actual build of it (dual-core, traditional, etc?)

Only time will tell, but I'd be very hesitant following theINQ.

 

thecoolnessrune

Diamond Member
Jun 8, 2005
9,673
583
126
Originally posted by: redbox
As much as I get tired of hearing about the inq bullsh*t they at least give me some new gpu to read about. I say down with G70, it's been great and all but I'm kinda getting tired of ya. :)

I always get a laugh out of this place. Reminds me of how poor i am :laugh: Ims till gaming ona GeForce 4 MX420 and MX440 :eek:
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Nelsieus
Originally posted by: 5150Joker
Originally posted by: keysplayr2003
Originally posted by: beggerking
good news for NV but there is no reason to get it now... Vista won't be out until 2007.

Why does everyone think new gen video cards are synonymous with Vista? DX9 Performance should be more than stellar on the new hardware. Better than it is now for certain. You don't have to have Vista in order to buy a G80 or a R600. XP will be just fine.
Now when Vista does come out and people are actually willing to use it, then is the time to concern yourself with a video card that can properly run Vista with all the candy. That is not for a good while yet. Probably two gens/refreshes of vid cards from now.



Keys,

With X1900 crossfire, 7900 GT/X SLI or 7950 quad, what DX9 game out there can you honestly say requires more power than they already provide? The G80 and R600 are pointless for now because they are built for DX10 yet Vista isn't even available yet. By the time Vista is available, the refresh for G80/R600 will be ready to roll and that's when people should give serious consideration to upgrading to the next gen.


I don't understand this mis-conception about-

a.) G80 and R600 being DX10-only cards:
G80 and R600, aside from sporting the cool DX10 capabilities, will more importantly bring increased performance and support for DX9 - not only in Vista, but XP, as well. And this increased performance/support will be faster than both the Geforce 7 and Radeon x1k series, as well. (Which you'd think would be obvious, but apparently you're not understanding? :confused: )


I understand just fine. My contention is that the extra power G80/R600 might bring would be pretty much useless in the timeframe they're being released since there isn't many games out there that can bring down a high end SLI/Quad/Crossfire setup. The only title on the horizon that I can think of that will remotely come close is Crysis but I bet even that will run beautifully on the aforementioned setups. I think it makes more sense (for those that have quad sli/sli/crossfire) to wait on a G80/R600 refresh when Vista is ready to roll




b.) My ultra-highend x1k / Geforce 7 GPU does me good as it is!:
I think it's insulting that you have such strong disregard for the enthusiasts who desire maximum settings on a decent resolution (which I consider to be 1600x1200 or higher) at an average of 60+ FPS, and able to run all the cool features and eye candy the game posseses.
What right do you have to tell gamers what's good enough for them, and what isn't. You might be satisfied with 1024x768 4xaa, but surprisingly, there are people who'd prefer to run things on higher settings, with a higher resolution, and optimal performance. And I'm sorry, but even the ultra-highend cards of today can't handle that for the most demanding games, much less the games coming out later this year through 2007.


What does any of this have to do with "rights" and "telling gamers what's good enough for them"? I bet you could count the number of games on one hand that could even get close to bringing down a quad SLI/SLI/Crossfire system at 1600x1200 4xAA (only one I can think of is Oblivion and that RPG only needs 30 fps outdoors).

Your logic makes no sense. I don't understand how something offering support for technology not yet on the market automatically means it becomes obsolete to everything else. But I think it's because your logic is just based on mis-information, and hopefully I cleared things up for you.

All you offered is opinion, no facts to back up your assertion that I'm wrong and you're right; you offered your own viewpoint just like I offered mine. I use a single OC'd X1900 XTX with my system at 1680x1050 with 4xAAA/16x HQ AF in every game I play except Oblivion (which I noted above). The most popular titles on the market with any value of replayability are WoW, BF2, Oblivion (b/c of mods), various MMOs, Source based games and a few other games and aside from Oblivion, they all run beautifully at 1600x1200 4xAA/16xAF. Very few people will buy a G80 just to experience FEAR again at a higher res + AA. They may play FEAR once they get a G80 to see how it runs but not because of FEAR or any other demanding SP title including Oblivion. So like I said, the only title on the horizon that I can think of that might even come halfway close to bringing quad SLI/SLI/Crossfire to their knees at 1600x1200 4xAA is probably Crysis - yet I doubt it will.

 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: thilan29
Originally posted by: 5150Joker
With X1900 crossfire, 7900 GT/X SLI or 7950 quad, what DX9 game out there can you honestly say requires more power than they already provide? The G80 and R600 are pointless for now because they are built for DX10 yet Vista isn't even available yet.

There is definitely plenty of power but think about it...next-gen cards usually give about 50% more perfromance than the last?? Dual GPU also gives about 40-60% performance boost but the thing is that dual GPU will usually cost more than the next gen single GPU, so you get the same performance benefit for less money, and without dual gpu hassles.

For people that already have dual GPU then of course it's not worth it but for people that don't have dual GPU it is definitely worth it. This is of course assuming the performance jump to R600/G80 is at about 50%.



Well that's the assumption I'm working off of course. Maybe I wasn't clear but I never said people with 9800s/X800s etc won't have any reason not to get a G80/R600. I'm saying that people with X1900 Crossfire/Quad SLI/7900 SLI won't need a G80/R600 since their setups can pretty much run anything out there until G80/R600's refresh + Vista roll around. Even those of us with a single 7900 GTX/X1900 can opt to go dual card when G80 rolls around and its almost certain prices for those cards will continue to decline. Right now it's all speculation that G80/R600 might offer a 50% performance jump in DX9. R600 is rumored to have 64 shaders which isn't a substantial jump from the X1900 XTX. It will definitely be faster but who knows by how much? It could only end up being 20-30% faster in DX9. So yeah realistically people will jump to get the newest hardware because that's what enthusiasts do and I'm not saying they shouldn't, just that those with uber setups today would have little reason to do so aside from satisfying their craving for new toys.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: thecoolnessrune
Originally posted by: redbox
As much as I get tired of hearing about the inq bullsh*t they at least give me some new gpu to read about. I say down with G70, it's been great and all but I'm kinda getting tired of ya. :)

I always get a laugh out of this place. Reminds me of how poor i am :laugh: Ims till gaming ona GeForce 4 MX420 and MX440 :eek:

ha. bit i think you still have better AF than most of US (it was the geforce 4 series that had good AF right ?)
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76

Well that's the assumption I'm working off of course. Maybe I wasn't clear but I never said people with 9800s/X800s etc won't have any reason not to get a G80/R600. I'm saying that people with X1900 Crossfire/Quad SLI/7900 SLI won't need a G80/R600 since their setups can pretty much run anything out there until G80/R600's refresh + Vista roll around. Even those of us with a single 7900 GTX/X1900 can opt to go dual card when G80 rolls around and its almost certain prices for those cards will continue to decline. Right now it's all speculation that G80/R600 might offer a 50% performance jump in DX9. R600 is rumored to have 64 shaders which isn't a substantial jump from the X1900 XTX. It will definitely be faster but who knows by how much? It could only end up being 20-30% faster in DX9. So yeah realistically people will jump to get the newest hardware because that's what enthusiasts do and I'm not saying they shouldn't, just that those with uber setups today would have little reason to do so aside from satisfying their craving for new toys.


hows that. its entirely a new architecture for the R600. it may as well be 10 times as fast as x1900xtx for all we know. its like comparing the MHz war of the processors or the fact that x1800xt has 16 pipes and 16 shaders (both less than 7800gtx) but still equals and mostly beats it
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
I just want IQ improvements..

Im thinking G80 will be on 80nm process (since ATi and NV are moving onto that process).
But the refresh of G80 and ATis R600 will be on 65nm.

Another thing to remember is, what if the new gen parts are going to have less of a performance hit when enabling AA? like 5% at maximum?, even better would be able to play ALL your current dx9 titles at 8xTR S at smooth fps. (60 and up).

Yes, the geforce4 had good AF, except it was WAY too slow to use it properly. NV had the right idea, but at the wrong time.

 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: tanishalfelven
Originally posted by: thecoolnessrune
Originally posted by: redbox
As much as I get tired of hearing about the inq bullsh*t they at least give me some new gpu to read about. I say down with G70, it's been great and all but I'm kinda getting tired of ya. :)

I always get a laugh out of this place. Reminds me of how poor i am :laugh: Ims till gaming ona GeForce 4 MX420 and MX440 :eek:

ha. bit i think you still have better AF than most of US (it was the geforce 4 series that had good AF right ?)

Geforce 4 MX is not a Geforce 4, it's a Geforce 2 ;)
So he might have better AF, but not the best AF.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: Lonyo
Originally posted by: tanishalfelven
Originally posted by: thecoolnessrune
Originally posted by: redbox
As much as I get tired of hearing about the inq bullsh*t they at least give me some new gpu to read about. I say down with G70, it's been great and all but I'm kinda getting tired of ya. :)

I always get a laugh out of this place. Reminds me of how poor i am :laugh: Ims till gaming ona GeForce 4 MX420 and MX440 :eek:

ha. bit i think you still have better AF than most of US (it was the geforce 4 series that had good AF right ?)

Geforce 4 MX is not a Geforce 4, it's a Geforce 2 ;)
So he might have better AF, but not the best AF.

ah so its has xbox like graphics. well they were good too.
 

thecoolnessrune

Diamond Member
Jun 8, 2005
9,673
583
126
Originally posted by: tanishalfelven
Originally posted by: Lonyo
Originally posted by: tanishalfelven
Originally posted by: thecoolnessrune
Originally posted by: redbox
As much as I get tired of hearing about the inq bullsh*t they at least give me some new gpu to read about. I say down with G70, it's been great and all but I'm kinda getting tired of ya. :)

I always get a laugh out of this place. Reminds me of how poor i am :laugh: Ims till gaming ona GeForce 4 MX420 and MX440 :eek:

ha. bit i think you still have better AF than most of US (it was the geforce 4 series that had good AF right ?)

Geforce 4 MX is not a Geforce 4, it's a Geforce 2 ;)
So he might have better AF, but not the best AF.

ah so its has xbox like graphics. well they were good too.

I wouldn't really know....... I've never been able to successfully run AF :eek: I get around 7fps in UT2K4 800x600 All low with 2xAF turned on :laugh:
 
Oct 4, 2004
10,515
6
81
Originally posted by: tanishalfelven
Originally posted by: Lonyo

Geforce 4 MX is not a Geforce 4, it's a Geforce 2 ;)
So he might have better AF, but not the best AF.

ah so its has xbox like graphics. well they were good too.

Actually, XBOX has more like Geforce 3 Ti graphics.
According to the Wiki article,

Graphics Processor: 233 MHz custom chip developed by Microsoft and NVIDIA (approximately equal to a GeForce 3 in capability). Enhanced vertex processing with 2 vertex shaders, and more flexible pixel shading than DirectX 8.

* Theoretical Geometry Rate: 115+ million vertices/second
* Theoretical Particle Performance: 125 M/s
* Pipeline Configuration: 4 pixel pipelines with 2 texture units each
* Theoretical Pixel Fill Rate: 932 Megapixels/second (233 MHz x 4 pipelines)
* Theoretical Texture Fill Rate: 1,864 Megatexels/second (932 MP x 2 texture units)
* Simultaneous Textures: 4
* Compressed Textures: Yes (6:1 through DDS)
* Full Scene Anti-Aliasing: Yes
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: redbox
Nope you won't even have to wait for the games to be written to support DX10. Cause a game is after all just software. Your hardware will just get input data and then output that said data. Doesn't matter what it is written with cause we all know software has nothing to do with hardware. ;)

the problem is, none DX10 GPU will not know how to process DX10 instructions...and please stop derailing the thread.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: beggerking
Originally posted by: redbox
Nope you won't even have to wait for the games to be written to support DX10. Cause a game is after all just software. Your hardware will just get input data and then output that said data. Doesn't matter what it is written with cause we all know software has nothing to do with hardware. ;)

the problem is, none DX10 GPU will not know how to process DX10 instructions...and please stop derailing the thread.

:cookie: