Nvidia talks G80

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
If the g80 will not be a unified architecture, it begs the question when will Nv transition to unified shaders, given that DX10 will use a unified programming model? If this is the new big thing they've been working on since 2002, how long can they ride the architecture before releasing a yet newer, unified gpu? And what are the performance ramifications of handling unified shader code at the driver level and running it using dedicated PS/VS?

At least I think this will be the big bad 32 pipe card everyone and their momma have been speculating about.
 

Steelski

Senior member
Feb 16, 2005
700
0
0
Originally posted by: Matt2
500 million invested in G80?

Let's hope it doesnt flop.

what they are saying is utter crap. that 500 can be split in many ways, not least of which has already gone into all the cores before this one. the extra 250 will be the chips they will be selling.
Its strange we have not heard about anything to do with chips after this one. the G80 and R600 have been in rumour for a long time!!!! is it not time for newer cores and rumours.
 
Jun 14, 2003
10,442
0
0
Originally posted by: munky
If the g80 will not be a unified architecture, it begs the question when will Nv transition to unified shaders, given that DX10 will use a unified programming model? If this is the new big thing they've been working on since 2002, how long can they ride the architecture before releasing a yet newer, unified gpu? And what are the performance ramifications of handling unified shader code at the driver level and running it using dedicated PS/VS?

At least I think this will be the big bad 32 pipe card everyone and their momma have been speculating about.


something about handling the unified API at the driver level........surely it will incur a performance hit

and 32 pipes, yeah will be awesome, but if its 32x1 then the R580 may still have the legs on it. though given NV's 24x1 can keep pace with ATI's 16x3 i doubt it. of course ATI will have their next best thing out too
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
You're still operating on the assumption that you're going to see Vista sometime this year. Who cares if the G80 doesn't have a unified architecture if DX10 isn't there to be dealt with?
 

TecHNooB

Diamond Member
Sep 10, 2005
7,458
1
76
Originally posted by: wizboy11
Doesn't DX10 state that the card HAS TO use a unified architechture?

Yea, I thought DX10 was no BS. Either the card meets the requirements or it's not DX10.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: wizboy11
Doesn't DX10 state that the card HAS TO use a unified architechture?

DX10 doesn't care what the configuration of Pixel/Vertex Shaders is. All it cares about is that they are present so it can execute Vertex/Pixel Shading Operations.

Im not quite sure why people keep saying that DX requires a certain amount or a certain type of pipelines because it doesn't

-Kevin
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
AFAIK, dx10 has the shader code written using a unified model. That doesnt mean that the hardware must have unified shaders, but it would involve an additional step in translating unified code into legacy PS/VS code in the driver, with a possible performance hit. Anyway, unified shader HW will not be of much use until DX10+Vista, but if the g80 is Nv's new architecture, then it will have to last them at least a year if not longer, and in the long run the lack of unified shader hardware may become a problem. Unless of course the g80 is nothing more than a 32 pipe g70, and the real new gpu will be released after the g80.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Ronin
You're still operating on the assumption that you're going to see Vista sometime this year. Who cares if the G80 doesn't have a unified architecture if DX10 isn't there to be dealt with?

I'm sure we will see Vista out at the end of this year, but DX10 games and mass migration to Vista will still be forthcoming. I know I and I'm sure countless other people inside and outside of the IT world are waiting for the first service pack before even considering switching to Vista full time.
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: jiffylube1024
Originally posted by: Ronin
You're still operating on the assumption that you're going to see Vista sometime this year. Who cares if the G80 doesn't have a unified architecture if DX10 isn't there to be dealt with?

I'm sure we will see Vista out at the end of this year, but DX10 games and mass migration to Vista will still be forthcoming. I know I and I'm sure countless other people inside and outside of the IT world are waiting for the first service pack before even considering switching to Vista full time.

I'm sure we won't. There's too much in flux (DX10, for one) for it to happen this year. Expect it sometime Q2 next year.

Everyone has this hard-on for Vista, when it's apparent that most of the people that do are either bandwagoners or just follow suit because it's the new thing.

News flash, folks. While it's pretty and all, there's a LONG way to go.
 

Alaa

Senior member
Apr 26, 2005
839
8
81
Originally posted by: munky
AFAIK, dx10 has the shader code written using a unified model. That doesnt mean that the hardware must have unified shaders, but it would involve an additional step in translating unified code into legacy PS/VS code in the driver, with a possible performance hit. Anyway, unified shader HW will not be of much use until DX10+Vista, but if the g80 is Nv's new architecture, then it will have to last them at least a year if not longer, and in the long run the lack of unified shader hardware may become a problem. Unless of course the g80 is nothing more than a 32 pipe g70, and the real new gpu will be released after the g80.

munky, missed u man..havent seen u for a long time...GUYS if u want next card to be 100 pipes ask munky to pray for this..it works! i guess G80 will be 40 pipes!! :p
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: munky
If the g80 will not be a unified architecture, it begs the question when will Nv transition to unified shaders, given that DX10 will use a unified programming model? If this is the new big thing they've been working on since 2002, how long can they ride the architecture before releasing a yet newer, unified gpu? And what are the performance ramifications of handling unified shader code at the driver level and running it using dedicated PS/VS?

At least I think this will be the big bad 32 pipe card everyone and their momma have been speculating about.

unified hardware architecutre and unified programming model are 2 different things. DX10 doesn't care how the hardware is laid out, just that it's compatible with the programming.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
And what are the performance ramifications of handling unified shader code at the driver level and running it using dedicated PS/VS?

And is that performance hit that much smaller then the very large performance hit trying to use general purpose hardware over dedicated.....

I'm not sure the NV50 will be segmented or unified- but it is assinine to me that people care as much as they do. If AMD releases a chip with seperate functional units for SIMD and FPU and Intel releases a chip with shared functionality is it that big of a deal? How it will perform is the main issue.

what they are saying is utter crap.

The NV50 is enormously important to nVidia. If the part fails, they are going to suffer enormously. This is the part that has been nVidia's focus for many years now, everything we have seen released since the XBox has been of secondary importance to them. While the current 7900GTX is in essence an evolution of the GF3 the NV50 will be a complete overhaul to the GeForce line. This is a do or die part for nVidia(at least in a generalized business sense as far as GPUs are concerned). The NV50 is far more important to nVidia then the R600 is to ATi. That isn't to say that anyone should auto assume it will be better, but if it doesn't end up being the superior technology nV is in big trouble while if it does manage to be a bit better nV is still in trouble- either one of those scenarios should leave ATi feeling fairly confident.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: BenSkywalkerThe NV50 is far more important to nVidia then the R600 is to ATi.

so if r600 flops it wil be no big deal to ati? hardly... failure on either part in any generation is of significant importance to each company.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
The Hardware must meet the DX 10 specification, and it says that it uses an unified shader model, just like the Xbox 360, so when there's more pixel shaders data than vertex shader data, those General Purpose pipelines can assign for example, from 48 pipelines, 12 for vertex operations and 36 for pixel shader operations, How than can be done with a videocard with 32 Pixel Pipelines and may be 10 vertex shaders like may be the G80?? And that's about compatability, or wanna replay the NV30 with it's CG language how it bogs down the performance of DX 9 tittles using drivers to translate the software at API level??
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
The Hardware must meet the DX 10 specification

NO IT DOESN'T

The API doesn't care in what configuration the hardware is in. As long as there is vertex and pixel shader hardware there. We could have a card with 100 Pixel Pipes and the DirectX 9 API couldn't give less of a crap.

-Kevin

Edit: As far as features go yes, you must be able to do advanced shading thereby meeting the specifications. But it DOES NOT care about the configuration of the hardware.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
so if r600 flops it wil be no big deal to ati? hardly..

About the level of insight I have grown to expect from you recently. Why don't you explain how ATi's R600 R&D costs and all related products have any chance of failing to generate significant revenue and also let's talk about the amount of R&D funding on a total or percentage basis of expenditures that the product consumed versus the typical generational part. R600 flopping is of little concern to ATi in relative terms- the majority of the R&D is already being used and is generating significant revenue without creating overhead. The R600 PC part could fail completely and odds are pretty high ATi will see an enormous return on their R&D.

failure on either part in any generation is of significant importance to each company.

ATi has in relative terms had one solid generation- R3x0- when factoring in performance and panache in the gamer market. They were at the pinnacle of their success in terms of marketshare back in the RageIIc and RagePro days- when they sucked by any reasonable standard. OTOH nVidia had an abject failure in the NV30 and within a few quarters they were posting record profits. No, not every generation is of significant or even major importance to each company- just to fanboy wankers. If nVidia had dropped $50Million on the R&D for the NV50 it would be of little importance to them how well it did in relative terms.