R600 may have external 512-bit memory controller.

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: apoppin
are we absolutely certain that G80 shaders are actually "unified"?
:confused:
From what I hear, yes.
i mean really unified like ATi's? this will be 2nd gen 'unified shaders' for ati and first for nvidia.
It's not exactly like ATI's, but from what I gather it has a unified architecture nonetheless.
and yes, i DO read theInq and take it for what it is.... there are some things that are 'scooped' here first and not all of it belongs with the used kittly litter. :p
I can agree with that. The speculation must start somewhere. However, I see it as a forums job to really come out with the uneducated guesses and a sites job to be able to uncover more confidential, and therefore, more supported info.
i don't link to theinq 'hidden' . . . and even took your advice to note the source in the Topic's summary and provide an alternate source --whenever possible. What more could you ask?
:Q
Thankyou for doing that by the way. I guess I'm just sick of hearing their BS that I don't understand why anyone links it at all. I'm not attempting to change your posting habit, just attempting to tell people that the only way BS sites like theirs will cease to exist is to ignore them.
...just don't ask me to stop posting what i find interesting . . . whatever the source . . . please ignore what you are not interested in . . . and if it turns out to be crap as many of my threads do . . . it will fall to the bottom quickly.
Fair enough. I can respect that.
...there is nothing worse than bringing more attention to crap by calling it crap
Unless it keeps people from making the same mistake twice. Still, if you want to read the Inquirer that's not my business. However, if you read the Inquirer and then complain about them, that's a different story.
... and i created 'that' thread because theInq brought up something i was interested in . . . it is a valid reason and certainly better than any flamethread.
While I agree with making topics over things that hold your interest, I can't seem to warrant the fact that the source instigating that interest is 99% completely false. Making threads that concern an interest is one thing, making threads that the Inquirer knew you would have an interest in is another.

Anyway, I wasn't trying to raid on your parade and if it seemed like it I apologize. I just hate sites like the Inquirer, that's all. We shouldn't continue this off topic Inquirer debate in here anyways, and if I feel the need to comment on it any further to you, I'll make sure to do so through PM's.

One thing I'll be interested in is the shader/bandwidth game both vendors may be playing. I mean, if you think about it, the X1900 has twice as many shaders as the 7900 but on the average doesn't provide twice the perfromance. Although the rest of the architecture is giong to be different, I'd expect the gap between the two new GPU's to be pretty thin in terms of frames. It looks like in this case Nvidia will have more shaders, less bandwidth and ATI will have less shaders, more bandwidth. Seeing how the X1900's didn't "blow-out" the 79 series, I don't think the G80's 128 shaders will "blow-out" the R600's 96 shaders. In any case though, it sounds like AA is going to become more "free" than AF has been for this last round of competition.

**Sigh*** We'll just have to wait some more.....
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: josh6079
Originally posted by: apoppin
are we absolutely certain that G80 shaders are actually "unified"?
:confused:
From what I hear, yes.
i mean really unified like ATi's? this will be 2nd gen 'unified shaders' for ati and first for nvidia.
It's not exactly like ATI's, but from what I gather it has a unified architecture nonetheless.

I see we are getting ready for another round of semantics. Neither card is likely to be 100% unified (whatever that is), so will leave a hole the size of a truck for them both to claim unified. Now who knows if either will perform that great with these expected vista games - but who cares, by the time those games actually come out in any type of number these cards will be history. Personally I expect crytek to push the envelope and everyone else to bolt on a bit of fog or something.

 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: apoppin
are we absolutely certain that G80 shaders are actually "unified"?
:confused:

i mean really unified like ATi's? this will be 2nd gen 'unified shaders' for ati and first for nvidia.

Let's get down to brass tacks here - it really doesn't matter if Nvidia's shaders are "really really" unified or "sorta" unified if it performs well.

Especially if you're fighting on the "DX10 won't be a factor for at least another year" side of the fence ;) .
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: jiffylube1024
Originally posted by: apoppin
are we absolutely certain that G80 shaders are actually "unified"?
:confused:

i mean really unified like ATi's? this will be 2nd gen 'unified shaders' for ati and first for nvidia.

Let's get down to brass tacks here - it really doesn't matter if Nvidia's shaders are "really really" unified or "sorta" unified if it performs well.

Especially if you're fighting on the "DX10 won't be a factor for at least another year" side of the fence ;) .

well, Microsoft is allowing DX10 HW to be "fully DX10 compatable" and yet not 'fully unified'....i understood that nvidia weren't . . . that there were still some specific assignments made to the shaders

i understand that ati's are . . . that's all :p
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Originally posted by: josh6079

Originally posted by: Elfear
On a side note about the Inquirer's trustworthiness, check this thread out, post #88 and #90. Kinda funny actually.
That link goes to a thread with only 6 posts.
Back on topic- I don't want to get my hopes up for either G80 or R600 but dang, the rumored specs look sweet.
Which means the refreshes will be even sweeter!

Whoops. :eek: Link is fixed now and here it is again (posts #88 and #90). Summary for those who don't care to link through: When ATI engineers get bored they call the Inq and make up stuff just to see if Fuad will post it.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Elfear
Originally posted by: josh6079

Originally posted by: Elfear
On a side note about the Inquirer's trustworthiness, check this thread out, post #88 and #90. Kinda funny actually.
That link goes to a thread with only 6 posts.
Back on topic- I don't want to get my hopes up for either G80 or R600 but dang, the rumored specs look sweet.
Which means the refreshes will be even sweeter!

Whoops. :eek: Link is fixed now and here it is again (posts #88 and #90). Summary for those who don't care to link through: When ATI engineers get bored they call the Inq and make up stuff just to see if Fuad will post it.

LOL!!! Like Bart Simpson and Moe!!!
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
The New Graphics - A Tale of Direct X 10
From what we hear, the new [nvidia] card might not adopt the Microsoft graphics standard of a true Direct 3D 10 engine with a unified shader architecture. Unified shader units can be changed via a command change from a vertex to geometry or pixel shader as the need arises. This allows the graphics processor to put more horsepower where it needs it. We would not put it past Nvidia engineers to keep a fixed pipeline structure. Why not? They have kept the traditional pattern for all of their cards. It was ATI that deviated and fractured the "pipeline" view of rendering; the advent of the Radeon X1000 introduced the threaded view of instructions and higher concentrations of highly programmable pixel shaders, to accomplish tasks beyond the "traditional" approach to image rendering.

One thing is for sure; ATI is keeping the concept of the fragmented pipeline and should have unified and highly programmable shaders. We have heard about large cards - like ones 12" long that will require new system chassis designs to hold them - and massive power requirements to make them run.

Why shouldn't the new cards with ATI's R600 require 200-250 watts a piece and the equivalent of a 500 W power supply to run in CrossFire or G80 in SLI? We are talking about adding more hardware to handle even greater tasks and functions, like geometry shaders that can take existing data and reuse it for the subsequent frames. More instructions and functions means more demand for dedicated silicon. We should assume that there will need to be a whole set of linkages and caches designed to hold the data from previous frames, as well as other memory interfaces. Why wouldn't we assume that this would require more silicon?

Although we are not in favour of pushing more power to our graphics processors and the massive amounts of memory on the card, we are excited to think about the prospects of more in our games. Currently one can perform Folding at Home on your graphics processor, and soon we will be able to do effects physics on them too
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
A chassis designed to help hold the card in place?

:shocked:

I wonder if that chassis will be compatible with a lot of full tower cases.
 

Baked

Lifer
Dec 28, 2004
36,052
17
81
Oh sh!t, I don't think my case is big enough (long enough) to accommendate these DX10 cards. I am planning on getting a 2nd generation DX10 card. Or maybe get a X1950Pro and save money.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: josh6079
A chassis designed to help hold the card in place?

:shocked:

I wonder if that chassis will be compatible with a lot of full tower cases.

i guess i bolded the wrong part . . . it looks like G80 is not "unified' :p

the new [nvidia] card might not adopt the Microsoft graphics standard of a true Direct 3D 10 engine with a unified shader architecture. Unified shader units can be changed via a command change from a vertex to geometry or pixel shader as the need arises. This allows the graphics processor to put more horsepower where it needs it. We would not put it past Nvidia engineers to keep a fixed pipeline structure.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: apoppin
Originally posted by: josh6079
A chassis designed to help hold the card in place?

:shocked:

I wonder if that chassis will be compatible with a lot of full tower cases.

i guess i bolded the wrong part . . . it looks like G80 is not "unified' :p

the new [nvidia] card might not adopt the Microsoft graphics standard of a true Direct 3D 10 engine with a unified shader architecture. Unified shader units can be changed via a command change from a vertex to geometry or pixel shader as the need arises. This allows the graphics processor to put more horsepower where it needs it. We would not put it past Nvidia engineers to keep a fixed pipeline structure.

Well, whether it is, or isn't unified, as long as it does what it has to do, that's probably quite alright.