• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GeForce FX has 4 pipelines not 8!!!

Originally posted by: Evan Lieb
And that's why you don't go to The Inq for GPU reviews. 😉

GPU reviews? Some of us don't go there at all. 😉 Unfortunately it has a way of coming to US. :disgust:

Chiz
 
theinquirer is what it is... A fun site to read about every rumor in this industry.

The problem is when people (usually fans of whatever product) take what they have to say as definite fact, without any other sources to confirm the notion.
 
Originally posted by: Wingznut
theinquirer is what it is... A fun site to read about every rumor in this industry.

The problem is when people (usually fans of whatever product) take what they have to say as definite fact, without any other sources to confirm the notion.

Here here...finally...
 
I know the Inquirer is crap, but if you read the article, their info comes from someone at nVidia. It's also been discussed on other sites... (I read about this on a french hardware site last week)



Anyways, it doesn't change anything, so who cares?
 
what do the pipelines do anyways? looking to upgrade video cards after a long hiatus (gf2 gts) and i'm confused with alot of these terms
 
Originally posted by: UCDznutz
what do the pipelines do anyways? looking to upgrade video cards after a long hiatus (gf2 gts) and i'm confused with alot of these terms

It determines (together with clock-rate) the pixel fill-rate the card is able to achieve. Example:

8 pipelines @ 300MHz = 2.4 gigapixels/sec
4 pipelines @ 300MHz = 1.2 gigapixels/sec

Then the are the texturing units (TMU's). 8x1 means 8 pixel-pipelines with one texturing-unit each. 4x2 would mean four pixel-pipelines with two texturing-units each.
 
Now I recomend 3dmark2003 becomes a very valid benchmark and nvidia just shutsup and start making a better card
 
Here's an interesting discussion about this.

I'm getting a number of reports from people saying that they have not managed to get more than 4 pixel per clock out for GeForce FX. Normally, if running in 32bit, the 3DMark fillrate tests will not show more than four pixels per clock on the GFFX becuase of bandwidth limitations - however, even if the colour is reduced to 16bit and 16bit textures the multitexturing performance is still twice the the single texturing performance and the single texturing is still less than half the theoretical performance of an 8x1 card (1.4Gp/s single tex, 3.4Gt/s multitex).

When Radeon 9500 PRO is run with these setting is does achieve a rate that is greater than 4 pixels per clock (as I pointed out to XBit labs in their pulled review).

There is something fishy about GF FX.
 
More bad news for the GeForce FX I'm afraid as an 8x1 config is always equal to or better than a 4x2 config, especially when multitexturing in odd amounts.

Still, the card is already on the back-foot in terms of specs yet it still manages to do quite well against the 9700 Pro. I'll be definitely keeping my Radeon but it'll still be interesting to see how the FX does in the retail reviews.
 
Originally posted by: Czar
Now I recomend 3dmark2003 becomes a very valid benchmark and nvidia just shutsup and start making a better card

Personally, I'd like to see 3DMark2003 get thrown in the dump, and for nVidia to make a better card anyway 😉
 
Quote

--------------------------------------------------------------------------------
Originally posted by: Czar
Now I recomend 3dmark2003 becomes a very valid benchmark and nvidia just shutsup and start making a better card
--------------------------------------------------------------------------------



Personally, I'd like to see 3DMark2003 get thrown in the dump, and for nVidia to make a better card anyway

I agree with Sunner,also Nvidia make a good point
Again, NVIDIA stated in their 3DMark03 lab report that the important tests will be how it performs in actual games like UT2003, and Morrowind, and Splinter Cell

Real gaming benchmarks is what really counts .
There will always be better faster cards down the road from both ATi and Nvidia ,the best way to test them is yes in real games 😉.


🙂

 
Originally posted by: GTaudiophile
theInq continues their investigation...
Gezz uInq is such a POS

"Developers appeared shocked to learn that FX is actually 8x1 in Color + Z rendering since developers were led to understand that the technology was actually 8x1."

They're trying to bash nVidia (sorta) but can't even firgure out what they're trying to say....... Developers where shocked to find out that 8x1 was really 8x1.....DUH!
rolleye.gif


Thorin
 
Back
Top