**Offcial FX thread** Hardocp, Toms Hardware , ANANDTECHS is up with MIN FPS, and Now Hexus.net added

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: zephyrprime
I think that the FX still has some promise. IT has excellent perfomance on 3dmark's lighting test and it's overdraw is less than the 9700. These two things will become relatively more important in the games of the future.
Sure . . . the NV35 or NV40 . . . but this current design is SO lost. :p
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
I'm looking forward to the ongoing comparison between the NV30 and the R3xx. This should prove interesting. The only interesting times in the graphics industry have been times when there was no clear-cut performance leader.

TnT2 vs Voodoo3 -> This filled forums with endless fanboi sp3ak for months and months until the release of the Geforce256 (and subsequent delay of the VSA100)

Geforce2 GTS vs Voodoo5 5500 -> Although nowhere near as interesting as the above, this was still good for a laugh as the 3dfx fanboi's had to learn a new vocabulary, claiming their advantage to be IQ.

Geforce3 vs Radeon 8500 -> Although the release of the 8500 was filled with driver issues and the Quack fiasco, subsequent driver releases pushed the R200 to surpass nV20. Just in time for the Geforce4. This one wasn't much fun either, as the nVidiot's were deaf on the ATIdiots cries about image quality.

R300 -> Boring! This thing smashed upon release like nothing else before it. I've been reading these reviews since before there were benchmarks, and not since the Voodoo2 SLI crushed it's younger brother has a video card leaped the competition like the 9700.

R300 vs NV30 -> Finally! This is gonna be meaty! Tom says GFFX holds performance crown, but at matched IQ settings it loses to R300, or does it? This has the making of a good rivalry... unless of course that R350 is released in March and leaves no questions.

 

EdipisReks

Platinum Member
Sep 30, 2000
2,722
0
0
it'd be pretty funny if nVidia released NV35 next week. maybe they just announced the NV30 to make everyone be like "nVidia is teh Suxors we are teh stoopid fanboys" and then they will drop the bomb". it would be nice just to get some of you to shut up. the card [Geforce FX] doesn't suck. it's not as good as it should be, and it is awfully loud, but everyone's condemnation of the card is ignoring that it is still freaking fast as hell. i wish everyone would talk like human beings instead of like monkeys jumping up and down while flinging poo. and LANs being affected by video cards?
rolleye.gif
 

Dufusyte

Senior member
Jul 7, 2000
659
0
0
Soon we'll need a whole new form factor to accomodate such heat!


Need to make an "external video card," analogous to an "external modem." It will be an external box with its own cooling system, and connect to the PC via some kind of cable.
 

element

Diamond Member
Oct 9, 1999
4,635
0
0
Originally posted by: OS
geesh, I'm glad I didn't buy that geforce fx tshirt. :Q

WTF happened?

What happened is they went with a 128 bit memory architecture, which is akin to sticking wooden cart wheels on a Ferrari. Look at the memory bandwidth specs.

I sure hope Nvidia wakes up and realizes they need to restructure that memory architecture for the next iteration of this line up or they will be doomed yet again. I doubt they will make the same mistake again though, that's not like them to do that.
 

308nato

Platinum Member
Feb 10, 2002
2,674
0
0
......i wish everyone would talk like human beings instead of like monkeys jumping up and down while flinging poo....
rolleye.gif



Ha Ha Ha Ha.... ROFL....that would defeat the purpose of the internet altogether.


That's the truest statement I have read all day.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: EdipisReks
it'd be pretty funny if nVidia released NV35 next week. maybe they just announced the NV30 to make everyone be like "nVidia is teh Suxors we are teh stoopid fanboys" and then they will drop the bomb". it would be nice just to get some of you to shut up. the card [Geforce FX] doesn't suck. it's not as good as it should be, and it is awfully loud, but everyone's condemnation of the card is ignoring that it is still freaking fast as hell. i wish everyone would talk like human beings instead of like monkeys jumping up and down while flinging poo. and LANs being affected by video cards?
rolleye.gif

Uh sure . . . it'd be "funny" alright.
rolleye.gif


It'd also be "funny" if ATI released the 9900Pro next week . . . actually THIS scenario is MUCH more likely. :Q However, Nvidia won't find it "amusing" . . .

 

element

Diamond Member
Oct 9, 1999
4,635
0
0
Originally posted by: Dufusyte
Soon we'll need a whole new form factor to accomodate such heat!


Need to make an "external video card," analogous to an "external modem." It will be an external box with its own cooling system, and connect to the PC via some kind of cable.

Yeah some kind of cable that picks up EMI like a cheap radio shack outdoor broadcast tv arial antenna. Leave the engineering to the engineers Dufus.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: element®
Originally posted by: OS
geesh, I'm glad I didn't buy that geforce fx tshirt. :Q

WTF happened?

What happened is they went with a 128 bit memory architecture, which is akin to sticking wooden cart wheels on a Ferrari. Look at the memory bandwidth specs.

I sure hope Nvidia wakes up and realizes they need to restructure that memory architecture for the next iteration of this line up or they will be doomed yet again. I doubt they will make the same mistake again though, that's not like them to do that.

The memory bandwidth situation is much worse than it appears. DDR-II only provides theoretical bandwidth based on the assumption the memory interface can keep up with it. Its pretty clear the 128-bit architecture is the bottle-neck, choking off all or most of any benefit DDR-II provides. The core looks to be a winner, especially from the shader and T&L benches. Looks like NV35 with a 256-bit memory bus will be what FX should've been.

Chiz
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
Looks like NV35 with a 256-bit memory bus will be what FX should've been.

Yes they should of went with 256 bit memory bus,unfortunately I hear that a 256 bit memory bus is not on their list for the near future,anyway if & when they do, I just hope they don`t make the card any bigger ;).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
I have to say it but the results are disappointing to say the least. If you're a Ti4600 owner and you absolutely must go with nVidia then the FX is a viable option, otherwise for most people the 9700 Pro is the logical upgrade.

We're witnessing history today as the 9700 Pro has survived one of nVidia's product cycles and is still the top-dog. The more I see the more I'm impressed with just how good the 9700 Pro really is and how it has stood the test of time.

Also, look at Anand's 16x performance anisotropic tests - the 9700 Pro is still achieving 89% of its performance at that setting and it looks better than nVidia's balanced 8x setting. We all know the 9700 Pro is fast and good looking but that really helps to put things into perspective. :cool:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: EdipisReks
Originally posted by: apoppin
Uh sure . . . it'd be "funny" alright.
rolleye.gif

you obviously missed my point.
You missed mine. :p

And before I offend the entire forum membership (well at least the bitter disappointed and former Nvidia fans) - I apologize for my sarcasm (that is being so misconstrued today).

TO SET THE RECORD STRAIGHT:
1) I always preferred ATI's video philosophy over Nvidias.
2) I respect Nvidia's engineering and product.
3) I too am disappointed with the NV30 (to the point of deriding it - even though it would be an AWESOME card if it wasn't so HYPED, so delayed and so late AND we already have a "superior" card that is 25% cheaper - the 9700.
4) I am NOT one of those who wishes Nvidia ill (try to find a single statement to that effect by me - ever!). We NEED both healthy companies.

Now I am not going to reply further since I am heading to work . . . but I may later tonight.

Best regards and aloha,
Mark
 

Bovinicus

Diamond Member
Aug 8, 2001
3,145
0
0
The memory bandwidth situation is much worse than it appears. DDR-II only provides theoretical bandwidth based on the assumption the memory interface can keep up with it. Its pretty clear the 128-bit architecture is the bottle-neck, choking off all or most of any benefit DDR-II provides. The core looks to be a winner, especially from the shader and T&L benches. Looks like NV35 with a 256-bit memory bus will be what FX should've been.
I totally agree. They should have made the product compatible with DDRII for scaling purposes, but that is it. This is what ATi has done, and it is paying off. The core is a monster in terms of clockspeed, but they are just constraining it too much. If they come out with a refresh of the FX (Ultra Ultra) with a higher memory clock, it could be a real winner.
 

sandorski

No Lifer
Oct 10, 1999
70,819
6,366
126
Originally posted by: Dean
Stop harping on Nvidia people!! It costs considerably less per pound than the 9700pro ;)

LOL, that could be their marketing slogan!
 

sandorski

No Lifer
Oct 10, 1999
70,819
6,366
126
Hey! Does the FX use HSR, or, was it supposed to have HSR as a feature? If HSR is a feature that hasn't been enabled yet, the FX may yet have a significant performance boost to be had.
 

element

Diamond Member
Oct 9, 1999
4,635
0
0
Originally posted by: Bovinicus
The memory bandwidth situation is much worse than it appears. DDR-II only provides theoretical bandwidth based on the assumption the memory interface can keep up with it. Its pretty clear the 128-bit architecture is the bottle-neck, choking off all or most of any benefit DDR-II provides. The core looks to be a winner, especially from the shader and T&L benches. Looks like NV35 with a 256-bit memory bus will be what FX should've been.
I totally agree. They should have made the product compatible with DDRII for scaling purposes, but that is it. This is what ATi has done, and it is paying off. The core is a monster in terms of clockspeed, but they are just constraining it too much. If they come out with a refresh of the FX (Ultra Ultra) with a higher memory clock, it could be a real winner.

But it doesn't appear feasible to increase the memory clock. Right now it maxes out at 500/1000Mhz. Anand tried to overclock it to 1100 effective but no go. And keep in mind the size of the heatsink just to get it at the speed it is at now. There is no headroom for overclocking, not just yet. Maybe when yields improve but by that time ATI's next part will be appearing in reviewer's hands (unless they too suffer a delay for some reason).

Nvidia really needs to improve the memory bus to 256bit wide data path. You can't send sports cars into a traffic jam and expect traffic to move smoothly because the cars are capable of high speed. Franky i don't know what Nvidia was thinking when they went with this memory architecture. Keep costs down perhaps? They would have done better to go with a 256 bit memory bus and a bit slower clockspeed like that of the standard 5800FX (non ultra part)
 

Adul

Elite Member
Oct 9, 1999
32,999
44
91
danny.tangtam.com
Originally posted by: sandorski
Hey! Does the FX use HSR, or, was it supposed to have HSR as a feature? If HSR is a feature that hasn't been enabled yet, the FX may yet have a significant performance boost to be had.

all current cards use HSR now. and yes it is enabled.
 

ed21x

Diamond Member
Oct 12, 2001
5,411
8
81
the GFX did not perform horribly... it performed about equal, which is still pretty damn good.
 

sharkeeper

Lifer
Jan 13, 2001
10,886
2
0
Times are sad indeed when reviews have to include arural measurements; i.e. the posting of sound levels because of the stinking cooling "solution". Unplug the fan, connect a seeback genny dist plate on that gpu and use the power for something useful. At 75W, there would be enough thermal drive to run an NH3 exchanger that could drop the temp of your cpu cores ten (possibly twenty!) degrees centigrade. Of course R717 is too nasty for most enthusiasts to mess with.

Cheers!
 

PCMarine

Diamond Member
Oct 13, 2002
3,277
0
0
Originally posted by: ed21x
the GFX did not perform horribly... it performed about equal, which is still pretty damn good.

Yes, but it is also $100 more expensive than the 9700 Pro which is ~6 month old technoligy.
 

FreshPrince

Diamond Member
Dec 6, 2001
8,361
1
0
I have a feeling nVidia will once again pull off their trick of releasing "special" detonator drivers when ATI releases their R350. This new driver will make the card run much faster and probably equal to ATI's counterpart. It'll also miraculously lower the core temp somehow.

nVidia, I smell wut ur cooking ;)
 

Leo V

Diamond Member
Dec 4, 1999
3,123
0
0
Originally posted by: FreshPrince
I have a feeling nVidia will once again pull off their trick of releasing "special" detonator drivers when ATI releases their R350. This new driver will make the card run much faster and probably equal to ATI's counterpart. It'll also miraculously lower the core temp somehow.

nVidia, I smell wut ur cooking ;)

I doubt NVIDIA is pulling any punches here--they had a reasonable amount of time to tune their drivers, with NV30 delays and all. The drivers will probably improve with time, but I really doubt that there is some secret master plan at work. (Edit: I really should stop posting here :eek: )
 

Adul

Elite Member
Oct 9, 1999
32,999
44
91
danny.tangtam.com
Originally posted by: FreshPrince
I have a feeling nVidia will once again pull off their trick of releasing "special" detonator drivers when ATI releases their R350. This new driver will make the card run much faster and probably equal to ATI's counterpart. It'll also miraculously lower the core temp somehow.

nVidia, I smell wut ur cooking ;)



and i SMELl what your cooking :Q

share it!