**Offcial FX thread** Hardocp, Toms Hardware , ANANDTECHS is up with MIN FPS, and Now Hexus.net added

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Originally posted by: NFS4
Isn't it kinda funny how the GeForce FX is turning out to be just like the Voodoo 5 6000?? Too big, too long, too power hungry, too loud, and not all that fast? :)

I hope that some day, after all the dust settles, someone will write a "behind-the-scenes" story on what exactly happened at nVidia in the last 1-2 years, especially concerning the acquisition of 3dfx and what their engineers did for them.

I guess ATi made a better purchase in ArtX.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
I'm suprised there are any people who are suprised with the Geforce FX release :)

The reality is that nVidia is (for the most part) starting over with the NV30. TnT through Geforce4 4600 have all basically shared the same architecture. This is what allowed them their furious run through all competition by basically piecing together faster cards every 6-9 months by a combination of adding pipelines and then by adding T&L and PS and VS blocks on top of it all. The high-end models feature the latest "blocks" and the low tier products are integrated into chipsets and cheap-o cards which share the same driver base and driver stability. The end result was a very financially successful model for nVidia, and I think it showed in the performance and reliability of the product line.

On that note, kudo's to nVidia for raising the bar for all vendors for the last several years.

One (decent) analogy is to that of the original Pentium architecture. This architecture carried Intel for many years, culminating in the Pentium 3 which is still being produced today. The reality is that these architectures can only be "built up" so far, to a point where the returns are diminishing to a point where there is no good return on investment.

Just as Intel floundered with the original P4 release, it will be (it is) the foundation for the next ~4 years.

Heat problems, poor optimizations, yeild problems, schedule delays. These are all common occurances with revamped architecture.

nVidia's next 4 years won't depend on how awesome nv30 isn't (or is, if you read Tom's), but on how well they ramp on their current architecture base. If they can return to the model of die shrink and feature "building-block" adding onto this core, I believe they will remain compeditive not just on a "high end performance" model, but on the same low,mid,high tier product offering which made them successful in the past. It's all up in the air.

That being said, mega-Kudo's to ATI for releasing such a killer product in the R300 core. If R300 lends itself to be a solid base for scaling and feature addition, then nVidia had better get their act together in a big way.

Either way, the next year or so is going to be amazing for the graphics industry. We have seen products released which eclipse their previous generation by 100% (is some benchmarks) !!! That's amazing !!!


 

Mem

Lifer
Apr 23, 2000
21,476
13
81
Isn't it kinda funny how the GeForce FX is turning out to be just like the Voodoo 5 6000?? Too big, too long, too power hungry, too loud, and not all that fast?


I was thinking how many ex-3DFX engineers were in the design of the FX compared to Nvidia engineers,probably 90% if you go by the size ;).

Bottom line is ,it`s to little an improvement and too late :(.
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Originally posted by: kuk
All review sites are getting hammered ... it's impossible to read Anand's and Tom's articles. But something cought my attention in page 3 of Tom's article. In the comparison table, both R9700 and FX are said to have $399 estimated prices. Something is not right ....

The Radeon 9700 PRO still has an MSRP of $399 over at ATI.com. So, in theory, he is correct. What he doesn't mention is that you can go over to newegg.come and pick up FIC's Radeon 9700 PRO for $295. The FX will debut around $399, but prices should fall. Of course, at the moment, you can't even buy an FX.
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
Originally posted by: NFS4
Isn't it kinda funny how the GeForce FX is turning out to be just like the Voodoo 5 6000?? Too big, too long, too power hungry, too loud, and not all that fast? :)

now we know what bits of old 3dfx were used in the fx ;)
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Originally posted by: merlocka


nVidia's next 4 years won't depend on how awesome nv30 isn't (or is, if you read Tom's), but on how well they ramp on their current architecture base. If they can return to the model of die shrink and feature "building-block" adding onto this core, I believe they will remain compeditive not just on a "high end performance" model, but on the same low,mid,high tier product offering which made them successful in the past. It's all up in the air.

That being said, mega-Kudo's to ATI for releasing such a killer product in the R300 core. If R300 lends itself to be a solid base for scaling and feature addition, then nVidia had better get their act together in a big way.

Just how much more can nVidia build on a product that already runs as hot and as loud as it does? From the looks of it, it's the end of a product line instead of the beginning.

As far as ATi goes, time will tell if they experience the same pains growing into .13mu production. R300 as a .15mu part doesn't offer much room to grow, but it remains to be seen if R400 will be a totally different core or just a .13mu version of R300 that's been tweaked with a much higher clock. Rumor has it that R350 and RV350 will debut in March with R400 paper-released in mid-to-late summer, ready to be releases in time for NV35.
 

Adul

Elite Member
Oct 9, 1999
32,999
44
91
danny.tangtam.com
Originally posted by: GTaudiophile
Originally posted by: merlocka


nVidia's next 4 years won't depend on how awesome nv30 isn't (or is, if you read Tom's), but on how well they ramp on their current architecture base. If they can return to the model of die shrink and feature "building-block" adding onto this core, I believe they will remain compeditive not just on a "high end performance" model, but on the same low,mid,high tier product offering which made them successful in the past. It's all up in the air.

That being said, mega-Kudo's to ATI for releasing such a killer product in the R300 core. If R300 lends itself to be a solid base for scaling and feature addition, then nVidia had better get their act together in a big way.

Just how much more can nVidia build on a product that already runs as hot and as loud as it does? From the looks of it, it's the end of a product line instead of the beginning.

As far as ATi goes, time will tell if they experience the same pains growing into .13mu production. R300 as a .15mu part doesn't offer much room to grow, but it remains to be seen if R400 will be a totally different core or just a .13mu version of R300 that's been tweaked with a much higher clock. Rumor has it that R350 and RV350 will debut in March with R400 paper-released in mid-to-late summer, ready to be releases in time for NV35.


i am willing to bet they will find a way to make it run cooler, through revisions. But that wont be until the NV35.
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: GTaudiophile
[As far as ATi goes, time will tell if they experience the same pains growing into .13mu production. R300 as a .15mu part doesn't offer much room to grow, but it remains to be seen if R400 will be a totally different core or just a .13mu version of R300 that's been tweaked with a much higher clock. Rumor has it that R350 and RV350 will debut in March with R400 paper-released in mid-to-late summer, ready to be releases in time for NV35.

OTOH, 9700 overclocks quite nicely, unlike the FX. So it seems that 9700 has more room to grow than FX does. And transition to .13 micron will be easier for Ati, since the fab worked out the bugs in the process when they were making FX. Now that their .13 micron process is more mature, it's easier for Ati to migrate in to it.
 

Double Trouble

Elite Member
Oct 9, 1999
9,270
103
106
Isn't it kinda funny how the GeForce FX is turning out to be just like the Voodoo 5 6000?? Too big, too long, too power hungry, too loud, and not all that fast?

That was my first thought after reading the initial part of the AT review on the card. As soon as I saw the picture, and read about the heat, the fan, the extra power connector, and the extra slot, the Voodoo 5 cards came to mind. I can't imagine nvidia would be in the same 'desparion mode' as 3dfx was though.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: GTaudiophile
Originally posted by: merlocka

nVidia's next 4-5 years won't depend on how awesome nv30 isn't (or is, if you read Tom's), but on how well they ramp on their current architecture base. If they can return to the model of die shrink and feature "building-block" adding onto this core, I believe they will remain compeditive not just on a "high end performance" model, but on the same low,mid,high tier product offering which made them successful in the past. It's all up in the air.

That being said, mega-Kudo's to ATI for releasing such a killer product in the R300 core. If R300 lends itself to be a solid base for scaling and feature addition, then nVidia had better get their act together in a big way.

Just how much more can nVidia build on a product that already runs as hot and as loud as it does? From the looks of it, it's the end of a product line instead of the beginning.

As far as ATi goes, time will tell if they experience the same pains growing into .13mu production. R300 as a .15mu part doesn't offer much room to grow, but it remains to be seen if R400 will be a totally different core or just a .13mu version of R300 that's been tweaked with a much higher clock. Rumor has it that R350 and RV350 will debut in March with R400 paper-released in mid-to-late summer, ready to be releases in time for NV35.

The end of a product line? You comments lead me to believe that you think the high-end cards are the foundation for a companies "bread-n-butter". Although I agree that it's paramount for a company to have stake as the performace leader, the Geforce4 MX is selling like hotcakes right now in OEM and I think we can agree that it's technically challenged compared to some of it's price compeditors.

The TnT->Geforce->GF2>GF3>GF4 series started with a delayed part with poor drivers, which didn't oust the current performance champion. It was overhyped and underwhelming at the onset.

The series started at 350nm and is now being produced at 180nm, there were 4 process shrinks throughout nearly 5 years now, and they were able to quadruple the number of rendering pipelines. It was a very scalable architecture.

Do I think the GeforceFX 5800 Ultra sucks? Compared to the 9700... Yup. I wouldn't buy one of those leafblower things, I don't have confidence in the drivers like I do with the Geforce1,2,3,4 series (which I don't have with the current ATI drivers), it's too hot, and it's expensive as heck with no "bargain" part in sight until the Fall.

I guess the main question in my mind is whether the nV30 architecture will turn out to be as scalable as the previous architecture. If so, nVidia will be in good shape this Christmas. If not... let's hope S3 rocks the boat with that DeltaChrome ;)
rolleye.gif
 

BlvdKing

Golden Member
Jun 7, 2000
1,173
0
0
Originally posted by: NOX
I'm shocked! Who would have thought nvidia would produce a sub-par video card months after ATI released the 9700?

The days of the Ti looks to be gone. I'm actually holding on to my Ti4200 for as long as I can. However the more time passes the more I think of buying an 9500Pro (to save cash) or 9700Pro (for more speed).

Btw, the .avi files were funny LOL!!!

I agreee. I'll hold onto my Ti 4200 until June. By then the ATI refresh should be out and the ATI 9700 Pro should be relatively cheap.
 

Ferocious

Diamond Member
Feb 16, 2000
4,584
2
71
Actually it might be a case of 3dfx engineers leaving Nvidia.

Nividia's sudden rise to fame came about mostly due to a lot of engineers leaving 3dfx during the V2 era.

Many of the engineers behind the "revolutionary" Geforce were ex-3dfx engineers.
 

KickItTwice

Member
Apr 28, 2002
113
0
0
It seems ironic that with all the engineers from 3DFX, that the FSAA image quality of the FX would fall behind the IQ of the Radeon 9700 pro.

I really like using just 2x FSAA at a fairly high resolution with the 9700 pro. The 2x FSAA actually makes a big difference with the image quality of the Radeon, and using the higher resolution with it makes the textures look crisp, as well as other things.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,360
16,193
136
To OS and Adul: (regardling the comment about blaming my lan game on a video card)

OS: I spent three weeks, and many emails, including some threads here. Running my lan game, the one in four computers running a Nvidia card, would drop out of the game after 5-10 minutes. The other video cards were a mix of ATI and Matrox. After all of the research and emails to Westwood tech support, my last try was to remove the video card, and swap in a different one. When it worked with an ATI card, I then ended trying about 10 different video cards, and two were Nvidia (a TNT Vanta and a GF2 TI200). Any time an Nvidia card was put in any machine (out of four), that computer would drop out of the game. Any other video card would work fine. I tried many video card combinations, and many computers (4), and the only constant ended up being that the one with the Nvidia card would drop out. If you had gone thru what I did to figure this out, then you would hate Nvidia also. It seems insane that it would be the video card, but after three weeks of experiments, thats where I ended up.

Also, On topic, I think the FX is a success if you like a small jet engine in your PC and you like Nvidia.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: Markfw900
To OS and Adul: (regardling the comment about blaming my lan game on a video card)

OS: I spent three weeks, and many emails, including some threads here. Running my lan game, the one in four computers running a Nvidia card, would drop out of the game after 5-10 minutes. The other video cards were a mix of ATI and Matrox. After all of the research and emails to Westwood tech support, my last try was to remove the video card, and swap in a different one. When it worked with an ATI card, I then ended trying about 10 different video cards, and two were Nvidia (a TNT Vanta and a GF2 TI200). Any time an Nvidia card was put in any machine (out of four), that computer would drop out of the game. Any other video card would work fine. I tried many video card combinations, and many computers (4), and the only constant ended up being that the one with the Nvidia card would drop out. If you had gone thru what I did to figure this out, then you would hate Nvidia also. It seems insane that it would be the video card, but after three weeks of experiments, thats where I ended up.

Also, On topic, I think the FX is a success if you like a small jet engine in your PC and you like Nvidia.

***CONFIRMED*** GeforceFX is DOA because this knucklehead can't get a TnT Vanta to work at a lan party. LoL
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,360
16,193
136
Merlocke, the Lan computers are all my own, and the GF2 TI200 is what started the mess, the Vanta was just an old card I put in to test my theory. Once you have gone down the same road, then you can talk.
 

virusag11

Senior member
May 22, 2002
336
0
0
ALl I have to say is that I am hella dissappointed! I wanted to get a FX for SWG, but I am definately going to get a Radeon 9700 Pro after reading the Anand review. Thanks for the info all.
 

Leo V

Diamond Member
Dec 4, 1999
3,123
0
0
NVIDIA will not share their heat specs with us but one manufacturer told us off the record that the GFFX 5800 Ultra GPU would be generating somewhere between 60 to 70 watts of heat.

:Q

Soon we'll need a whole new form factor to accomodate such heat! Maybe this was inevitable, just like the 100-watt 3ghz P4, but it is a horrific trend all the same.
 

Bovinicus

Diamond Member
Aug 8, 2001
3,145
0
0
Originally posted by: Leo V
I am going to add my $.02 to your discussion on whether 1600x1200 is a reasonable resolution to strive for.

My belief is that ultimately it is, however, not at the expense of far more important steps forward such as FSAA, AA, high-detailed geometry, and realistic lighting. I will argue that even 800x600 could be excellent if it enabled significant improvements in other aspects of visual fidelity.

Before going further, note that I run my 21" Trinitron at 1600x1200 desktop res all day. Also, I am not defending GFX or NVIDIA; I have no particular interest in joining another company vs. company debate.

Anyway, I will restate my arguments (that I've used many times before) on why less resolution is often better, contrary to popular belief:

1) The TV argument: television has a resolution approximating 640x480. However, "graphics" seen on TV look infinitely more realistic than those rendered by computer even at 1600x1200. You'll say that is because they are showing "real" live footage on TV. But what about Pixar-type graphics or any other rendered special effects seen on TV? It still looks a million times more realistic than your 1600x1200 game, and the TV's low resolution is certainly not the deciding factor.

My point is that there are other factors to realism vastly more important than screen resolution. They are much more complicated, and deserve far greater attention than simply increasing raw fillrate and drawing more smaller pixels.

2) You see polygonation much easier at higher resolutions. If you noticed how triangular and rectangular Quake (1) monsters began to look once you were able to run above 640x480, you know what I mean. Realism suffers because the vertices/triangles used to build the models become obvious.

3) When everything else remains fixed, you inevitably lose processing time to increasing resolution. This is obvious, either your framerate goes down or you must disable visual effects to maintain the framerate.

Now, some people prefer high resolution even if that means playing at 8FPS, or without any decent lighting whatsoever. However, most of us would not consider this to be a "happy medium" for realistic graphics.

You will naturally argue that this is not a problem anymore, because your latest Radeon/Geforce can tackle even 1600x1200 without a hitch. That is not true, however--a compromise is still being made when you're running at 1600x1200. In this case, the compromise is the primitiveness of the game/application that you're running successfully at 1600x1200. If it was had more complex, realistic graphics, then you would go right back to 1024x768 or even 800x600 to make run it acceptably. And most likely, it would be a favorable tradeoff! (Especially if it was designed with that resolution in mind.)

In conclusion: resolution is by no means bad--however, it only gets you so far. Higher resolution in no way compensates for other graphical capabilities (including software as well as hardware), and can be even detrimental when this high resolution reveals their limitations.

I'm probably not returning to this thread, I just couldn't help my temptation writing about this.
There are a few things you fail to mention in this post. The reason things look good on television is because we watch televisions from several feet away. Can you imagine how good the images on a monitor would look at that distance? The image quality on a monitor is FAR great than that of a television, it is just much easier to notice defect 1 foot away from the viewing area. Watching DVDs on my monitor produces an image far crisper than that of a TV, and I have a cheap monitor. Also, increasing the resolution helps improve texture quality, especially textures in the distance.

Also, in response to point #2, models in games are becoming increasingly complex as time goes one. This effect is becoming far less noticeable in current games. Sure, higher resolutions made blocky models more obvious years ago, but Q3 can scale to higher resolutions pretty nicely. UT2K3 looks pretty good too. Doom3 looks as though it will really have some nice vertex-filled and complex models in it.

I don't totally disagree with what you are saying however. I always put the resolution as high as it can go without sacrificing any other quality options. Resolution isn't everything, but it is important and does improve the overall quality of the game. Things like lighting, color, contrast, textures, bump-mapping, and many other IQ enhancements are all very important.

 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
I think that the FX still has some promise. IT has excellent perfomance on 3dmark's lighting test and it's overdraw is less than the 9700. These two things will become relatively more important in the games of the future.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: Markfw900
Merlocke, the Lan computers are all my own, and the GF2 TI200 is what started the mess, the Vanta was just an old card I put in to test my theory. Once you have gone down the same road, then you can talk.

I've walked down many off-topic roads, but haven't brought up any of them in a Geforce FX thread....


 

Aquaman

Lifer
Dec 17, 1999
25,054
13
0
Maybe this is the 3dfx engineers way of getting some revenge on nvidia for putting them out of business ;) :D

Cheers,
Aquaman