**Offcial FX thread** Hardocp, Toms Hardware , ANANDTECHS is up with MIN FPS, and Now Hexus.net added

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

OS

Lifer
Oct 11, 1999
15,581
1
76
Originally posted by: Markfw900
Chiz, ever since Nvidia wouldn't play nice with my other video cards on my local lan (the computer with an nvidia card would disconnect fron the lan game) I have hated them. That was two different cards that did the same thing. You sound like the fanboy that won't die ! Good luck if you stay with them. I go with the one that plays fair and works good. ( and I don't fault them for being a little finicky in a complicated configuration like you do.)

uhh, blame your lan problems on a video card?
rolleye.gif


 

Adul

Elite Member
Oct 9, 1999
32,999
44
91
danny.tangtam.com
Originally posted by: OS
Originally posted by: Markfw900
Chiz, ever since Nvidia wouldn't play nice with my other video cards on my local lan (the computer with an nvidia card would disconnect fron the lan game) I have hated them. That was two different cards that did the same thing. You sound like the fanboy that won't die ! Good luck if you stay with them. I go with the one that plays fair and works good. ( and I don't fault them for being a little finicky in a complicated configuration like you do.)

uhh, blame your lan problems on a video card?
rolleye.gif

 

flashbacck

Golden Member
Aug 3, 2001
1,921
0
76
Originally posted by: RaynorWolfcastle
The FX slams the Radeon in a couple of benchmarks but most of the time they're neck and neck. Once AF/AA get turned on ATI lays the smack down in a lot of cases. I'm not impressed by thsi showing, nVidia took 6 extra months to get this product out the door and it doesn't look all that strong beside the 9700 pro. I certainly wouldn't pay an extra ~$100 or so to get that kind of boost in performance, and I'm not too sure too many people would

barring any spectacular boosts in performance in the production drvers, booooo nVidia.

yeah... I was all hyped up about it too. I thought I heard someone say ATI has another chip almost ready that was suppose to compete with the FX (R350?). Psh. they don't even need it. They should, however, release it just to smash the FX into the ground.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Adul
to repeat what has been said already















(exactly)

It also shows "agreement" or gives "emphasis" to what was said. If someone does it to your quote, consider it "flattery" (Like "I couldn't have said it better myself")
 

Adul

Elite Member
Oct 9, 1999
32,999
44
91
danny.tangtam.com
Originally posted by: apoppin
Originally posted by: Adul
to repeat what has been said already















(exactly)

It also shows "agreement" or gives "emphasis" to what was said. If someone does it to your quote, consider it "flattery" (Like "I couldn't have said it better myself")

or we are to lazy to type out a reply, but this is close enough.

 

flashbacck

Golden Member
Aug 3, 2001
1,921
0
76
Originally posted by: GTaudiophile
Despite this review, the folks over at Rage3D are guessing as to how much sunshine the likes of Anand and Tom are going to blow up nVidia's a$$ tomorrow morning:


Just hope I don't see crap like this:

3dmark2001 @1024x768 - Radeon Pro = 16000
3dmark2001 @1024x768 - GFFX = 16500

Reviewer: "As you can see the GFFX crushes the Radeon Pro in this benchmark"

LOL, I love it when Tom's does that. cracks me up.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Adul
Originally posted by: apoppin
Originally posted by: Adul
to repeat what has been said already















(exactly)

It also shows "agreement" or gives "emphasis" to what was said. If someone does it to your quote, consider it "flattery" (Like "I couldn't have said it better myself")

or we are to lazy to type out a reply, but this is close enough.
I am trying (hard) to be polite. ;)

rolleye.gif



:D

 

SteelyKen

Senior member
Mar 1, 2000
540
0
0
I am sure the GeforceFX will sell well no matter what the final tally of website benchmarks is. They have a good reputation earned over the last few years and that won't go away overnight. Those of us who stay up all hours of the night reading Tom, Dick, and Harry's benchmarks might have more insight on the performance of video cards; but joe average will still go for the Nvidia card based on reputation only. Its a good card, just not the R300 killer many hoped for.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SteelyKen
I am sure the GeforceFX will sell well no matter what the final tally of website benchmarks is. They have a good reputation earned over the last few years and that won't go away overnight. Those of us who stay up all hours of the night reading Tom, Dick, and Harry's benchmarks might have more insight on the performance of video cards; but joe average will still go for the Nvidia card based on reputation only. Its a good card, just not the R300 killer many hoped for.
I doubt it after the tech sites and PC magazines get through with the reviews of this "disappointment". Plus, it's hard to justify $100 extra for nearly identical performance.

Can you imagine an OEM opting to put an NV30 into a machine and sell it for the same price as the R300?

Unless Nvidia's marketing is THAT good. ;)


rolleye.gif

 

SteelyKen

Senior member
Mar 1, 2000
540
0
0
I still think it will take time for any of that info to trickle down into mass populace. It takes time to build a reputation and it will take time and more than one blunder (not that I should characterize the FX as such - it isn't) to tear it down.

BTW, I have owned 3 ATI cards and 0 Nvidia cards, so don't flame me. Just my 2 cents.
 

DaFinn

Diamond Member
Jan 24, 2002
4,725
0
0
My next update will be a laptop with either a GF-FX?? or ATI Radeon R3?? depending which hits the market first... My TI-4200/128 serves me well in my box...



-DaFinn
 

Adul

Elite Member
Oct 9, 1999
32,999
44
91
danny.tangtam.com
toms conclusion


NVIDIA takes the crown! No question about it - the GeForceFX 5800 Ultra is faster than the competition from ATI's Radeon 9700 PRO in the majority of the benchmarks. However, its lead is only slight, especially compared to the distance that ATI put between its Radeon 9700 PRO and the Ti 4600. Still, when compared to its predecessor, the GeForce4 Ti, the FX represents a giant step forward.

The GeForceFX 5800 Ultra is irrefutably the fastest card in most of the tests - but at what price? The power of the FX relies on high clock speeds, which in turn require high voltages and produce an enormous amount of heat. The consequence is that extensive (and expensive) cooling is necessary. Add to that the DDR-II memory, the price of which is quite high, due to the small production numbers. Even the 12-layer board layout is complex and expensive.

It will be difficult for NVIDIA to push its GeForceFX 5800 Ultra. Radeon 9700 PRO cards are only slightly slower, and, because they've been out on the market for months now, they're much less expensive. Also, because they deliver 3D performance with much slower clock speeds, they do not require extensive cooling - and that's nice for your pocketbook as well as your ears.

Still, despite expectations to the contrary, the official price for the FX 5800 is $399 plus tax and that seems pretty aggressive and attractive. This makes it identical to the launch price of the GeForce4 Ti4600 and the ATI Radeon 9700 Pro. The "normal" version of the 5800 will be somewhat less expensive. It's surprising that the GeForceFX GPU, clocked at 500 MHz, only gains a small lead over the R300 GPU (VPU), which is modestly clocked at 325 MHz in comparison.

It remains to be seen how long NVIDIA, with its FX 5800, can maintain a lead over ATI. ATI has already started to hint at a faster-clocked R350 to come in the next weeks (according to rumor, it will have a 400-425 MHz core and 800 MHz memory).

Nevertheless, enthusiasts will, without a doubt, love the GeForceFX 5800 Ultra. It is a monster card! And it has a look that is similarly spectacular to the 3dfx Voodoo5 6000 at the time of its launch.
 

Adul

Elite Member
Oct 9, 1999
32,999
44
91
danny.tangtam.com
looking at the image quality pics between the 9700 here

and the Geforce FX

here


U will notice that the edge of the balcony is smoother on the 9700 then it is on the GF FXcard despite it be only 6x AA compared to GF FX8x AA

Also look at the roof has well. no jaggies at all on the 9700 and there is some on the GF FX.

the wire is near the top is also another point of refernces. and I also notice that the shading is darker on the GF FX then it is on the 9700 Pro
 

Aquaman

Lifer
Dec 17, 1999
25,054
13
0
What are the chances of ATI pulling a driver boost like nvidia used to do?

Cheers,
Aquaman
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,404
8,575
126
Originally posted by: apoppin
Originally posted by: SteelyKen
I am sure the GeforceFX will sell well no matter what the final tally of website benchmarks is. They have a good reputation earned over the last few years and that won't go away overnight. Those of us who stay up all hours of the night reading Tom, Dick, and Harry's benchmarks might have more insight on the performance of video cards; but joe average will still go for the Nvidia card based on reputation only. Its a good card, just not the R300 killer many hoped for.
I doubt it after the tech sites and PC magazines get through with the reviews of this "disappointment". Plus, it's hard to justify $100 extra for nearly identical performance.

Can you imagine an OEM opting to put an NV30 into a machine and sell it for the same price as the R300?

Unless Nvidia's marketing is THAT good. ;)


rolleye.gif
i can't imagine any OEM putting that thing into a machine without a serious overhaul of the cooling. heck, even the radeon 9700 was too loud (or something) for dell so you have the 9700TX (TX for texas) which runs at 275 or so. so you'd have a completely castrated version of the nv30 as the only possible solution for getting this into OEMs. not to mention the OEMs would probably want a cheaper ram type. i don't really see a possibility for OEMs to pick this thing up. NV31 can't come fast enough.
 

Leo V

Diamond Member
Dec 4, 1999
3,123
0
0
I am going to add my $.02 to your discussion on whether 1600x1200 is a reasonable resolution to strive for.

My belief is that ultimately it is, however, not at the expense of far more important steps forward such as FSAA, AA, high-detailed geometry, and realistic lighting. I will argue that even 800x600 could be excellent if it enabled significant improvements in other aspects of visual fidelity.

Before going further, note that I run my 21" Trinitron at 1600x1200 desktop res all day. Also, I am not defending GFX or NVIDIA; I have no particular interest in joining another company vs. company debate.

Anyway, I will restate my arguments (that I've used many times before) on why less resolution is often better, contrary to popular belief:

1) The TV argument: television has a resolution approximating 640x480. However, "graphics" seen on TV look infinitely more realistic than those rendered by computer even at 1600x1200. You'll say that is because they are showing "real" live footage on TV. But what about Pixar-type graphics or any other rendered special effects seen on TV? It still looks a million times more realistic than your 1600x1200 game, and the TV's low resolution is certainly not the deciding factor.

My point is that there are other factors to realism vastly more important than screen resolution. They are much more complicated, and deserve far greater attention than simply increasing raw fillrate and drawing more smaller pixels.

2) You see polygonation much easier at higher resolutions. If you noticed how triangular and rectangular Quake (1) monsters began to look once you were able to run above 640x480, you know what I mean. Realism suffers because the vertices/triangles used to build the models become obvious.

3) When everything else remains fixed, you inevitably lose processing time to increasing resolution. This is obvious, either your framerate goes down or you must disable visual effects to maintain the framerate.

Now, some people prefer high resolution even if that means playing at 8FPS, or without any decent lighting whatsoever. However, most of us would not consider this to be a "happy medium" for realistic graphics.

You will naturally argue that this is not a problem anymore, because your latest Radeon/Geforce can tackle even 1600x1200 without a hitch. That is not true, however--a compromise is still being made when you're running at 1600x1200. In this case, the compromise is the primitiveness of the game/application that you're running successfully at 1600x1200. If it was had more complex, realistic graphics, then you would go right back to 1024x768 or even 800x600 to make run it acceptably. And most likely, it would be a favorable tradeoff! (Especially if it was designed with that resolution in mind.)

In conclusion: resolution is by no means bad--however, it only gets you so far. Higher resolution in no way compensates for other graphical capabilities (including software as well as hardware), and can be even detrimental when this high resolution reveals their limitations.

I'm probably not returning to this thread, I just couldn't help my temptation writing about this.