Some Geforce FX previews are up!!!

rahvin

Elite Member
Oct 10, 1999
8,475
1
0
Intellisample baby! Oh yea... FSAA has been severely neglecte since 3dfx died, thank god nvidia is bringing it back to the front.
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
I must say, I wasnt expecting this quality untill the end of next year at the earliest:Q
 

bluemax

Diamond Member
Apr 28, 2000
7,182
0
0
Darn that chip is hot though! That's the last thing I want... a big, space heater with a noisy fan. :|
Fast is good but come on! Do a little optimization! I wouldn't think a 0.13m part would be so hot except they also went and clocked it to the moon! :confused: Seems to me like they're overclocking it as much as possible to give it as much *cough* "edge" over the 9700-pro as possible. ;)

I'll give that to the R300 chip - it doesn't need NEARLY as much cooling! A plain Radeon 9700 non-pro would run nice 'n cool with only the slowest of fans.
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Holy Jebus, that's a big card!!!

I guess the 128-bit memory pipe is official: The GeForce FX will be using 2ns chips. With DDR2 at 1GHz on a 128bit memory bus, the raw memory bandwidth comes out to 16GB/sec. The raw bandwidth of the Radeon 9700 Pro, with its 256bit memory bus and DDR running at 620MHz, is 19.8GB/sec.

Edit: It needs a second power connector???
 

bluemax

Diamond Member
Apr 28, 2000
7,182
0
0
Now THIS is what really caught my eye!!

The end result of this compression engine is that anti-aliasing now becomes a very low cost operation, since very little memory bandwidth is wasted. Essentially the only memory bandwidth used is on the edges of polygons, which make up a much smaller percentage of a scene than everything else. This should sound quite familiar as the results are similar to what Matrox promised (and delivered) with their Fragment Anti-Aliasing technology - only anti-aliasing the edges of polygons - however the difference is that there are no compatibility problems with NVIDIA's approach as it is still conventional multisampled AA.

The compression engine is completely invisible to the rest of the architecture and the software running on the GeForce FX, which is key to its success. It is this technology that truly sets the GeForce FX apart from the Radeon 9700 Pro.

The compression engine and the high clock speed of the GeForce FX enabled NVIDIA to introduce to new anti-aliasing modes: 6XS under Direct3D, and 8X AA under both OpenGL and Direct3D. Because of the compression engine, performance with AA enabled should be excellent on the GeForce FX.

THAT is a feature I'd pay for! I'd take a good AA routine over 1600x1200 any day! You can still see the creepy-crawlies at that res, but a good AA routine would eliminate it even at 1024x768! Especially at 6x or 8x ! :Q
 

bluemax

Diamond Member
Apr 28, 2000
7,182
0
0
Oops... looky here:

As we've mentioned before, the kicker is that the GeForce FX won't be available until February of next year. The chip itself is done and production is ramping up at TSMC; one of the causes for the delay is that producing the 0.13-micron wafers apparently takes several more weeks at TSMC than the older 0.15-micron process.

By that time, ATI will have its DDR-II Radeon 9700 out. Competition is still VERY close here. Cool beans or what, eh? :D

 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Dang, that's a hot card!

From HardOCP: Right now the card is not ready to ship. Production will not start until December and retail availability will not be there until February 2003.

Perhaps the coolest thing, though, is nVidia's Cg: NVIDIA wants to speed this up by providing their Cg program to help developers program interesting and unique effects and shaders. Their hope is to make this as easy as they can, so developers can quickly make and implement them in games. But again, Cg isn?t the only game in town, as DX9 has its own HLSL.
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Originally posted by: Czar
next order of business... when will we see benchmarks?

Kyle has this to say at HardOCP: Of course, how fast it will run DOOM3 is what is on everyone's mind. From the numbers that NVIDIA showed us compared to a 9700, it looked as if the GeForceFX would be approximately 25% faster in frame rate.

Edit: I'm not too impressed...yet...especially since ATi pulled this rabbit out of their hat 6 months ago...
 

Coherence

Senior member
Jul 26, 2002
337
0
0
Please tell me that is not a power connector I see on the back edge of the card.

And egads this thing will never fit into an XPC, it takes up 2 slots! At least, it won't fit if you don't plan on using the on-board sound. (I'll never use on-board sound again; I want a dedicated card.) Looks like I'm gonna go back to building a normal tower PC.
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76
I hope Xbox2 will feature NV30. Can you imagine an Xbox 2 w/ NV30 for only $199 :) That would be awesome.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Deeko
That dual slot interface is G-A-Y

Hehe, you don't like plumbing in your case?

In all honesty though, I don't think its that big a deal. I use (and many others I know) a slot cooler there anyways, I just wish they used that extra slot for extra outputs, like the Abit card does. Dual-DVI should definitely be standard on a card this expensive, and taking up 2 slots gives them no excuse to have the extra output.

Also, I tend to leave PCI slot 1 open, as it shares the same interrupts as your AGP slot on most mobos. That's never a good thing.

Chiz

 

Woodchuck2000

Golden Member
Jan 20, 2002
1,632
1
0
Those screenshots are incredible. It's the first time for at least a year I've looked at a technology demo and simply said "Wow!"
All the screenshots on the THG preview are simply stunning and quite a large jump from the 9700 demos (I'm gonna get flamed for that ;))

I'm sure that both the 9700 and the FX will be able to run any game in the next two years perfectly well. I'm also sure that I'm not gonna be able to afford either card...