How can the integrated video on the nForce2 be so good?!

CZroe

Lifer
Jun 24, 2001
24,195
857
126
I thought that the only justification nVidia had for calling the GeForce4 MX anything more than a GeForce2 (MX?) is it's "Lightspeed Memory Architechture II," as it's core feature set is identical to the GeForce2. However, the nForce2 uses shared system memory INSTEAD, and therefore benefits little. From what I was told, without the LMAII speed-boost, a GeForce4 MX is effectively a GeForce2, and in this case effectively a higher-clocked nForce1. I would expect an nForce1 to be slower than other GeForce2 MX's simply because it didn't have dedicated memory whne a real MX card did. I was expecting the nForce2's nForce1-class graphics to CRAWL with a modern game like UnrealII.

Just for laughs, when my brother brought home an MSI K7N2G-LISR, we decided to run UnrealII on it before throwing in an AGP8x GeForce4 Ti4200. We really didn't expect it to run smoothly (Albiet, at 640x480)! It was indeed smooth enough to play comfortably, even in wide open terrain. Then we discovered that the 512MB (2x256MB DualDDR266) wasn't even in the right slots for dual-channel operation. That means it was sharing memory bandwidth with the AthlonXP, effectively stealing every bit of video bandwidth straight from the AthlonXP's FSB! We swapped some DIMMs around, cranked up all the settings and were able to play comfortably at much higher resolutions.

It can't run 3Dmark2003 (GeForce2 feature set only supports one test, and it happens to crash on this machine), but now I almost feel that it's a waste to stick an AGP8x GeForce4 Ti4200 in here!
 

jeffrey

Golden Member
Jun 7, 2000
1,790
0
0
Originally posted by: CZroe

It can't run 3Dmark2003 (GeForce2 feature set only supports one test, and it happens to crash on this machine), but now I almost feel that it's a waste to stick an AGP8x GeForce4 Ti4200 in here!

Stick the Ti4200 in there before you write-off how much of an improvement you will get out of it.

 

Lord Evermore

Diamond Member
Oct 10, 1999
9,558
0
76
http://www.anandtech.com/video/showdoc.html?i=1583&p=6

multisample AA unit and video features are the main differences from the GF2. Even though the LMA-II is limited due to not having dedicated memory, the dual memory controllers would enhance performance compared to a GF2. The clock speed is 250MHz, which is the same as a GF2 Ultra.

Sure, you can run a game at 640x480, and it'll play nice, but with a Ti4200 you'll be able to play them at even higher resolutions "comfortably" (what framerate are you talking about as comfortable?).

Try it with other games as well. I don't know how much Unreal II depends on memory to CPU bandwidth and video memory bandwidth, but anything which needs a lot of both is going to see increasingly poor performance with the integrated video.

Basically it sounds like you've managed to hit the exact market nvidia designed the nforce2 IGP for.
 

John

Moderator Emeritus<br>Elite Member
Oct 9, 1999
33,944
4
81
If you play games @ 640x480 then stick with the integrated GF4 MX440, otherwise use the TI4200 for higher res. gaming and AF/AA.
 

CZroe

Lifer
Jun 24, 2001
24,195
857
126
It was 640x480 only with the single-channel configuration. Increasing to dual-channel DDR266 made it "playable" at up to 1024x768 with all detail settings up (Except the lighting effects under other non-graphical menus; left at default). I consider "playable" well over 30FPS, though I could see the framerate dip a little by eye at the higher resolutions. ie, Quake1 was playable in software on my IBM P-150 CPU (133MHz? Piss-poor FPU?) in Windows (Lowest resolution) and DOS (Higher resolution), but sucked ass on a 97MHz N64 with SGI hardware accelleration.

I know this much: I couldn't play UnrealII so comfortably on my other system with a GeForce3 Ti500, which is generally considered worlds better than any MX card. Imagine if it was running DualDDR333 (DDR400 is only supported when not using IGP graphics)!

Stick the Ti4200 in there before you write-off how much of an improvement you will get out of it.
I'm not talking about a benchmark improvement. I'm saying it can run even today's very latest games with very little sacrifice. To keep the Ti4200 in this system, I'd be sacrificing alot from another system that's otherwise incapable of running modern games.