matrox g400 max vs. geforce DDR?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Who cares about this crap? Seriously dude, get a life.

Infraction issued for thread crapping. Two days off for points accumulation.
-- stahlhart

Discussions about old graphics are actually quite interesting to many of us ;)

Don't get me started on comparing the PS2, GC, and Xbox and how they match up.

Honestly, I know hardly anything about pre-and early Geforce era cards, so these discussions are often quite enlightening. It's interesting to hear how some of these old cards can still play games from up to a few years ago lol.
 

Majcric

Golden Member
May 3, 2011
1,409
65
91
I'm not sure why, but my father has kept the original box for the first video card I ever bought on top of his gun cabinet. That was 15-16 years ago. Still have it, was working last time I used it. Pretty sure I have a matrox 550 laying around also.

xykp.jpg

Lmao, that image looks like it came from Avatar.
 

Z15CAM

Platinum Member
Nov 20, 2010
2,184
64
91
www.flickr.com
Don't forget the ATi Mach 64 with 24-Bit Colour with TV Out and Rage 128 32-bit Colour PCI Cards. ATi pretty ruled the AGP era starting with the Radeon 9700 until the PCie era when the GTX 280 came out.
 
Last edited:

XiandreX

Golden Member
Jan 14, 2011
1,172
16
81
Don't forget the ATi Mach 64 with 24-Bit Colour with TV Out and Rage 128 32-bit Colour PCI Cards. ATi pretty ruled the AGP era starting with the Radeon 9700 until the PCie era when the GTX 280 came out.

The Radeon 9700 (which I owned) was trully one of ATI's best cards ever. It trounced the competion, had great image quality and for all intents was a stable card.

I am looking forward to hopefully one day AMD pulling a similar feat.
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
I actually find it amazing to think that transform and lighting were ever done on the CPU. I mean, given how critical they are to 3D rendering.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I'm not sure why, but my father has kept the original box for the first video card I ever bought on top of his gun cabinet. That was 15-16 years ago. Still have it, was working last time I used it. Pretty sure I have a matrox 550 laying around also.

xykp.jpg
I had the same card with the same box.
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
Years ago I built a dual cpu PC for fun, this was before dual cores were so common, think it was 2 x pentium 3 800mhz's on a tyan tiger 100 mobo. Anyways I got a cheap graphics card from ebay just to boot the thing, pretty sure it was a matrox G400, the POS couldnt even run starcraft 1 quickly.... it lagged in every game I joined, there was much rage from the other players :cool:
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
I dont think a 3D card could cause Starcraft 1 to lag - probably something else. StarCraft 1 was 2D remember - everything would be done on the CPU.
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
I dont think a 3D card could cause Starcraft 1 to lag - probably something else. StarCraft 1 was 2D remember - everything would be done on the CPU.

Mightve been a G200. Either way it was the only time I ever thought god this card is trash lol. The Geforce MX400 that replaced it fared much better.
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
Still shouldnt matter. The only things that should cause lag in SC1 are CPU, RAM and network speed.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
May I ask specifically what you didn't like about the image quality of the ati 9700 pro or what features you thought it should have?
sure.:) the pixel shaders were only partial precision which means that there was less accuracy so they were certainly less able to emulate old features with high accuracy if the old fixed functions had been calculated in 32 bits (although things like the dreamcast's modifier volumes werent 32 bits at every step; the T&L was calculated with 32 bit float precision on the dreamcast's CPU although the R300 vertex shaders were as good there). it would've been better for it to have a 32 bit fixed point log z-buffer (UT 99 had to have depth hacks or severe flicker without it and so we haven't seen many games that look quite like enter the matrix since geforce FX or even Xbox Classic; even the DX10 renderer for the original unreal engine-based games uses a reverse float 32 bit z-buffer and that still isnt quite as good as th) and a D24FS8 format. few attempts for new features were made at openGL. made no attempt at dev relations like nvidia did so opengl drivers wound up sucking as a result for R300. it did have z-bias, but that could only be done at 24 bit if you wanted per pixel (rather than per vertex which was used by most games then and done in full precision with vertex shaders) since the z-buffer and pixel shaders were only 24 bit. full trilinear filtering wasn't possible on the R300 and amd's filtering still wasnt up to snuff with the 6k series; havent tried GCN though. and finally, R300's depth range still appeared smaller than it did on the Gf 4k Ti series with games that both used 24 bit z-buffers. R300 was geared for performance to the max like the gamecube's flipper was (they were designed by the same team IIRC) with everything low precision (remember all the artifacts in REMake that were required in order to get cool looking alpha blending effects?).

ati also had some overgammad colors mixed in with overvibrant colors.

there were plenty of rendering errors in max payne, at least with the drivers and settings i tried (perhaps forcing things through the drivers caused them i dont remember; i was also always at a loss to understand why they had the option for 6x MSAA given how terrible it looked and when the sampling pattern for 2x and 4x MSAA was really the only thing R300 did right image quality wise).

amd's alpha blending or testing whatever it was looked pretty terrible particularly in the windows in max payne.

basically, i would make hardware with 64 bit float (and fixed as what is best depends on the app) data precision at every stage of the pipeline; all decompression should be lossless; and of course less than 64 bit instruction sets are out of date, so 64 bit instruction sets are necessary too. and larrabee failed because it wasnt radically parallel. it even used multiple pentium 1 cores... it simply combined all the old ideas into one and that's not going to work. you can get a lot more performance out of software and a lot better IQ out of software, although you cant have a balance of both like you do with modern hardware unless you use more than one gen purp die, some can be geared towards being better for gaming, and any noticeable problems should be taken care of with multiple dies (AFR sucks for example).

i am happy for amd though and i wish them nothing but the very best. i just think that sega's programmers made burning rangers look really good for its day (at the tech level in addition to the art level) with all the software transparency and creative use of the hardware functions it did have; as did PS2 games like Devil May Cry 1 and 3 (wouldnt have looked anywhere near as good on the gamecube unless the whole game was done in software at which point it would've been unplayable although re4 on the PS2 had terrible textures and i heard the fight with krauser wasnt the same because the ps2 didnt have as much geom processing power as the GC's fixed function T&L unit did). the saturn port of DOA arcade looked better than the new one made for the PSX. astal's water effects looked damn fine and the game's art style was cool if a bit depressing (game was a bit hard though). vf2 played and sounded just like the arcade one. panzer dragoon saga wouldnt have looked the same on the PS2. the 3D level in sonic jam looked really good and so did the unique user interface.
 

Blue_Max

Diamond Member
Jul 7, 2011
4,223
153
106
As far as Starcraft1 lag goes - make sure it's actually using Matrox drivers, not "Generic VGA" or something!
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
i am happy for amd though and i wish them nothing but the very best. i just think that sega's programmers made burning rangers look really good for its day (at the tech level in addition to the art level) with all the software transparency and creative use of the hardware functions it did have; as did PS2 games like Devil May Cry 1 and 3 (wouldnt have looked anywhere near as good on the gamecube unless the whole game was done in software at which point it would've been unplayable although re4 on the PS2 had terrible textures and i heard the fight with krauser wasnt the same because the ps2 didnt have as much geom processing power as the GC's fixed function T&L unit did). the saturn port of DOA arcade looked better than the new one made for the PSX. astal's water effects looked damn fine and the game's art style was cool if a bit depressing (game was a bit hard though). vf2 played and sounded just like the arcade one. panzer dragoon saga wouldnt have looked the same on the PS2. the 3D level in sonic jam looked really good and so did the unique user interface.

IIRC The vast majority of PS2 games hardly used the full breadth of both of the Emotion Engine's vector units. The thing was a bitch to program supposedly, so you can't blame them. There are plenty of PS2 games that show off what could be really done when using both VUs, many things that were not seen on the Xbox or GC of the same quality like dynamic water in a few games (Ghosthunter). EE was a beast, and had it been paired with a 3D TnL GPU, the possibilities..........:awe:
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
IIRC The vast majority of PS2 games hardly used the full breadth of both of the Emotion Engine's vector units. The thing was a bitch to program supposedly, so you can't blame them. There are plenty of PS2 games that show off what could be really done when using both VUs, many things that were not seen on the Xbox or GC of the same quality like dynamic water in a few games (Ghosthunter). EE was a beast, and had it been paired with a 3D TnL GPU, the possibilities..........:awe:

I still think the GameCube had the best hardware design out of all the 6th gen system besides maybe the Dreamcast (but I'm extremely biased in that regard).

The TEV pipeline in the GameCube made for some very cool effects in games that utilized it well and the fact that Flipper had T&L and that Gekko had several "graphics" specific enhancements to it's FPU (I won't say SIMD instructions because some say it didn't have any but I'm not sure right now) which made the PS2's VUs look like a waste of silicon. However the PS2's design was still very interesting.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
I still think the GameCube had the best hardware design out of all the 6th gen system besides maybe the Dreamcast (but I'm extremely biased in that regard). The TEV pipeline in the GameCube made for some very cool effects in games that utilized it well and the fact that Flipper had T&L and that Gekko had several "graphics" specific enhancements to it's FPU (I won't say SIMD instructions because some say it didn't have any but I'm not sure right now) which made the PS2's VUs look like a waste of silicon. However the PS2's design was still very interesting.
i say the dreamcast, then the PS2 then the GC.

the gamecube provided the most colorful-looking most lit up graphics and could render lights the fastest but had the most artifacts while the dreamcast was in between being the PS2's precision (except for the RAM limitations generally not allowing 32 bit textures) and the Gamecube's emphasis on color/lighting. and the dreamcast's dithering isnt as noticeable if you dont play in progressive scan which was only 60hz (on a crt no less) anyway. but putting the GC in progressive scan really makes the artifacts look awful. so i would say the PS2 was the second best because it was fully programmable and since Devil May Cry 3 looks a lot better than resident evil 4 does imo. even Sonic Adventure and Crazy Taxi looked and ran better on the dreamcast than they did on the GC although i cant comment on the PS2 version. so the gamecube wasnt even all that easy to program for if the devs couldnt make the gamecube versions look and run at least as good as the Dreamcast originals did.

and i am just not most happy with fixed function hardware (like z freeze) or when tricks have to be done (like decompressing/compressing anything lossy on the fly) that try to or have to make up for low precision.. hardware should have few modes with the highest precision possible and the most programmability. that's part of why i favor going back to software for everything and part of why i think 3d graphics hardware was a mistake. intel should never have had IP and microsoft having it just as much as intel has retarded good growth... society would've been even happier. pretty much only one part should be fixed function for now rather than having programmable scaling since getting rid of nvidia's current display logic right away would be a disaster.

anyway, the gamecube's graphics would have been a lot better had 32 bit z-buffer and ARGB8 been used... 24 bits is always going to look like crap because it is only partial precision. but i have noticed ATi and AMD have always liked partial precision and now nvidia doesnt like double precision unless they can make extra money off of it.
 
Last edited: