matrox g400 max vs. geforce DDR?

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
i think the g400 was better because it had better feature set (better circuitry including the DAC, full fixed point z precision that could be forced via the driver, full trilinear mipmapping that could be forced via the driver; all it really lacked was glide and it was actually a good move not to design it on DX6.1 because matrox's opengl icd came to give great performance and because there was no risk of them implementing DXT poorly). T&L wouldnt have been necessary if intel had tried to design CPUs to be better for gaming (they had very little competition because of superior marketing which meant they didnt have to make the best and most ground breaking products) although that applies to all graphics processors with hardware features. we would be a lot better off if we had just had systems with multiple general purpose processor dies with some having an architecture designed for graphics (more than one core per die was a huge mistake and it is a shame that cant be reversed) and some being balanced (like the Gamecube's CPU). there is really no point in having hardware anyway if you want maximum framerates because software can potentially be made as fast as the end user wants... it is more versatile even if it is more expensive (but then lower prices, more popular products, and more efficiency would result without IP as individuals would then pay to make what they want rather than having a business model where the inventor cant really know whether his product will give him a profit).

nvidia wouldnt have a feature set not biased towards pure performance until the Geforce FX (although it was still not done right) and G80 was really the first time they went all out with feature set (nvi/o or whatever the new digital transmitter was called, finally could do full fixed point log z buffers, AF angle variancy was done just right for the first time and to date no change has been necessary to that all, trilinear filtering was perfect (and i cant imagine how that could be any better 8 years later other than that i wish they still allowed the end user to force trilinear mipmaps and clamp texture negative lod bias), added SGSSAA (even though it couldnt selected until much later) and could do floating point render targets with any AA mode at the time, and it could do full precision integer precision (if i am not mistaken, AMD could do FP32, but only 24 bit for fixed point). it is just a shame how nvidia's drivers have fallen and that they put a programmable fuse into the GK110 that can cripple FP64... GTX 780s and 780 Tis are basically all damaged. and they're probably going to charge at least $1.1k for a maxwell with regular fp64 and that's really low class because that will hold back progress.

as for ATi, there would've been none for AMD to buy if performance benchmarks hadnt been what sold cards rather than much analysis of IQ and compatibility in reviews... ATi and now AMD can thank all the tech sites for that because i wouldve made an analysis on the lack of features and given the R300 series not more than a 50% if i had the brains to do an expert review of it. performance was no doubt ground breaking and its increase is legendary to this day, but the only good feature it had was properly rotated grid AA.
anyway, which of the two do YOU think was the better product?
 

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
I recall that the G400 had superior 2D image quality, both sharpness and color, to GeForce cards at the time. I had a 32Mb dual head -- still own it.

It was just unfortunate that their OpenGL support was dodgy at best, and then when they jumped the shark with that Parhelia disaster.

G400, TurboGL and Aureal 3D for Half-Life. Good times while they lasted.
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
Who cares about this crap? Seriously dude, get a life.

Infraction issued for thread crapping. Two days off for points accumulation.
-- stahlhart
 
Last edited by a moderator:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I recall that the G400 had superior 2D image quality, both sharpness and color, to GeForce cards at the time. I had a 32Mb dual head -- still own it.

It was just unfortunate that their OpenGL support was dodgy at best, and then when they jumped the shark with that Parhelia disaster.

G400, TurboGL and Aureal 3D for Half-Life. Good times while they lasted.

Indeed those were good times. A3D was pretty much a game-changer when I heard it, I still have my Diamond MM Monster Sound 3D lying around somewhere. I have an Awe64 Gold in the original box, mint condition too. I just can't bring myself to get rid of either of those cards. :biggrin:

Side note. I loved diamond multimedia back then, they were hands down the best AIB you could buy stuff from. They were sorta like the "EVGA" of those early days. Too bad they went belly up, the new diamond multimedia that exists now is nowhere near the same. Back then though (mid-late 90s), if you wanted premium quality you went with Diamond MM. IMO anyway. I really liked Diamond back then.

Also, analog RAMDACs made such a huge difference in terms of 2D quality back then. ATI was more or less ahead of nvidia then in that respect, although it took ATI a LONG time to catch up in 3d performance. And matrox was better than everyone basically (2d image quality). I do remember that quite well. Thank goodness everything is digital now, and we don't have to make purchases based on that.
 
Last edited:

XiandreX

Golden Member
Jan 14, 2011
1,172
16
81
Indeed those were good times. A3D was pretty much a game-changer when I heard it, I still have my Diamond MM Monster Sound 3D lying around somewhere. I have an Awe64 Gold in the original box, mint condition too. I just can't bring myself to get rid of either of those cards. :biggrin:

Side note. I loved diamond multimedia back then, they were hands down the best AIB you could buy stuff from. They were sorta like the "EVGA" of those early days. Too bad they went belly up, the new diamond multimedia that exists now is nowhere near the same. Back then though (mid-late 90s), if you wanted premium quality you went with Diamond MM. IMO anyway. I really liked Diamond back then.

Also, analog RAMDACs made such a huge difference in terms of 2D quality back then. ATI was more or less ahead of nvidia then in that respect, although it took ATI a LONG time to catch up in 3d performance. And matrox was better than everyone basically (2d image quality). I do remember that quite well. Thank goodness everything is digital now, and we don't have to make purchases based on that.

I have been in the video card game since day 1. I still remember using an Asus SP-97 motherboard (might have name wrong) with the built in S3 virge or whatever it ran and I used a Cable to connect to my Diamond Voodoo 1 4mb for 3D. I knew at the time Matrox was superior for 2D but for games was not even close.

I also remember the Ramdac difference back then. I went to Ati for their Riva cards, and tried both sides off and on over the years.

Best memory was running Fifa 96 (or maybe 97) with my Voodoo 1 and seeing 3D for the first time. Heck I even called the entire family in to see it.
Doom and quake were never the same again. :)
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
They were different competitors and really different generations.

I really enjoyed Dagoth Moor Zoological Gardens tech demo with my GeForce 256 DDR!
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
I had the G400 Max 32mb ( sent by a kind tech PR) and it was better than anything from Nvidia or ATI at the time by a significant margin-better fps AND image quality. It was pretty much Matrox's last serious attempt to compete in the 3D space to my knowledge. Obviously it was soon superseded but I loved mine for the few months it was ahead.
 

Ventanni

Golden Member
Jul 25, 2011
1,432
142
106
3dfx, Matrox, and ATI had a huge one-up on Nvidia in the early, early days when it came to 2D quality. I remember owning a Geforce 2 GTS and I couldn't get over how much the dang thing hurt my eyes when browsing the web; something I never experienced when I owned my Voodoo3 2000. By the Geforce 3 though, Nvidia had fixed this issue.

Otherwise, driver quality was something Nvidia had hands over heels above Matrox, ATI, and to a lesser extent 3dfx back in those early days. It was one of the reasons Nvidia put 'em all (besides ATI) out of business.
 

ctk1981

Golden Member
Aug 17, 2001
1,464
1
81
I'm not sure why, but my father has kept the original box for the first video card I ever bought on top of his gun cabinet. That was 15-16 years ago. Still have it, was working last time I used it. Pretty sure I have a matrox 550 laying around also.

xykp.jpg
 

Blue_Max

Diamond Member
Jul 7, 2011
4,223
153
106
I recall that the G400 had superior 2D image quality, both sharpness and color, to GeForce cards at the time. I had a 32Mb dual head -- still own it.

It was just unfortunate that their OpenGL support was dodgy at best, and then when they jumped the shark with that Parhelia disaster.

G400, TurboGL and Aureal 3D for Half-Life. Good times while they lasted.

Agreed on the quality! The G400 really had a sharp picture when other VGA cards were known for being fuzzy!

Still, it performed only as good as a GeForce SDR at best - the DDR2 version was ALMOST as fast as a GeForce2
 

serpretetsky

Senior member
Jan 7, 2012
642
26
101
as for ATi, there would've been none for AMD to buy if performance benchmarks hadnt been what sold cards rather than much analysis of IQ and compatibility in reviews... ATi and now AMD can thank all the tech sites for that because i wouldve made an analysis on the lack of features and given the R300 series not more than a 50% if i had the brains to do an expert review of it. performance was no doubt ground breaking and its increase is legendary to this day, but the only good feature it had was properly rotated grid AA.
anyway, which of the two do YOU think was the better product?
May I ask specifically what you didn't like about the image quality of the ati 9700 pro or what features you thought it should have?

I thought it was amazing that the card could run at the same fps as other cards while driving higher resolutions with AA and AF enabled. I think it was the first fully capable directx 9 card as well.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I think the nvidia TNT 2 ultra was my favorite card by them ever. That's where they set themselves apart from the pack, specifically 3dfx - the Voodoo 3 was pretty disappointing (just IMO) while the TNT 2 had 32 bit color support, AGP texturing, and large texture support. And was a bit faster than the V3 (at least, the ultra variant was). But way better image quality.

Whereas the TNT1 and Voodoo 2 were pretty close. I'd say the V2 was a bit better. I still have a creative labs Voodoo 2 lying around somewhere...heh. I was so stoked to buy that card from Best Buy. Remember back in the day where you bought *everything* from B+M? I loved Best Buy back then. Anyway, that was a close battle with the V2 edging out. TNT 2 wasn't, it was just better than the V3.

Man, those were good times for PC gaming. Multiplatform games didn't really exist back then either, lots of PC exclusive titles. It was a different era for sure, and a better one in terms of PC gaming IMHO.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,401
5,639
136
I have a lot of fond memories of the Riva TNT 2. Our local game store had some 3-for-2 deal on Star Wars video games, and we wound up coming home with Rogue Squadron, Podracer, and X-Wing Alliance... we got a lot of use out of that graphics card!

On a side note, I really miss old school joysticks. Wish my Sidewinder Pro would work under Windows 7. :(
 

SimianR

Senior member
Mar 10, 2011
609
16
81
I think the nvidia TNT 2 ultra was my favorite card by them ever. That's where they set themselves apart from the pack, specifically 3dfx - the Voodoo 3 was pretty disappointing (just IMO) while the TNT 2 had 32 bit color support, AGP texturing, and large texture support. And was a bit faster than the V3 (at least, the ultra variant was). But way better image quality.

Whereas the TNT1 and Voodoo 2 were pretty close. I'd say the V2 was a bit better. I still have a creative labs Voodoo 2 lying around somewhere...heh. I was so stoked to buy that card from Best Buy. Remember back in the day where you bought *everything* from B+M? I loved Best Buy back then. Anyway, that was a close battle with the V2 edging out. TNT 2 wasn't, it was just better than the V3.

Man, those were good times for PC gaming. Multiplatform games didn't really exist back then either, lots of PC exclusive titles. It was a different era for sure, and a better one in terms of PC gaming IMHO.

TNT 2 was a great card but at that time I think Glide still gave the V3 the edge, at that time there was still a lot of support for it. Once the Geforce 256 was out though there was really no recovering for 3dfx :(
 

Ajay

Lifer
Jan 8, 2001
16,094
8,111
136
I recall that the G400 had superior 2D image quality, both sharpness and color, to GeForce cards at the time. I had a 32Mb dual head -- still own it.

It was just unfortunate that their OpenGL support was dodgy at best, and then when they jumped the shark with that Parhelia disaster.

G400, TurboGL and Aureal 3D for Half-Life. Good times while they lasted.

Yes, 2D IQ was way better. I picked up a Voodoo 2 for 3D games. Starsiege Tribes and the original Unreal Tournament FTW!
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
TNT 2 was a great card but at that time I think Glide still gave the V3 the edge, at that time there was still a lot of support for it. Once the Geforce 256 was out though there was really no recovering for 3dfx :(

Yep. I was sad to see 3dfx go. I remember how flat out awesome they were in the early days - everyone and their brother bought those cards for GLQuake and Quake II, which were more or less the "COD" games of that era. By far the most popular FPS games you could buy and unlike COD, they were great games with great multiplayer action.

But, the writing was on the wall when they bought STB and started stalling in terms of performance. I'll always have fond memories of the early days though. Oh man. GLquake was just flat out unreal on the Voodoo cards back in the day. It was a jaw dropping difference compared to software rendering.
 

Yreka

Diamond Member
Jul 6, 2005
4,084
0
76
Aah memories. I still remember when I noobed out and bought the Voodoo Rush to upgrade my IBM Aptiva ;) Didn't find out until later the stand alone cards were much faster
 
Last edited:

nenforcer

Golden Member
Aug 26, 2008
1,767
1
76
Image quality was always first on my list instead of pure speed for rendering.

I went from a Rendition Verite V2200 8MB PCI to a Matrox G400 16MB AGP 4X.

I remember playing Max Payne at 1024x768 w/o T&L in 2001 with that card.

It's amazing that to this day Matrox Graphics has maintained driver support for these cards (G550 PCI-E) through Windows Vista / 7 to Windows Server 2003 & 2008.

I actually purchased a Hercules Evil Kyro II (PowerVR) w/o T&L before purchasing my first nVidia Geforce 4 Ti 4200 w/ AGP 8X in 2003.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,065
418
126
I went from an S3 GPU to a geforce 2 MX, at that time nvidia had such a huge advantage in terms of drivers, or simply having hardware working properly with the advertised capabilities
 

Informant X

Senior member
Jan 18, 2000
840
1
81
Wasn't Matrox the first to feature EMBM? I was always enamored with it's possibilities back in the day.