Would you like to see Matrox back in gaming?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mem

Lifer
Apr 23, 2000
21,476
13
81
What was the point of the filter in the first place if removing it made such a marked improvement to the image quality? The component added cost, so surely there was some manner of justification for it being there...what would that justification be?

I remember a lot of the companies used inferior parts for the filter ,Leadtek used good quality parts for their cards back in those days and was one of the better card companies .
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
They were never in gaming to begin with. Unless you call side scrolling 2D or Doom 1 a game.
Dude, you have no idea what your talking about. Their were games before 3D, and Matrox was a big player in the market. The Millennium and Millenium 2 were stalwarts of the industry.

And yes, I would definitely call Quake 1 a game. When it first came out it was awesome.
 

chucky2

Lifer
Dec 9, 1999
10,018
37
91
The absolute best graphics setup at one point was 1 Matrox Millennium G200 w/ 16MB of SGRAM, and 2 Voodoo2's w/ 12MB each.

If you had that, you basically had the best setup one could have.

The other thing about Matrox at that time was, their drivers were stable (for the time period), unlike ATI and nVidia's stuff which was certainly not as stable.

Those were the good old days for sure... :wub:

Chuck
 

Scali

Banned
Dec 3, 2004
2,495
0
0
If I'm not mistaken, 2D acceleration was offered by video cards even a few years older than the Millennium II. As long as the drivers supported Direct2D (called DirectDraw at the time).

2D acceleration predates DirectDraw actually.
The first Windows acceleration was done in Windows 3.x, via the GDI API.
I didn't say it was THE first, but I said it was 'one of the first'.
S3 was also among the early 2D accelerators, and even late versions of the Tseng Labs ET4000 architecture had some 2D acceleration features.

And if we don't look at Windows specifically... you could even look at some early SVGA cards or IBM's XGA for some amount of 'acceleration', by having special bitblt operations and simple hardware filling.

I agree with you guys that during those days Matrox and ATI had the best 2D image quality and performance, S3 wasn't very far behind though. Nvidia's early cards had worse image quality, but 2D performance improved by the time TNT/TNT2 were released.

The main difference between Matrox/ATi and the others (S3, nVidia, Tseng Labs, Cirrus Logic and whatnot) is that Matrox and ATi didn't use 3rd party OEMs to market the cards, but only built their own, under strict quality control.
Sure, there may have been SOME good cards from the other brands as well, but the majority of the OEMs had very poor quality, even from major OEMs such as Diamond.
With Matrox and ATi, the image quality was guaranteed.
 

pukemon

Senior member
Jun 16, 2000
850
0
76
My 1st 3D accelerator was a 8MB Matrox M3D using PowerVR. It was the 1st time that I ever saw a 3D accelerated game (GLQuake). It blew my mind!

Oh man, I thought I was the only one that had one of those. I had a Millennium II + M3D. I can't think of too many things that ran decently on it though.

I ended up upgrading to a Diamond Viper V550 (nvidia Riva TNT) because it could do 2D and 3D in one card! w00t!
 

Lavans

Member
Sep 21, 2010
139
0
0
Personally, I always thought TH2G was a waste. There's little point in even considering buying it with Eyefinity and Nvidia Surround on the market.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
2D acceleration predates DirectDraw actually.
The first Windows acceleration was done in Windows 3.x, via the GDI API.
I didn't say it was THE first, but I said it was 'one of the first'.
S3 was also among the early 2D accelerators, and even late versions of the Tseng Labs ET4000 architecture had some 2D acceleration features.

And if we don't look at Windows specifically... you could even look at some early SVGA cards or IBM's XGA for some amount of 'acceleration', by having special bitblt operations and simple hardware filling.

You're right Scali. The first PC I got in 94 had an ISA VGA card, I replaced that with a Trident VLB (512K?) card and my Dos games ran much faster. Not sure if that was because of the faster VESA Local Bus, the Trident card, or both. But I believe the Trident card did have some 2D acceleration there. Some awesome Dos games tested were:

SimCity 2000 - Dune 2 - Warcraft - Sam & Max - Syndicate - Master of Magic - SSF2T - Full Throttle - Doom 1 & 2 - The Settlers - Duke Nukem 3D. Oh yeah, the good old days :p
 

Flipped Gazelle

Diamond Member
Sep 5, 2004
6,666
3
81
You're right Scali. The first PC I got in 94 had an ISA VGA card, I replaced that with a Trident VLB (512K?) card and my Dos games ran much faster. Not sure if that was because of the faster VESA Local Bus, the Trident card, or both. But I believe the Trident card did have some 2D acceleration there. Some awesome Dos games tested were:

SimCity 2000 - Dune 2 - Warcraft - Sam & Max - Syndicate - Master of Magic - SSF2T - Full Throttle - Doom 1 & 2 - The Settlers - Duke Nukem 3D. Oh yeah, the good old days :p

Still haven't had more fun with video games than these. :D
 

Ross Ridge

Senior member
Dec 21, 2009
830
0
0
If I'm not mistaken, 2D acceleration was offered by video cards even a few years older than the Millennium II. As long as the drivers supported Direct2D (called DirectDraw at the time).

Direct2D and DirectDraw are not the same thing. The IBM 8514/A, which was the first commonly used PC 2D graphics accelerator, came out in 1987. That was long before DirectDraw and the Millennium II. It was (eventually) capabable of accelerating GDI, the graphics foundation of Windows use to draw things like windows, controls and text.
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
Actually, back around the Banshee time frame they had some decent cards for gaming.

Of the cards available in that time, I went with Matrox (Millenium ?) card as it benched very competitively with what was available.

I had a Tseng labs ET6000 that benched better than a Matrox Millenium, and was cheaper to boot. I think the Millenium was a 2D card though, as I seem to remember it coming out around the same time as the original Voodoo.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
You're right Scali. The first PC I got in 94 had an ISA VGA card, I replaced that with a Trident VLB (512K?) card and my Dos games ran much faster. Not sure if that was because of the faster VESA Local Bus, the Trident card, or both. But I believe the Trident card did have some 2D acceleration there.

In DOS there really wasn't any acceleration, apart from basic (S)VGA/VESA trickery (and some DOS APIs for early 3D cards, at about the same time that games started moving to Windows).
It was mainly about the bus speed and about how fast the memory was on the videocard.
VLB played a huge role in that. With the ISA bus the bandwidth just was not sufficient for anything above 320x200 (and various notoriously slow ISA cards, such as Trident and Realtek, were not even capable of good framerates at 320x200 either).
VLB made it possible to play games in higher resolutions (like 640x400) and/or truecolour. Pretty much all VLB cards, even the slowest ones (such as Trident), were capable of 320x200 at 70+ fps, good enough to saturate the 70 Hz standard of VGA.
But at high resolutions, fast localbus cards like Matrox really made a difference, because they could achieve considerably higher framerates in high resolutions because of their faster memory. No acceleration, just raw power.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
The IBM 8514/A, which was the first commonly used PC 2D graphics accelerator, came out in 1987.

Yup, 8514 was the high-end part, VGA was mainstream, and then there was the MCGA low-end chip (supported only the low-res VGA 320x200 256 colour mode).
8514, and its successor, XGA, supported some basic acceleration features such as line-drawing.

However, neither 8514 nor XGA managed to make much of an impact in the market.
Most other IHVs made their clones only VGA-compatible, and then started adding their own proprietary SVGA-extensions.
So most games only supported MCGA/VGA graphics, and the 320x200 256 colour mode became the standard modus operandi for games for many years.
SVGA never caught on very well, because the hardware was too diverse, and very difficult to support. By the time it became standardized with a VESA BIOS API, the market was about ready to move to Windows and DirectDraw, which eventually became the de-facto standard for accelerated SVGA, and was later extended with Direct3D.
 
Last edited:

jacktesterson

Diamond Member
Sep 28, 2001
5,493
3
81
It was all about voodoos in my newb years but im only 26

I owned a kyro ii and it was pretty good bang fir the buck
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,402
8,574
126
the top end G400 card was the fastest 3D card before the original geforce256 came out. after that


<--- kicked it with a 4MB millenium and pure3d 6MB back in the day
 

abbadaba

Member
Aug 9, 2010
48
0
0
I had a 2mb s3 virge integrated on my p133 system from 1995 or so. It came with a version of mechwarrior 2 which was "optimized" for the virge. My god it ran slow and looked awful. I remember getting my diamond voodoo 1 card, which had the 3dfx optimized version of mechwarrior 2 bundled. What a huge difference.
 

Dark_Archonis

Member
Sep 19, 2010
88
1
0
No. I never really cared that much for Matrox.

The only company I miss is 3dfx, but since they are a part of Nvidia now, I have no real problems with that. I am fairly content with the way the graphics market currently is.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
I remember a lot of the companies used inferior parts for the filter ,Leadtek used good quality parts for their cards back in those days and was one of the better card companies .

It was not just the components themselves though.
The reference design from nVidia was a 3rd order filter. Matrox used a 5th order filter. This allowed for a much steeper slope at cut-off (where the nVidia filter started cutting off in the frequency spectrum of the actual signal already, which is what caused the blurring at high resolutions/refresh rates). Even with the most expensive components in the world, nVidia's filter would not be as good.
Anandtech did a nice explanation with pretty pictures:
http://www.anandtech.com/show/911/8

I haven't heard of any OEM that used a custom-designed filter.

ATi eventually solved it by putting the filter inside the GPU itself, so that OEMs could not mess it up.
 
Last edited:

Mem

Lifer
Apr 23, 2000
21,476
13
81
It was not just the components themselves though.
The reference design from nVidia was a 3rd order filter. Matrox used a 5th order filter. This allowed for a much steeper slope at cut-off (where the nVidia filter started cutting off in the frequency spectrum of the actual signal already, which is what caused the blurring at high resolutions/refresh rates). Even with the most expensive components in the world, nVidia's filter would not be as good.

I haven't heard of any OEM that used a custom-designed filter.

ATi eventually solved it by putting the filter inside the GPU itself, so that OEMs could not mess it up.

You don't remember the Anandtech review years ago where they meantioned the image quality was very good on the Leadtek?

Btw I'm not saying it was good as Matrox cards but there was a difference in IQ between Nvidia based cards back in those days from various companies,and yes quality components can help improve IQ.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
You don't remember the Anandtech review years ago where they meantioned the image quality was very good on the Leadtek?

No I don't.
I'm not saying the image quality isn't good. Just saying that I haven't heard of any OEM that used a better filter design than nVidia's reference design, which was flawed to begin with.
Does the review say anything about their filter design?