Are Matrox Video Cards Considered Good?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Stefan Payne

Senior member
Dec 24, 2009
253
0
0
I should have said...Matrox's last good card was the Millennium II.
Nope it was the Millenium G400 (MAX).

At that time Matrox could keep up really good with nVidia, ATi and 3DFX (who were far behind that time but performance was OK).
But after the G400 Matrox lost the ambition and so nothing really happens for years...
What about the Parhelia, with its 4 pixel pipelines, 16 TMU's, 4 vertex shaders, 256-bit BUS and 16x Fragment Anti Aliasing? The Millenium P750 is only half of the parhelia.
Well the Memory Interface was crap...
It was just 1 256bus while other cards such as the R300 had AFAIR 4 64bit Busses wich could operate independently so it wasted a lot of bandwith...
Even the older G-Series Cards had better memory interfaces.

And the rest isn't that good either...
The 4 TMUs per pipe couln't be used and some other things...

The Parhelia was a geforce 3 competitor (directx 8.1 iirc on the matrox part) and underperformed even then. Most integrated graphics are faster.
Wish that was true...
From the featureset that's correct but let's not talk about the performance wich was behind a GF2 GTS...

I had a GF2 GTS (PRO IIRC) and a Parhelia a while ago...
Parhelia was a hibrid DX8/DX9 card, it could process DX9 pixel shaders, but it could only do DX8 vertex shaders.
No it was the other way around.
The pixelshaders were Version 1.3 and the Vertex Shaders were almost version 2.0 (or even were)

While the Parhelia was slower in overall than the GeForce Ti 300, it was never that far behind, specially when Anti Aliasing was cranked up, Parhelia did a good match against the 4600 in this review, but it lost hopelessly when anti aliasing was off, but when it was on, it was so close in performance, and the image quality was so great, ATi and nVidia should do something like the Fragment Anti Aliasing, little impact in performance and great image quality, Super Sampling, CFAA and TRAA are simply much of a burden in current GPU's with many demanding games today.
http://firingsquad.com/hardware/parhelia/page11.asp
1. Well the truth is, the Parhelia had even Problems with the TI 200 and wasn't able to compete with them.

2. FAA looks good - on Paper!
1st it used 16 Samples but in an ordered grid pattern so it can be compared to 4x RGMSAA.
But FAA works on the 4x Parhelia just on the outer edges of an object wich is quite useless...
I think it was fixed on the AGP 8x Version wich also got a much higher clock (I've heard about 300MHz) but no one really reviewed that thing so I can't tell for sure how much the AGP 8x Version was better...
 
Last edited:

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
No it was the other way around.
The pixelshaders were Version 1.3 and the Vertex Shaders were almost version 2.0 (or even were)

Yeah, sorry hehe, a bit sleepy at that time...


2. FAA looks good - on Paper!
1st it used 16 Samples but in an ordered grid pattern so it can be compared to 4x RGMSAA.
But FAA works on the 4x Parhelia just on the outer edges of an object wich is quite useless....

But nVidia's Anti Aliasing was also based on Ordered Grid at that time, nVidia introduced a better grid with the GeForce 6 series. That's why Parhelia had better Anti Aliasing quality compared to anything at that time

Those were the good times when out of the nothing, competing technology would come out of the nowhere, making huge leap in performance and image quality. I still remember my Radeon 9700PRO and their techdemos which were jawdropping at that time in 2002/2003 and it wasn't until 2 years ago approx that DX9 games catched up with such demos in terms of realism and quality, yet I have to waint DX9 games to catch up with X800 demos like DoubleCross with its CG style and Toyshop with their excellent raindrop simulation. Software is so behind of hardware....
 

WaitingForNehalem

Platinum Member
Aug 24, 2008
2,497
0
71
He needs exactly the opposite. Apple's adapter will allow him to hook up a Mac with a mini displayport to an LCD with a DVI plug. There are cheaper alternatives to this from monoprice than the Apple branded adapter. What he needs is a plug that will convert the male mini displayport hard wired to his Apple monitor to a DVI plug that will plug into his PC, which does not exist.



This would probably work, but he will still need a mini displayport to displayport adapter.

http://www.monoprice.com/products/p...=10428&cs_id=1042804&p_id=6502&seq=1&format=2

http://www.monoprice.com/products/p...=10246&cs_id=1024603&p_id=5995&seq=1&format=2

http://www.newegg.com/Product/Produc...82E16814121310
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91

Stefan Payne

Senior member
Dec 24, 2009
253
0
0
But nVidia's Anti Aliasing was also based on Ordered Grid at that time, nVidia introduced a better grid with the GeForce 6 series. That's why Parhelia had better Anti Aliasing quality compared to anything at that time
nVidia was way behind at that time.
And the R300 wasn't far away - came a month later IIRC.

So it's not that good...
If it came a year earlyer - who knows...
 

Luddite

Senior member
Nov 24, 2003
232
3
81

So, if I got the coupler, the cable and then any video card it would work? (It didn't say anything about a displayport out on the Asus card in your link). Doesn't a DVI signal need to be converted to a displayport signal?

After some more research, I found out that some people have reported this card to work in a PC:

http://store.apple.com/ca/product/MC002ZM/A
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
I think Nvidia bit off Matrox actually except that Nvidia isn't as efficient.
Uh, I don’t think you know enough about the AA methods to make that comment. I know this because your comment is false.

There’s nothing efficient about an ordered grid pattern that only affects first-class polygon edges.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
But nVidia's Anti Aliasing was also based on Ordered Grid at that time,
That isn’t quite true; 2x and 2xQ both employed rotated grid patterns. Also even 2xAA impacts more of the scene than 16xFAA does. That and you forget 16xFAA is ordered grid too, which means at near horizontal/vertical edges it’s only about as effective as 4xRG, while taking four times as many samples.

That's why Parhelia had better Anti Aliasing quality compared to anything at that time
This is absolutely untrue. Arguably the finest AA ever seen in consumer space was that of the Radeon 8500, which employed pseudo stochastic super-sampling. Granted, the performance hit was quite steep, but you were only talking about image quality.

In terms of usable AA with excellent quality, the 9700 Pro was far ahead of the Parhelia given it had 4xRGMS and 6xSGMS.
 

Stefan Payne

Senior member
Dec 24, 2009
253
0
0
Let's not talk about the R200, wich seems to be broken in many ways (anandtech had a test with drivers wich enabled Multisampling on it), it even had sometimes a rotated grid for 2x and even sometimes 4x, if there was no fogging.

That FAA only affects the outer edges was AFAIK fixed on later (AGP8x capable) versions of the Parhelia wich should be clocked higher than the AGP4x version.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
That isn’t quite true; 2x and 2xQ both employed rotated grid patterns. Also even 2xAA impacts more of the scene than 16xFAA does. That and you forget 16xFAA is ordered grid too, which means at near horizontal/vertical edges it’s only about as effective as 4xRG, while taking four times as many samples.

I already knew that FAA was OG, 2x Quincux anti aliasing was never that good, it was a cheap way to do some sort of anti aliasing for less impact in performance comparedo to the traditional anti aliasing which was a heavy burden for such GPU's at that time.

This is absolutely untrue. Arguably the finest AA ever seen in consumer space was that of the Radeon 8500, which employed pseudo stochastic super-sampling. Granted, the performance hit was quite steep, but you were only talking about image quality.

In terms of usable AA with excellent quality, the 9700 Pro was far ahead of the Parhelia given it had 4xRGMS and 6xSGMS.

I was comparing the Parhelia only to the GeForce 4 and earlier models. Radeon 9700PRO came after Parhelia and definitively Radeon 8500 had better anti aliasing quality compared to the GeForce 4/3. But the Radeon 9700PRO demolished everything at that time in terms of image quality and performance.
 

Knowname

Member
Feb 17, 2005
102
0
76
Matrox's current cards likely can't keep up with intel's integrated graphics in 3d.

I doubt they are THAT bad, Matrox has silently (perhaps more loudly to oems but silently to enthusiasts) announced PCE-express based Parhelia units. They have 'kept up'. But still, on par for Matrox in 3d is probably 4400 lvls. NOT worth the $400 for a card. but do NOT underestimate the suckiness of integrated intel :|.

An old G550/ K6-2+ I had couldn't even play HL1 though and this was YEARS after HL came out :(.
 
Last edited:

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
I doubt they are THAT bad, Matrox has silently (perhaps more loudly to oems but silently to enthusiasts) announced PCE-express based Parhelia units. They have 'kept up'. But still, on par for Matrox in 3d is probably 4400 lvls. NOT worth the $400 for a card. but do NOT underestimate the suckiness of integrated intel :|.

An old G550/ K6-2+ I had couldn't even play HL1 though and this was YEARS after HL came out :(.

The G550 should have been able to play halflife 1. Heck, the k6-2+ should have been able to play halflife 1 by itself, albeit it slowly on the 'dramatic' scenes. I had one and that's how I perceived the slow down whenever particle effects occured. However, my Athlon ripped through the game, even able to run at fairly hi res.

Oh, and Intel's integrated graphics have been about geforce 2 to geforce 3 level for years, they're sucky, but they're not parhelia sucky.