***OFFICIAL*** Parhelia Countdown thread, constant updates, Anand's article is up!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DaFinn

Diamond Member
Jan 24, 2002
4,725
0
0
BENCHMARKS...AAAAHHH...WE NEED BENCHMARKS...CANT WAIT...
WHEN..OH WHEN DO WE GET BENCHMARKS. ALPHA BOARD OR NOT, FIRE IT UP!
 

PG

Diamond Member
Oct 25, 1999
3,426
44
91
Originally posted by: DaFinn
BENCHMARKS...AAAAHHH...WE NEED BENCHMARKS...CANT WAIT...
WHEN..OH WHEN DO WE GET BENCHMARKS. ALPHA BOARD OR NOT, FIRE IT UP!
From Anand:

- In "simple" games like Quake III Arena, the Parhelia-512 will definitely lose out to the GeForce4 Ti 4600. By simple we mean games that generally use no more than two textures and are currently bound by fill rate. NVIDIA's drivers are highly optimized (much more so than Matrox's) and in situations where the majority of the Parhelia's execution power is going unused, it will lose out to the Ti 4600. This can change by turning on anisotropic filtering and antialiasing however, where the balance will begin to tilt in favor of the Parhelia.

- In stressful DX8 games, Matrox expects the Parhelia-512 to take the gold - either performing on par or outperforming the GeForce4 Ti 4600. Once again, as soon as you enable better texture filtering algorithms and antialiasing the Parhelia-512 should begin to seriously separate itself from the Ti 4600. The quad-texturing capabilities of the core as well as the 5-stage pixel shaders will be very handy in games coming out over the next several months.

- The Parhelia-512 has the ability to take the short-term performance crown away from NVIDIA.




Sorry, but I don't think that's good enough. The NV30 is just around the corner and Matrox cannot keep up with Nvidia's 6 month product cycle so they will fall behind very soon again. :(


 

mastabog

Member
May 1, 2002
48
0
0
Quite intersting ...
Reading all pages but the last one from anand's parhelia review makes you rush out and look for the board to buy it (like any other review only presenting baord features)
Reading the last page though makes you wanna wait until nv30 comes out and outperforms the parhelia so you'll be glad you havent spent your money on the parhelia

I for one am not a gamer and i've been using matrox cards since 1994 when i first bought a misterious vga board named MGA (dunno if there is someone here remembers how much everyone was dreaming of having a vga card with VRAM back then). Then continued with millenium and mistique products and now owning a g450.

I always looked at signal quality because i use a high quality monitor (if you dont have a monitor that can distinct signals like matrox cards provide, then ignore the signal quality feature if you plan on going with parhelia).

Benchmarking the board now with current games might not be too relevant as parhelia has features not yet implemented in present games.

Anyways, i will sure buy the parhelia, regardless of the performance of the new nv30. Matrox always made the highest quality video boards, no matter what product they released and being a Matrox fan i will sure stick with them in the future also.

Cheers,
BoG
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,402
8,574
126
that news blurb yesterday mentioned that NV30 and R300 won't be around until september at the earliest. without occlusion i wonder how competitive parhelia will be, and if there will be a parhelia 2 a year from now with full programmability and culling?
 

GrumpyMan

Diamond Member
May 14, 2001
5,780
266
136
Yes the reason I bought high quality monitors is for a high quality signal. Gaming is fun but the 2-D is the most important aspect of my next card which will be a Matrox even though I'm sure it will have it's drawbacks in gaming but since I already dual monitor, 3 will be even better. As long as I get G 4 performance with outstanding 2-D, I'm sold.
 

Dean

Platinum Member
Oct 10, 1999
2,757
0
76
Someone at the rage3d forums posted this, i'm not sure if its true or not though.

Originaly posted @ www.hexus.net "We have also learnt that they are playing down the speed - It can run Quake 3 in Triplehead 1280 * 1024 at 110fps. This is serious power - roughly 3 times that of a Geforce 3."
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
I like it :). Let's see now...

Performance: Like it was said, it might lose in "simple" games. But I think "losing" is wrong word here. One card gets 200FPS, other gets 190FPS. Both are plenty of fast.

But, as the games get more complex, Parhelia should shine. It's texturing-capabilites kill GF4, and it's a monster when it comes to memory-bandwidth. Also, texture-filtering should have very little performance-impact, and same can be said for FSAA.

Features: In short, this thing has everything you could ask for! It may not have DX9, but that ain't coming for a while now.

Image quality: Seems to be best there is.

All in all, it looks good :). Some might whine that "GF4 gets 250FPS, this get's only 240FPS in Quake2!", but that really is irrelevant. Parhelia has more than enough performance in simple games :). In more complex games Parhelia should really shine.

I wonder what kind of upgrades we can expect in the .13 micron version? They might (at least I hope they do) add Z-compression and occlusion culling. As stated by Anand, those seem to be the two weak points in Parhelia.
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
Originally posted by: Dean
Someone at the rage3d forums posted this, i'm not sure if its true or not though.

Originaly posted @ www.hexus.net "We have also learnt that they are playing down the speed - It can run Quake 3 in Triplehead 1280 * 1024 at 110fps. This is serious power - roughly 3 times that of a Geforce 3."

pretty darn nice, but I'm thinking that quake3 is becoming a absolete benchmark, like look at the geforece 4 4200 q3 benchmarks, cards are reaching over 200fps, now thats just pointless
 

Eug

Lifer
Mar 11, 2000
24,165
1,809
126
The 2D testing looks interesting. I've always thought Matrox 2D was amongst the best, and the Radeons were close. Dunno about the Geforce4's but some of the Geforce2's are terrible for 2D, at least at high resolution.

Matrox is suggesting the Radeon 8500 in ideal situations can never be as good as the 512-Parhelia, and the Geforce4 might be better than the Radeon.

We all know it's not just raw design of the cards though, as 3rd party manufacturers making the Geforce cards can really screw up 2d quality even if the card design is capable of better. If Matrox's graphs are to be believed (and that's a big if), it would seem that the Radeon's pretty good 2D is not because the design so much as the quality control put into the final shipping card. Geforce's might have an OK design for 2D, but the lack of QC by 3rd party manufacturers really can hurt them.

The other issue though is that most people don't have monitors capable of displaying good 2D at 1600x1200, so all of this may be moot. I would suggest that anyone with almost any consumer-level aperture grill monitor is not capable of doing an accurate assessment of 2D. Even with a Matrox card, high resolution 2D looks like crap on a mid-end aperture grill monitor (eg. Samsung 900NF) compared to a mid-range shadow mask monitor (eg. Samsung 950p, which is usually much cheaper by the way).
 

LostHiWay

Golden Member
Apr 22, 2001
1,544
0
76
Originally posted by: Nemesis77
I like it :). Let's see now...

Performance: Like it was said, it might lose in "simple" games. But I think "losing" is wrong word here. One card gets 200FPS, other gets 190FPS. Both are plenty of fast.


But think about who would be buying this card when it comes out....the people who care about those 10 extras FPS. I really don't see the point in Matrox putting so much money & RD into this card. It will most likely be dethroned in 2 or 3 months when the NV30 comes out.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
I still don't understand why the Parhelia doesn't have any bandwidth saving features. I mean, NVIDIA and ATi have been doing it for the past year and a half so it's really no excuse IMHO (seeing the performance increase with it on vs off is what makes me scratch my head at Matrox). I mean, it's nice and all that the Parhelia has 20GB/sec of bandwidth, but don't let it go to waste:eek:
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
Originally posted by: NFS4
I still don't understand why the Parhelia doesn't have any bandwidth saving features. I mean, NVIDIA and ATi have been doing it for the past year and a half so it's really no excuse IMHO (seeing the performance increase with it on vs off is what makes me scratch my head at Matrox). I mean, it's nice and all that the Parhelia has 20GB/sec of bandwidth, but don't let it go to waste:eek:

probably because of the die size. They had to sacrifice some stuff like the .. nullifier thing... to make room for all the other fetures and since they have so much bandwith its not that much of a deal at the moment. My guess is that they will make another version of this card in about a year done with 0.13 micron and fully dx9 compatible.


btw, toms article is up:p
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: LostHiWay
But think about who would be buying this card when it comes out....the people who care about those 10 extras FPS. I really don't see the point in Matrox putting so much money & RD into this card. It will most likely be dethroned in 2 or 3 months when the NV30 comes out.

Well, no. People who care about those 10FPS are usually the n00bs, who runout and buy the greates NVIDIA-card every 6 months. It seems that people who run Matrox are a bit more... professional ;). They care about other things besides those 10FPS. And this card costs so much that those kids have hard time assuring their parents that they need this card.

And like I said, all that applies only to older games (like Q3). With newer games (like Doom 3 and Unreal Tournament 2003) things should look mighty different.

As for Parhelia being dethroned by NV30... That we don't know yet, since we don't really know anything about NV30. All we have is the regular hype from NVIDIA. It might offer better raw performance, but will it beat Parhelia in other departments? Like image quality and features?
 

jm0ris0n

Golden Member
Sep 15, 2000
1,407
0
76
Nice article Anand. You had a bit more insight than Tom as you did go to sanfrancisco and see the silicon in flesh :)

I can hardly contain myself when thinking of triple-monitor quake 3! It'd be like playing F355 by Sega at the arcades, except it would be quake3/unreal tournament :D I hope matrox brings the goods to quakecon this year. Whew, that'd be sweet!

2002 is gonna be known as "The year of the Gamer!" :)
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
Originally posted by: jm0ris0n
Nice article Anand. You had a bit more insight than Tom as you did go to sanfrancisco and see the silicon in flesh :)
Matrox has been having press conferences for the last weeks all over the world, most reviewers have seen the card in action.


<-- still reading through toms article, anands was great:)

 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: Nemesis77
Originally posted by: LostHiWay
But think about who would be buying this card when it comes out....the people who care about those 10 extras FPS. I really don't see the point in Matrox putting so much money & RD into this card. It will most likely be dethroned in 2 or 3 months when the NV30 comes out.

Well, no. People who care about those 10FPS are usually the n00bs, who runout and buy the greates NVIDIA-card every 6 months. It seems that people who run Matrox are a bit more... professional ;). They care about other things besides those 10FPS. And this card costs so much that those kids have hard time assuring their parents that they need this card.

And like I said, all that applies only to older games (like Q3). With newer games (like Doom 3 and Unreal Tournament 2003) things should look mighty different.

As for Parhelia being dethroned by NV30... That we don't know yet, since we don't really know anything about NV30. All we have is the regular hype from NVIDIA. It might offer better raw performance, but will it beat Parhelia in other departments? Like image quality and features?

So you're calling people who buy the latest technology n00bs? Come on, be fair here. If people can afford to buy the latest and fastest, let them do it. Don't look down on them b/c of it. If NVIDIA only catered to kiddie gamers, they wouldn't be as large as they are today. They get money from all sectors of the marketplace so don't give me this "mommy buy me an NVIDIA card" routine.

While it's true that Matrox caters more to the professional crowd, their products have been stagnent for the past three years and thankfully Parhelia will change that dramatically.

Lastly, 2D image quality has been suspect with NVIDIA for some time now (not on all cards, but just on boards where manufacturers choose to use crappy filters), but how have they been down and out feature-wise? They've always touted the latest and greatest so I don't see where you're heading with that one. nView is a great dual output feature and it is great competition for dual head.

As for NV30 dethroning Parhelia, I'll just sit back and enjoy the ride:)

Anyway, all everyone wants to talk about is Matrox vs NVIDIA. PLEASE don't leave ATi out. If NE1 can out do Matrox in the feature department, it's ATi. And don't forget 3DLabs with the PV10.



Anyway, I've been watching the Parhelia ponderings for the past few months and it's really quite funny that Matrox hasn't had a competitive card since G400MAX...yet when the Parhelia specs come out sans benchmarks the Matrox fans claim to lay waste to EVERYONE at the drop of a hat. It just doesn't work that way. Sit back and wait for the benchmarks before making brash assumptions. You wait 3 years for a worthy sucessor, you can wait another month a half to see if it really destroys the competition. And by that time, you'll be able to see how it will stack up against NV30 and R300. If they both support 256-bit memory, higher clock speeds, faster memory, and .13 micron technology, it won't be much of a show to watch. Not to mention the usual feature count increases, improvements to their dual head capabilities (possibly triple head), and NVIDIA's claim that they will out a stop to the inferior 2D image quality of their cards.
 

microAmp

Diamond Member
Jul 5, 2000
5,988
110
106
Originally posted by: BFG10K
I'll be very disappointed indeed if I don't see actual benchmarks along with the technical information.

And disappointed I am.
 

ToBeMe

Diamond Member
Jun 21, 2000
5,711
0
0
I gotta' be honest............I've owned every type and brand of card out there almost and have matrox, ATI, and n'Vidia as we speak in systems..............this card is immpressive in a pure hardware sense, but, I'm not overly immpressed otherwise........it does have shortcomings in areas mechanicly, and if you want the high-end version it is even beyond what n'Vidia asks for theirs! I'll wait for some B/M's but as of right now, I don't know if this is a card which really excites me that much unless it shows some really impressive qualities and B/M's beyond even the capability of NV30 & R300............
 

vash

Platinum Member
Feb 13, 2001
2,510
0
0
This Matrox review, is a bit dissappointing -- even from all websites.

I WAS hoping, that after the NDA was lifted, we'd see some actualy benches and some serious screenshots with a comparison of image quality and AA comparisons. What did we all get instead? We received basically the same bit of leaked information that everyone had. I'm thinking the drivers are still VERY immature for this board and they'll ship this card with only better drivers that still aren't fully optimized.

On a second note, I was happy to hear how fast the third RAMDAC would be for the "surround" gaming. With only a 230Mhz RAMDAC for the third screen will definitely slow down the gaming performance. I knew it would happen for a third output, but I was hoping it wouldn't be as bad.

Matrox has plenty of potential for the card, so lets hope they can shrink the die to .13, to bump up the speeds on all RAMDAC's and get the drivers out in a timely manner. This card can be the performance crown over the TI4600 for a few months, but when the next Nvidia card comes around, Matrox will be playing catch-up to both ATI and Nvidia.

For all the hardcore Matrox fans: go and reserve your card now. For everyone else that wants the most gaming performance, not much to see here, move along.

vash
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: NFS4
So you're calling people who buy the latest technology n00bs?

Don't put words in to my mouth ;). I called those people n00bs who buy the latest and greatest every 6 months (even when they don't need it) and cry when they get only 255FPS @ 640x480 when someone else gets 260FPS. Those people ARE n00bs! People who know better know that it really doesn't matter if you get close to 300FPS, what matters is the minimium FPS.

Lastly, 2D image quality has been suspect with NVIDIA for some time now (not on all cards, but just on boards where manufacturers choose to use crappy filters), but how have they been down and out feature-wise? They've always touted the latest and greatest so I don't see where you're heading with that one. nView is a great dual output feature and it is great competition for dual head.

True, NVIDIA has had good features for years now. What I meant by the features are such things as hardware-accelerated font-AA (the Glyph AA in Parhelia), surround-gaming, the GigaColor-tech, Triplehead... Those are all features that are not currently in any NVIDIA products. Will NV30 change that? We shall see I guess :)

As for NV30 dethroning Parhelia, I'll just sit back and enjoy the ride:)

Same here, I won't be upgrading for a while, so I'll just wait 'n see what each company has to offer :)

Anyway, all everyone wants to talk about is Matrox vs NVIDIA. PLEASE don't leave ATi out. If NE1 can out do Matrox in the feature department, it's ATi. And don't forget 3DLabs with the PV10.

I am mighty impressed with P10 :). Only thing that's holding banck my enthusiasm right now is the lack of specs for the Creative-branded consumer-version.

As for Ati... Their hardware has always been good, it's the drivers that have been lacking. And if they can't improve their Linux-support, they will not be getting my money.

Sit back and wait for the benchmarks before making brash assumptions. You wait 3 years for a worthy sucessor, you can wait another month a half to see if it really destroys the competition. And by that time, you'll be able to see how it will stack up against NV30 and R300.

That's EXACTLY what I'm planning to do :) (altrough I didn't wait for 3 years). I'm in no hurry to upgrade just yet, I'll just wait and see what the other companies have to offer.

As to the performance of Parhelia... Specs at least point to stellar performance. But we will know for sure once the cards gets out
 

Eug

Lifer
Mar 11, 2000
24,165
1,809
126
BTW:

Message to Matrox:

I won't buy any card over CAD$400 (which is about US$250). Hopefully you'll come out with a "lower-end" card soon enough. I use a inexpensive Matrox dual head card at work on one of my machines, but have not done so at home because the 3D sucked so bad. Give me something with great 2D and adequate 3D at a more reasonable price and I will buy a dual head setup for home too.
 

mastabog

Member
May 1, 2002
48
0
0
In responce to NSF4.

I personally think Nemesis77 was right about 'noobs'. Maybe the word he used has troubled you.
Do you really think that if it wasnt for the 'noobs' who bought nVidia cards every 6 months, nvidia would have such a market ??

Get serious, if it wasnt for gamers, graphic cards would suck like they did 5 years ago and nVidia would not be what it is now.

Oh and also ... ATI the single one who could dethrone Matrox ??????? ... man oh man.
Think about who came out first with 3D hardware ...

Cheers,
BoG