• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

More info on the Parhelia, and now some numbers

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Athlon4all
Amish, I dunno if this is the case with other ppl. but I don't doubt Matrox's ability to release stable drivers (or ATi's for that matter), but to be honest, what I doubt is Matrox's/ATi's ability to match nVidia's optimization level. There is no doubt that nVidia's drivers provide their cards with an additional speed boost that definately makes a differenece, and that is what I feel will end up being the decicive factor with Parhelia is once again, nVidia's drivers. We'll see. But, I definately wouldn't bet on Matrox's drivers being at nVidia's level. Where Parhelia wins in benchys vs the GF4, it'll because of superior technology.

Of course it will be. 🙂

Nvidia has been working on their drivers for the past what, 4 generations of cards (Nvidia generations anyway). There hasn't been much of a technology upgrade between generations, so I would expect a decent level of performance from them.

The Parhelia, being totally new from the ground up, will definitely take several driver revisions before it truly begins to shine.

amish
 
So, let me get this straight. You don't care about the card's perfomance in today's games but in games that will not be available for another 6-12 months. Not sure if you have noticed but the product cycle tends to be 6 months. Stop buying into all the hype.

Surround gaming:

Quake 1024x768x32 Medium Quality Average 50fps

So you are going to buy a $400 card and a couple of $300 monitors so you can use this great (I couldn't care less, but hey ...) feature and then you are going to use 1024x768 to play a 3 years old game? Not only that but if anybody actually plays Q3 he will know that 50fps is unacceptable due to major slowdowns when there is a decent rocket fight going on. Please, have some sense.

Performs poorly in Q3 but will perform great in UT2003 and Doom3

Can somebody please provide some proof for that? Has anybody actually played UT or Doom3 on a Parhelia? How about on an ATI R300? On a nVidia NV30 perhaps? How about all those cards that will be out by the time Doom3 is out? You can not buy a card in order to play a game that will not be available for another year. Please have some sense.

Anisotropic Filtering and AA
Has anybody played a game using Anisotropic Filtering and x16AA on a Parhelia? Can anybody provide anything more than a guess as to what the performance hit will be? If a GF3 is faster than a Parhelia at 1024x768x32 Medium quality what makes you think that 1024 x 768 x 32 Max Quality x 4AA will be any different? The chances are that it won't.

Don't try to convince yourself that paying $400 for a card that performs worse than a $100 card in TODAY'S games is the right thing, cause it is not. It's packed with features that you will either never use or by the time you are able to another card will be faster! Don't claim that it will have great (or crap for that matter) drivers because you CAN'T KNOW!
 
MrGrim,
the product cycle for matrox is not 6 months, also matrox has said they will release a 0.13 micron fully dx9 compliant card in about a year.
UT2003, when epic got the card they tried it with tripplehead and they said it worked great and would definetly include that feature in the game. So I think it performs well enough 🙂
 
Originally posted by: Czar
MrGrim,
the product cycle for matrox is not 6 months, also matrox has said they will release a 0.13 micron fully dx9 compliant card in about a year.

When did I say that Matrox's product cycle is 6 months?

UT2003, when epic got the card they tried it with tripplehead and they said it worked great and would definetly include that feature in the game. So I think it performs well enough 🙂

Epic said that the tripplehead worked great (meaning it didn't crash?) and that is proof that it performs well enough. Well, I hope I'm not the only one that can't make any sense out of that.
 
I was refering to all the video card manufacturers, nVidia, ATI, Matrox etc. It's common for a new card to be released every 6 months, whether it's a completely new core or just an improved version doesn't really matter as long as the performance enchancement is there.

What's playable? 640x480x16bit @ 30fps? Because you can still play, but I doubt you enjoy it.
 
Originally posted by: MrGrim
I was refering to all the video card manufacturers, nVidia, ATI, Matrox etc. It's common for a new card to be released every 6 months, whether it's a completely new core or just an improved version doesn't really matter as long as the performance enchancement is there.

What's playable? 640x480x16bit @ 30fps? Because you can still play, but I doubt you enjoy it.
matrox doesnt go by the 6month product cycle (more like 1 1/2 years or more), ati goes by the 1 year product cycle.

I think they were running at 1024x768x3, though not entirely sure

 
Well, nVidia has a 6 month product cycle and updated version of cards with ATI chipsets have been released as well (Radeon 8500 128MB). There will be at least 2 more generations of graphics cards before Doom3 is released.

About the performance ... is this leading anywhere? If you think that paying all that money for the card and the monitors and having games being "playable" is OK then I can't argue with that. It's your money.
 
Originally posted by: MrGrim
Well, nVidia has a 6 month product cycle and updated version of cards with ATI chipsets have been released as well (Radeon 8500 128MB). There will be at least 2 more generations of graphics cards before Doom3 is released.

About the performance ... is this leading anywhere? If you think that paying all that money for the card and the monitors and having games being "playable" is OK then I can't argue with that. It's your money.
about one ore two generations between now and doom3, still alot can happen.
I havent quite gotten the point of buying graphics card every 6 months or even every year. I own a g400 and only recently I'v found it not to be good enough with current games. At school we had a tnt2 and it worked great in medal of honor and jedi night 2. I just want my next card to last me as well as my g400 has done 🙂
 
Then perhaps the argument is that your current card is not good enough any more and that you need a new one right now. I can understand that.

I don't agree though with those that have a GF3 or a Radeon 8500 and want a Parhelia without even knowing its potential.
 
Originally posted by: Czar
Originally posted by: MrGrimWell, nVidia has a 6 month product cycle and updated version of cards with ATI chipsets have been released as well (Radeon 8500 128MB). There will be at least 2 more generations of graphics cards before Doom3 is released.About the performance ... is this leading anywhere? If you think that paying all that money for the card and the monitors and having games being "playable" is OK then I can't argue with that. It's your money.
about one ore two generations between now and doom3, still alot can happen.I havent quite gotten the point of buying graphics card every 6 months or even every year. I own a g400 and only recently I'v found it not to be good enough with current games. At school we had a tnt2 and it worked great in medal of honor and jedi night 2. I just want my next card to last me as well as my g400 has done 🙂
Frankly, if you find a TNT2 satisfying in MOH:AA, then your expectations can't be very high.
I play games(though not much these days since most new games have sucked recently), and I do alot of work as well.
Now, for the games, I want good performance, and for the work, I want good 2D.
My current GF3(LeadTek) performs well enough for the games I play, but isn't up to the task in the 2D department.
The Parhelia will surely be more than up to the 2D task, but Im beginning to doubt if it will be up to the 3D performance I want.

Especially for $400, if it was more like $200, I'd be OK with it, cause then it's a budget high end card, but $400, that is the high-high end of gaming cards, and I'd expect it to perform like one, features or not.
 
Sunner,
medal of honor plays just fine, though using only 640x480 on the tnt2, but I finished the game at home using 1024x768 on my g400 and that was just fine also. (around 30fps on average I think)
 
I'd say MOH:AA was bordeline playable on my GF DDR running at 800x600x32, and this was on an AXP 1466 MHz, and a GF DDR is quite a bit faster than a TNT2 or G400.

30 FPS is indeed having low expectations, especially if you would be satisfied getting it from a $400 card.
 
Anisotropic Filtering and AA
Has anybody played a game using Anisotropic Filtering and x16AA on a Parhelia? Can anybody provide anything more than a guess as to what the performance hit will be? If a GF3 is faster than a Parhelia at 1024x768x32 Medium quality what makes you think that 1024 x 768 x 32 Max Quality x 4AA will be any different? The chances are that it won't.

Don't try to convince yourself that paying $400 for a card that performs worse than a $100 card in TODAY'S games is the right thing, cause it is not. It's packed with features that you will either never use or by the time you are able to another card will be faster! Don't claim that it will have great (or crap for that matter) drivers because you CAN'T KNOW!

have you even read anandtech's review of the card?! do you even know anything about the technology of the card that makes it so different?!

Quotes from the article:

Each one of these "quad texturing units" is flexible enough to allocate processing resources depending on the application at hand. For example, in a predominantly dual-textured game such as Quake III Arena, the Parhelia-512 can use the unused texturing resources to perform 8-tap anisotropic and trilinear filtering at virtually no performance hit; granted that this is more of a feature for today's games than tomorrows.

Compared to a GeForce4, the Parhelia-512's pixel shading stage is superior in that it has five pixel shader stages in each rendering pipeline (compared to the GeForce4's two). This gives the Parhelia-512 the ability to multipass much less frequently than the competition as it is not only able to process 5 pixel shader operations in a single pass per pipeline but it can also process 10 pixel shader operations across two pixel pipelines in a single pass if necessary. And as you know, the fewer passes made the more bandwidth and resources are conserved.

given this type of information, i don't understand how people can sit here and deny that the card has some huge potential?? i hardly ever use AA b/c of the hit on fps with my radeon 7500, but imagine if it didn't effect the performance?? huge advantage!
 
Originally posted by: Sunner
I'd say MOH:AA was bordeline playable on my GF DDR running at 800x600x32, and this was on an AXP 1466 MHz, and a GF DDR is quite a bit faster than a TNT2 or G400.

30 FPS is indeed having low expectations, especially if you would be satisfied getting it from a $400 card.
I'm not expecting 30fps on the parhelia, just saying that with current games my old g400 is getting around 30fps on a 700mhz machine, old card still in use and still performing well enough.
 
I am not saying that Parhelia is not a card with potential. It will probably be the best thing around when it comes out. The question is, for how long? With R300 and NV30 (we don't really know anything about either of them) around the corner can you be positive that a Parhelia will be the best buy? That's all I'm saying, don't buy the hype!

I know very little about video cards and I'm not afraid to admit it. I can only judge one by what it offers me in terms of performance and features, unfortunately I can't appreciate good architecture and design.

For example, in a predominantly dual-textured game such as Quake III Arena, the Parhelia-512 can use the unused texturing resources to perform 8-tap anisotropic and trilinear filtering at virtually no performance hit

First of all this is quite a special case if I'm not mistaken, right? The game has to be predominantly dual-textured what ever that means. 🙂 And even in those cases does it really matter? It takes no hit to apply 8tap anisotropic filtering ... so?! I don't know about you but I can't tell the difference between no AF and 8tap AF. And even if somebody can see a difference between the two, the same question arises; does it really matter? If a GF3 can outpeform a Parhelia in games such as Quake3 then the chances are that R300 and NV30 (as well as GF4) will be able to do so too by a much bigger margin. Even if 8tap AF results in a performance hit for R300 and NV30 the difference in performance will be such that the end result will be Parhelia being slower again.

Compared to a GeForce4, the Parhelia-512's pixel shading stage is superior in that it has five pixel shader stages in each rendering pipeline (compared to the GeForce4's two). This gives the Parhelia-512 the ability to multipass much less frequently than the competition as it is not only able to process 5 pixel shader operations in a single pass per pipeline but it can also process 10 pixel shader operations across two pixel pipelines in a single pass if necessary. And as you know, the fewer passes made the more bandwidth and resources are conserved.

Who cares about the GF4? If I'm not mistaken your argument is that it doesn't matter that it is slower in today's games because it will be fast in future games. Well, excuse me but I'm not going to buy a GF4 to play Doom3. I'm going to buy an R300 or a NV30 or the newer Parhelia model or whatever is out at the time!

My point is that buying a card that doesn't perform well in today's games but will perform better in future games is not the wisest thing. A lot happens in 6-12 months, standars and specs change.
 
mrgrim ... when people get bitten by the hype-bug they'll grasp any excuse to justify their purchases... I have the feeling that if the situation were reversed, matrox fanatics could come up with all sorts of reasons why FPS is more important than surround gaming & 8-tap anistropic stuff.

 
I think I agree with what you are saying. Bottom line is that you should look at everything objectively. Don't worry about who makes the chip, worry about what it has to offer you. 🙂
 
The game has to be predominantly dual-textured what ever that means. And even in those cases does it really matter?

All games utilize dual-texturing these days.

It takes no hit to apply 8tap anisotropic filtering ... so?!
Currently, nVidia takes a substantial performance hit with anisotropic filtering. ATI's degradation is less, but their implimentation of anisotropic isn't as "complete" as nVidia's.

I don't know about you but I can't tell the difference between no AF and 8tap AF.

Perhaps you should try some Windex on your monitor screen? 🙂

And even if somebody can see a difference between the two, the same question arises; does it really matter? If a GF3 can outpeform a Parhelia in games such as Quake3 then the chances are that R300 and NV30 (as well as GF4) will be able to do so too by a much bigger margin.

I think the point is... no one knows how it will or will not perform. That's what the "buzz" is about. Everyone want's to know how well this card will hold up when AF and AA are turned on together. If they can really use the extra pipelines to get "free" anisotropic filtering the card could become the 3D quality leader.

Even if 8tap AF results in a performance hit for R300 and NV30 the difference in performance will be such that the end result will be Parhelia being slower again.

This is speculation and doesn't amount to a hill of beans either way until the cards ship.

Don't worry about who makes the chip, worry about what it has to offer you.

Exactly, and right now nobody knows what any of these cards will offer (other than the triple-head thing).
 
I worry about who makes the chip because Matrox has always treated me right and their quality is unmatched in my experience.

As long as the games/apps I use are playable, that's all I care about. That's why I'm stilll using my G400. I tried Quake 3 a couple of times, but I don't like it and don't play it. So, Quake numbers, no matter how Beta the hardware or drivers, don't mean a whole lot to me.

amish
 
merlocka- i agree 100% with your post.....so many people in here were knocking the parhelia b/c it's fps on the first know b/m didn't beat out that of a gf4, on an old game.....they aren't giving the thing a chance to shine.....some people just shoot new technology down immediately....it's almost like there are now nvidia/ati/matrox zealots in the same way there are amd/intel!.....i don't see how anybody can deny that the parhelia has a lot of potential to be a kick ass card
 
merlocka I agree that we don't know how any of the new cards are going to perform. All I'm saying is don't buy the hype!

I explained my points over and over again and I'm starting to think it's a waste of time because nobody takes the time to read them thoroughly. Here is an example:

If they can really use the extra pipelines to get "free" anisotropic filtering the card could become the 3D quality leader.

The only thing free that you may get in SOME games is 8tap AF. 8tap AF is really not that big of a deal! I'm not saying it's not worth it, I'm saying that it is not as FANTASTIC as it sounds. It's NOT free AF and it's not going to be free ANYTHING in future games.

I'm not saying that you shouldn't buy it, just don't get excited over nothing, don't get caught up in all the hype. I'm just trying to show you that nothing that has been said is worth the remark "I'm going to get it no matter how much it costs!!!!!".

It may turn out to be the greatest card of all time but in order to decide that you will have to see it in action and evaluate what it is going to offer you.
 
It's obvious the Parhelia wont be trampling anything anytime soon. Nvidia is working on anisotropic filtering with their new driver set according to this(use link in grants post if you are not a subscriber) thread. It was said that nvidia put performance under AA as a top priority in their driver sets, now they are working on other things. We wont know what the Parhelias final performance will be for some time, but it?s safe to say that this card is in no way anything monumental to the graphics industry. 3 monitor support will most likely go unused by most, and be used by an insanely small amount of gamers. I?m sure ATI and Nvidas current sales tactics will supersede Matrox 10 fold in their upcoming product releases among the public and the enthusiasts alike. This card is purely for the enthusiast, which is most likely why Anand said only some would benefit from it.

edt: Thanks grant2
 
Back
Top