Parhelia-512 vs. Radeon 9700...

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

EddNog

Senior member
Oct 25, 1999
227
0
0
I just remembered something very important: Fragment AA. Parhelia's 16X Fragment AA not only provides superior AA results to standard FSAA, but it's able to avoid blurring unnecesarilly, which is the main complaint most people have with FSAA; fragment AA only antialiases polygon edges, not on screen text etc. How's R9700's AA capabilities in comparison? Also, which card has better aniso capabilities?

-Ed
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
As somewhat of a conaisseur when it comes to ati cards, i'd like to give you some advice. I've owned ati cards as far back as the rage 2 chipset, and not once have i had any problems with the 2d output. I've been doing alot of reading on the radeon 9700, and basically it boils down to this:

- both the r9700 and the parhelia have 10-bit colour for 2d graphics
- the matrox card does some awesome anti-aliasing on text, which the ati card does not do
- the r9700 will revolutionize 3d graphics as we know it, with performance that is 3x faster than a gf4 with all features turned on (yes...that's 300%)

I'm somewhat of a graphics professional, and as such 2d means alot to me. I consider myself very demanding and very picky, and even on my radeon 8500 card, I have no complaints at all about the 2d output. You should really wait for the prices to drop on these cards though...$400 is outrageous. In 6 months you will be able to get something with that type of performance for $200. If I were you I would invest in a cheap older radeon card or the g400 card you mentioned. Good luck.

--dave.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) The Parhelia's Fragment AA IS an amazing technological leap, but it has quite a few downsides. It can't always be applied, misses some jaggies and results in eratic frame rates. Where FxAA can't be used the 4xAA mode is laughable, you are talking GF4MX perf, not what you expect from a $400, or even $150 gfx card. Where it comes to Aniso, nVidia are king. The quality is superb and the perf hit, while larger than (double) Rad8500, is still not huge. Rad8500 has very fast Aniso but the quality is inferior. As the Rad9700 hasn't been finalised yet it is difficult to comment, but I would expect it to be inbetween ATI & nVidia.

:D Well all of the links relating to the Rad9700 that I found either didn't mention 10bit RGB (what Matrox call Gigacolor) or said that Rad9700 can do it, not 1 link said it couldn't. As I said the Rad9700 hasn't truly been finalised yet, mostly clock speeds and a few driver tweaks, as such we will know NOTHING for certain until late August if not September. It would seem that DX9 calls for 128bit colour and floating point precision, this would be far superior to 10bit RGB, and would work for games too, if you don't mind perf being quartered! If you like 2D apps which can benefit from 10bit RGB (not very many at all) then it may be worth having, but paying $400 for only GF3 3D perf is quite a price to pay. I would suggest you wait for solid Rad9700 info, price drops and actual releases, if not Rad9500, in the meantime try a Rad8500 (built by ATI) or GF4TI4200 (pref Leadtek as '2D' IQ varies), you may find these cards suit your needs and are a world apart from the GF3 you have, you may decide saving $250-300 is worth it if you only lose 10bit RGB. In 6-12 months prices, availability and loads more info should be out, and that would be a mutch better time to buy a Parhelia, Rad9500/9700 or the new nVidia card.
 

TheGoose

Junior Member
Jul 23, 2002
15
0
0
From what I have seen from the white pages of both product is that they both support 10bit RGB but what Matrox has that ATI doesnt is a Adobe plugin that allows Photoshop and After Effects to use the 10 bit RGB mode.As far as I can tell this plugin ONLY supports Matroxs hardware.I wouldnt be suprised if Adobe denies ATI a software right seening how Adobe is fully backing Matrox in all of their special hardware thats designed for Adobe Products....Such as the RT series cards and some of their highend imaging solutions.

I am in a similar boat right now because I was all ready to buy a Parheila until the Radeon 9700 was announced.:confused:....But now I find myself rethinking....I need a powerhouse jack of all trades card to support my needs because along with being a avid PC gamer I also do a lot of video editing through Premiere as well as 3D animation using Lightwave and After Effects.Some of things that I need is rock solid stability,GOOD multi monitor support,fast OpenGL rendering,Crystal Clear 2D performance,and support that is geared towards Pro and consumer level applications.

All of these needs should point to the Parheila however the gaming performance is only comparable to other product when you run at high resolutions with FAA 16 turned on.Only then does it beat a GF4 Ti4600.

Personally I think I am going to sit this one out for a couple of months or so and see where things go .I really cant wait to se the 256 meg Parheila I bet that thing will be a rendering beast with all of the Memory bandwidth

Scott
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
TheGoose, I'm afraid the Parhelia barely beats the GF3 and Rad8500 in most benchmarks, even with full details. Even with it's FxAA on it still struggles and as already said FxAA is technically brilliant but ti does have its short falls. In short, if you want the top '2D' IQ then expect to pay $400 and make do with GF3 perf. If you can wait the cheaper Parhelia should be out, as well as the RAd9500. Then the new nVidia cards should be out and prices should become very nice for the consumer.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
Bingo13, I don't see why that is relevant.

Since you ask I personally own a Inno3D GF4TI4200 128MB using 4.0ns. But what relevance does that have?
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
Although I for one welcome a review based more upon '2D' IQ than just perf, it is clear that the review and reviewer are biased, much in the same way many reviewers lean towards ATI or nVidia when reviewing, it is very easy to be selective with what you show and how you show it. That's not to say the Parhelia doesn't have the best '2D' IQ, I'm sure it does, but I would expect the diffs to be inperceptible between the Rad8500 and GF4 cards. As I said, even just assuming the Parhelia has the best '2D' IQ and that 10bit RGB is significant and will be used by many apps, the price tag is still very hard to swallow even for a user able to take advantage of the 3 monitor support, esp since the Rad9700 uses FP precision throughout, 128bit colour (FAR better than 10bit RGB) and has more than double the 3D perf, esp with high res, AA & Aniso.
 

TheGoose

Junior Member
Jul 23, 2002
15
0
0
AnAnd Austin I think that you should check out the Parheila review at http://www.sharkyextreme.com/hardware/videocards/article.php/3211_1376851__7 Here you will see as with 3DMark 2001SE, Quake 3 at 1024x768 shows the GeForce4 Ti 4600 keeping up to the various Matrox options. Once we hit 1280x1024, the tables start turning and the Parhelia's FAA-16x offers the highest framerates. The same anti aliasing performance trends happen with Serious Sam 2, further solidifying the inherent advantage the Matrox edge anti aliasing technology has as the resolutions are increased....Not to mention that I am not really concerned with price.If I am waiting for anything its the 256meg higher clocked Parheila which will cost quite a bit more.....As of yet I have not heard mention of ATI planning to release a 256meg R300 based card.

Scott
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:D LOL, to seriously say the Parhelia can compete with GF4 is a laugh. I can pick some benchmarks for you showing the 4200 beating the 4600 when both are at defaults etc etc, it certainly isn't representative of the whole picture. Check this out.

http://firingsquad.gamers.com/hardware/parhelia/
In 3Dmark2001, P512 is behind the 4600 (only other card) in ALL parts of benchmark inc 1600x1200x32, 4600 50% faster.
In Q3A noAA, the 4600 miles ahead, and gap widens (not narrows) as res increases.
In Q3A AA, 4600 uses worst perf 4xAA and still beats P512 FxAA until 1280 where about even, and 1600 both unplayable.
In JK2 noAA & AA, 4600 still whips the P512 and still has a narrow lead all the way to 1600x1200x32xAA although unplayable.

http://www.anandtech.com/video/showdoc.html?i=1647
UT2003 1600x1200-HI DETAIL, P512 slower than R8500 & 0.5FPS faster than GF3TI500, 4200 is 25% faster!
UT2003 1600x1200-MID DETAIL, P512 EVEN slower, now 20% slower than GF3TI500!
In the more demanding Asbestos benchmark things are even worse! GF3TI500 is 20% faster 16x12-HI & MID slower than GF2ultra!

http://www.tech-report.com/reviews/2002q2/parhelia/index.x?pg=1
CodeCreatures, both R8500 & 4600 squirt pee on the P512 in all res up to 1600x1200x32.
SerSam2, both R8500 & 4600 beat it again right up to 1600x1200x32.
3Dmark2001, 8500 & 4600 win again.
AA, P512 takes huge perf hit when switching from noAA to 4xAA esp compared to even the 8500, FxAA does cost little perf tho.
Aniso, P512 can't do any other than 2xAni but should be able to do 8xAni with updated drivers, what perf hit tho?

http://www17.tomshardware.com/graphic/02q2/020625/index.html
Giants, again all the way up to 1600x1200x32 the 8500 & 4600 easily laugh off the P512.
MaxPayne, the P512 can now equal the 8500, but 4600 is more than 60% faster than both in all res up to 1600x1200x32.
Aquanox, same as MaxPayne, can equal the 8500 but 4600 streets ahead.
Commanche4, At last the P512 beats the 8500 in all res, albeit marginal in 1600x1200x32, at which the 4600 is 200% perf!
DungeonSiege, Again P512 just beats the 8500 but never nears the 4600.
JK2, P512 is miles behind both 8500 and 4600 in ALL res up to 1600x1200x32. Both more than 60% faster at 1600x1200x32!
Q3A, P512 no where near either card AGAIN! At 1600x1200x32 4600 is 200% perf of P512 AGAIN!
3Dmark2001, Once again both cards are flying ahead of P512. Again 1600x1200x32 shows the 4600 more than 60% faster.
Q3A Aniso, not only does 4600 look better but is much faster and takes less of a hit in ALL res.
MaxPayne, Aniso likes P512 most but 4600 still a LOT faster.
AA, FxAA not work in all games or conditions so P512 has to use 4xAA which it does very slow, big perf hit.
MaxPayne, AA, 4600 QxAA easily beats the P512 FxAA and even the 4600 4xAA is about even with P512 FxAA, P512 4xAA VERY slow.
Q3A AA, Same story as MaxPayne but now just about playable at 1280x1024x32, 4600 QxAA double P512 FxAA.
Q3A AA&Aniso, 4600 QxAA (not need for 4xAA when using Aniso) actually faster than P512 without anything enabled!
MaxPayne AA&Aniso, pretty much the same as Q3A.

;) Please excuse my shorthand but I believe most people will still understand it. You may think that the 4600 is an unfair comparison but rem that is where the P512's price point places it. The Radeon8500 is about a third the cost and beats the P512 almost every time too!
 

TheGoose

Junior Member
Jul 23, 2002
15
0
0
I think that your missing my point
http://www.athlonxp.com/modules.php...owFile&file_wrap=html/reviews_parhelia_6.html

In more than one benchmark I have proven that the P512 is fast only under certain circumstances,I think that you think that I am saying its a all around better card....Which is not the case from the view of a framerate drunken gamer.If you take a look at the card from the view of someone who is obsesed with image quaility(both 2D and 3D) the P512 is the best thing going(currently).

Also I have noticed that some of the hardware sites that got the P512 first had much slower marks.I think that this has something to do with drivers changing 4 times in the last 3 weeks.

To address the coment about developers not supporting the Matrox in the future lets not forget that Matrox just got a voting seat on the OpenGL board.

I am not just blindly supporting Matrox, as a RT2000 and a RT2500 owner I have a very unique love,HATE relationship with Matrox

SCott
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:D So if I wanted to quote a few extreme examples (see the above links for the sources):

In 3Dmark2001 at 1600x1200x32 the 4600 50% faster.
In JK2 at 1600x1200x32 the 4600 is over 50% faster.
In JK2 at 1280x1024x32xAA the 4600 is over 40% faster, even with the P512 using FxAA!
In UT2003 at 1600x1200 with HIGH DETAIL the GF3TI500 is completely equal to the P512, while the Rad8500 is faster!
In UT2003 at 1600x1200 with MIDIUM DETAIL the GF3TI500 is 20% faster than the P512.
In UT2003's more demanding Asbestos BM at 1600x1200 with HIGH DETAIL the GF3TI500 is 20% faster.
In UT2003's more demanding Asbestos BM at 1600x1200 with MEDIUM DETAIL the GF2ultra is faster while the GF3TI500 is more than 50% faster!
In CodeCreatures at 1600x1200 the 4600 is 80% faster than the P512, in fact the Rad8500 is nearly 20% faster than the P512.
In SerSam2 at 1280x1024 the 4600 is more than double the speed of the P512, that's over 200% the perf!
In 3Dmark2001 the 4600 is more than 35% faster than P512, even the Rad8500 is still more than 20% faster. In DOT3 bump mapping the 4600 is 50% faster than the P512.
In 3Dmark2001 1024x768 lets take a look at 4xAA perf; the 4600 is more than 2.5 times faster than P512, that's more than 250% the perf. Even the Rad8500 is still more than 10% faster.
Even when P512 uses its FxAA it still can't compete with 4600 in 2xAA mode, the 4600 is over 80% faster, even in 4xAA mode the 4600 is still over 10% faster than the P512 using its FxAA! This is before we consider that FxAA misses jaggies and has eratic FPS.
In Q3A AA the 4600 uses its absolute worst perf 4xAA and still beats P512 FxAA. The 4600 is MUCH faster at 2xAA, or even QxAA WITH Aniso enabled which is still much faster than 4xAA alone. Plus MOST people say QxAA & Aniso IS far better than 4xAA whilst still significantly faster. This is before we look at maximum and minimum FPS which shows how eratic the P512's FxAA is, not to mention the missed jaggies and the HUGELY slow 4xAA mode of the 512, which is its only other option when unable to use FxAA.

;) Easy to pick a few odd benchmarks to say something, isn't it.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) Hey TheGoose I mean no disrespect, but I do find it very annoying when a couple of select benchmark results are quoted. I am more aware than most the power of statistics and how they can be used (and unused) to make a point.

:D I still stand 100% behind the advice I gave earlier, IMHO the best thing to do would be to WAIT, if you're peed off with your GF3 then get a GF4TI4200 (pref Leadtek for assured great IQ) or Rad8500 (or LE but definitely a true ATI either way), if they aren't enough of an improvement then you've lost nothing (but could have saved yourself $250+) as in a few month's time prices of Parhelia512 and Rad9700 will have fallen, plus there should be plenty of concrete info and reviews AND budget versions of both cards (Parhelia256 & Rad9500?).

:) If 3D perf doesn't really bother you then I would rec a Matrox card (but prob not the hugely priced Parhelia), but since you must want superior 3D for your $400 the only real choice is the Rad9700, not only hugely faster and superior to the GF3 in all areas, but IQ is as good as guearanteed to be far superior to the GF3 too. If you didn't work in high resolutions so much then GF4TI would be the wisest choice, Rad9700 if you can afford it. Since you have a GF3, and apparantly one of the worser manu'd ones, then you have good solid 3D perf, it is really the IQ that sucks and as such you must want an improvement in both IQ & 3D if you're spending over $100, let alone $400!
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) Of course the Rad9700 also sports 10bit RGB, essentially all Gigacolor is. But it is also far better technically with full FP precision throughout and support for 128bit colour, which is far more likely to become a widespread standard as part of DX9 AND being applicable to games as well as applications. ATI & nVidia have both caught up with Matrox, gone are the days when Matrox were best for multi-monitor support and IQ, ATI for multimedia (VIVO/TV etc) and nVidia for pure 3D perf. With a market as active as the gfx arena is, it makes sense for ALL companies to close in on their adversaries' strengths and enhance their own weaknesses, as such we have already witnessed nVidia improving IQ, standards and dual monitor support, ATI catching up with AA and pure 3D speed (catching up?), and finally Matrox catching up (just about) with 3D perf. If ATI and nVidia weren't closing in on Matrox's specialities, then Matrox would be unlikely to bother enhancing the IQ, or providing 3 monitor support ... Matrox wants to play with the big boys in the mass market, I hope they succeed, but it will take a lot more than an overpriced card with inperceptibly better IQ, 3 monitor support (precious few have room or money for that), technically brilliant but a little flawed AA and last year's 3D perf. Sure a few loony Matrox fanboys and IQ enthusiasts will buy one at $400, heck that's why they set that price, but it will never become mainstream unless they halve the price, and even then it would still be tough. Whatever did they do with that 20GB/s bandwidth, esp considering the 4600's 10GB/s?

:( I simply can't understand the logic behind someone recommending the Parhelia512 at $400 to EddNog, who is looking for better IQ and surely an improvement over his GF3 3D, something the P512 really doesn't have. Sure the IQ is great, and since he works at 'ridiculous resolutions' it makes snese to have the best IQ possible. But even the Matrox PDF file inadvertantly shows that both Rad8500 and GF4 have caught right up in IQ, just as everywhere seems to be saying. I believe the difference between Rad's and GF4's IQ and the P512 will be very slight and as such NOT worth the huge price tag the P512 carries. EddNog will notice a huge improvement for a simple $100 Rad8500 or $150 GF4TI4200 (better IQ & 3D), and in a few months if still unsatisfied should find both cards easy to sell on at a very similar price as he paid, and then decide to purchase a Parhelia, Rad9x00 or nVidia newby at significantly lower prices and with far more reviews, user experiences and good solid information to go on.

;) Just as a note, I say '2D' image quality and put the '2D' in ' ' because we both know this stands true for both 2D and 3D (beware of WinXP setting the 3D refresh to 60Hz though), it is just far more noticable in 2D AND too many people get confused and think that 'image quality' refers to Anisotropic Filtering, TriLinear Filtering, Mipmap levels, AA , even supported refresh rates and simple RAMDAC speed. So I find '2D' IQ is the best way to put it.
 

EddNog

Senior member
Oct 25, 1999
227
0
0
WHEW is it getting hot in here or is it just me? ;-)

Now, I'm not trying to start a war...

Anyway it seems more like a highly civilized verbal bitch slap fight.

Moving right along...

I believe I will take the neutral route and wait a little bit of time for: A) Matrox to get their drivers a little more mature B) Matrox to release the 64MB and 256MB versions (I need to see which model has the fastest memory, 64MB, 128MB or 256MB; the fastest, not biggest one, may get my money) C) ATI to release the Radeon 9700 for real so I can see some proper pricing D) ATI to release the "9500" version and finally E) to see which models overclock how well (Parhelia-512 64MB, 128MB, 256MB, Radeon 9700, 9500).

As you guys know I am a pretty hardcore overclocker (not totally hardcore; no liquid cooling here, yet...) so the overclockability of the card is highly important to me. If the Parhelia overclocks well enough, I'll get it. If it can't overclock much, I'll prolly get the Radeon, overclockable or not. Then I must see how much faster or slower each model is and at what price point. Things will get more complex.

Finally I gotta' see if I can get my eyes on somebody's Leadtech GF4Ti4X00 so I can see the quality for myself. Hopefully thet have a decent display.

Thanx 4 all j00r help, fellas! I'll be checking back here only once a day from now on. ;-)

BTW Take it easy!

-Ed
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
gone are the days when Matrox were best for multi-monitor support and IQ, ATI for multimedia (VIVO/TV etc) and nVidia for pure 3D perf. With a market as active as the gfx arena is, it makes sense for ALL companies to close in on their adversaries' strengths and enhance their own weaknesses, as such we have already witnessed nVidia improving IQ, standards and dual monitor support, ATI catching up with AA and pure 3D speed (catching up?), and finally Matrox catching up (just about) with 3D perf.

I don't think those days are completely gone, the distinction has lessened but it's still there.
ATi still has the best DVD playback, and TV-Out capabilities, ATi still has the most feature-filled boards in the AIW series.

nVidia still has the mainstream 3D performance, and the top of the line GF4 Ti4600 (at least until R9700 debutes).

Matrox still rules in 2D with the Parhelia, and the G400/450/550 still outpaces ATi/nVidia (This last one is debateable though I suppose).
Matrox still rules the 3+ monitor realm, especially when 4+ monitors are required ie. medical imaging, air-traffic controller displays, financial market analysts etc.
Matrox still has the most versatile, flexible and fully featured multi-monitor implementation available, head and shouldners above all but Appian's top card and even there it retains the lead.

Each manufacturer is beginning to step into each others typical domain, and each are improving their weaknesses... but the king in each area hasnt truly changed.


but it will never become mainstream unless they halve the price

Matrox doesnt have the cash, or the marketing team, or the manufacturing capabilities to handle the mainstream.
Certainly they'd like to get a foot-hold into the mainstream consumer 3D but I doubt they have any intentions of seriously striving to take the crown because they don't have the finances or resources to cope with the demand if that were to happen.


Now, I'm not trying to start a war...

Personally I've been quite pleased with this thread, for the most part there has been a calm and rational discussion of the relative merits of different graphics cards. The discussion on the Parhelia's 2D image quality has been perfectly civil, and all parties involved have presented their view-point calmly and logically.
No flames or insults have been thrown, and I see no clear indications of any bias zealotry on any side.

Considerably better then that which is the standard in most threads on these forums.




I am in a similar boat right now because I was all ready to buy a Parheila until the Radeon 9700 was announced.....But now I find myself rethinking....I need a powerhouse jack of all trades card to support my needs because along with being a avid PC gamer I also do a lot of I am in a similar boat right now because I was all ready to buy a Parheila until the Radeon 9700 was announced.....But now I find myself rethinking....I need a powerhouse jack of all trades card to support my needs because along with being a avid PC gamer I also do a lot of video editing through Premiere as well as 3D animation using Lightwave and After Effects.Some of things that I need is rock solid stability,GOOD multi monitor support,fast OpenGL rendering,Crystal Clear 2D performance,and support that is geared towards Pro and consumer level applications.

TheGoose, you've pretty much perfectly labeled out my own needs for a graphics card there. :)
Except in my case 3D performance is secondary as I really only need GF2 MX400 level gaming performance... though extra performance beyond that is nice to put towards FSAA.
I work quite a bit in 3D Studio Max though, so respectable pro-3D application performance is necessary though not so much as to require a true pro 3D board.
I also do a bit of video editing through Premiere.
I only need a basic multi-monitor feature set for 2 monitors though.
The biggest factor for me is solid 2D at high res though, as I do a considerable degree of 2D graphics design through a myriad list of applications.
 

butch84

Golden Member
Jan 26, 2001
1,202
0
76
Not to burst your buble or anything, but the GF3 ti500 defaults to 240core and 500mem, so your ti200 is faster in mem, but not gpu. Either way, its prolly still faster than a ti500, but i just wanted to make the correction.

Butch
 

TheBug

Junior Member
Jul 20, 2002
23
0
0
Joining from the thread I started with identical topic

Matrox Parhelia Image Quality

I just ordered a Parhelia and will test it over the weekend against my Radeon 8500 and the Radeon 7000, 7500, GeForce2 MX, and GeForce4 MX at work. Will keep you guys posted.
 

Bingo13

Golden Member
Jan 30, 2000
1,269
0
0
Bingo13, I don't see why that is relevant.

I ask this because unless you have actually used the product then any information you provide is purely subjective based upon information from other sources. You are merely providing links to opinions, opinions that differ widely considering this subject matter. My opinion is no different but at least I have experience with all of the cards mentioned. Matrox does have better 2d quality than ATI/Nvidia. It might not be as evident on a $250 monitor but it surely is on the Sony F520 monitors that I use. Matrox does have better TV Out, multi-monitor support, 2d display drivers, color/clarity, and Adobe support than ATI/Nvidia. These are facts and can be proven in an objective manner. In the same light Matrox cost more, offers lower 3D performance (wide margins at times), offers lower Professional OpenGL performance, and is not as widely supported as ATI/Nvidia games.

If you only game (mainly FPS) then this card is probably not for you. I work and game so this card had been wonderful to date. In some situations I will use one of my Ti4600 machines mostly due to the ability to hold a higher average frame rate in certain games. However, if you toss out the benchmarks you would have a hard time telling any difference in true game motion between the cards in most situations.

Was I disappointed with the performance of the card based upon the reviews? YES! Am I now disappointed with the card now that I have used it for two weeks? Slightly as I know Matrox probably could have done more on the core clock speed to improve FPS gaming performance but overall I have been very satisfied with the card to the point of ordering two more. In fact if the core clock speed had been around 300 there would probably be very few "issues" with this card from the hardcore gaming community. :)
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:D LOL Bingo13, I handle a lot more PC hw than I personally own. And I agree to a small extent that you can't beat personal experience. But even then, handling 1 or 2 samples of a product doesn't add up to all that much, esp with vast array of options and configurations out there, almost every setup is different and so even using a particular component doesn't ensure you can say very much with any degree of certainty. I don't need to drive a BMW to know it will be nicer than a mini, I don't need to have 8 children to know they'll take more time and care than 2, I don't need to use a 500MB HD to realise its limitations, I don't need to handle every single component out there in order to have formed an educated opinion and give advice. In any case there is no denying that the P512 falls very short on gaming perf, esp for somebody already very used to GF3 3D perf, this is something that needs more than clock increases in order to even reach GF$TI4200 levels let alone $400 Rad9700 levels. What the heck have Matrox done with double the memory bandwidth that the P512 has?

;) In any case, where have you been Bingo13, you are pretty much just echoing what has been said already. The P512 most likely does have the best '2D' IQ, 3 monitor support and moderate GF3 level 3D perf. The P512 has its market, but I don't think it is worth EddNog shelling out $400 only to improve his '2D' IQ. IMHO he would be far better off getting a Rad8500 ($100) or GF4TI4200 ($150) and seeing if these fit his needs, if not then it will at least make the few months of wait for prices to fall and concrete info to come out, if not budget versions of Rad9700 and P512, now what's wrong with that? There is no doubt that both Rad8500 and GF4TI cards will have significantly improved '2D', the Rad8500 won't break the bank and the GF4TI4200 won't either, and will increase 3D perf at the same time. In my educated opinion based upon many individuals' experiences and independent reviews from multiple reliable sources I have given it how I see it, without bias or malice. So what exectly are you getting at, that I should not express an opinion without having tried every flavour of component out there? Unlike some, I don't simply rec a product based upon my own purchases in order to justify to myself I have not wasted my money. If EddNog wasn't too concerned about 3D perf, needed the absolute top in '2D' IQ and didn't mind shelling out $400, then the P512 would certainly be the top candidate, but I'm sure EddNog expects an increase in his 3D as well as '2D' IQ even when spending $200 let alone $400. Pretty much every user I've spoken to, even avid Matrox fans, admit that Rad8500 and GF4 have caught up with Matrox's pre-P512 '2D' IQ, surely it makes sense to give the Rad8500 or GF4TI4200 a spin for the relatively small cost and see how they fit his requirements, even if they don't he will likely find he can sell them on in a few months at pretty much the same price and still get a better deal on the P512 or Rad9700, that is if P256, Rad9500 or new nVidia cards aren't out by then!
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
To cut right to it, I had the impression that Matrox was going to attempt to enter the gaming niche with the Parhelia. The board is dissapointing in the sense that it is just about as "fast" as a 200/230 GeForce 3 that was available around May '01. Granted that GeForce 3 performance isn't a joke, the Parhelia doesn't give us the GeForce 4 Ti performance many of us were expecting (if not better). GeForce 3 performance surely is more than enough to be able to play most of today's games, and the 256bit memory architecture allows for a huge bandwidth which in turn allows for little loss of performance in high resolutions.

So what is wrong with the picture? Right now you can buy an OEM Parhelia for $344. This version sports core and ram clocks slightly lower than that of the of the retail version which costs $399 and as far as I know is not yet available for sale. The Parhelia is a 80 million transistor 512bit video chip. But it is a 512bit chip compared to the respective 63 million and 60 million transistor 256bit GF4 Ti and R8500. On top of that the Parhelia sports a 256-bit DDR memory interface and a total of 128mb of onboard video memory, whereas the GF4 Ti and R8500 have a 128-bit DDR memory interface.

I don't know about the rest of you but from those specs the Parhelia sounds like a hell of a gamer's video card. Am I right? What's wrong? Seems like Matrox flubbed up. Maybe they don't intend for the Parhelia to be a beast of a gaming board, but why is that? That same $344 OEM Parhelia I described in the previous paragraph sounds superior to the GF4 Ti and R8500 until you get a load of the speeds Matrox has the Parhelia running at. 200MHz core, and 250MHz Ram for the available OEM board and 220Mhz core and 275MHz ram for the retail. The ram clock isn't so bad considering 250MHz * 2 (DDR) * 32 bytes (256-bit memory interface) = roughly 16GB/sec of memory bandwidth. 6GB/sec greater than the Ti 4600. But then the average core clock speed of the GF 4 Ti line and the R8500 is 275Mz, that's almost 50% greater than that of the Parhelia. No wonder it is getting an ass kicking. Sheer lack of raw core clock and immature drivers, that is what is bringing the Parhelia down.

ATI initially failed with their 8500. All the pre-release previews we saw showed the 8500 trailing pathetically behind the GF3 in just about every if not every benchmark. What did ATI do? They made sure the 8500 was clocked as high as they could risk, cut prices (the 8500 went from a $399 MSRP to $299 and could be found for around $250 on the web shortly after its release), and went working their butts off trying to drastically improve their drivers. It paid off for ATI. Solid drivers, competitive pricing, and solid products...business is chugging along for the Canadian company and I'm sure nVidia is on their toes preparing to defend their spot on the hill, especially now that ATI is showing off their new BFG and ATI certainly hasn't slacked off even after they've already wowed us with the R300.

Back to the Parhelia...

All that impressive technology packed a $344 board and all you end up getting for it is top-notch 2D if you own an awesome, expensive, and large CRT monitor and you are running very high resolutions. You also get such features as Surround gaming which you need another two monitors (which will also greatly add to the bill), great TV-out and a cool form of FSAA, and to top it off or rather take out the knees...sub-par 3D performance. Looks very much like a niche (and it is a rather small niche I think) card to me when it seems that Matrox could easily make some changes to broaden interest in the Parhelia and ultimately achieve more sales and make more money... Matrox hasn't failed especially after such a long time of "nothing" from them. Their Parhelia would have been the shizzle if it would have been released a year ago ;)

just my thoughts...

To add to Bingo has said... I wonder how a heavily overclocked Parhelia would perform... a 300MHz core certainly woundn't hurt methinks ;)
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
What the heck have Matrox done with double the memory bandwidth that the P512 has?

Hmm... maybe it has to do with all the bandwidth saving technologies ATI and nVidia have been working on ever since the GeForce 256 and GeForce 2 GTS... Those chips were extremely fast but were very inefficient in handling the memory bandwidth which in turn forced producers to pair the chips with extremely fast ram. The GF256 got to show off once it was paired with DDR and the GF2 was owning all with the 7.36 GB/sec bandwidth of the GF2 Ultra...

The Radeon 9700 will sport the third generation of HyperZ technology, a technology introduced with the R100 Radeon 256 to save memory bandwidth.

Perhaps Matrox lacks such technology or perhaps their technology is inferior as they don't have the benefit of building upon previous generations of such technology :p
 

SSXeon5

Senior member
Mar 4, 2002
542
0
0
AnAndAustin

I dont know where you got that the IQ on the GF4 is anywhere close to the R8500 .... and for performance .... the 9700 rapes the Ti4600 in the anus 100% .... so there goes all this talk. The 9700 will be the same price as the Parhelia-512 so what is the better choice? Yup ATi!

SSXeon
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:D SSXeon5, regarding point number 1; I got the GF4 IQ being easily on par, and usually exceeding the very capable and respectable Rad8500 from a whole wealth of GF4 reviews, admittedly most review sites do fixate on benchmarks and such, but if you look hard you find most sites talk about IQ as well. Many users, and I'm not just talking nVidia fanboys here, have also mentioned the increased IQ of the GF4 cards, that's after upgrading from GF2, GF3, Matrox and Radeon cards mostly. But I think the most telling and impartial account of the GF4 being better at IQ than Rad8500 (and actually very close to Parhelia) comes inadvertantly from Matrox themselves:

Matrox.com

;) As I said, you have to scour GF4 reviews to find IQ references, also Parhelia reviews are also worth checking too. As I simply don't have the time to go through each and every site here's the main review sites I use:

TomsHW

AnAndTech

Tech-Report

Firing Squad

X-Bit Labs

HardOCP

Dans Data

Extreme Tech

Gamers Depot

;) I use others but these are the ones I use most, I won't hold your hand nor do the ground work for you, if you look through the above for GF4 and Parhelia reviews you will find that, where mentioned, the GF4 IQ is a world apart from the GF3 and are right up there with ATI.

:D As to your other point, YES! Rad9700 does stick 4600 in the anus as you so eliquently put it, who ever said it didn't? However, bear in mind that no independent has conducted a real review of a final product yet, the clocks aren't even finalised yet! However, the 4200 at $150 o/c's to either 4400 or 4600 perf and as such will give a nice boost over the GF3 EddNog currently uses in ALL departments, but especially IQ. Sure the Rad9700 whips the Parhelia, but if you listened to EddNog you'd realise it is IQ he's more concerned about, he still wants 3D, but is unsure of the actual diffs in IQ between GF4, Rad and Parhelia. Since $400 is a lot to spend if a $150 card may certainly do the job, added to which he should find a 4200 easy to sell on in a few months AND by then a lot of hard info will be out, as well as possible budget versions of Parhelia and Rad9700 (even if not the prices will have dropped). If it was my $400 (and I had it to spend) I would not hesitate to go Rad9700, that thing rocks, but everybody's requirements and preferences are different now aren't they SSXeon5.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I would be willing to put money on the fact that the 2D on the ATI card is just as good if not better than the Matrox.