Parhelia-512 vs. Radeon 9700...

EddNog

Senior member
Oct 25, 1999
227
0
0
Yes it may seem at first glance that the Radeon 9700 is a far superior choice, however, my situation is sort of unique...

First off my current setup involves a PNY Verto, GeForce3 Ti200, o/c'd to 233 core/555 memory. Yes that's with stock cooling. Yes that's faster than a Ti500. :cool:

Anyway, my problem with this, my first ever NVIDIA-based card, is that the damn 2D output is so friggin' fuzzy I feel like I'm going blind but whenever I look away from my monitor everything is perfectly sharp (thank goodness for contact lenses). Before this GF3, I owned almost all Matrox boards (with the exception of a KyroII, and a Voodoo3 eons ago). Their 2D was sharp as a Masamoto sushi knife (the Matroxes, not the Voodoo3 or KyroII). This GF3's 2D is killing me. I blame it on the 3rd party (ahrm, PNY) filtering components. For this reason and from this experience I am no longer buying any cards that are not made by the chip manufacturer.

So this leads me to ATI or Matrox. I've never owned an ATI; how is their 2D? I'm confident the Parhelia's 2D image will r0x0r my b0x0rz. The serious issue however is that a Parhelia is only about half as fast as a GF4 Ti4600 (is it even as fast as my ridiculously o/c'd GF3?) whereas the Radeon 9700 seems to be some 20-60% faster than a GF4 Ti4600. That makes the Radeon 9700 some 200-300% faster than a Parhelia (choke!!!). Of course most people say that sacrificing some 2D fidelity is worth the incredible performance difference but I'm the person who wants to go back to his G400 MAX 'cause his stupid fast GF3's 2D output makes him want to disgorge his dietary tract contents.

Another thing to keep in mind is this: Parhelia-512 OEM for $344, free shipping @ www.newegg.com. Radeon 9700 boards: $400 and UP, shipping not even counted yet.

So, can anyone provide me a concrete answer here? Is the Radeon 9700 worth the extra $60-$70 over the Parhelia-512? How does the 2D image quality compare to the Parhelia's? And yes, even though I'm looking at approx. $375 video cards, the $60-$70 really does matter to me.

BTW My main reason for caring so much about 2D quality is that not only am I a hardware freak and avid PC gamer like many of us in this l33t community, but I am also a professional desktop publicist; I work at ridiculous resolutions many hours straight in Photoshop, PageMaker, Illustrator, QuarkXpress and InDesign. With this GF3 I am going nuts coping with the fuzzy, unstable image.

One last thing. I noticed ATI has component HDTV (720p/1080i!!!) video output adapter for Radeon 8500's. Will they make this for Radeon 9700, or does it already include HDTV output function? I haven't seen Matrox mention HDTV even once anywhere on any of their literature; I own a Toshiba 42H80 tv set and it would totally r0x0r my b0x0rz to play some hot 3D games on it.

THANX IN ADVANCE 2 ALL 4 ANY HELP J00 CAN PROVIDE!!!

PS Parhelia offers the amazing Gigacolor feature; does Radeon 9700 offer similar?
 

AZGamer

Golden Member
May 14, 2001
1,545
0
0
Ed - I'm in pretty much in the same boat as you.

I bought a PNY Verto Ti500, and the 2D Sux0rz my B0xorz (heh).

I'm going to upgrade the Radeon 9700, as the price isn't going to be $400 and up ( I can guarantee you that the price will be $400 at most, and most likely newegg will have it at least a few bucks cheaper).

"
PS Parhelia offers the amazing Gigacolor feature; does Radeon 9700 offer similar? "

I think the Radeon has something like this too. Either way, the Parhelia doesn't really do well in gaming, and I'm not giving up on playing UT2003 :) - whenever it comes out.
 

Vegito

Diamond Member
Oct 16, 1999
8,329
0
0
My friend bought the matrox already.. i'm going to find out if it works decent or not tonight.. I would wait a while before jumping on either.. there might be a few revisions to fix some bugs...
 

EddNog

Senior member
Oct 25, 1999
227
0
0
Thanx, Force; I have noticed that the person who has my G400MAX (I gave it to a buddy after casting it away for my presbyoptic GF3), and also on my parents' G450-equipped machine, there are some graphic artifacts here and there in 3D games (specifically in WarCraft III and NeverWinter Nights); I'm merely hoping the Parhelia will not exhibit similar issues. I've never owned any ATI cards, but I'm assuming that being such a popular card manufacturer, people will program their software to properly implement the hardware. I can see that people don't always consider Matrox when writing their software, and the glitches come up. I hope you can get back to me on that firsthand Parhelia experience; it would be handy in juggling this equation.

AZ, I know of course that radeon prices will drop sooner or later, but I'm seriously thinking of buying a new video card, like, NOW. And right NOW the Radeon is showing at $400-$420 all over the web in the few stores that carry it; ATI's own online store does not carry it. None of my local shops (chain or mom & pop) have it in stock either. So in my case I would be paying $400 and up for the Radeon 9700 if I decide to go with it.

Soul, damn I WISH rofl! Anyway I only have one ubercomputer and it has only one AGP slot. ;-)
Athlon T-bird 1G o/c'd to 1467 (uberheatsink, Delta 7K RPM fan); GF3 Ti200 o/c'd to 233core/555memory; dual IBM 75GXPs (15GB each) on Promise FastTrak100, RAID 0, 64KB striping; SB Live! "classic" (with the i/o board) for my FPS2000's via Digital-DIN; Kenwood Zen 72X; Plextor 24x10x4... that's the important stuff.

-Ed
 

Piano Man

Diamond Member
Feb 5, 2000
3,370
0
76
ATI has also been known for there good image quality. Another thing to think about is support. The Parhelia will never be a major player in games, so it might not get the driver support, but ATI will. I would go with ATI.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) Well to put it simple, the Parhelia will have much better '2D' but 3D perf is probably worse if anything than your o/c'ed GF3. As such it would be a ridiculous amount to spend. However, GF4TI cards have HUGELY improved '2D' image quality, they are easily on par with ATI and the older (non-Parh) Matrox cards. I seriously doubt anybody would be able to percieve the '2D' benefits of Parhelia over ATI or GF4TI cards. They are all VERY closely matched in this respect. However, as you point out when cards are made by different companies, IQ does vary somewhat, although from what I've seen and read GF4TI still have GREAT IQ. However ATI would definitely be a wise choice, esp since you love high resolutions. Just be sure to only consider true 'made by ATI' Radeon cards.

:) I would suggest you wait a little longer, ATI Rad9700 prices will come down a bit over the next few months (esp with the new nVidia cards due November) PLUS ATI will almost certainly be releasing a Rad9500, a slighlty slower Rad9700 to combat the GF4TI4200. This could save you a LOT of cash, give you the '2D' IQ you want while still providing you with better gaming for your outlay.

:D If you want to buy sooner, then I would suggest a GF4TI4200 ('2D' is a world apart from GF3, $150), Rad8500 / 8500LE (not better gaming ability BUT much better '2D' IQ and 'only' $100) or Rad9700 (SHOULD certainly have fantastic '2D' IQ AND the best perf too, BUT $400). You may think the 4200 won't give you much of a boost, but with 4.0ns RAM they reach 300/550 (4400 speed) and 3.6ns they reach 300/620 (VERY near 4600 speed). VERY cost effective solutions. I'll root around and see if I can find info on the brands of 4200 better for IQ if you like?
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
Another factor is your CPU. Athlon 1.4ghz still kick some a$$, but I have heard that the Rad9700 is VERY CPU dependent, 1.4ghz may choke its perf somewhat. Of course if you can shell out $400 for a gfx card then you can surely shell out $100 for a fast AthlonXP and even a new mobo ($50) if that proves necessary.
 

SSXeon5

Senior member
Mar 4, 2002
542
0
0
Originally posted by: AnAndAustin
AnAnd showing GF4TI4600 vs Rad9700 vs Parhelia

The benchmarks relate to 4600, so 1.5=150% perf of 4600, ie 50% faster! The AA isn't entirely fair on the 4600 as QxAA coupled with Aniso is faster AND better looking than 4xAA, but it does show just how fast the Rad9700 is at 4xAA though.

QxAA doesnt look better then 4x aa! Its just so the performance wont drop alot .... and lookie here .... we have what I have also been anticipation .... the ATi Radeon 9700 :D Man its sad when they bench a gf4 ti4600 w/o AA and the 9700 with 4xAA 16x AF and still beats it by 30% in almost all games. Dont believe me read up at Hardocp. Well I would recomend the radeon 100% .... and now it has improved Truform (awsome so truely awsome) and the image quality is about the same to matrox .... and raw speed you just cant beat it. I dont like that 3 monitor think on the matrox .... cuz who has 3 moniters for one computer ... thats a crap load of money. Im sticken with my 8500 and not get the 9700 when it lauches like i did with this .... $300 was alot ... and it droped half price a wile after ... really pissed me off. So ill get the hercules 9700 128MB in about 3-4 monthes, wait till prices drop. Thats my $.02

SSXeon
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
SSXeon5 dude, read my post. I said QxAA WITH ANISO was both faster and better looking than 4xAA. Not to take anything away from the Rad9700's perf, it is awesome, just pointing out that 4600 performs a LOT faster with QxAA, the Aniso clears up the blurriness resulting in excellent results much better than 4xAA alone. The Aniso perf of the Rad9700 is awesome too, truly worth $400 (for those who can cough up the cash), unlike the Parhelia which struggles against Rad8500 and GF3!
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
Well to put it simple, the Parhelia will have much better '2D' but 3D perf is probably worse if anything than your o/c'ed GF3. As such it would be a ridiculous amount to spend. However, GF4TI cards have HUGELY improved '2D' image quality, they are easily on par with ATI and the older (non-Parh) Matrox cards. I seriously doubt anybody would be able to percieve the '2D' benefits of Parhelia over ATI or GF4TI cards. They are all VERY closely matched in this respect.

I would tend to disagree, the best 2D image quality I have ever seen from an nVidia based graphics card was a LeadTek GeForce3 Ti500. From the GeForce4 Ti/MX boards I've seen thus far I've yet to notice any consistnet improvement in image quality over the GF1/2/3.
nVidia claimed tremedous improvements in image quality with the GF2, and with the GeForce3 also... they didnt deliver then and I've yet to see any definitive proof they've delivered this time either.
I've only seen four nVidia GeForce4 Ti based graphics cards thus far, but all were almost identical to the image quality offered by GeForce 3 boards from the same manufacturer.
Four boards is far from enough to give a solid opinion on GeForce4 image quality, but from what I've seen it's not improved in the least.
The last consistent improvement I saw acroass almost all GeForce boards was the switch from the TNT2 to GeForce1 which IMHO did yield quite an improvement in most cases.

I most definitely have never in my life seen any graphics card from nVidia or ATi that was able to match the image quality offered by the Matrox G400/450/550. Indeed, I tend to put even the now ancient G200 on par with ATi's best.
From my own personal experience I generally consider VisionTek, LeadTek and Gainward to have the best 2D image quality among nVidia board manufacturers, with VisionTek a notch above the other two.
The LeadTek Ti500 being an exception as it was unusually good, and very nearly on-par with the ATi boards I've seen.

I generally tend to put ATi as being above any nVidia board manufacturer, not drastically superior but I do find the difference noticeable after prolonged usage of the system at reasonably high resolutions.

Surprisingly enough I generally consider the 2D quality offered from some 3dfx Voodoo Banshee and Voodoo3 3000/3500 and V4/5 as being superior to that of ATi. Though again, they definitely can't match up to Matrox IMHO.

2D output tends to be a HIGHLY subjective subject though.
I've heard some people state they cannot see any difference between even PNY and MSI compared to Matrox's best. This despite the fact that PNY and MSI are widely regarded to have among the worst quality of nVidia board manufacturers.
On the flip side of things I've seen people say the consider even ATi to be horribly blurry and inconsistent compared to Matrox.

On a personal level I generally consider myself to be relatively sensitive to 2D quality, immediate first impression usually isnt too noticeable of a difference for me... but after a few hours of working on the system I can usually see a distinct and clear difference between the quality offered by differing boards. I put this down to the fact that my eyes likely tend to get tired and stressed by looking at a monitor for long periods of time.

I've always found Matrox to offer almost crystal clear image quality, whereas with alternative boards I've almost always found the images and text to slowly start appearing mildly jagged, colors washed out, and most noticeably the text begins appearing slightly fuzzy to my eyes.
After a long enough period of time I have a definite tendency to get a terrible headache.

As said above, 2D quality is a highly subjective elemtent though and can be impacted to a large degree by the monitor and even the cabling used.
For many the quality difference might be negligable, but I assure you there are some that can notice quite a dramatic difference.



To go back to answering the original subject though, if 2D quality is the only factor the Parhelia offers that interests you then I'd probably lean towards the Radeon 9700.
2D quality alone is unlikely to be enough to make it worth purchasing the Parhelia.
ATi's quality with the R9700 should be quite a bit better then that from the PNY GF3 board you presently have, as PNY is usually regarded as having quite poor 2D quality even among nVidia graphics cards.
Besides 2D quality, the R9700 sounds like a much better match for your requirements then would be the Parhelia.


PS Parhelia offers the amazing Gigacolor feature; does Radeon 9700 offer similar?

I've heard differing answeres on this... some say it does support 10bit RGB, others say it cannot. I havent seen anything that states definitively whether it does or does not support 'GigaColor'.
Architectually is certainly seems as though it should be capable of it, and it's definitely able to implement 10/10/10 RGB when rendering 32bit color in 3D. Unfortunately I've not yet heard any definitive statements as to whther it is capable of it in 2D, the vague comments I've heard thus far seem to indicate it cannot.
ATi's own WhitePaper doesnt even give a definitive answer.

I'm STRONGLY hoping it does, as this is singly one of the most interesting new features I have seen in a long time. Everything I've heard makes it sound as though GigaColor could be truely beneficial for graphics artists. I spend a number of hours working in a couple pro 2D rendering apps, so I'll be extremely pleased if it does support it.


I said QxAA WITH ANISO was both faster and better looking than 4xAA.

I've always strongly disliked Quincunx. The blur filter applied at the end just completely and totally ruins texture quality. Anisotropic filtering can offset this to a great degree, but even with maximum AF I still find it makes textures look a bit unusual.

Just my opinion though, and there's no reason to go in-depth in this topic.
I'd appreciate it if you might look into
this thread. There is a discussion of FSAA going on there, and it'd be nice to have the opinion of another presented.
Especially if you prefer Quincunx I'd be very interested in hearing your opinions of the various implementatios available in consumer graphics cards.
There have been a number of comments brought up thus far both in favor of and against Quincunx.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:) Rand, in terms of '2D' image quality, every article and review I've seen has stated that GF4 cards have improved dramaticly over GF2 & GF3. Even the impartial Matrox PDF file which compares IQ of GF4, Rad8500 and Parhelia, the GF4 beats the Rad8500 in every test IIRC. I have noticed the diff going from GF2 to GF4, there is certainly a diff there. From a lot of independent people and resources I have heard that GF4 ARE now easily on par with both ATI and non-Parh Matrox. I too have heard that 3dfx had great IQ, far better than GF2 certainly. You said, "2D output tends to be a HIGHLY subjective subject though", and I agree whole-heartedly it is. Some people need only a 70Hz refresh rate while others need 100Hz in order to use monitors for short let alone prolonged periods. Many people love AA & Aniso while other can't tell the difference. Beauty is certainly in the eye of the beholder, but many sources all seem to agree that GF4 are easily up there with ATI & non-Parh Matrox. In any case, most people couldn't tell the diff between GF2 and Rad8500, let alone notice any benefit of the Parhelia's IQ.

:D My take on GIGA-COLOR is that it is copyrighted for Matrox, but it is just marketing blurb for 10bit RGB. In any case a few sources have said the Rad9700 does this. To make it simple it is unlikely to ever help games, the perf hit is too big, but for still pictures and short animations the diff is meant to be superb. Essentially 32bit colour uses 8bit for each RGB giving 256 shades of each colour, 10bit gives 1024 shades, and thus gives a much more discrete diff between maximum lights and maximum darks. Nice to have but unlikely to ever be used by the average Joe/Julie (but that's what they used to say about AA, LOL).

;) Respect to you Rand.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Rad9500, a slighlty slower Rad9700 to combat the GF4TI4200

If it is only "slightly slower" than the 9700, I would think that a 9500 could still kick the shnits out of a Ti 4600 let alone a Ti 4200, which in turn is being nipped at by the 8500 and the 9000 Pro slightly farther behind.

The Radeon 9700 prices on the web now are pre-release prices, they are inflated because everyone and their dog wants one and they aren't available (supply and demand there for ya). I heard that 9700 boards would be available for sale 30 days from the official release from ATI, so we still have a few weeks days and hours to wait... The OEM Parhelia can be had now for around $350, and it is the OEM which is (god help Matrox) slower than the retail Parhelia which is already slower than your overclocked GeForce 3. ATI is known to have good IQ, and I'm sure the IQ is only getting better with the 9700.

Matrox would probably be the way to go if you want the absolute best 2D, but for how it performs, $350-400 is $150-200 more, IMHO, than Matrox should be asking for...

I think the ATI 9700 would be a compromise, it would be by far the best gaming card you can get (for now) and it should have great IQ as well.

go with the 9700... if it doesn't have the 2D quality you want, just return it and get a Parhelia...
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:D LOL, All sounds good to me bunnyfubbles!

;) It will be very interesting to see what ATI leave out of the Rad9500 and how they clock it (esp since the Rad9700 clocks aren't finalised yet), but they will ensure they can churn it out at $150 and convincingly beat the GF4TI4200. Perhaps this will force nVidia into releasing a sub$200 version of their new gfx card, at least the Rad9000 should hopefully wipe out those pesky and deceiving GF4MX cards.
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
Originally posted by: AnAndAustin
:) Rand, in terms of '2D' image quality, every article and review I've seen has stated that GF4 cards have improved dramaticly over GF2 & GF3. Even the impartial Matrox PDF file which compares IQ of GF4, Rad8500 and Parhelia, the GF4 beats the Rad8500 in every test IIRC. I have noticed the diff going from GF2 to GF4, there is certainly a diff there. From a lot of independent people and resources I have heard that GF4 ARE now easily on par with both ATI and non-Parh Matrox. I too have heard that 3dfx had great IQ, far better than GF2 certainly. You said, "2D output tends to be a HIGHLY subjective subject though", and I agree whole-heartedly it is. Some people need only a 70Hz refresh rate while others need 100Hz in order to use monitors for short let alone prolonged periods.

The only reviews I've personally read on the GeForce4 that did any in-depth testing on it's 2D quality were those at AnandTech and Digit-Life. Anand rated all of the GF4 boards he tested quite highly in terms of 2D output, but as much as I respct Anand I can't say as I put much faith in that because he rated all GF3 boards very highly in terms of 2D output also... even those such as PNY/MSI that have recieved a substantial number of complaints.
In any case, I've grown somewhat wary of claims regarding improved 2D after nVidia's said they'd improve it a couple times already without any result.

But, as you've said it's all subjective... and I can only speak for my own opinion based on the boards I've seen thus far.

My take on GIGA-COLOR is that it is copyrighted for Matrox, but it is just marketing blurb for 10bit RGB. In any case a few sources have said the Rad9700 does this. To make it simple it is unlikely to ever help games, the perf hit is too big, but for still pictures and short animations the diff is meant to be superb. Essentially 32bit colour uses 8bit for each RGB giving 256 shades of each colour, 10bit gives 1024 shades, and thus gives a much more discrete diff between maximum lights and maximum darks. Nice to have but unlikely to ever be used by the average Joe/Julie (but that's what they used to say about AA, LOL).

Can you link me to any of the article that have definite information on R9700 supporting 'GigaColor' in 2D?
I've been searching, but what I've read so far is awfully mixed. Some sites say it doesn, others say it doesnt and none seem to have any definite proof one way or another.

BTW, why do you think it would cause a significant performance hit in most cases?
I tend to agree though, that GigaColor won't be too useful in games for the most part. Primarily because it'll steal 6bit of Alpha away... and too many current games will require that which will leave you with rendering errors.
Beyond that most games tend to use a uniform color palette, and besides those occasional rare games such as 'Thief' etc. their unlikely to hit the extremes that would tend to benefit from 'GigaColor'.

Actually, I think the biggest advantage to GigaColor will come in DVD's rather then any specific 2D or 3D applications.


BTW, thanks for popping into the AntiAliasing thread AnAndAustin, it was nice to read some differing viewpoints. Definitely can't say I agree with your perspective, but then.. that's why it's worth discussing :)
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
"Rand, in terms of '2D' image quality, every article and review I've seen has stated that GF4 cards have improved dramaticly over GF2 & GF3."

The GF4 is much improved over the the GF2/3, but don't believe the hype, it still stinks. I bought a GF4 Ti4600 for a system I was putting together for someone and the card was unusable at 1600x1200, it was barely usable at 1280, certainly nothing I would use for an extended period of time.

"From a lot of independent people and resources I have heard that GF4 ARE now easily on par with both ATI and non-Parh Matrox."

2D is subjective and the monitor you use plays a roll too, but I completely disagree that the GF4 is anywhere near the Radeon 8500 let alone Matrox. I can run 1600x1200 on my Radeon 8500 though I do get eye fatigue after a while.

"I too have heard that 3dfx had great IQ, far better than GF2 certainly."

It absolutely does, second only to Matrox, which is why I still have a Voodoo5 in my main system while the Radeon got moved to my gaming/multimedia system. Hopefully NVidia can pick up some IQ tips from the 3dfx tech they acquired.

 

EddNog

Senior member
Oct 25, 1999
227
0
0
Whew! *Eyes going blind reading so much text on a GF3-equipped machine* I'm happy so many people have responded to my little thread! So far from what I can tell, and of no surprise, everyone here is more gaming oriented than pure high-res 2D and desktop publishing, which is fine since I expected it. And I thank you all very much for your insight. I would like more info pertaining as to whether or not the R9700 has 10-bit/channel in 2D, whether the Parhelia supports HDTV output, and any info regarding a real concise 2D lab test comparing GF4 to ATI to Matrox. Any additional help would be great; everyone has been wonderful so far. I do, however, spend more time working than playing, and the clarity of 2D output, not to mention the 10-bit/channel capability, is probably higher importance to me than most of you, and has a great effect on my decision. HOWEVER; it is a very good thing to know that the OEM Parhelia is slower; I will now consider the additional cost of buying a RETAIL Parhelia, vs. an R9700. Thanx again so far, and thanx in advance for any more help anyone can provide! Keep it coming folks!

-Ed
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) Respect to you Rand. Most people who have differing opinions simply bitch and moan, it is very refreshing for people to give their own experiences and view points with reasoning and rational, when it comes to subjective matters such as '2D' there is little 100% rock solid info. I had a scout around to find some relevant info on Rad9700, esp to do with 10bit RGB and 128bit FP.

Tech-Report Rad9700 info

"Full floating-point accuracy throughout ? I saved the best for last, because I'm sneaky like that. This chip's internal accuracy is a staggering leap over anything that came before, because it can handle pixel data all throughout its pipeline?and most importantly, in its pixel shaders?using floating-point datatypes. We're talking about representing a range of values with exponentially more granularity than conventional integer color representations. And because the chip can comprehend fractional numbers, all kinds of complex math?in the form of pixel shader operations?are now possible ... Even R300's frame buffer is floating point. The one exception is the chip's RAMDAC. Final outputs are converted to a 10:10:10:2 RGBA format before they are sent to a display, which only makes sense, frankly."

:) The 10:10:10:2 RGBA thing doesn't seem to relate to a Gigacolor type feature, but the 'Full Floating-point Accuracy Throughout' sounds like a very good image enhancing initiative.

TomsHW Rad9700

"The real key feature of DirectX 9, however, is the introduction of RGBA values in 64 (16-bit FP per color) as well as 128-bit (32-bit FP per color) floating point precision. This great increase of color precision allows a stunningly new amount of visual effects and picture quality."

TomsHW Rad9700 P06

"In the case of a 128-bit FP number, each color channel has a 32-bit floating point precision (IEEE single precision, remember 'SSE'?), which consists of 1 sign bit, 8 exponent bits (7 plus sign) and 23 mantissa bits. This allows a dynamic range from <infinitely small> to <ridiculously large>, and with 23-bit, a much higher precision than the 8-bit precision of the 32-bit color values. It opens the door for a lot of new effects that weren't really possible before. Unfortunately, 128-bit color requires four times the memory bandwidth of 32-bit color. It's going to take a while until memory technology will caught up with that.

:D 'Floating Point Precision Color' would not seem to be the ATI equivilent of Gigacolor, it would be a few steps above it.

TomsHW Rad9700 P11

"Display Output: Matrox introduced it already with Parhelia and it will be a part of Microsoft's DirectX 9; by "it" we mean the new 10/10/10 bit color precision output format that should provide us with a more vibrant image experience. Usual 32-bit color is only using 24-bit for the actual color information, while the remaining 8-bit are not used for the output to a CRT or flat panel. Radeon 9700 is able to use 10-bit precision for each color channel, supplying 1024 different levels of red, green and blue rather than the mere 256 different levels known so far. I am sure that analog output devices have a good chance of benefiting from this new format, while I wouldn't know how digital flat panels are supposed to handle the additional two bits per color channel."

;) This would certainly suggest 10bit RGB just like the Parhelia.

AnAnd Rad9700

"This time around, ATI will be the first to jump on the next-generation bandwagon by outfitting the R300 with eight 128-bit floating point pixel rendering pipelines. This is a huge improvement over the four 64-bit integer rendering pipelines of the GeForce4 and Radeon 8500, and it also explains where a lot of those transistors went in the R300?s design. Not only did ATI double the number of rendering pipelines but they also doubled the precision and moved to a fully floating point pipeline to increase precision further. A fully floating point 3D pipeline will be a DirectX 9 requirement, and moving forward you?ll see the fastest DX9 parts employ a similar 8-pipe configuration ... The additional bits of precision achieved through the new floating point pipelines can be used to more accurately depict lighting. With the old 32-bit integer pipelines each RBGA value was limited to 8-bits, or 256 distinct values. Instead of only having 256 values per component, each color component can now be represented by, in effect, an unlimited number of values, giving you a truly dynamic range of brightness."

;) Once again another vote for the huge superiority of 128bit FP precision over both 8bit RGB and 10bit RGB (Matrox Gigacolor). Not a sniff here of 10bit RGB support from the Rad9700 though.

HardOCP Rad9700 specs (p)review

"The advanced DirectX 9.0 pixel shader engines of the RADEON 9700 are designed to handle floating point operations, which provide increased range and precision compared to the integer operations used in earlier designs. The engines provide up to 96-bits of precision for all calculations, which is a necessity for re-creating studio-quality visual effects ... The RADEON 9700 supports a new, high precision 10-bit per color channel frame buffer format enabled by DirectX 9.0. This enhancement of the standard 32-bpp color format (which supports just 8 bits per color channel) is capable of representing over one billion distinct colors, resulting in sharper, clearer images with more faithful color reproduction."

:D Yet again a clear indication of the 10bit RGB capability that Matrox spin the Gigacolor name around.

ExtremeTech Rad9700

"Each color channel (RGBA) gets single-precision floating point. ATI considered a 64-bit implementation where each channel would be have what's called a "short float" precision of 16 bits, but ATI instead opted to shoot higher and went ahead with a 128-bit implementation. This 128-bit precision is maintained throughout Radeon 9700's entire rendering pipeline, right up until the RAMDAC scan-out, where it's dithered down to 32-bit. However the bit allocation at this point is 10 bits per color (RGB), allowing for just over one billion possible colors. This allocation is similar to that found in Matrox's GigaColor technology, and this feature is now exposed in DirectX 9. ATI pointed out that while final values written to the color buffer are 32-bit values, the R9700 can write temporary 128-bit float values to scratch memory in the frame buffer in the course of rendering operations, and then read those values back into the VPU for additional rendering with the floating-point precision preserved."

:D Another vote for Rad9700 ability for both 10bit RGB and 128bit FP precision.

SharkyExtreme Rad9700

"RADEON 9700 Video Processing Engine: In addition to high-end 3D power, ATI hasn't neglected the other parts of the video game, and has enhanced the Radeon 9700 with VIDEOSHADER technology and upgraded the color depth to 10-bit per color channel. VIDEOSHADER essentially lets the Radeon 9700 use its programmable pixel shaders when dealing with more conventional video playback. Streaming video is one area ATI is excited about, and VIDEOSHADER can smooth over video artifacts and yield a much cleaner image. This is also true of DVD movies or any form of video playback (even TV), and it also provides real-time noise filtering on captured video, along with the ability to apply real-time effects such as blurring. The ATI RADEON 9700 also supports a new, high precision 10-bit per color channel format, this is quite similar to the Gigacolor Technology found on the Matrox Parhelia. This lets the Radeon 9700 use over one billion colors, which should be useful for graphics professionals."
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
"So, can anyone provide me a concrete answer here? Is the Radeon 9700 worth the extra $60-$70 over the Parhelia-512? How does the 2D image quality compare to the Parhelia's?"

I don't really see how you expect anyone to know this when the 9700 won't be available until August and no one that I have seen here has a Parhelia. Unless you expect Anand himself to chime in here no one has seen both of these products in action in a controlled environment.

"Will they make this for Radeon 9700, or does it already include HDTV output function"

It's integrated into the board. Matrox does not, but it does have 10bit DVD decoding, which ATi doesn't.

"Parhelia offers the amazing Gigacolor feature; does Radeon 9700 offer similar?"

No, it doesn't.

 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:D I know I have read a LOT of positive info about GF4 '2D' image quality. It is certainly a world apart from the GF3 you are currently using. As you can see I've spent some time scouring the web already so you'll have to forgive me if I only supply 1 link showing GF4 '2D' IQ is above that of the Radeon8500 and not far behind the hugely expensive and under-powered Parhelia. It is a PDF file (Adobe Acrobat Reader file) distributed by Matrox to cover how their 'fantastic' Parhelia is the king of '2D' IQ, but I think it shows us that both ATI and GF4 have truly caught up with Matrox for IQ.

Matrox PDF

;) My advice would be that $400 for the Parhelia would be a ridiculous waste considering you get almost NO perf gain for your expense. If you need something NOW then a GF4TI4200 would prove very cost effective ($150) and a great improvement in all departments. Otherwise a Rad8500 / 8500LE card ($100) MADE BY ATI (not powered) would give about even perf but significantly improve '2D' IQ. If you can wait a bit then the Rad9500 could certainly be your answer. By waiting the Parhelia and Rad9700 should reduce in price, particularly when the new nVidia card is out November, whether or not you are directly interested in it, it will still help to bring prices down. You could get a 4200 NOW, and then wait until early 2003 to see how the new cards compare, esp price-wise and there should be plenty of image quality reviews, who knows you may find GF4 more than suits your needs (esp if you buy a brand Rand suggested). It is important to remember that the Rad9700 isn't even available yet, the clock speeds haven't even been 100% finalised. As such I would suggest you get a 4200 or Rad8500 type card, enjoy increased perf and hugely better '2D' and wait a few months to see what happens and what rock solid info there is.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:D I think we should take a look at Matrox's Gigacolor.

TomsHW Parhelia

"With regard to 24-bit True Color display, the latest graphics cards limit themselves to 16,777,216 simultaneously displayed colors (256³). For each value of red, green and blue in the VGA signal, there are 8 bits per value available. A green tone can therefore be displayed in 256 different intensities. By comparison, Parhelia is capable of displaying each channel in 10-bit, which means that 1,073,741,824 colors can be displayed simultaneously - this is 64 times more than the standard cards. Matrox calls this mode 'GigaColor'."

;) So Gigacolor is the (dumb and unnecessary) marketing term for using 10+10+10+2 bits R+G+B+A giving 1024 shades of R, G & B instead of the usual 8+8+8+8 bits R+G+B+A which 'only' gives 256 shades of R, G & B. However Alpha handles the fogging, smoke and transparency effects doesn't it, won't these be severly reduced when using 10bit mode? If so games wouldn't really benefit from 10bit as much as you think, so it would be much more of an app enhancing utility.

"10-bit GigaColor: One of the big features in the Parhelia that people are talking about is GigaColor. Your usual color arrangement in 16.7 million colors is 8-8-8-8 for R-G-B-A respectively (A for alpha). GigaColor takes a total of 6-bits from the Alpha channel and distributes it evenly over RGB for a 10-10-10-2 arrangement. This gives you a palette of over 10 billion colors but leaves the Alpha channel with a little something to be desired. In Matrox?s effort to eliminate banding and increase overall image quality, docking that many bits from the Alpha channel increases banding in transparency effects you often see in games.

FiringSquad Parhelia

"To test the effectiveness of GigaColor as an everyday use feature we heavily scrutinized the viewer included with the Parhelia which is currently the only application available that is capable of displaying 10-bit GigaColor. We wanted to ensure that there weren?t any tricks being played with the viewer since the supplied images (used by the viewer for comparing 16.7m versus 1b colors) contained heavy banding to begin with that 16.7m colors could have easily eliminated ... While GigaColor does make a difference (still require a non-filtered version of GigaColor Viewer to be absolutely certain), it does not make a noticeable difference in everyday use, especially because you can?t see the difference simply by using any image viewer or surfing around your desktop. GigaColor is mainly positioned at the graphics artist who wants to see what his or her masterpiece looks like in GigaColor and print in GigaColor as well. Microsoft?s next generation OS will have support for higher color support out of the box and Matrox will be ready when the new OS is released. It?s not everyday that you stare at 5 or 6 highly contrasting colors (only way to significantly see banding) on the screen, but rather looking at many millions of colors mixed together in games and photographs, making banding virtually impossible to tell. Matrox insures us that they will be quickly updating the viewer to support large files. We?re approaching GigaColor from a gamer?s point of view but we have heard others in different areas of the industry like photography and film who think very highly of GigaColor. Don?t get us wrong though. We think GigaColor is a great feature to have, it?s just not that practical for gamers at the moment."

:D The above link/review also shows that Parhelia '2D' IQ is better than GF4 (Leadtek), BUT ONLY VERY SLIGHTLY and it involves a lot of squinting and staring at high contrast purpose-made images, not very real world. Worth the $400 price tag, I don't think so. Certainly a vote for a Leadtek GF4TI4200 $150, at least until a cheaper version of the Rad9700 or Parhelia is released.
 

Bingo13

Golden Member
Jan 30, 2000
1,269
0
0
I can truly tell you that the Parhelia is at least one level better on 2D quality than the Leadtek Ti4600/Radeon8500 using a Sony F520. Not only that but the IQ quality in games is better than either of the above cards. In fact if you were to sit down and play games on the Parhelia you would hardly notice any "measurable" difference in the performance of the cards unless you turned on the FPS counters. This is at a 1280x1024 resolution with high quality settings.
I was disappointed with the true performance potential of the Parhelia not being realized by Matrox and strongly believe that the next release of FPS-MP games will probably cause issues with the card unless the drivers (opengl/overclock tweaks) have reached their full potential. I will be buying the Radeon 9700 to replace my Ti4600 cards in my game machines but my work machines will all have the Parhelia.
I have not had any gaming issues to date, the drivers are very solid, Imaging Quality is fantastic, TV out is leagues better than ATI/Nvidia, and DualHead operation is great (left out triplehead). However, the card is still priced higher than it should be and if you are a gamer after ultra high framerates (required in certain situations) then this is not your card.
 

EddNog

Senior member
Oct 25, 1999
227
0
0
"Ho boy" this little thread is slowing down; nonetheless, it seems we have a couple people who read past the first page. :-D Well AnAnd I can probably agree that because Gigacolor reduces Alpha channels down to 2 bits, it's basically useless in games; however the Gigacolor function doesn't matter to me in games anyway. I'm here for the Gigacolor function in 2D and on the desktop, where I'm often doing my graphic manipulation, page layouts, illustrating etc. etc. etc. This would also be why I am seriously needing a replacement for my GF3; if I didn't do so much in my 2D environment, relying on still images remaining STILL, I'd stick to my GF3; in games, the poor filtering hardly has a great effect.

-Ed
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
However Alpha handles the fogging, smoke and transparency effects doesn't it, won't these be severly reduced when using 10bit mode? If so games wouldn't really benefit from 10bit as much as you think, so it would be much more of an app enhancing utility.

Which is precisely why I can't see GigaColor being of benefit in most current games, as it would tend to induce significant image corruption in some areas.
Even were on to play an older game that wouldnt need the extra bits for alpha, most games tend to use a fairly uniform color palette and most likely wouldnt be in a position to benefit much from GigaColor.

There will of course be exceptions, but by and large I don't think Matrox's implememtation is well suited to games.
Besides DVD playback, which would seem to be an almost ideal scenario for GigaColor I wouldnt expect it to have a dramatic advantage unless one is a 2D graphics artist.

Bingo13, thanks for giving us a first hand impression of the Parhelia relative to the Ti4600/Radeon 8500.
Undoubtedly their multi-monitor implementation is still as flexible and feature-filled as that which gained in such high accolades with the G4/5XX series. I'd like to see how well it extends to three monitors though.
Prior to the Parhelia 3+ monitors was solely the domain of Appian and Matrox's MMS series boards.

I notice you mentioned the TV-Out is "leagues better than ATI/Nvidia", I must say that surprises me. ATi specifically has tended to thrive upon multimedia features like TV-Out. That said, I have heard the TV-Out capabilities have improved tremendously on the Parhelia....
I shall be interested to see how it matches up from my own perspective once I get a chance to test her out.

Quite a roundup of ATi + GigaColor comments AnAndAustin :)
We still seem to be getting a mixed opinion on whether the R9700 supports 10bit RGB though.