Parhelia-512 vs. Radeon 9700...

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TheBug

Junior Member
Jul 20, 2002
23
0
0
AnAndAustin,

Thanks for the analysis. The CRT monitors used in the test were Sony Multiscan CPD-G400 set to the recommended 1280x1024 @ 85Hz that were bought in the same batch of order, so I'm assuming their manufacture date/time were close. In addition, I did a hardware reset on the monitors before presenting them to the testers.

I'm not sure about the brand of the GeForce cards since they were bought before I was hired. I will disect the machines after-hour today and post the brands here. As for the GF3MX, it was my mistake. I meant to write GF4MX. I guess the lateness of the night does get to me sometimes. :eek:

Rand,

Personally, I was surprised to see the GF cards being out-voted by Rad8500. I will find out what GF cards were used after-hour today.

The video was set to 32-bit on all boards. What I did was opening a Word document (at 100% zoom), an Excel spreadsheet (at 100% zoom), an Internet Explorer pages (mostly text), and Adobe Photoshop with one of their sample images, and asked the testers to go through all of them.

No thanks needed. We were all on company time anyway. Getting paid to do interesting things is definitely worth the time.

On a separate note, I installed Parhelia on my home machine last night and my girlfriend worked on the computer for about 30 minutes after that. This morning, she asked if I did anything to the system because the video was clearer even though I never told her about the switch from Rad8500 to Parhelia. I would still say that the 2D IQ depends greatly on the person, but there are definitely differences between the Parhelia and its competitors.

Also, I tried playing JKII (yes, I finally got a copy) last night on both Rad8500 and Parhelia. I didn't see any difference in 3D quality (maybe I had to enable/disable something?), but under the exact same video setting (in JKII, highest setting on everything but the texture quality, I think, or whatever setting that had "low", "medium"," high", and "very high"), the performance was about the same @ 1152x864 32bit. I will definitely be keeping the Parhelia. :D
 

Bingo13

Golden Member
Jan 30, 2000
1,269
0
0
TheBug,

New drivers (231) were released today by Matrox. Be sure you have the latest patch for JKII and turn on GL Extensions as it increases the performance.

A very good site for the Parhelia is here-

Link
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) Yup, it seems that for most of us the GF4TI and Rad8500 cards are excellent (esp price:perf) and certainly have more than good enough IQ for the vast majority of users. If you don't mind paying the extra then the Parhelia has the best IQ (should be most apparent at 1600x1200 and above), the downside is that you only get GF3/Rad8500 perf from the $400 card but of course if you happen to have 3 monitors then 'surround gaming' should be a very sweet bonus. It is just a shame that the Rad9700 retails for the same price yet offers full DX9 compliance (FP precision and 128bit colour blows away 10bit RGB) and more than DOUBLE the 3D perf!

:( Regarding oem versions coming with lower clocks, nVidia cards don't but Matrox and Radeon cards do, so be warned. Generally you are only talking a 10% perf hit at worst from the lower clocks, so if the price diff is more than 10% an oem Matrox or Radeon card will do you no harm. In any case, I think the non-ATI branded Radeons ('powered by ATI') are the biggest rip-off, they use stealthy lower clocks and cheaper RAM, and from what I hear the IQ suffers too. Again whether or not the non-ATI Radeons are worth it comes down to the price diffs.

:) I think most people on this forum believing that Rad8500 has better IQ than GF4TI mostly comes from the superiority of the Rad8500 IQ over the very variable GF3 cards, the Rad8500 certainly does have excellent IQ, I think the Parhelia and GF4TI tested were simply that little bit better. IQ certainly seems something nVidia has addressed with GF4, certainly GF4TI, in any case it seems that both ATI and GF4TI both have excellent IQ and are excellent buys for the vast majority of buyers, the Rad8500 is slower but costs around $90 while the GF4TI4200 starts from around $140, both are exceptional value for money.

;) Regarding RAMDACs and dual display functionality, GF3 certainly couldn't operate 2 devices simultaneously, let alone at different res or refresh rates, however this again has all been remedied by the GF4 cards.
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
TheBug, you said no thanks is required but you truly deserve it!

This is extremely interesting information, and your testing was evidently conducted fairly and reasonably with a solid range of different scenes viewed.
One seldom gets to see the results of a completely unbiased survey like this, and I suspect it will definitely be beneficial for both myself and others.

I look forward to hearing what brand of GeForce cards were used, thanks for checking :)


The CRT monitors used in the test were Sony Multiscan CPD-G400 set to the recommended 1280x1024 @ 85Hz that were bought in the same batch of order, so I'm assuming their manufacture date/time were close. In addition, I did a hardware reset on the monitors before presenting them to the testers.

It is highly likely the results of the test would change at higher resolutions wherein solid and crisp to 2D output becomes even more important.
This may lead to re-ordering of the graphics card in the votes recieved, but most likely the boards would stay in the same order but the differences between the respective graphics cards would become even more pronounced.

This certainly shows that even at 1280x1024, there are definitely visible differences between the graphics cards that would likely become more/less pronounced depending upon the resolution selected.


I installed Parhelia on my home machine last night and my girlfriend worked on the computer for about 30 minutes after that. This morning, she asked if I did anything to the system because the video was clearer even though I never told her about the switch from Rad8500 to Parhelia. I would still say that the 2D IQ depends greatly on the person, but there are definitely differences between the Parhelia and its competitors.

I am going to presume by that responce that the differences are clearly visible even to the average person if she so easily noticed the improved quality as to lead her to have it uppermost in her mind the next morning and inquire about it.


New drivers (231) were released today by Matrox

Christ! What is this.. the fourth or fifth driver release in about 2 weeks?!
I'm not sure whether to be disturbed or pleased by that. Either way Matrox certainly isnt slacking off on driver support!
That's a very positive sign given their usually very solid drivers have had OpenGL issues with past graphics cards.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Thanks for the analysis. The CRT monitors used in the test were Sony Multiscan CPD-G400 set to the recommended 1280x1024 @ 85Hz that were bought in the same batch of order, so I'm assuming their manufacture date/time were close. In addition, I did a hardware reset on the monitors before presenting them to the testers.

Very nice analysis. One thing I would like to contribute.

A few months ago I did a mini-review of 2D quality to try and "debunk" the Matrox myth. My testing wasn't as unbiased and controlled as yours, but it was informative to me.

I basically tested an ATI Radeon LE, Gainward Geforce3, Creative Labs Geforce2, 3dfx Voodoo5, and a Matrox G400 on my sony G400 monitor. There was an aweful lot of card swapping so i was relying on my "visual memory" which isn't a very good method.

So why am I bringing this up??? I ended up running the test twice because I found that the default gamma settings for each card were varied enought to make me believe they were more "different" than they really were. I ended up running through each card again and normalizing all the color settings using some program that came with a nokia monitor that I had.

My end results were pretty similar to your "test". The only card which really stood out multiple times was the Matrox. The ATI Radeon, the Gainward, and the Voodoo5 were basically a wash. The Geforce2 was noticably worse than the rest.

 

TheBug

Junior Member
Jul 20, 2002
23
0
0
So it turned out that the video cards were Dell generic ones without any noticeable marking (at least none that I could find). I didn't have the time to check all the tiny text though.
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
Originally posted by: EddNog
Hey, I just realized; Radeon 9700 is DX9. Is Parhelia fully DX9 capable?

-Ed


Nope, it's only partially DX9 compliant. It's fully DX8.0 compliant and implements part of the DX9 feature set but not all of it.
More-so then the GF3/4/R8500 but it doesnt have the full compliancy of the R9700.
 

sash1

Diamond Member
Jul 20, 2001
8,896
1
0
Originally posted by: Rand
Nope, it's only partially DX9 compliant. It's fully DX8.0 compliant and implements part of the DX9 feature set but not all of it.
More-so then the GF3/4/R8500 but it doesnt have the full compliancy of the R9700.
Should be "more-so than." :)

Does it matter that it's not fully DirectX 9 compliant, though? I mean, all the GF2 cards, and even my Kyro II are only DirectX 7 cards, yet they are still working fine even now...

~Aunix

 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) In short, not really, games have only really started to use DX8 funcs, as such it will be a good year or more before many games begin to utilise DX9 funcs. There's no doubting the benfits and superiority of DX9, nor the excellent perf and features of the Rad9700, but DX9 compliance isn't of much significance at this moment in time, buying the Rad9700 for that reason wouldn't be very wise as by the time it proves useful there will be cheaper and faster DX9 cards anyway.
 

EddNog

Senior member
Oct 25, 1999
227
0
0
Right now the primary concern is to replace my GF3Ti200 with either the Parhelia-512 or Radeon 9700. It looks like the Radeon 9700 is a better idea; for basically the same price, I get a card that will last me longer because of its high performance and full DX9 compliance. The Parhelia isn't as fast, and it lacks full DX9 compliance; if I were to purchase a Parhelia, it may last me around 1-1.5 years. Perhaps the Radeon 9700 would last me 2-2.5. For that reason alone, the Radeon 9700 looks like a far more intelligent choice. Although I give up GigaColor on the desktop (at least until ATI releases special software support for it in Photoshop like Matrox does) and I give up Fragment AA, and perhaps some (probably just hardly noticible; so long as it's better than my seizure-inducing GF3's) 2D quality, I get better performance, full DX9 support (not important now, but important later) and I get HDTV output. The DX9 and higher performance means a longer lasting card. I have probably made up my mind; but, like I said, I'm still watching for maturity in Matrox's Parhelia drivers, as well as overclockability tests of both cards. Now, if ATI would release the damn card already!

Say, is it true that the 9700 will be the slower piece and that a 9900 part will be available? I would like to know the clocking differences, differences in memory equipped, and differences in price.

-Ed
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
Originally posted by: AunixP35
Originally posted by: Rand
Nope, it's only partially DX9 compliant. It's fully DX8.0 compliant and implements part of the DX9 feature set but not all of it.
More-so then the GF3/4/R8500 but it doesnt have the full compliancy of the R9700.
Should be "more-so than." :)

Does it matter that it's not fully DirectX 9 compliant, though? I mean, all the GF2 cards, and even my Kyro II are only DirectX 7 cards, yet they are still working fine even now...

~Aunix

Right now?
Not really. It'll likely be approaching DX10 time-frame before it truly becomes appreciative. only within the last few months have DX8 features starting becoming strongly utilized, and there are still only a few games that absolutely need a DX8 board to experience the best image quality from the game.

In the future it will certainly benefit the Parhelia. Texturing units are impressively versatile, and the Parhelia will likely show pretty strong vertex shader performance relative to the GF3/4/R8500. A reasonably effective depth adaptive tesselation implementation will benefit the Parhelia. All of the above will in the future yield performance dividends for someone with the Parhelia rather then a purely DX8 board.

Additional effects like displacement mapping which the Parhelia will be able to handle whereas present DX8 boards cannot is a nice visual addition as well.

In the short term- to mid-term it won't mean much at all. In the low term it'll be a huge benefit for the Parhelia or any other partially, or preferably fully DX9 compliant graphics card.


Say, is it true that the 9700 will be the slower piece and that a 9900 part will be available? I would like to know the clocking differences, differences in memory equipped, and differences in price.

Unfortunately that's unknown at present. Current rumours indicate a possible Radeon 9500 which would likely be nothing more then a lower clocked R9700.
Other rumours indicate an R9900/10,000 which would likely be a dumb shrink to .13u fabrication process with slightly higher clockspeeds.
That would likely appear end of this year or Q1 2003 though.
 

sash1

Diamond Member
Jul 20, 2001
8,896
1
0
Other rumours indicate an R9900/10,000 which would likely be a dumb shrink to .13u fabrication process with slightly higher clockspeeds.
That would likely appear end of this year or Q1 2003 though.
Why whould they want to make their chip smaller? The cost of revamping chip fabrication is becoming increasingly high. But, if you look at FS:
"Other details that we do know is that NV3x will be built off TSMC?s 0.13-micron manufacturing process, will fully support AGP 8X, and supports high-speed DDR-II memory."
ATi might just be wanting to follow suit with nVidia's plans.

But will AGP 8x really be worth it? Do we really need a transfer rate of 2Gb/sec?

~Aunix
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
QUOTE: "will AGP 8x really be worth it? Do we really need a transfer rate of 2Gb/sec?"

;) The short answer is NO. AGP8x is much more about marketing and selling new hw than actual perf gains, it is superior and it has its potential and it will become the dominant standard, but don't expect much, if any, perf gain over running in 4x mode. There is actually little speed loss when switching from 4x to 2x with any current gfx card and as such AGHP8x bandwidth is unlikely to be of benefit, much like ATA133 and ATA100 really. USB2 and DDR-II do and should bring big leap in perf, but some things are just marketing blurb.

:D There is no doubt in my mind EddNog, either get a GF4TI4200-128MB or pay the price for the top-of-the-range Rad9700 if you can't wait a few months, just rem that buying top of the range is always a sure way to get the product which devalues the quickest, but if you have $400 to spend, in your case and most other people's, the Rad9700 is the clear choice.

:) Regarding future cards, the new nVidia cards (Nov/Dec) should be faster than the Rad9700, but I doubt it will be by much. I would expect Rad9500 to come out pretty soon and should be aimed at $150-200 price range, even if it 'only' offered half the perf of Rad9700 it would still be very competative and would still be much faster than the Parhelia512. I would expect ATI to release a couple of new revisions of the Rad9700 to compete with nVidia's new offerings, much in the way nVidia released the GF3TI200 and GF3TI500 when the Radeon8500 cards were giving the standard GF3 far too much heat, and of course ditto for the GF2MX & TI cards.
 

EddNog

Senior member
Oct 25, 1999
227
0
0
Since I basically give my old stuff away to family and friends, the immediate devalue of a video card is of little importance; anyway, whether I buy a $400 card or a $150 card, they'll both be worth about the same in 3 years. The point though is, spend $400 now, then don't spend any money on a video card for another 2-3 years, whereas spend $150 now, then in only a year or a year and a half, spend another $150 or more for a newer card. This is the main reason why I've decided that I'll probably get the R9700 over P512; the P512's performance makes the card less valuable sooner because I would replace it sooner. The Radeon would last me longer because of its stellar performance; sure it won't be the fastest thing out in a couple of years, but at least it's not the slowest thing around. My Parhelia could end up that way, left to the merits of its current performance (current implying performance with current drivers; later drivers, in hope for Matrox, should be faster).

-Ed
 

Bothware

Member
Jul 26, 2002
25
0
0
Originally posted by: AnAndAustin
QUOTE: "will AGP 8x really be worth it? Do we really need a transfer rate of 2Gb/sec?"

;) The short answer is NO. AGP8x is much more about marketing and selling new hw than actual perf gains, it is superior and it has its potential and it will become the dominant standard, but don't expect much, if any, perf gain over running in 4x mode. There is actually little speed loss when switching from 4x to 2x with any current gfx card and as such AGHP8x bandwidth is unlikely to be of benefit, much like ATA133 and ATA100 really. USB2 and DDR-II do and should bring big leap in perf, but some things are just marketing blurb.

Actually, it should make a difference, 4x AGP bandwidth can be completely filled by some current texture transfers, 4xAGP also limits the level of improvement a faster CPU makes, The R9700 has a 20Gb/s internal mem. bandwidth, current main RAM has transfer rates of > 3.2Gb/s therefore having a 1Gb/s connection is a limiting factor, and should make a difference (albeit small in most games).

ATA133 could be considered a marketing ploy because drives can't physically transfer fast enough, even with 2 drives on the connection. - however this doesent compare to AGP, where the AGP bus is slower than the components either side of it.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:eek: So that's basicly what I said, "The short answer is NO. AGP8x is much more about marketing and selling new hw than actual perf gains, it is superior and it has its potential and it will become the dominant standard, but don't expect much, if any, perf gain over running in 4x mode.

;) It will of course be interesting precisely how Rad9700 compares when plugged in to both AGP2.0 AGP4x and AGP3.0 AGP8x slots. However very little is lost from most cards from 4x to 2x which shows that in actual perf gains even 133mhz isn't too limiting. Much in the same way as 256MB gfx RAM, in theory and on paper its benefits are obvious, but in real games, even games out over the next 12 months it will make no difference, in a couple of years who knows, but in the near future I severely doubt it.
 

EddNog

Senior member
Oct 25, 1999
227
0
0
Would it not be that with the ????JB SE models from WD, the 8MB cache could take slight advantage of the UDMA133 connection?

-Ed
 

Bothware

Member
Jul 26, 2002
25
0
0
Originally posted by: EddNog
Would it not be that with the ????JB SE models from WD, the 8MB cache could take slight advantage of the UDMA133 connection?

-Ed

NO, the 8mb wont make any differecnde in normal use, for it to be effective you would have to request reads of the SAME data over and over again, WITHOUT changing it : this is unreleastic behaviour of applications.

Yes the burst throughput of the cache is capable of >133 Mb/s but the disk can't fill the cache that quick. Also a bigger cache could result in the loss of more data in the case of power loss.

This is starting to get a bit OT...