***OFFICIAL*** Parhelia Countdown thread, constant updates, Anand's article is up!

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mem

Lifer
Apr 23, 2000
21,476
13
81
I dont know if Parhelia will satisfy those (but i doubt that it wont) but i'm sure i wont say "i wont buy the board because it doesnt have bandwith saving"

It really is down to what you want in a graphics card,I`m a gamer and will wait until the 4th quarter before I decide to buy my next graphics card,is the Parhelia the graphics card for me? I don`t think so,however there`s always room in the market for the Matrox Parhelia which means more choice for the consumer and Matrox famous 2D quality will be more then enough for some buyers that want a new card.

You will always get both positive and negative comments with any new graphics card,we still have to wait for real benchmarks and real prices and see how the drivers perform etc, remember not every gamer upgrades every 6 months,I`m not one of them even though I`ve the funds to do so, not everybody runs at 1600x1200 res etc,so there`re many reasons why people buy different graphics cards.The old saying is true "it`s your money your choice".

As a gamer I would like to see more choices when it comes to graphics cards.

:)



 

NOX

Diamond Member
Oct 11, 1999
4,077
0
0
Originally posted by: grant2
Dude, you are taking the early (yes, early being the key word) impressions of the card and quoting people like some plague! You need to give this card more time; there are no official benchmarks yet!

When people write "This card sounds great, i will definately buy it!" you don't say a word, but when people write "This card doesn't sound so good, I probably won't buy it" you chastise them for taking early impressions & not giving it enough time.

Hypocrite.
Oh my, I?m shocked, not at the fact you are calling me a hypocrite, but for the fact you totally fabricated that quote!

When did anyone say ?This card sounds great, i will definately buy it!??!?!

If they did then they are obviously walking blind. I?ve looked up and down in this thread, and found nothing even comparable to what you are saying!?!?

I?ve read ?This card is looking sweeter as I read more about it.?, ?this card looks very tempting for me, though I do game I spend far more time on doing 2d work.?, ?As long as they will keep producing boards like they did till now and provide me with super quality and features and some decent 3D, i'm sold.? The last quote can be attributed to a previous experience, which in this case was a plus for that person. Or maybe a G400/550 owner, which they are sold because this would be a technological leap for them, from what they currently have to what their next card might be.

For suggesting I?m a hypocrite for not chastising someone because they agree this card ?is looking sweet?, ?this card looks very tempting?, is getting VERY desperate! Wouldn?t you think?

If you already own a 8500, G3/4 Ti, then this card is most likely not an option unless benchmarks show otherwise!

To phrase this card for its technology is nothing wrong, but to suggest this card has nothing exciting is a matter of opinion. All you need to do is look at anyone?s review, and you will see there is lots to be excited about, for all video card manufacturers. This is just a small step into the future of where video cards are headed. If that doesn?t get you excited then I don?t know what will. Personally, I think it?s all personal feelings at this point!

 

NOX

Diamond Member
Oct 11, 1999
4,077
0
0
Originally posted by: mastabog
p.s. if you have the time, try reading this thread from top to bottom (well not all of it) and you will see how at the start everyone was thrilled and after the 'reviews' (only specs stories), it ended up in some kind of a dispute between the ones who already dont like it because they've read Anand said something, or Tom said saomething (no offense intended at all) and others who already love it or give it credit (which i dont blame).
Agreed!

Though the part I don't understand is many of them (if not all) had exicting things to say!

 

socketman

Member
Mar 4, 2002
116
0
0
Some of the features of Parhelia look good to me. 16 FSAA - displacement mapping and more. But its just not the card for me at that price tag. What is most exciting is competition. Right now there is only Nvidia vs ATI/matrox/3dlabs.
I think we can all agree the Nvidia has the upper hand against all 3 (in market share). So while Perhelia isnt the Pinnacle of all things electronic, its a good card with good features = competition = good for end users.
So color me happy :)

If it ever comes down in price i would love to try it out on a quality moniter to check out this 2D stuff everyone keeps raving about. Personally i didnt think 2D quality was an issue... after reading this thread I guess it is. Anyone know of a review or article that compares 2D quality?

 

Hulk

Diamond Member
Oct 9, 1999
5,173
3,798
136
Even if for all intents and purposes it is a preview, there should be some benches.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: Hulk
Even if for all intents and purposes it is a preview, there should be some benches.

You can't bench what you don't have.
 

CocaCola5

Golden Member
Jan 5, 2001
1,599
0
0
Anyone thinks CRT in surround gaming is NOT really a viable option? I mean, for all practicalness(weight first, then power wattage), you probaby will need to go LCDs. Three 15" flats would still rock.:)
 

RedRooster

Diamond Member
Sep 14, 2000
6,596
0
76
Originally posted by: NOX
Originally posted by: grant2
Dude, you are taking the early (yes, early being the key word) impressions of the card and quoting people like some plague! You need to give this card more time; there are no official benchmarks yet!

When people write "This card sounds great, i will definately buy it!" you don't say a word, but when people write "This card doesn't sound so good, I probably won't buy it" you chastise them for taking early impressions & not giving it enough time.

Hypocrite.
Oh my, I?m shocked, not at the fact you are calling me a hypocrite, but for the fact you totally fabricated that quote!

When did anyone say ?This card sounds great, i will definately buy it!??!?!

If they did then they are obviously walking blind. I?ve looked up and down in this thread, and found nothing even comparable to what you are saying!?!?

Guess you missed the "Now I'v found a card that suits me so I'll buy this one for sure :)" care of mr. Czar. ;)
There, that's my two cents. Post count+1




NVidia, much like O'Doyle, rules!

 

BD231

Lifer
Feb 26, 2001
10,568
138
106
Though I agree there wasn't anything terribly exciting, I too would like to know what exactly you consider exciting? It's a video card, when is it ever exciting, and what were you expecting?

When the Athlon XP came out, I got excited because the T-Bird was aging and made insane amounts of heat. It offered less heat, SSE(which is something AMD didn?t have), and a good deal more performance and higher clock speed?s. When the KT266A came out I got excited about DDR, PC133 was becoming pretty slow. The GF4 Ti4600 was, at the time, exciting. It offered a second Vertex shader, and simply insane performance. I didn?t buy a GF4 because I simply don?t need such power, but since I have seen what an Xbox can do, I was interested in seeing that type of graphics power come to the computer. I?m not excited about the Parhelia because 1.) It doesn?t have a memory bandwidth saving features, which has been in use for quite some time now. Instead they opted to make the thing extremely expensive with an insane amount of mem bandwidth. When a lesser version is released, it will most likely be crippled due to lack of sufficient bandwidth. So a high performance cost effective card seems impossible 2.) There are to many drawbacks to its ?new? features 3.) Just plain don?t feel this product would justify a hefty price tag for me. Now these are my feelings, I could be wrong in my assumptions, and the card could end up being truly amazing and also be decently priced. The important thing is that I?m not casting any judgment, I?m just voicing what I feel about the product with the current info. I have. I was not arguing, I was trying to explain why I feel the way I do about the card, apparently people were taking it wrong. Either way, I?ll be keeping my eye on the card to see what becomes of it, just like most any computer aficionado would.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
NVidia, much like O'Doyle, rules!

Oh dear.....
rolleye.gif
Someone is bound to bite your head of for that one.
 

NOX

Diamond Member
Oct 11, 1999
4,077
0
0
Originally posted by: bdog231
NVidia, much like O'Doyle, rules!

Oh dear.....
rolleye.gif
Someone is bound to bite your head of for that one.
Naa... I wouldn't do that. If he reads my post again I'm sure he'll figure it out. :)
 

TheWart

Diamond Member
Dec 17, 2000
5,219
1
76
THE PARHELIA WILL BE THE BEST CARD EVER!!


















Just joking.....seriously, almost all of these arguments are going to change once benchmarks como out, and even then they will change after the intial price goes down. I persoanlly dont see the big fuss, but maybe I am just jealous cause I am on a GF2, but o well. I say we just calm down and wait for the /sarcasm/ really important features - synthetic benchmark performance /sarcasm/ to tell us how much this card is worth. :)
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: grant2
ya as if Matrox wouldn't like to sell a $500 card to the same person every 6 months. I have the funny feeling Matrox isn't allergic to profit...

Last time I checked, Matrox wasn't on the 6 month product-cycle. In fact, alot of people here have been complaining that Matroxes product-cycle is too long! There IS .13 micron Parhelia (with improvements) in the pipeline though

And are you saying we have anything but hype from Matrox?

Huh? Last time I checked, we have a multi-page article on Anandtech regarding Parhelia. It's loaded with technical data. All we have regarding NV30 is the "It'll blow you away!"-comments from Jen-Shun Huang (spelling?)

I wonder where this "nv30 vs. parhelia" talk is coming from since there are absolutely *0* products and *0* benchmarks available for either product.

We have the specs of Parhelia, we can already make some guesstimates regadring it's performance. As to the Parhelia vs. NV30... Beats me. There seems to be people here who disregard Parhelia because "NV30 is just around the corner". Go ask them why they turn this in to a Parhelia vs. NV30 debate
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
The lack of HSR really concerns me especially when it comes to pixel shading operations executing on useless (ie hidden) pixels. We'll have to wait for the benchmarks to see just how well 20 GB/sec stacks up against nVidia's ~10 GB/sec + LMA II.

Image quality: Seems to be best there is.

According to the specs it certainly has the potential but we'll have to see just how well it does against nVidia's offerings (GF4 in particular).

Mingon, I think your 25% estimate is a tad conservative and I think the figure might be higher. In some cases the Det4s increased performance by 50% and that was pretty much because nVidia optimised around their first version of LMA on the GF3.
 

dexvx

Diamond Member
Feb 2, 2000
3,899
0
0
I think this card is THE card for hardcore gamers...

Seriously, with triple head, you have a HUGE advantage compared to anyone else. Especially if you play counterstrike/quakeIII (FPS) or any RTS games, seeing beyond the center perspective is a HUGE advantage. I can easily imagine that in professional tournaments, every pro would have something similar to tripleview (unless it was banned). I mean, imagine in counterstrike (I used to play that game a lot) with the ability to see the to your left without actually TURNING around. It would be immensely harder to catch you by surprise. Even if it was at a lower overall FPS, I think most pro gamers would actually want a wider field of view because it gives a bigger advantage. The peripheral view of tripleview is almost the same as our ACTUAL peripheral view. Although I would have to say that the monitor's edges would be kind of annoying.

Anyone thinks CRT in surround gaming is NOT really a viable option? I mean, for all practicalness(weight first, then power wattage), you probaby will need to go LCDs. Three 15" flats would still rock.

LCDs currently suck for gaming. Their response time is too high compared to a CRT. Try playing FPS games (like counterstrike/quake3) and download some demos of pros playing. You will most likely be dizzy or see a lot of blurring when they turn around really fast.
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: BFG10K
The lack of HSR really concerns me especially when it comes to pixel shading operations executing on useless (ie hidden) pixels. We'll have to wait for the benchmarks to see just how well 20 GB/sec stacks up against nVidia's ~10 GB/sec + LMA II.

I don't think that LMA2 can double the effective bandwidth (maybe something like +15-30% I guess). What it can help with (as Anand noted) is with the shading of the pixels. That is the one of the two shortcoming of Parhelia (other being the lack of FAA is stencil is used. I think both of those faults are corrected in the .13 micron version).
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
one question I have if anyone knows, the tv-out dvdplayback, they say its 10bit, but do you have to enable desktop 10bit to make the tvout 10bit or is the tvout always 10bit?
 

Chumster

Senior member
Apr 29, 2001
496
0
0
Tim Sweeney, Epic's chief 3D guru working on next-generation Unreal engine technology had this to say about it: "We've had our hands on a Parhelia for the past few days, and have Unreal Tournament 2003 up and running in triple-monitor mode -- it's very immersive, and surprisingly fast (rendering at 1280*3 x 1024)." The UT engine has had adjustable FOV for quite some time now, and UT 2003 obviously does too, so when that title ships this summer, it will in all likelihood support Surround Gaming.

To me, "Surround Gaming" is a gimic. Granted, I would love to try games with 3 monitors, but it's just not financially feasible for most gamers. The cheapest 17" LCD I could find on pricewatch sells for ~$500 (plus S&H) - meaning you'd have to spend ~$1500 dollars to configure your setup like those shown by Matrox (or where they using LARGER LCDs?).

Chum