Matrox Parhelia.. NDA BROKEN!!!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BlvdKing

Golden Member
Jun 7, 2000
1,173
0
0
I want to know if the Parhelia will be supported with good drivers for Linux as well. Right now no consumer level 3D graphics card can beat NVidia in the Linux support department IMO.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Sounds like the textbook definition of a Matrox zealot to me...whatever dude. Considering every video card thread that comes along requires a post coming from you that it won't matter since NVidia will bring something else out so much better, it sounds an awful lot like the pot calling the kettle black.
I don't recall saying that NVIDIA would blow anybody out of the water. I simply said that NVIDIA Ti4600 is already very fast. Most likely, they have Det5 drivers which will increase performance on top of that. Secondly, NVIDIA has already been tooling around with the .13 micron process and already has samples from TSMC. That allows for higher clock speeds and room to add more features on the die. Ti4600 w/Det5 would close the gap between the Parhelia and NV30 would be interesting no doubt.

It's still gonna be a while before we see the G1000 cards on the market so only time will tell.

And the only reason why I've been mentioning NVIDIA (concerning NV30) is b/c they are the only ones that have been leaking information as to what their next card is going to be like. ATi has been surprisingly quiet.

As for the "zealotry" comment, I simply stated that b/c you said that Matrox **WILL** have the best of 2D/multi-monitor/TV-out. How can you say that for an unreleased product? I mean, you're not even giving the other guys a chance based on unconfirmed specs;)

Sure, NVIDIA may not have the best reputation when it comes to 2D quality, but their cards now mostly come with DVI ports. So wouldn't that pretty much discredit that finding now? I mean, even if you were to use an analog monitor, it would still be off the DVI connector and would be converted using the DVI --> Analog adapter and not the onboard RAMDAC (or maybe I'm thinking about it the wrong way?:)).

And ATi's 2D quality has always been good IMHO as has been their TV/recording features. I don't know what 3D Labs is gonna bring to the table 2D/TV-out wise, but I'll at least give them the chance.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
in dont need all the 'features' of the parhelia..namely dual/triple head...dvd/hdtv decoding and blah blah.

All what counts for me is D3D/OpenGL performance and image quality.

Parhelia hasnt convinced me yet that it's "the" killer card - but maybe i am (as a person) just not part of the market the card seems to be targeted at.
(Home theater PC people, 2D people, people who own [cough] three 19" LCD monitors and see a use in the triple head feature )

It's a "feature card" - so to speak...

Besides....already (of course) rumours going around "...[ATI's] R300 will destroy everything"....NV30 on the way.

Great time for following and reading all that reviews !!!

Personally....for some reason i am more confident in what ATI will release with their R300..not that i am "Ati fanboy"....but for some reason i think ATI is always good when it comes to innovative technologies. But i havent forgotten NV30 yet, either.

It's just that my overall "Parhelia" expression is "too much overhead" with things i dont have a priority for...and some disappointing things like no memory optimizations. The parhelia may be good SSFAA card (benchmarks needed, i hope the 3fps in 3dmark2k1 are not right !!!) and displacement mapping...this looks great, just great.

But as with ATI's PS1.4 and truform....it doesnt make sense if displacement mapping is only supported by one card.....then you might sit on a card with great features which no app/game will use. (I am a radeon 8500 owner, i know what i am talking about :)





 

Athlon4all

Diamond Member
Jun 18, 2001
5,416
0
76
I have to admit, I am one of those that is excited very excited about Parhelia. I am not a zealot of anything, but, I definately would pick in modern times nVidia over ATi, but I am not a huge gamer at all, as a matter of fact, from April 2001-March 2002, I was on ProSavage, and for the past 2 months I was with a GF2MX, so really my demands aren't that high, what I do demand, is GF2MX level performance, and now, Dual Monitor capability. Now, I just the past week sold my GF2MX, and have started hunting for a Dual Monitor card, and I would've gone with a Matrox if only the G550 was closer to GF2 MX, but instead, I went for a R7500. So the bottom line, is I'd take a Matrox any day as long as performance is at a GF2MX, and by the way it seems, this card is more than just at a GF2MX:p

Btw, it seems that Parhelia 512 does not have nVidia's Visability Subsytem, or HyperZ or similar, and while it will have this massive memory bandwidth numbers, will it use it so ineffectively compared to HZ/Visability that the performance won't be there??? jw
 

zippy

Diamond Member
Nov 10, 1999
9,998
1
0
Matrox Parhelia
Finally! :D

I'm getting a new card this summer! WOO!

Perfect timing...get it for summer when I'll have some free time and at the same time as I get my new LCD. :) Maybe I'll get three? No, no I won't get three because I'm not a money tree, which never ceases to disappoint me. Maybe I'll get a 17 and 2 decent 15" LCDs? Heh, we'll see how much $ I can make this summer and how much I want to spend on a d@mn computer. Sorry, in my excitement I have digressed.

Anyway, this rocks. Matrox rocks. Heh, talk about a long cycle between really new products...almost 3 years. :)
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: zippy
Matrox Parhelia
Finally! :D

I'm getting a new card this summer! WOO!

Perfect timing...get it for summer when I'll have some free time and at the same time as I get my new LCD. :) Maybe I'll get three? No, no I won't get three because I'm not a money tree, which never ceases to disappoint me. Maybe I'll get a 17 and 2 decent 15" LCDs? Heh, we'll see how much $ I can make this summer and how much I want to spend on a d@mn computer. Sorry, in my excitement I have digressed.

Anyway, this rocks. Matrox rocks. Heh, talk about a long cycle between really new products...almost 3 years. :)

HOLY HELL!!! It's zippy!!!! Where ya been man? :D
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
"I don't recall saying that NVIDIA would blow anybody out of the water. I simply said that NVIDIA Ti4600 is already very fast. Most likely, they have Det5 drivers which will increase performance on top of that. Secondly, NVIDIA has already been tooling around with the .13 micron process and already has samples from TSMC. That allows for higher clock speeds and room to add more features on the die. Ti4600 w/Det5 would close the gap between the Parhelia and NV30 would be interesting no doubt."

I don't care. I'll say it again, all of the upcomming cards will be more than fast enough for at least a year after release. The next NVidia card may very well be the fastest thing on the planet. It could be 3x faster than a TI4600, I don't care. The problem with NVidia has never been performance, yet that appears to be the only thing they care about.

I don't own a flat panel, and have no intention of buying one. They don't fit my needs. NVidia's analog output stinks and always has. The likelihood of them having stellar 2D with their next card is not good.

"And ATi's 2D quality has always been good IMHO as has been their TV/recording features. I don't know what 3D Labs is gonna bring to the table 2D/TV-out wise, but I'll at least give them the chance."

Let's go back in time, about 4 years. The G200 is released continuing Matrox's tradition of top notch 2D quality. Nvidia is currently selling the Riva series which has the worst 2D of the 3 companies in question. ATi is somewhere in the middle. Fast forward to today, Nvidia still is easily at the bottom, Matrox using their rehashed G400 now named G550 is still the best 2D available despite being basically 3 years old, and ATi is still somewhere in the middle. We could go back further for Matrox and ATi, but NVidia was not a player before the Riva. If ATi or Nvidia were able to create top notch 2D cards I think they would have been able to catch up to 3 year old Matrox tech by now.

ATi has decent 2D, and definite step above NVidia, but the 8500 was not good enough to replace my V5, so I don't have much hope for the R300 to top the Parhelia.

I don't consider 3DLabs a player in this as their first cards will be workstation cards. A gaming board is too far off in the future to be concerned with right now. Their card may well be the best when it is released, but I don't plan on waiting that long.

The Parhelia is not for everyone, especially not the top of the line, everything and the kitchen sink, cost be damned card. It contains a lot of features most people don't need, the top notch 2D is probably irrlevant for the majority of people here inexplicably using GeForce4's on 17" CRT monitors or flat panels, but for some of us, me included, Parhelia is the card we have been waiting for since the G400Max became outdated regardless of whether it is the fastest available or not. I don't live my life through benchmarks. If it gives you a woody to be able to brag about 350fps in Quake3 at 640x480 or 15000 in 3DMark 2000SE, good for you, brag all you want, doesn't bother me, but don't tell me that I should care, because I don't.
 

zippy

Diamond Member
Nov 10, 1999
9,998
1
0
HOLY HELL!!! It's zippy!!!! Where ya been man? :D
Haha, it's nice to know I was missed. :)

I've been stressin over school, SATs, APs, Nationals for Science Olympiad - I've been posting a few times a week in OT and lurking around a bit - I just haven't had the time. Plus GH has gone down the tubes recently, but posts like this remind me of the good old days.

Nice to see you too Brandon! hehe
 

Parhelia2002

Junior Member
May 12, 2002
6
0
0
you're wrong.

woofffff..............wooooooooofffffffffffffffffffffffffffffffffffff...............wooooooooooofffffffffffffffffffffffffffffffff


 

Barnaby W. Füi

Elite Member
Aug 14, 2001
12,343
0
0
Originally posted by: Parhelia2002
you're wrong.

woofffff..............wooooooooofffffffffffffffffffffffffffffffffffff...............wooooooooooofffffffffffffffffffffffffffffffff

wtf?!?!!?
 

Electric Amish

Elite Member
Oct 11, 1999
23,578
1
0
Originally posted by: rickn
Difference is that you get alot for that money. With NVIDIA you just get rehash of a rehashed tech

You actually believe that? G400 couldnt even run Quake 3 at a decent speed at 1024 32bbp. I'd imagine it runs everything like a slideshow now. You sure get a lot for your money alright, after you have to run out and buy three monitors. Matrox should go into the monitor business with Parhelia

Apparently you haven't owned a G400. I'm still using mine in my gaming machine. It ran Quake 3 just fine at 1024. It also runs UT at 1024 very smoothly. The only game I ever had a problem playing was the Tribes series.

People around here just don't see the big picture. Video cards are not only used to play games.

As for Linux support, I don't know where you've gotten your info, but Matrox has always been known to have the best Linux support.

amish

 

BD231

Lifer
Feb 26, 2001
10,568
138
106
This thread is enough to make your haed spin. So many words, such little relevence :confused:. Thanks for keeping us updated Amish ;).
 

NOX

Diamond Member
Oct 11, 1999
4,077
0
0
Originally posted by: bdog231
This thread is enough to make your haed spin. So many words, such little relevence :confused:. Thanks for keeping us updated Amish ;).
Agreed, the card hasn?t even come out yet and already all this bickering. Funny?
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: kgraeme
Originally posted by: Nemesis77
If you are in business of creating 3D, then Quadro would have been a better choice, that is not the field Gxxx was meant for. But Parhealia address that. It's targeted as much to gamers as it is to 3D-professionals.

How can you know the market this card is meant for until it's actually on the market? I'm aware of the Quadro. Just because Matrox hasn't been in the 3D workstation market before doesn't mean they don't want to break into it. This card looks to me like that opportunity.

Umm, isn't that what I just said :confused:? Parhelia is targeted towards performance-enthusiasts (read: gamers), 3D-professionals, businesses/people who need excellent 2D and excellent multidisplay and the like.
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Matrox is the only company that has provided good support for Linux and other UNIX-like OSes. Ill continue to purchase their cards.
 

dexvx

Diamond Member
Feb 2, 2000
3,899
0
0
A LOT of you people need to rehash on your video card history... G400/MAX was the *best* when it came out. It had really bad timing though, because of all the hype that was surrounding the NV10, the GeForce SDR.

Most people still dont know that video cards are used for display purposes and that 3D features is a relatively recent concept. The market that Matrox has been after for the past 2-3 years is the business workstation. They don't give about 3D performance and is only after crisp 2D quality with a good RAMDAC. If I was on a workstation at work, and coding or something practical like that, I would want a very large monitor that was crystal clear image quality. I wouldnt really care if it played Jedi Knight II at 100 fps because that is what I have a home computer for.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
they still have not got there detonator4 drivers stable!
They are stable.

I'm still using mine in my gaming machine. It ran Quake 3 just fine at 1024. It also runs UT at 1024 very smoothly.

rolleye.gif


Absolute rubbish. The G400 will be generating total slideshows in those games using those settings. Would you care to post some benchmark results to back your claims?
 

ElDonAntonio

Senior member
Aug 4, 2001
967
0
0
Guys guys guys, calm down, why all the bickering??? I think we should all just say "YEAHHHH A NEW COOL CARD IS COMING, MORE COMPETITION, MORE INNOVATION, LET'S ALL BE HAPPY-HAPPY-JOY-JOY!!!". :D

I personally like Matrox (worked there for a while), but I'll be just as happy when an nVidia or ATI card comes out!!! It's just great news, it makes some competition, which means lower prices and better technology for all of us.

So why fight over which company's best? let's just all be happy and enjoy what we have while we have it. :)
 

NOX

Diamond Member
Oct 11, 1999
4,077
0
0
Originally posted by: BFG10K
they still have not got there detonator4 drivers stable!
They are stable.

I'm still using mine in my gaming machine. It ran Quake 3 just fine at 1024. It also runs UT at 1024 very smoothly.

rolleye.gif


Absolute rubbish. The G400 will be generating total slideshows in those games using those settings. Would you care to post some benchmark results to back your claims?
Well, I know that was not directed towards myself, but being a pervious G400 (MAX) owner I would like to comment on my personal experience while playing Q3 and UT.

I was able to play Q3 at 1024x768 at decent frame rates (about 30-80fps), nothing earth shattering like my Ti500 (150+), thus why I sold my G400 Max in the first place. ;)

Same for UT, in fact, I had my G400 running at 1024x768 in D3D smoothly, much better then what my GeForce2 was doing, and I didn?t have to download no Loki patch either (which was a headache because the gamma would stay bright after exiting UT with Loki patch).

This is my personal experience with Q3, UT, and G400 Max? was a great card back then, still a great card but more for 2D then 3D with today?s titles.
 

Valinos

Banned
Jun 6, 2001
784
0
0
And I know guys with GF2MX's that say they get great performance in their games at 1024x768...then I go over to their house and see the game struggling to reach 20 fps.

I don't know about you, but I don't consider anything less than 30 fps in a First Person Shooter as playable - and 60+ fps to be competitive online. Other games like RPGs (Dungeon Siege) are fine with 20 fps, but can cause some slight nausea or headaches being that low. I like a nice, crisp, smooth framerate.

I'm sure you G400 owners that say Q3 at 1024 is playable have fat wives too.
 

rickn

Diamond Member
Oct 15, 1999
7,064
0
0
Apparently you haven't owned a G400. I'm still using mine in my gaming machine. It ran Quake 3 just fine at 1024. It also runs UT at 1024 very smoothly. The only game I ever had a problem playing was the Tribes series.

Umm, yes I did own a G400. And at 1024x768 32bbp with Trilinear filtering, maximum texture detail = a slideshow on G400.
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: rickn
Apparently you haven't owned a G400. I'm still using mine in my gaming machine. It ran Quake 3 just fine at 1024. It also runs UT at 1024 very smoothly. The only game I ever had a problem playing was the Tribes series.

Umm, yes I did own a G400. And at 1024x768 32bbp with Trilinear filtering, maximum texture detail = a slideshow on G400.

So why not drop the details a bit?
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
I'v been using a G400 ever since it came out, great gaming card no matter what you people say.

Not the fastest of the bunch but fast enough to make every game playable that I'v tried. It wasnt untill like 5-6 months ago that I had to turn down the resolution a bit.
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
about Jedi Knight 2, at school we have 800mhz PIII machines with TNT2 and the game plays great, the fact is that you dont need a uuber powerful card for every new game there is. Most games arent even designed with Geforce1 in mind because not everyone has those and poor game developers must think of the poor ones :p
 

gregor7777

Platinum Member
Nov 16, 2001
2,758
0
71
Btw, it seems that Parhelia 512 does not have nVidia's Visability Subsytem, or HyperZ or similar, and while it will have this massive memory bandwidth numbers, will it use it so ineffectively compared to HZ/Visability that the performance won't be there??? jw




HyperZ and whatnot are the technologies that allow for data that won't be displayed on screen to be eliminated, right? This is a somewhat "standard" thing now right? I'm sure Parhelia will include some such technology, especially seeing how they want to be able to run on three seperate monitors.

Otherwise that "3fps" screen wouldn't bee too far off.