***OFFICIAL*** Parhelia Countdown thread, constant updates, Anand's article is up!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NOX

Diamond Member
Oct 11, 1999
4,077
0
0
Originally posted by: bdog231
Originally posted by: Electric Amish
the card offers nothing new in terms of technological advancement, so thats more than enough reason to be dissapointed.

Displacement Mapping, 16X AA, 10bit color, Hardware Glyph rendering, triple head....

Naw, nothing new here...
rolleye.gif


amish

Nothing exciting...
But obviously "new"!
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
what reaaly interests me is this

16x fsaa
2x 400mhz ramdacs :)
improved tv out
faster than I need 3d speed :D
the new drivers


this is basicly what I use my g400 for only way better :) so I'm very pleased
 

HaVoC

Platinum Member
Oct 10, 1999
2,223
0
0
Wow...a lot of lukewarm response to this announcement. Given Matrox's track record with regards to high performance 3D graphics, it's understandable.

However, I think the 16x sub-sample AA feature and the displacement mapping are pretty good features. This is provided that they are implemented well and developers use it. I'm hoping the dismap will allow for much more detailed HUGE 3D outdoor environments at a good performance level. I'm getting pretty sick of walking through dungeons and sewers in 3D first-person shooters....
 

Electric Amish

Elite Member
Oct 11, 1999
23,578
1
0
Originally posted by: bdog231
Originally posted by: Electric Amish
the card offers nothing new in terms of technological advancement, so thats more than enough reason to be dissapointed.

Displacement Mapping, 16X AA, 10bit color, Hardware Glyph rendering, triple head....

Naw, nothing new here...
rolleye.gif


amish

Nothing exciting...

I'm curious as to what exactly would elicit an "exciting" reaction out of you...

500fps in Quake, no doubt....
rolleye.gif


amish

 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: Electric Amish
Originally posted by: bdog231
Originally posted by: Electric Amish
the card offers nothing new in terms of technological advancement, so thats more than enough reason to be dissapointed.

Displacement Mapping, 16X AA, 10bit color, Hardware Glyph rendering, triple head....

Naw, nothing new here...
rolleye.gif


amish

Nothing exciting...

I'm curious as to what exactly would elicit an "exciting" reaction out of you...

500fps in Quake, no doubt....
rolleye.gif


amish

There really isn't anything there that it exciting to me either. I mean, the Hardware Glyph looks pretty nice, but so does Clear Type in Windows XP on my 18.1" LCD monitor. Triple head is nice, but rather far-fetched and unrealistic IMHO (even in this market space) to think that people would actually USE 3 monitors on their desk. Only the most insanely rich gamer would bother with this in Quake3 or JKII. Even dual head is stretch for most people.

That being said, I just so happen to have a 21" Dell Trinitron monitor here that my dad uses at home. And I have my own 18.1" LCD. I'm getting a dual monitor Ti4200 or Ti4400 to hook them together with. The ONLY reason why I am bothering is b/c I JUST SO HAPPEN to have this new desk in my room that is uber awesome and has plenty of desktop space and I JUST SO HAPPEN to have a 21" monitor laying around. Otherwise, I'd say screw dual head ;)
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
But obviously "new"!

rolleye.gif


Displacement Mapping

Is it just me or dose every video card release state better curve's and surface's?

16X AA

Better FAA? Ok so all new card's have been boasting better FAA. Unfortunately all isn't perfect with the Parhelia's FAA engine; there are situations where the fragment detection doesn't work perfectly and the result are annoying artifacts in the game, Hmmmmm.....

10-bit color

That will be usefull when and if it becomes a standard.

Hardware Glyph rendering

Witch fall's into the whole 2D quality section, and it's obvious they have the crown. This feature is only good for LCD monitors, wich most dont have.

triple head

Whatever floats your boat ;)
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
Originally posted by: Electric Amish
Originally posted by: bdog231
Originally posted by: Electric Amish
the card offers nothing new in terms of technological advancement, so thats more than enough reason to be dissapointed.

Displacement Mapping, 16X AA, 10bit color, Hardware Glyph rendering, triple head....

Naw, nothing new here...
rolleye.gif


amish

Nothing exciting...

I'm curious as to what exactly would elicit an "exciting" reaction out of you...

500fps in Quake, no doubt....
rolleye.gif


amish

Riiiiiiight, and thats why I have a Radeon 8500? As long as the game run's half way decent I'm good. I'll be excited when a really amazing game with insanely immersive environments and gameplay comes out. By the time that happens, the Parhelia will have been replaced by some other video card with "new" features. I'll stick with my 8500 and morrowind till something worth a financial splurge comes out. ;)
 

Electric Amish

Elite Member
Oct 11, 1999
23,578
1
0
Tim Sweeney, Epic's chief 3D guru working on next-generation Unreal engine technology had this to say about it: "We've had our hands on a Parhelia for the past few days, and have Unreal Tournament 2003 up and running in triple-monitor mode -- it's very immersive, and surprisingly fast (rendering at 1280*3 x 1024)." The UT engine has had adjustable FOV for quite some time now, and UT 2003 obviously does too, so when that title ships this summer, it will in all likelihood support Surround Gaming.

How's this for exciting??

Quoted from Extremetech.

amish :D
 

joohang

Lifer
Oct 22, 2000
12,340
1
0
Originally posted by: Electric Amish
Tim Sweeney, Epic's chief 3D guru working on next-generation Unreal engine technology had this to say about it: "We've had our hands on a Parhelia for the past few days, and have Unreal Tournament 2003 up and running in triple-monitor mode -- it's very immersive, and surprisingly fast (rendering at 1280*3 x 1024)." The UT engine has had adjustable FOV for quite some time now, and UT 2003 obviously does too, so when that title ships this summer, it will in all likelihood support Surround Gaming.

How's this for exciting??

Quoted from Extremetech.

amish :D

:D:D:D
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
Originally posted by: bdog231

Riiiiiiight, and thats why I have a Radeon 8500? As long as the game run's half way decent I'm good. I'll be excited when a really amazing game with insanely immersive environments and gameplay comes out. By the time that happens, the Parhelia will have been replaced by some other video card with "new" features. I'll stick with my 8500 and morrowind till something worth a financial splurge comes out. ;)
you already have a good card that will last you a while, you have the highest generation of cards and can therefor skip the next one.

I have a g400 because no card I'v seen has been able to fullfill my "quality" needs up untill now, ati has been able to come close except for the stupid drivers, nvidia has also come close except for the crappy 2d.

Now I'v found a card that suits me so I'll buy this one for sure :)
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Tim Sweeney, Epic's chief 3D guru working on next-generation Unreal engine technology had this to say about it: "We've had our hands on a Parhelia for the past few days, and have Unreal Tournament 2003 up and running in triple-monitor mode -- it's very immersive, and surprisingly fast (rendering at 1280*3 x 1024)." The UT engine has had adjustable FOV for quite some time now, and UT 2003 obviously does too, so when that title ships this summer, it will in all likelihood support Surround Gaming.
How may AT'ers have/will have/will buy two additional monitors for that though? :D

It's a nice thought but one that most likely will not be taken advantage of very often. How many people with G400's here actually use dual head?
 

grant2

Golden Member
May 23, 2001
1,165
23
81
Well, no. People who care about those 10FPS are usually the n00bs, who runout and buy the greates NVIDIA-card every 6 months. It seems that people who run Matrox are a bit more... professional ;). They care about other things besides those 10FPS. And this card costs so much that those kids have hard time assuring their parents that they need this card.

ya as if Matrox wouldn't like to sell a $500 card to the same person every 6 months. I have the funny feeling Matrox isn't allergic to profit...

As for Parhelia being dethroned by NV30... That we don't know yet, since we don't really know anything about NV30. All we have is the regular hype from NVIDIA. It might offer better raw performance, but will it beat Parhelia in other departments? Like image quality and features?

And are you saying we have anything but hype from Matrox?

I wonder where this "nv30 vs. parhelia" talk is coming from since there are absolutely *0* products and *0* benchmarks available for either product.
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
Originally posted by: NFS4
Tim Sweeney, Epic's chief 3D guru working on next-generation Unreal engine technology had this to say about it: "We've had our hands on a Parhelia for the past few days, and have Unreal Tournament 2003 up and running in triple-monitor mode -- it's very immersive, and surprisingly fast (rendering at 1280*3 x 1024)." The UT engine has had adjustable FOV for quite some time now, and UT 2003 obviously does too, so when that title ships this summer, it will in all likelihood support Surround Gaming.
How may AT'ers have/will have/will buy two additional monitors for that though? :D

It's a nice thought but one that most likely will not be taken advantage of very often. How many people with G400's here actually use dual head?
I used two 20" for a few months, wonderful :) now I only have one 20" but I use the tv-out feture every day, I would never part with my tv-out :D
 

grant2

Golden Member
May 23, 2001
1,165
23
81
Dude, you are taking the early (yes, early being the key word) impressions of the card and quoting people like some plague! You need to give this card more time; there are no official benchmarks yet!

When people write "This card sounds great, i will definately buy it!" you don't say a word, but when people write "This card doesn't sound so good, I probably won't buy it" you chastise them for taking early impressions & not giving it enough time.

Hypocrite.
 

Electric Amish

Elite Member
Oct 11, 1999
23,578
1
0
Originally posted by: grant2
Dude, you are taking the early (yes, early being the key word) impressions of the card and quoting people like some plague! You need to give this card more time; there are no official benchmarks yet!

When people write "This card sounds great, i will definately it!" you don't say a word, but when people write "This card doesn't sound so good, I probably won't buy it" you chastise them for taking early impressions & not giving it enough time.

Hypocrite.

Wow.

Good point!

amish
 

ProviaFan

Lifer
Mar 17, 2001
14,993
1
0
If (and that's a big IF) the parhelia lives up to it's promises, I might get one. After getting a bad taste of nvidia's visual quality (er...lack thereof), I've basically decided to stick with other sources for graphics cards. However, if ATI's or 3Dlabs' cards end up being better, I'll probably go for one of those instead.
The lesson that I'm learning over and over again is that consumer-level equipment is mostly crap. Sometimes it has good quality (like my Kodak DX3500, which can take very good pictures...if the light level is to its liking), but almost always lacks in very much needed features. Ergo, my next computer will probably have lots of "unnecessary" and expensive things - like dual Xeons, SCSI, workstation-class video card, professional sound card capable of 24/96 format audio I/O, etc.

Oops...this wasn't supposed to turn into a rant about computer equipment in general. Sorry. ;)
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
Originally posted by: grant2
Dude, you are taking the early (yes, early being the key word) impressions of the card and quoting people like some plague! You need to give this card more time; there are no official benchmarks yet!

When people write "This card sounds great, i will definately it!" you don't say a word, but when people write "This card doesn't sound so good, I probably won't buy it" you chastise them for taking early impressions & not giving it enough time.

Hypocrite.


:Q I just gave him a roll of the eye's, but this works as well.

I was more impressed at the Parhelia's ability to run UT2003 at 1280x1024

With all those Vertx shaders and quad texturing, I would imagine the card would fly in super intensive game's like the new Unreal.
 

mastabog

Member
May 1, 2002
48
0
0
I dont wanna sound like a smartass nor as a "man knowing everything" nor as having an opinion in anything BUT ...

Lotsa people already began saying Parhelia $ucks, like they've tested it or seen it in action. "Awwwwww no bandwith saving support" etc etc ...

Whenever a new killer board appears everybody gets confused and, like now, there will be people loving the board and there will be people disconsidering it, both without getting to know it for real.

My opinion is: every board will get deprecated. Its pointless to say "i will not buy the board X because the board Y will come out in 6 months and will be better". Considering that not even Board X nore board Y are even out, thus having nothing in your hands to talk about and the term 'better' can't be defined, all discussions/disputes based on cards specs are pointless. Its relative to what you will use it mostly. You like great quality or multimonitor or peak 3D performance, or etc etc, then buy the card that fits the most of your REAL needs (not because you want the atest piece of technology at home). You can never have it all in 1 board. There will always be another board having feature F better but feature F' worse than a chosen reference board.

I like quality, multimonitor and am not really a gamer. I dont know if Parhelia will satisfy those (but i doubt that it wont) but i'm sure i wont say "i wont buy the board because it doesnt have bandwith saving" when i'm sure i will barely notice the lack of bandwitch saving. People tend to be like that way nowadays. Seeking a board with everything at the max, but after the purchase they will use only 60-70% of its power.

Seeing all these reviews, horror stories, reading forums gets any1 confused. Seeking information in reviews, forums, etc is like seeking out information from gaussian noise. You can't really find it. All you get is 50% 'yes' and 50% 'no' (or equal amount for each option). Define your needs and go with the board that satisfies them. Dont hunt the best board only because its best in other people's point of view or because a review said it rocks ... you might end in finding that its not the best for you.

Remember, it's only my opinion.

Cheers,
BoG

p.s. if you have the time, try reading this thread from top to bottom (well not all of it) and you will see how at the start everyone was thrilled and after the 'reviews' (only specs stories), it ended up in some kind of a dispute between the ones who already dont like it because they've read Anand said something, or Tom said saomething (no offense intended at all) and others who already love it or give it credit (which i dont blame).
 

Pocatello

Diamond Member
Oct 11, 1999
9,754
2
76
I guess I can order the GF 4 4200 now without worrying matrox to produce something similar in price and better performance.
 

Aquaman

Lifer
Dec 17, 1999
25,054
13
0
Originally posted by: Czar
mastabog,
extremely good post :D

Great post :D

I'm also tired of people whining and moaning about no benchmarks
rolleye.gif


It, for all intense & purpose is a preview of the technology, like all other new video cards or new tech, when they first are announced.

People have to be patient.......... wait for the real reviews in late june (when reviewers actually have a board), to make an informed choice.

Cheers,
Aquaman
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
"It, for all intense & purpose is a preview of the technology, like all other new video cards or new tech, when they first are announced."

Unless it's an NVidia based card, then you look at everyone's technology previews from the past couple of generations to see what they are finally adding.

"Nothing exciting..."

Though I agree there wasn't anything terribly exciting, I too would like to know what exactly you consider exciting? It's a video card, when is it ever exciting, and what were you expecting?