XGI Volari V8 Duo - Review of Final

McArra

Diamond Member
May 21, 2003
3,295
0
0
PS (both 2.0 and 1.1/1.4) seem really weak
rolleye.gif


It has very strong and very weak points. We'll have to see how drivers mature.

AA performance is horrible.
 

GZFant

Senior member
Feb 18, 2003
437
0
76
Damn, and I was hoping the Volari line of cards was going to be amazing. Then Nvidia and Ati would be shi**ing bricks and have to work that much harder.

There goes nextgen Nvidia and ATi truly kicking ass.

Its like a car market. Volari came in with its dual 4-cylinder honda against Nvidias (bias cause i like Nvidia) McLaren F1 and Ati's Corvette.....

*cough*

that was dumb.
 

beatle

Diamond Member
Apr 2, 2001
5,661
5
81
Too bad... also the cost of putting 2 chips on, the pricey (looking) cooling setup, and the resulting heat of those chips add up to an uncomfortable feeling in the wallet and the case.
 

lchen66666

Senior member
Aug 11, 2000
359
0
0

It took ATI more than 6 months to stabllize the driver since its introduction of Radeon 9700.

I bet it will take much longer time for XGI to come up a good driver. Plus right now
there are better offerrings from ATI and Nvidia.

I definitely will not consider within next 6-9 months.


leland
 

vss1980

Platinum Member
Feb 29, 2000
2,944
0
76
It does quite well in terms of fill-rate (especially multi-texturing) and good old pure T&L (not PS/VS), obviously because its got 2 processors to work on this, and probably lends itself to the good Quake3 performance. It seems like it will be a good card for older types of games that rely on good texturing performance and require a lot of T&L instructions.

If the chips themselves are cheap then maybe the single chip solutions might compare well to the Radeon 9600/GFFX 5600 area of performance. We will have to wait till the full line-up is released.
 

Shinei

Senior member
Nov 23, 2003
200
0
0
And here I was hoping to proudly showcase an XGI Volari in my computer as an alternative to the iffy ATI and weaker nVidia cards. Their AF better be phenomenal at low levels in order to compete with ATI/nVidia for IQ without AA... Maybe in release 2 of the Reactors? I'm really curious to see what these cards can do when given a proper driver, considering that the "RAID 0" GPU setup and massive memory bandwidth is pretty much on par with nVidia's latest offering as far as hardware goes.
[/jumbled thoughts]
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
They say that its about 350 dollars for the card.

They have terrible image quality but it clears up with AA. Fault in drivers. I think that because of their excellent T&L and Fill Rate they show a lot of power and potential. They drivers aren't mature enough with pixel shaders, but a couple months might make this card really shine to be a cheaper and better alternative to both ATi and nVIDIA.

They have better AA or at least similar to ATi. AA performance problems are driver problems. They have a lot of power in the both chips.

Do you think it would have been a lot cheaper to make a 16 pipe chip.
 

stardust

Golden Member
May 17, 2003
1,282
0
0
The card is very strong in high resolution multitexturing, just look at the benchies
 

sandorski

No Lifer
Oct 10, 1999
70,822
6,366
126
It seems strong on some points, but very weak in others. Hopefully they can survive and improve their tech for a few generations. Maybe then we'll have some XGI vs ?? flamewars. :)
 

bpt8056

Senior member
Jan 31, 2001
528
0
0
Drivers can make or break a product in the video industry. They got a lot of work to do if they ever hope to be competitive. I was also impressed by the fillrate of this card and it can only show how much potential it has.
 

Shinei

Senior member
Nov 23, 2003
200
0
0
At this rate, it might be XGI against ATI, unless nVidia gets their act together for NV40... Though with XGI's drivers just out of pre-production, it may be a while before we see real potential, and by then we might be looking at PS/VS3.0 enabled cards, making these models rather obsolete...
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
XGI is a combination of Trident and SiS, one or both of whom "cheated" by lowering texture quality to mud with previous cards, IIRC (look up a Xabre review). If memory serves, the Xabre also misadvertised its pixel pipeline configuration, saying it was 4x2 when it was 2x4. I believe XbitLabs or Digit-Life showed this in a review using 3DMark.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
At this rate, it might be XGI against ATI, unless nVidia gets their act together for NV40... Though with XGI's drivers just out of pre-production, it may be a while before we see real potential, and by then we might be looking at PS/VS3.0 enabled cards, making these models rather obsolete...
LOL
A. nVidia sells more video card processors than ATI
B. You can bleat about PS 2/DX9 all you like but before doing so please list the games where a GF 5900 owner would be severely disadvantaged by this? Only games I can buy at the store now.
C. Besides the wonderful and wildly successful "Tomb Raider Angel of Darkness", it would be fairly difficult to tell the difference between a 5900 and a 9800 in your box
rolleye.gif
 

hans007

Lifer
Feb 1, 2000
20,212
18
81
Originally posted by: shady06
Originally posted by: edmundoab
Originally posted by: Schadenfroh
they shoulda stuck 4 GPUs on there

YEAH! dedicate all to rendering..

at least XGI dont cheat like ATi and Nvidia.. as yet

please refresh my memory about ati cheating...

there was the hardocp , thing about a month ago, where they showed the ATI cards dont draw all the fog and smoke.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
OK, ATi cheated back in the day, I think it was with the original RADEON, but I'm not sure. They had a benchmark detecter just like nVIDIA had, but it was for QuakeII I believe. So, they cheated.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I'm a new supporter of XGI.

nVIDIA and ATi suck compared to XGI.

As if they ever had any competition.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
there was the hardocp , thing about a month ago, where they showed the ATI cards dont draw all the fog and smoke.
Are you referring to AquaMark 3 (which I don't believe [ H ] broke the story on)? Because that wasn't cheating, it was basically an error/quirk of ATi's rounding in hardware (or something like that). More proof that it wasn't cheating? ATi rendering AM3 the exact same way with drivers released before AM3 was released.

Maybe you're talking about something else, though, as I don't keep up with [ H ]'s front page.