the new Alienware Video Array

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: MercenaryForHire
Originally posted by: Naustica
Originally posted by: MercenaryForHire
the new Alienware Video Array ...

... is an overpriced POS that will be available on every other PCIe board in short order, making this purchase only mildly less moronic, as you're still spending twice what you need to just for the logo.

- M4H


Harsh but true.

Yeah, that about sums me up as a whole. :D

- M4H


Perhaps you didn't read exactly WHAT this does. It's not simply simply running two cards on two or more monitors, it's using two cards in PARALLEL to render one image. Think of it as a Video RAID setup.
 
Jan 31, 2002
40,819
2
0
Originally posted by: Creig
Perhaps you didn't read exactly WHAT this does. It's not simply simply running two cards on two or more monitors, it's using two cards in PARALLEL to render one image. Think of it as a Video RAID setup.

Really, I mean, I never would have guessed ... having replied on this issue the other twenty times it was posted on various forums, having already compared it to the SLI setup of the olden-day Voodoo cards, and alluding to the fact that Metabyte's PGC tech was purchased by Alienware back in the TNT2 era? :roll:

I got two words for you - driver issues. Alienware hopes to do this by splitting the screen into two halves. I'd like to see how AA and AF is handled across the divide, let alone the fundamental vertex/texture data of planes/objects that cross. Let's all think back to the last time ATI tried to mate two cards ... all together now, Rage Fury MAXX. ;)

And then shall we say another series of bolded buzzwords? CPU limited. Even with a single R420/NV40 card, you need the fastest chip possible to avoid being choked at the processor. By that theory, you'll need a pair of those to keep up.

When I see some working silicon, flawless screenshots, and the same compatibility as a single-card - then I'll pay attention. But by then, I'll be paying attention because another OEM made it better, faster, stronger, and cheaper.

- M4H
 

Buz2b

Diamond Member
Jun 2, 2001
4,619
0
0
Originally posted by: MercenaryForHire

Really, I mean, I never would have guessed ... having replied on this issue the other twenty times it was posted on various forums, having already compared it to the SLI setup of the olden-day Voodoo cards, and alluding to the fact that Metabyte's PGC tech was purchased by Alienware back in the TNT2 era? :roll:

I got two words for you - driver issues. Alienware hopes to do this by splitting the screen into two halves. I'd like to see how AA and AF is handled across the divide, let alone the fundamental vertex/texture data of planes/objects that cross. Let's all think back to the last time ATI tried to mate two cards ... all together now, Rage Fury MAXX. ;)

And then shall we say another series of bolded buzzwords? CPU limited. Even with a single R420/NV40 card, you need the fastest chip possible to avoid being choked at the processor. By that theory, you'll need a pair of those to keep up.

When I see some working silicon, flawless screenshots, and the same compatibility as a single-card - then I'll pay attention. But by then, I'll be paying attention because another OEM made it better, faster, stronger, and cheaper.

- M4H

I respect your "lifer" status and knowledge on this issue but have you read the FAQ page from Alienware on this product? Granted, they will always present the "best case scenario" when possible but I think they address the driver issue pretty well. As for the CPU limited issue, keep in mind that each card is working with the faster bandwidth of PCI express and is also NOT required to render the entire screen. Collectively they do; but not individually. I would think, (just using logic here, NOT techno know-how) that this would off-set the taxing of the single cpu by two cards. Look, bottom line is that I don't think Alienware would make such a big "to-do" about the release of this product without some sort of proof of technology being done. They are debuting it at the E3 conference also, where they have "working" models up and running. Yes, it's a long way from being in our homes (even if we could afford it) but I would say it does show some promise. Also, keep in mind that, different from previous attempts at this, they have a sort of "controller" (referred to as "software solution as well as a video merger hub") that works with the system. Look, I'm as much as a "show me if you can really do it" kind of guy as the next but I think you are being a bit pessimistic on this one.
 
Jan 31, 2002
40,819
2
0
Originally posted by: Buz2b
I respect your "lifer" status and knowledge on this issue but have you read the FAQ page from Alienware on this product? Granted, they will always present the "best case scenario" when possible but I think they address the driver issue pretty well. As for the CPU limited issue, keep in mind that each card is working with the faster bandwidth of PCI express and is also NOT required to render the entire screen. Collectively they do; but not individually. I would think, (just using logic here, NOT techno know-how) that this would off-set the taxing of the single cpu by two cards. Look, bottom line is that I don't think Alienware would make such a big "to-do" about the release of this product without some sort of proof of technology being done. They are debuting it at the E3 conference also, where they have "working" models up and running. Yes, it's a long way from being in our homes (even if we could afford it) but I would say it does show some promise. Also, keep in mind that, different from previous attempts at this, they have a sort of "controller" (referred to as "software solution as well as a video merger hub") that works with the system. Look, I'm as much as a "show me if you can really do it" kind of guy as the next but I think you are being a bit pessimistic on this one.

Alienware's driver handling can be summed up as - "It'll work. We promise. No, really. We'll make it work this time." That demo video is cute - but then again, I can make high-bandwidth prerendered video too and fake a four-way Video Array in a heartbeat. And if this is the "next generation of no-compromises video cards" ... why are they running only DX7 (3DMark03 Test 1) and elementary OGL (Q3A) applications instead of showing off how badass Far Cry looks at 2048x1536 with all the goodies on?

CPU limiting - right now, there isn't a single CPU fast enough to pump out the requisite data to saturate an AGP8X pipe, never mind a pair of PCIe x16 slots. A pair of R420/NV40 cards will likely outrun a dual-CPU system similarily.

And yes, I'm being pessimistic. I haven't seen anything that should indicate I should act otherwise. If Alienware actually manages to pull this off flawlessly, I'll be genuinely impressed and more than happy to eat a serving of crow with a side of humble pie.

- M4H
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,738
156
106
it already costs like 400 or 500 for a topend card

how many of you would pay 1000 for a video card setup like this if you could only get 25 or 30% performance increase in farcry for example ??
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,738
156
106
Originally posted by: Buz2b
Originally posted by: Creig
When I see some working silicon, flawless screenshots, and the same compatibility as a single-card - then I'll pay attention.


Then pay attention. http://www.pcper.com/images/reviews/43/Alienware_VideoArray.wmv (for working silicon, at the very least).


Boy, that link sure doesn't like Mozilla! Got it fine with IE but Firefox choked on me.


netscape and netscape-like browsers hate wmv unless you got the right plugins to handle it.
crashed netscape for me too :)
 

Marsumane

Golden Member
Mar 9, 2004
1,171
0
0
If this actually works though, the plus will be situations where memory bandwidth and gpu fillrate limits are decreased by a large margin. AA and AF are going to be easier to implement, but when the cpu is a factor, such as in physics calculations, you wont see as much gain. So basically this is more of an eye candy upgrade. And who will buy this? The same ppl that would buy the top end cards today (2x the cost for 25% better performance ie xt and ultra cards)
 

ForceCalibur

Banned
Mar 20, 2004
608
0
0
No, it won't cost 1000 dollars. More like 400 dollars for two cards, (LIke Radeon 9800 Pros). As time passes, the X800XT will drop a little bit, (not much until R480/R500), but the X800Pro will drop like a F8Cken rock, down to where Radeon 9800 Pros are right now.

Sure if your an early adopter of new gen vid cards, it will cost you 1000 maybe, but its for the super enthusiast market, most of us couldn't care less.
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,738
156
106
the dual pci express capable boards won't be cheap either i bet :)

kinda reminds me of dual channel memory hehe :)


yeah i think it would be great to throw two budget cards in this config and get the performance of a top-end card or better :)
 

mrwxyz

Senior member
Feb 7, 2004
334
0
71
Originally posted by: Soulkeeper
yeah i think it would be great to throw two budget cards in this config and get the performance of a top-end card or better :)

now thats something i can go for
 
Jun 14, 2003
10,442
0
0
Originally posted by: Creig
When I see some working silicon, flawless screenshots, and the same compatibility as a single-card - then I'll pay attention.


Then pay attention. http://www.pcper.com/images/reviews/43/Alienware_VideoArray.wmv (for working silicon, at the very least).

hahah i like the way the guy quickly moves in front of the monitors displayin 3d mark 03.........he like oh f**k its knacked....cover it up!

but IMO i think this is pretty pointless.....i mean the 2 halfs have to be stitched to gether by another piece of hardware right? so wouldnt the performance rest on how well this other hardware can handle things?
 

Buz2b

Diamond Member
Jun 2, 2001
4,619
0
0
If nothing else it shows that, first, they DO HAVE the working systems in place and are gearing for the market. Second, you have to keep in mind that the figures shown (whether true or not) will undoubtedly be a bit inflated; that is the nature of those kinds of marketing stats. That being said, if it only offers 35-40% instead of 77% average improvement, there will still be a lot of folks trying to figure out how they can put together enough scratch for one of these systems. I'm not a gamer so I wouldn't be one of them but I do appreciate any and all new developments that attempt to raise the bar on performance; no matter what area it is intended for. :D
 

somekid617

Member
Mar 27, 2004
143
0
0
The game looks sorta glitchy while i was watchign teh movie.... was it me, or did quake 3 look all distorted and funky? They prolly took an old voodoo card, took out whatever made the 2 of em work together, then sent it to some manufacterer to make/enhance it
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Lol i doubt that.

Hey guys just out of curiosity... why the hell are they running a single xeon processor. If they watn highend Intel WHY OH WHY go Xeon... 533FSB CMON... highend Intel+ P4 Emergency Edition (Extreme). And if they really wanted to show it off slap in a pair of Athlon64FX 53's. I think they can make this system better lol... yeah seriously... they may have maxed out video cards but a XEON. The performance games are impressive though.

-Kevin
 
Jun 13, 2004
28
0
0
The systems come with FX-53's, start at $4,800 USD. Dual x800XT's or 6800U's, take your pic. The load balancing system looks to work pretty well. I'd get it if I could afford it. They come stocked with liquid cooling as well. And you can't really run an ATI in one slot and NV in the other, drivers have to match up. And if you don't use the same card, the load will be very off balance, which takes away from the performance gain in the first place.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Pete
Someone at B3D reminded us that this is similar to what Alienware tried to do previously. They bought custom tech from Metabyte (remember them? :)) that allowed them to SLI anything, as it worked on a software level. Should be interesting, but a 700W PSU seems ridiculous.

Ahh! That makes perfect sense then. I was just thinking about MetaByte's Wicked3D drivers (I think that was them), useful for using those LCD 3D shutter-glasses for games not originally designed to support them. They were some sort of "shim" driver, between the game and the actual hardware D3D/OpenGL drivers.
I was wondering to myself if AlienWare's software setup was similar, and to hear that they actually bought the technology, makes so much sense now. Thanks for the info.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: fwtong
This will come in handy for the day when games sponsered by Nvidia will only work with Nvidia cards and games that are sponsered by ATI will only work with ATI cards.

Sounds... just like the days of older console game systems!

Ugh. :(
 

T9D

Diamond Member
Dec 1, 2001
5,320
6
0
Wow you guys are so negative.

This is an awesome technology. I'm excited about it. So much potential here.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: tk109
Wow you guys are so negative.

This is an awesome technology. I'm excited about it. So much potential here.

It only has potential if we have a CPU that can actually push that much graphics power. Honestly, what's to get excited about if you spend more than twice as much for your graphics cards only to get a 30% boost in most situations?

Once games are multithreaded I can see technology like this having it's place. As of right now, we are CPU limited even with just a single 6800U in most situations.