• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Alienware video array benchmarks!!

3DMark 03 is probably the worst benchmark they could have used. We all know it is almost 100% video card bound which is in stark contrast to most games today which are CPU bound. So in a video bound application, adding a second card can up to double performance? Some real breaking news there... Let's wait until we see some REAL game benchmarks.

Didn't Alienware announce the prices for some of their ALX systems already? I believe the high end default configuration was already over $5000, and was single CPU, no video array technology. Lord knows what a fully decked out SMP/video SLI system will cost.
 
Yes I know no one here will actually buy 1 of those machines, not at their prices the impressive implications of the technology though are if the technology is licenced out to other makers or other companies develope their own similar technology after seeing its possible.
 
Originally posted by: SgtZulu
I'm sure everyone here has $5000 burning a hole in their pocket to blow on a PC :roll:
Of course not... But that certainly doesn't mean it isn't an interesting topic.
 
Originally posted by: Wingznut
Originally posted by: SgtZulu
I'm sure everyone here has $5000 burning a hole in their pocket to blow on a PC :roll:
Of course not... But that certainly doesn't mean it isn't an interesting topic.

I dunno, I'm definitely hardcore into tweaking my own PC, but you can't argue with 77% improvement @ that level of detail. How about buying a stripped down one and putting in all your own parts, and selling off the stock ones? 😀
 
How hard would it have been for them to run a few UT2K4 & Far Cry benchmarks? What a joke.....
 
Originally posted by: Nebor
Originally posted by: Wingznut
Originally posted by: SgtZulu
I'm sure everyone here has $5000 burning a hole in their pocket to blow on a PC :roll:
Of course not... But that certainly doesn't mean it isn't an interesting topic.

I dunno, I'm definitely hardcore into tweaking my own PC, but you can't argue with 77% improvement @ that level of detail. How about buying a stripped down one and putting in all your own parts, and selling off the stock ones? 😀
You would basically have to use all their parts; the 2x 16x PCI-E motherboard, the Xeons, the video cards, the merger hub, the power supply, probably the case, and you'd be smart to keep the liquid cooling system too(2 GPUs + 2 CPUs = hot).
 
Originally posted by: Xernex
Yes I know no one here will actually buy 1 of those machines, not at their prices the impressive implications of the technology though are if the technology is licenced out to other makers or other companies develope their own similar technology after seeing its possible.
Didn't ATI do something similar with their Rage-MAX GPUs?

Sound like it could even be a "budget" solution with two less than top-of-the line GPUs
 
From what I understand although their approach might be proprietary, the technology can be utilized in any board with two pci express slots, which means that this could come to the masses soon enough. Them using xeons is ridiculous, unless they are the Xeons with 1mb of L2 (and even then an FX would probably still beat it in some games). Of course AMD doesn't have the chipset for this yet.

Exciting, but probably too expensive. I'd have to see the benchmarks at lower resolutions. Even doing this yourself would cost a fortune, I'm sure the motherboard supporting this (non multi cpu) will cost upwards of $200-250.
 
Originally posted by: apoppin
Originally posted by: Xernex
Yes I know no one here will actually buy 1 of those machines, not at their prices the impressive implications of the technology though are if the technology is licenced out to other makers or other companies develope their own similar technology after seeing its possible.
Didn't ATI do something similar with their Rage-MAX GPUs?

Sound like it could even be a "budget" solution with two less than top-of-the line GPUs

I don't remember if they licensed them or not, I don't think that anyone used ATi's AFR (Alternate Frame Render) technique. 3dfx used SLI (scan line interlieve), and I can't find what the Matrox cards used. Alienware actually developed a way to use two 3dfx cards together (NOT the voodoo 2's), but by the time they got it working and ready to sell, the Geforce came out and stomped all over the Voodoo3. This is a distant child of that technology.
 
apoppin. my understanding is that linking together GPUs like that is getting harder and harder, due to high speeds. There's a point where it's better to link than build a better GPU, but that's at the high end, when the costs of a 400mm^2 GPU die would be more expensive than 2 GPUs + linkage.

PS remagavon, I believe the Xeons are due to the chipset used: Tumwater. Evidently, there aren't enough PCI-E channels on a 915/925 chipset for 2 PCI-E 16x slots(you need 32 channels, 915/925 has 20)
 
Originally posted by: ViRGE
apoppin. my understanding is that linking together GPUs like that is getting harder and harder, due to high speeds. There's a point where it's better to link than build a better GPU, but that's at the high end, when the costs of a 400mm^2 GPU die would be more expensive than 2 GPUs + linkage.

PS remagavon, I believe the Xeons are due to the chipset used: Tumwater. Evidently, there aren't enough PCI-E channels on a 915/925 chipset for 2 PCI-E 16x slots(you need 32 channels, 915/925 has 20)

I see, the number of channels didn't occur to me. Thanks for the info 🙂 (damn I don't want to buy xeons again, I had dual 2.4s that cost me a fortune and didn't perform that much smoother than a single cpu system, and I do a lot of things at once too.)
 
Originally posted by: Xernex
Yeah why would they use Xeons....

If you were Alienware, would you rather go with a first generation PCI-e motherboard based on Intel tech or use an AMD compatible (VIA, SiS?) board?
 
Originally posted by: Pariah
Originally posted by: Xernex
Yeah why would they use Xeons....

If you were Alienware, would you rather go with a first generation PCI-e motherboard based on Intel tech or use an AMD compatible (VIA, SiS?) board?

If I were Alienware, I would be like, WTF, I'm a company? 😕
 
Back
Top