R600 supports Quad setups

Furen

Golden Member
Oct 21, 2004
1,567
0
0
So does SLI and look at just how much performance sucks when using it... I dont doubt that ATI's Quad-crossfire implementation will be better, only whether it will be good enough to warrant the insane cost increase compared to dual and single-card setups.
 

hardwareking

Senior member
May 19, 2006
618
0
0
Even if its true i don't see the point of having 4 physical cards in a typical desktop.Even if they go 7950 gx2 style imagine the power consumption.
this'll be limited to the really high end workstations which need that much power.
 

JungleMan1

Golden Member
Nov 3, 2002
1,321
0
0
Originally posted by: hardwareking
Even if its true i don't see the point of having 4 physical cards in a typical desktop.Even if they go 7950 gx2 style imagine the power consumption.
this'll be limited to the really high end workstations which need that much power.
Or to the people who absolutely, absolutely have to have the fastest gaming system, period.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Makes sense as AMD is pushing their 4x4 platform.
4xCPU
4xGPU

afaik they were going to be forced to use Nvidia as a launch partner for this, but it appears they are working on having something in house for us.
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
mmmm

I'm dreaming of the day when I can use 2x Kentsfields CPUs Oced to 3.4 Ghz and 4x x2800xtxs

:: Drool ::

Not to mention 4 GB of Ram. Where would I put my X-Fi Fatality board, though?
 

mooncancook

Platinum Member
May 28, 2003
2,874
50
91
Originally posted by: JungleMan1
Originally posted by: hardwareking
Even if its true i don't see the point of having 4 physical cards in a typical desktop.Even if they go 7950 gx2 style imagine the power consumption.
this'll be limited to the really high end workstations which need that much power.
Or to the people who absolutely, absolutely have to have the fastest gaming system, period.

Quad-crossfire warning: make sure you unplug all eletrical appliances and turn off all lights in the house when running quad-crossfire setup; fire extinguisher highly recommended.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
seems senseless to me... aside from the fact that performance certainly does not scale well, you have to consider costs, thermal output, etc.

dual gpu setups have already shown us they are shortlived; so far every generation a single gpu card has outperformed the previous gen SLI setup.

the marketing and hype by the manufacturers my have convinced some, but I just don't see the sense in it, at least from a consumer standpoint. it's great from the manufacturer's perspective to have gamers sprend $1200 on their gfx tho...
 

terentenet

Senior member
Nov 8, 2005
387
0
0
4 R600? That's insane! They would have to convince the mobo industry to build boards with 4 PCI-E. They would have to change the ATX standard, cause 4 video cards would leave no place for any other card (sound, network, scsi/raid controller).
I look at my computer, I can't even get my hand inside, there's no place left. And I only have 2 video cards and a sound card.
Quad R600 would only be possible in GX2 style, if they can cool 2 of them at once. I trust AMD, they can do it.
Regardless of performance, I'll stick with Nvidia, their solutions seem more "elegant".
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
Meh, with 4 R600's one's Case would melt down before you even finish loading Windows.
 

40sTheme

Golden Member
Sep 24, 2006
1,607
0
0
Originally posted by: mooncancook
Originally posted by: JungleMan1
Originally posted by: hardwareking
Even if its true i don't see the point of having 4 physical cards in a typical desktop.Even if they go 7950 gx2 style imagine the power consumption.
this'll be limited to the really high end workstations which need that much power.
Or to the people who absolutely, absolutely have to have the fastest gaming system, period.

Quad-crossfire warning: make sure you unplug all eletrical appliances and turn off all lights in the house when running quad-crossfire setup; fire extinguisher highly recommended.

Yeah, really... how far can we go without flipping the circuit breaker...
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Reminds me of one of those shaving razor insanities: when you have no better ideas, throw more blades into it. This goes for both Nv and Ati - I'd rather see some real innovation in the IQ and features department.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: munky
Reminds me of one of those shaving razor insanities: when you have no better ideas, throw more blades into it. This goes for both Nv and Ati - I'd rather see some real innovation in the IQ and features department.

The G80 wasn't innovative? :confused:
 
Oct 4, 2004
10,515
6
81
I can see the future...
Dual Quad-Core CPUs
Quad GPUs with SLIFire and HavokPhysX
Quad Raptors in RAID-0

All so we can play the same old Half-Life/Quake/Doom/Unreal/Elder Scrolls/Need for Speed sequels and go gaga over more individual leaves on trees and blood splatters unlike anything seen before.

I think music, movies and video games all have the same potential to be a legitimate art form - unfortunately, all these industries are hell bent on reducing them to mass-produced commodities that look attractive enough to fly off shelves with ease.
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
I don't get some of you guys. You want faster video cards but you hate this idea? Its not like Both AMD + NV arn't making their cards nearly 100% faster then the last gen they also allow those folks who are really into gaming or have spare money to spend it on something they enjoy...
It would be one thing if the next gen cards were barely faster than the last but they are still able to catch up to the previous version in SLI or CrossFire with no problems.
 

SexyK

Golden Member
Jul 30, 2001
1,343
4
76
Hmm, some people will definitely be needing that 1KW+ power supply after all.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Nightmare225
Originally posted by: munky
Reminds me of one of those shaving razor insanities: when you have no better ideas, throw more blades into it. This goes for both Nv and Ati - I'd rather see some real innovation in the IQ and features department.

The G80 wasn't innovative? :confused:

Other than surprising everyone with the unified scalar shader architecture, I'd have to say no. HDR+AA has already been done on x1k cards, angle-independednt AA has been done many years ago, and the default driver settings still use brilinear filtering, although the g80 now doesn't suffer from texture crawling. On top of that, they disabled the supersampling AA modes that have been present in previous generation cards.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: munky
Originally posted by: Nightmare225
Originally posted by: munky
Reminds me of one of those shaving razor insanities: when you have no better ideas, throw more blades into it. This goes for both Nv and Ati - I'd rather see some real innovation in the IQ and features department.

The G80 wasn't innovative? :confused:

Other than surprising everyone with the unified scalar shader architecture, I'd have to say no. HDR+AA has already been done on x1k cards, angle-independednt AA has been done many years ago, and the default driver settings still use brilinear filtering, although the g80 now doesn't suffer from texture crawling. On top of that, they disabled the supersampling AA modes that have been present in previous generation cards.
The G80 was as innovative and impressive as the R300 was IMO.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: SickBeast
Originally posted by: munky
Originally posted by: Nightmare225
Originally posted by: munky
Reminds me of one of those shaving razor insanities: when you have no better ideas, throw more blades into it. This goes for both Nv and Ati - I'd rather see some real innovation in the IQ and features department.

The G80 wasn't innovative? :confused:

Other than surprising everyone with the unified scalar shader architecture, I'd have to say no. HDR+AA has already been done on x1k cards, angle-independednt AA has been done many years ago, and the default driver settings still use brilinear filtering, although the g80 now doesn't suffer from texture crawling. On top of that, they disabled the supersampling AA modes that have been present in previous generation cards.
The G80 was as innovative and impressive as the R300 was IMO.
However, the g80 and the r300 both made some IQ sacrifices to boost performance. And now that I think of it, the were both the first cards from Ati and Nvidia to drop the fullscreen SSAA support.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SickBeast

The G80 was as innovative and impressive as the R300 was IMO.

I would still list the 6xxx series at the top. SM3.0, Purevideo, HDR, SLI, etc.

Although the G80 could have a trick up it's sleeve with those "stream processors".