- Jul 4, 2005
- 4,064
- 89
- 91
Or to the people who absolutely, absolutely have to have the fastest gaming system, period.Originally posted by: hardwareking
Even if its true i don't see the point of having 4 physical cards in a typical desktop.Even if they go 7950 gx2 style imagine the power consumption.
this'll be limited to the really high end workstations which need that much power.
Originally posted by: JungleMan1
Or to the people who absolutely, absolutely have to have the fastest gaming system, period.Originally posted by: hardwareking
Even if its true i don't see the point of having 4 physical cards in a typical desktop.Even if they go 7950 gx2 style imagine the power consumption.
this'll be limited to the really high end workstations which need that much power.
Originally posted by: mooncancook
Originally posted by: JungleMan1
Or to the people who absolutely, absolutely have to have the fastest gaming system, period.Originally posted by: hardwareking
Even if its true i don't see the point of having 4 physical cards in a typical desktop.Even if they go 7950 gx2 style imagine the power consumption.
this'll be limited to the really high end workstations which need that much power.
Quad-crossfire warning: make sure you unplug all eletrical appliances and turn off all lights in the house when running quad-crossfire setup; fire extinguisher highly recommended.
Originally posted by: munky
Reminds me of one of those shaving razor insanities: when you have no better ideas, throw more blades into it. This goes for both Nv and Ati - I'd rather see some real innovation in the IQ and features department.
Originally posted by: Matt2
HAHA.
Anyone who drops $2400 on 4 GPUs should die.
Originally posted by: Nightmare225
Originally posted by: munky
Reminds me of one of those shaving razor insanities: when you have no better ideas, throw more blades into it. This goes for both Nv and Ati - I'd rather see some real innovation in the IQ and features department.
The G80 wasn't innovative?![]()
The G80 was as innovative and impressive as the R300 was IMO.Originally posted by: munky
Originally posted by: Nightmare225
Originally posted by: munky
Reminds me of one of those shaving razor insanities: when you have no better ideas, throw more blades into it. This goes for both Nv and Ati - I'd rather see some real innovation in the IQ and features department.
The G80 wasn't innovative?![]()
Other than surprising everyone with the unified scalar shader architecture, I'd have to say no. HDR+AA has already been done on x1k cards, angle-independednt AA has been done many years ago, and the default driver settings still use brilinear filtering, although the g80 now doesn't suffer from texture crawling. On top of that, they disabled the supersampling AA modes that have been present in previous generation cards.
However, the g80 and the r300 both made some IQ sacrifices to boost performance. And now that I think of it, the were both the first cards from Ati and Nvidia to drop the fullscreen SSAA support.Originally posted by: SickBeast
The G80 was as innovative and impressive as the R300 was IMO.Originally posted by: munky
Originally posted by: Nightmare225
Originally posted by: munky
Reminds me of one of those shaving razor insanities: when you have no better ideas, throw more blades into it. This goes for both Nv and Ati - I'd rather see some real innovation in the IQ and features department.
The G80 wasn't innovative?![]()
Other than surprising everyone with the unified scalar shader architecture, I'd have to say no. HDR+AA has already been done on x1k cards, angle-independednt AA has been done many years ago, and the default driver settings still use brilinear filtering, although the g80 now doesn't suffer from texture crawling. On top of that, they disabled the supersampling AA modes that have been present in previous generation cards.
Originally posted by: SickBeast
The G80 was as innovative and impressive as the R300 was IMO.