The speculation on Nvidia's dual part has died down in this thread. We should create a friendly bet/poll on what people think will end up forming the x2 GPU part. I personally think it's going to be a hybrid sort of GF110 - it will have all 512 shaders like the gtx580 on each chip, but it will be downclocked to around gtx570 speeds. Further, Nvidia may also opt to cut the memory bus down to 320-bits to help save a little bit more on both keeping it within the 300w thermal envelope AND keeping the price down with slightly less ram.
The speculation on Nvidia's dual part has died down in this thread. We should create a friendly bet/poll on what people think will end up forming the x2 GPU part. I personally think it's going to be a hybrid sort of GF110 - it will have all 512 shaders like the gtx580 on each chip, but it will be downclocked to around gtx570 speeds. Further, Nvidia may also opt to cut the memory bus down to 320-bits to help save a little bit more on both keeping it within the 300w thermal envelope AND keeping the price down with slightly less ram.
Have to totally agree on that one. Remember that the Gtx295 was not a dual 280 or 285, rather it was two of the yet to be released 275s. I would be happy with dual 570s and a little less ram. It wouldn't be a far stretch to go from 1280mg to 1024mg to save on cost. Then we may see the card split later as the gtx 565.
If I had any interset in multi-GPU, my only consern would be performance.
Wattage on a dual-GPU is even more retarded than on a single GPU...it's like whining about the MPG on a Veron (dual V8)...
No, it's not really like that at all.
It's like complaining about a sports car that gets worse gas mileage while being slower in most metrics than a competing car.
You say you are all about performance, than why do you discard multi-GPU setups? There are times when multi-GPU won't work as it's supposed to, so you'll only get single GPU performance. But far more often than not, you'll be faster. A 5970 is/was faster than a GTX480 much more often than it wasn't (and while managing to actually use a little less power!)
It's fine that you like single GPU's, but I don't see how you can claim to only be about performance when that appears not to be the case... your comparrison to cars is flawed.
Russain said it best.
Are we really gonna cry about gpu wattage and overclock are cpu's over 200 watts.
Seems kinda dum ha?
That is why.
Did you hack my account or do we think exactly alike?![]()
Just cause you don't have a need for DP doesn't mean other people don't. There is nothing wrong with covering all your bases.
Pretty much this.Hahahaha I could have kept going with my theories but I stopped just short. I don't think Nvidia can release two gtx570's downclocked and expect to have the hands down fastest dual-GPU solution. And they won't release a dual-card solution unless it's faster than AMD's dual card. So they're either going to break the 300w TDP specification, or they're going to release a 512 core part @ gtx570 speeds and memory bus.
Welcome to the forums Zerocarestate! :thumbsup:
But you claim to only care about performance. So you'd rather have a card that is slower 80% of the time than a dual GPU/card set up? I guess I don't see the advantage when all you want is the most performance... you're gonig to be slower more often than not.
And affected by the other irks of multi GPU, no thanks.
No cookie for you.
(I still run CRT's because I can't stand the I.Q. of LCD's, don't "advice" me to degrade my visual experience).
if I had a larger one I would probably still be using it. dark games look like poop and any lcd I have ever used. in fact I think I will be pulling the crt out of the closet to play Amnesia.People still run CRT's? Uhh
if I had a larger one I would probably still be using it. dark games look like poop and any lcd I have ever used. in fact I think I will be pulling the crt out of the closet to play Amnesia.
Considering the fact that Nvidia has been teasing a dual-card for the greater part 2010, I would be somewhat shocked if a Dual-Fermi ever sees the light of day.
Now, let's say it does happen (and assuming it's in quantities that could be considered an actual product release and not say, 10 cards), I doubt it would be based on the 570/580.
While Nvidia have gotten their thermals and TDP a little more under control, they are still not where they need to be for their top product.
But the question is, will Nvidia actually do something like that? Sometimes, it's not about if it's possible, but if it is business feasible.
You are going to take chips that are normally selling in a certain market at a certain performance range and castrate them, and spend the resources to under-volt them.. for what exactly...?
