EVGA offers a sneak peek at Nvidia's next dual-GPU monster

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Another fail is also his grammar, consern :biggrin:
Irony
But it would be interesting to use a mobile part to create the GTX 595, but also would means that it would underperform.

:whiste:


This post is propagating a pre-existing thread-derail. Please don't do this anymore.

Either contribute to the thread's topic or refrain from posting in the thread.

Moderator Idontcare
 
Last edited by a moderator:

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
I think a 480 oc'd at full load could pull close to 400
nVidia measured TDP as the average power consumption with the GTX 480 and OC'd is not specified by nVidia themselves so they haven't really passed the limit yet as far as official specifications are concerned.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
English is far from my first language, I speak 4...but thanks for the Ad Hominem :thumbsdown:

And probably far from being your third language too!! :awe:

Irony


:whiste:

At least I speak two languages and practicing Italian, you only speaks one language, English nVidia US which shows your lack of execution resources to accomodate additional registers and logic for more language and instruction support which means that you have the computing power equivalent to a Celeron 1.7Ghz based on Williamette. Hence, that's the cause of the issue that shows clearly that no one here can sustain a smart conversation with you due to the lack of computing resources and instruction sets, thank you! :awe:

Back on topic again, I can't wait to see what nVidia and AMD has next on terms of dual GPU solution, nVidia's last dual solution was the GTX 295 which was released centuries ago. Gotta work now, good nite everybody. :)


You are attacking the poster and making it personal rather than deftly presenting evidence to the contrary of the contents of the post in question.

Please reconsider your approach.

AnandTech Forum Guidelines
We want to give all our members as much freedom as possible while maintaining an environment that encourages productive discussion. It is our desire to encourage our members to share their knowledge and experiences in order to benefit the rest of the community, while also providing a place for people to come and just hang out.

We also intend to encourage respect and responsibility among members in order to maintain order and civility. Our social forums will have a relaxed atmosphere, but other forums will be expected to remain on-topic and posts should be helpful, relevant and professional.

We ask for respect and common decency towards your fellow forum members.

Your moderation staff joins your fellow forum members in asking you to please help us in keeping this forum civil, respectful, and open-minded towards ALL of its members.

If you have nothing good to say regarding any one of your fellow members then please refrain from posting your personal opinions regarding them.

If you feel you have a valid concern or complaint regarding a post or poster then you are expected to report the post and restrict your communication of this concern to the comment box therein and leave the matter to the discretion of your moderator staff.

Moderator Idontcare
 
Last edited by a moderator:

DrPizza

Administrator Elite Member Goat Whisperer
Mar 5, 2001
49,601
166
111
www.slatebrookfarm.com
Phasers are set to "ban" - get back on topic or I'll remove those derailing the thread and having little side discussions insulting each other about grammar.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I think whatever dual card Nvidia brings out has to be a GF110 card. It could be two slightly downclocked gtx570's (to stay within 300w TDP) or it could be two downclocked gtx580's with one memory controller disabled (a hybrid gtx570/580). Personally I believe that with AMD's much more competitive crossfire scaling in their 6000 series parts, Nvidia will probably need something better than two downclocked gtx570's to regain the undisputed fastest card crown. So my guess is that this card will feature 2 512 core GPU's with 320-bit memory buses.
 

dust

Golden Member
Oct 13, 2008
1,328
2
71
IMO

I think having two GF110 would make sense. Two 460's wouldn't make sense at all, especially now, after the new releases and the upcoming 6990. They would have already done it by now if it was worthwhile.

Two 560's wouldn't pose much of a threat probably to the 6990(which would make sense to have at least two 6950) but, it could also happen.

BTW the 460 was also launched with low clocks and the oc is never guaranteed, but it seems Nvidia found the sweet spot for the clocks, the card performs plenty for the bucks at stock and people have pretty good oc-ing results with just about all of them, and the card is probably selling very good still so I don't believe this would be a reason for them to up the clocks from start for a dual GPU card based on whatever options.

But I do believe they will shoot at the absolute crown with the card, they always do, and that might be enough reason for them to use two highly clocked 560's or two 570's with as less as possible reduced clocks.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Sweet looking. Should put smiles on the NV fans. Nice looking card tho.

Who releases first AMD with te 6990 or nv with 585/590. lots of power coming to us . But the game choice for these high powered cards is a joke.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Sweet looking. Should put smiles on the NV fans. Nice looking card tho.

Who releases first AMD with te 6990 or nv with 585/590. lots of power coming to us . But the game choice for these high powered cards is a joke.


Card looks awsome. I too am left wondering what games are going to realize their potential above and beyond what is being offered from cards half the price of what this will likely debut at.

But it's not a bitter complaint, lots of great games to play now, even if they haven't moved as quickly towards the visuals of Heaven that had me drooling over a year ago (year and a half?).
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Sweet looking. Should put smiles on the NV fans. Nice looking card tho.

Who releases first AMD with te 6990 or nv with 585/590. lots of power coming to us . But the game choice for these high powered cards is a joke.

Crysis 2
Crysis-2-wallpaper-2049.jpg


Speaking of 'BIG' video cards, over at TPU
ASUS Shows Off Humongous Radeon HD 6970 DirectCu II Graphics Card

73a.jpg
 
Last edited:

MrHydes

Junior Member
Jan 16, 2011
7
0
0
makes me wonder whos writting the drivers?

has to be EVGA, but guys who buy this card will they have a good/same

support as nvidia does.

Because no one talks about this, it's weird
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
The GTX460 is a fantastic overclocker from what I have seen. I wouldn't be surprised if this card was downclocked and sold as a very overclockable card, like the 5970.

But out of the box, I think two GF104's (or maybe even two GF114's) might have a problem being clearly faster than two Cayman based GPU's.

I'm not buying either of those cards, so I don't really care. :) But it will be nice to see the new fastest cards.

*edit - Also, Nvidia has shown us with the GTX480 that they have no problem labeling a card at a lower number than what the card actually uses. I thought there was a review out there that showed the GTX480 pulling over 300 watts in Crysis, nevermind Furmark.

I cant see the point of NV releasing a dual card unless its for the fastest card crown!
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
The speculation on Nvidia's dual part has died down in this thread. We should create a friendly bet/poll on what people think will end up forming the x2 GPU part. I personally think it's going to be a hybrid sort of GF110 - it will have all 512 shaders like the gtx580 on each chip, but it will be downclocked to around gtx570 speeds. Further, Nvidia may also opt to cut the memory bus down to 320-bits to help save a little bit more on both keeping it within the 300w thermal envelope AND keeping the price down with slightly less ram.
 

MrHydes

Junior Member
Jan 16, 2011
7
0
0
Am I alone wondering who will write the drivers? is clearly not a nvidia's reference version, so i think it's a legitimate question
 

MrHydes

Junior Member
Jan 16, 2011
7
0
0
you are discussing the vram amount 1024MB or 2, well that's pointless!!!

for christ sake, don't you know that when you have 2 gpu's the data stored

in GPU0 ti's mirrored in GPU1? jzzz :colbert:

well at least it happens both wih SLI & CF Alternate frame or Split frame whatever
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
you are discussing the vram amount 1024MB or 2, well that's pointless!!!

for christ sake, don't you know that when you have 2 gpu's the data stored

in GPU0 ti's mirrored in GPU1? jzzz :colbert:

well at least it happens both wih SLI & CF Alternate frame or Split frame whatever

why is it pointless, a dual card such as this well be for 25x16 res, which is where you need 2Gb of RAM p/GPU, therefore 4in total!
 

MrHydes

Junior Member
Jan 16, 2011
7
0
0
why is it pointless, a dual card such as this well be for 25x16 res, which is where you need 2Gb of RAM p/GPU, therefore 4in total!

what matters the vram amount if you don't even get decent drivers?

first you should be worried thinking can they deliver a good support?

because cards like this one without proper drivers you won't be able to

play decently with filters ON, high RES, and still have good frame rate

i doubt it, so what matters the mem amount. it might even come

with 4GB P/ gpu but if the support sucks then the card sucks.
 
Last edited: