• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GeForce GTX 590 at PAX 2011?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
"The Masses" have been waiting for a dual GPU nVidia card for a lot longer than they've been waiting for the 590. They haven't had a competitive dual GPU card for over a year now.

Why did/does Nvidia need a dual card they allready have/had the gtx580 that gives you the best gaming experience of any card on the planet.

SImple answer is they don't, untill AMD releases there dual 6990.
Mabe thats why the dual 4 series was cancelled? Why make one dual 4 series card when theres really no competition and its gonna get overtaken by the 6990 anyway?
 
^ Yea I dont know why anyone would use the dual card of either side when SLI/X-fire boards are a dime-a-dozen now, not like SLI on 775.

Get two cards you can independently OC with better cooling.

These cards are only going to bring the same performance you can already achieve now.
 
how does the 570 core have 512 sp's 😱

512sp is the 580.

And the 590 is suposed to be 2 570's.

So how does it have 1024 sp's on 570cores?


you know what lately im not impressed at all with nvidia's launch windows.
They like to stir the masses, and then make us wait by holding inventory until its too late.

And the ones remaining are the true nvidia lovers.

Believe me, if my last cards werent ATI, id be on ATI right now and not have suffered the understock bullshit i went though to get my 580's

the 590's are gonna be even more bullshit to pull inventory off of.

The 570's have 480 SP's. I don't know what the 590's will or will not have. However, if one looks at nVidia's past offerings they have gone with full SP's and wouldn't be surprised to see this again if there is indeed a dual GPU Sku.

I don't understand what is or what is not a true nVidia lover.
 
Why did/does Nvidia need a dual card they allready have/had the gtx580 that gives you the best gaming experience of any card on the planet.

SImple answer is they don't, untill AMD releases there dual 6990.
Mabe thats why the dual 4 series was cancelled? Why make one dual 4 series card when theres really no competition and its gonna get overtaken by the 6990 anyway?

They're nice Sku's that allow Sli into single slot motherboards to me, or can allow Quad with two slots.
 
So the dual bios is difficult? Or do you think only AMD can do it? There is a dual bios gtx560, you do know that don't you?

Never said anything about the dual BIOS not being possible. I'm just stating that making a card that's faster at 300W than what AMD can make will be the challenge. This considering that the Fermi based cards are less efficient than AMD's latest offerings, perf/W. Doesn't mean that nVidia can't work some kind of software solution to help. I'm curious to see what they do. I just don't think they'll release a reference design that doesn't meet PCIE spec and isn't faster than what AMD can come up with. I think they are both playing a bit of cat and mouse with each other to see what the other can offer. We'll have to see who flinches first.

We don't know much about either upcoming card, so it is hard to say that. My guess is that for most people it really wont matter which one you get, and that it will come down to pricing.

BTW I hope you and your family are OK (looking at your location).

For many, I agree and pricing will be the overriding concern. They'll both be fast as stink. 😀

For some the only thing that will matter though is which is faster. There are people who will pay extra for 110/fps over 105/fps. Others are just fans of one brand over the other and will buy that company's offering. Others prefer one's feature set.

I'm figuring that nVidia is working as hard as they can to overcome AMD's efficiency advantage. Like most, I'm just interested in seeing the tech advance. Not like a lot of us are actually waiting to buy either one.

Thanks for wishing me well. My family, friends, and myself are all fine. I'm blessed. Just a bit inconvenienced with bored and screaming women and children. 😉
 
I'm just stating that making a card that's faster at 300W than what AMD can make will be the challenge

With a dual bios they can make the first bios at 600 core and the second bios at any clocks they want over 300 watts. So to me 375 watts is possible with 2 8 pin connectors and a dual bios. Mabe they can call it a turbo button or something?

How hard would that be?
 
With a dual bios they can make the first bios at 600 core and the second bios at any clocks they want over 300 watts. So to me 375 watts is possible with 2 8 pin connectors and a dual bios. Mabe they can call it a turbo button or something?

How hard would that be?

I don't know, won't they be flagged for exceeding the PCI-E specs with the 2nd bios?
 
With a dual bios they can make the first bios at 600 core and the second bios at any clocks they want over 300 watts. So to me 375 watts is possible with 2 8 pin connectors and a dual bios. Mabe they can call it a turbo button or something?

How hard would that be?

Hard? I'd imagine not too hard. Will they a reference card that breaks the spec, though? I wouldn't if I were running either company, and I'll be disappointed if they do that rather than engineer a proper solution. We'll have to see what happens.
 
I don't know, won't they be flagged for exceeding the PCI-E specs with the 2nd bios?

Do they get flagged for advertising a overclocking spec that exceeds 300 watts? I doubt it.

Like I said they could just call it a turbo button, it would be the same as overclocking a 5970 or gtx295 over 300 watts.

I think its a great idea for both companies.
 
Hard? I'd imagine not too hard. Will they a reference card that breaks the spec, though? I wouldn't if I were running either company, and I'll be disappointed if they do that rather than engineer a proper solution. We'll have to see what happens.

I'd imagine that both the FULL 6990 and gtx580 with exceed 300 watts, so mabe they will both use this idea?
 
I'd imagine that both the FULL 6990 and gtx580 with exceed 300 watts, so mabe they will both use this idea?

What do you mean by FULL? Are you saying at the same clocks as the single GPU cards? If so, then I agree. If you mean with fully functioning chips (all shaders, etc.)? Not necessarily. The 5970, would have exceeded 300W too, if released at full clocks. They didn't release it though at full clock speeds just for that reason.
 
Do they get flagged for advertising a overclocking spec that exceeds 300 watts? I doubt it.

Like I said they could just call it a turbo button, it would be the same as overclocking a 5970 or gtx295 over 300 watts.

I think its a great idea for both companies.

MY palit 4870 had one of those... always left it on turbo, and I'm sure it's far less than 300W at full draw...

still, use it/don't use - it can and has been done.
 
19 days left on my "step up"

might pull the trigger with EVGA if something happens before then for my 580, otherwise, will wait to add a 2nd 580 some months down the road
 
19 days left on my "step up"

might pull the trigger with EVGA if something happens before then for my 580, otherwise, will wait to add a 2nd 580 some months down the road

You can't step-up in the same generation, right? I recently bought two EVGA 580s and, well, if the 590 turns out to be a monster... 🙂
 
Back
Top