• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Nvidia's Future GTX 580 Graphics Card Gets Pictured (Rumours)

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Someone over at hardocp caught amazon early posting a product page for a gtx580. The card was listed at $499.99. The page has since been taken down, but in light of this it's reasonable to believe the $599 price point that was floating around was FUD. I'm guessing the gtx570 is going to come in at $399.99, and both cards will stay around that price point until GF100's can be cleared out and/or AMD releases cayman with aggressive msrp pricing.
 
I will most likely buy a 580 at the $499 price point, unless the 6970? (New flagship single-gpu, correct me I'm wrong) smokes it.
 
I will most likely buy a 580 at the $499 price point, unless the 6970? (New flagship single-gpu, correct me I'm wrong) smokes it.
you already have a gtx480. I think its time to stop spending 500 bucks on gpus and upgrade to a modern i7/i5 quad core first. there are already many games where you would not see the difference between the 480 and 580/6970 anyway.
 
Last edited:
you already have a gtx480. I think its time to stop spending 500 bucks on gpus and upgrade to a modern i7/i5 quad core first. there are already many games where you would not see the difference between the 480 and 580/6970 anyway.

Nope, goin Sandy Bridge. i7 didn't do the trick for me in single/double threaded performance.

I am lucky enough to have a career that allows me to spend money on computer hardware, even if it doesn't make sense sometimes. I like toys. I sell my old stuff to my gaming team (clan I guess?) at prices that allow people who could never afford modern hardware to enjoy them.

Besides, you sound like my SO! 😉
 
Nope, goin Sandy Bridge. i7 didn't do the trick for me in single/double threaded performance.

I am lucky enough to have a career that allows me to spend money on computer hardware, even if it doesn't make sense sometimes. I like toys. I sell my old stuff to my gaming team (clan I guess?) at prices that allow people who could never afford modern hardware to enjoy them.

Besides, you sound like my SO! 😉
well then, carry on. 😉
 
I'd personally wait the extra 10-14 days or so and see what Cayman brings. If Cayman smokes the GTX 580, then just pick a Cayman (which is supposed to be priced under the 580 anyways) and worse case, the 580 gets a price drop. Win-win, especially if you aren't in dire need of a replacement.

Of course, the GTX 580 performance is making me think this should've just been named the GTX 485 instead of a whole new 5xx generation
 
The upcoming Geforce GTX 580 Graphics card from Nvidia has been detailed. Lots of benchmarks and Specs sheets have been leaked of the card which is to be built on the GF-110 Core which is a refined version of the GF-100 Core on which GTX 480/470/465 were built. First of all the Inno 3D based card was showed on the net which featured the reference design shown below along with the PCB design. The card will be powered by a single Six pin and One 8 Pin connector and will consume 375W under full load.

18077_2.jpg



inno3d%20gtx%20580%208+6%20pin.jpg


The card will feature a 1.5GB (1536MB) GDDR5 Memory along a 384-Bit Wide Memory Interface running at 4008Mhz effectively. The clocks of the card are set at 772/1002/1544 Mhz for Core/Memory/Shader Respectively. The card boasts 512 Cuda Cores to provide extreme gaming performance. One really cool thing about the card which can be viewed in the GPU Caps Viewer Below is that the cards temps at idle are 40C which are quite impressive for a beastly card like itself. This means that the heating issues have been solved in the new chip.

gtx580_gpucapsviewer01%20%281%29.jpg


gtx%20580%20pconline.jpg


You can also notice that this card is much quieter as compared to the GTX 480 as shown in the Acoustics bar below:

gtx580_acoustic.jpg


You can also check out the Benchmarks below which show that this card even outperforms 2 x HD 5870 is crossfire meaning that it will easily outperform the HD 5970 which is based on two of the HD 5870 cores. Another Benchmark is also posted below in which it is compared to a HD 6870/ HD5870/ GTx 480 and it tops all of them.




GZzek.jpg

gtx580_slii_vs_hf5870_xfire.jpg


gtx580_unigine_heaven_score.jpg

gtx580_hawx2_score.jpg

gtx580_lost_planet2_score.jpg


This means that Nvidia has finally come up with a graphic card which can give a blow to the upcoming Antilles based dual GPU from AMD. The release date has not yet been finalized but it could be really close. The card is said to be priced at 499.99$.You can check out more news on the GTX 580 in the link below:

Read more: http://wccftech.com/2010/11/06/nvidia-geforce-gtx-580-detailed-specs-revealed/#ixzz14YRWnu00



 
This means that Nvidia has finally come up with a graphic card which can give a blow to the upcoming Antilles based dual GPU from AMD. The release date has not yet been finalized but it could be really close. The card is said to be priced at 499.99$.

Amds 6970 (single GPU) will be within ~5% or less performance differnce of the 580. Some Rumors say its beating the 580.

Amds 6990 (dual GPU single card), will be MUCH better performance than 1x580 card.

Just like the current 5970 is faster than a 480 today.
 
Those charts are retarded. Why do they start at .8 but measure in increments of .2? Look how big it makes the green bars go LOL
 
Last edited:
A 580 is not faster than 2 5870's, a 580 SLI is faster than 5870 Xfire (which is definitely expected as its 1000 to 650ish).

I dont think anyone is expecting this mid-gen to be double the previous performance. That isn't where the goalposts are 😉
 
Last edited:
I know. He just said "You can also check out the Benchmarks below which show that this card even outperforms 2 x HD 5870 is crossfire meaning that it will easily outperform the HD 5970 which is based on two of the HD 5870 cores. Another Benchmark is also posted below in which it is compared to a HD 6870/ HD5870/ GTx 480 and it tops all of them."

I was just correcting him.
 
Amds 6970 (single GPU) will be within ~5% or less performance differnce of the 580. Some Rumors say its beating the 580.

Amds 6990 (dual GPU single card), will be MUCH better performance than 1x580 card.

Just like the current 5970 is faster than a 480 today.


Link to the rumors please.
Actually, nevermind. Just realized how stupid it was for me to ask for confirmation of rumors. LOL.
 
Last edited:
That's pretty cool.
If it's just DX11 code, it should run on AMD hardware as well. I'd like to hear some performance figures 🙂

Actually, it doesn't. According to that.

15fps on the 460 sounds nice. You have a 460 right? You could download it and let us know about performance.
 
Last edited:
Actually, it doesn't. According to that.

Did I miss something?

Edit: Ah, the message box... Seen so many of those in my life that I don't even look at them anymore.
That's fixable though... assuming it doesn't use Cuda code, and I don't see why it would.
 
Did I miss something?

Edit: Ah, the message box... Seen so many of those in my life that I don't even look at them anymore.
That's fixable though... assuming it doesn't use Cuda code, and I don't see why it would.

Could you do it?
 
Could you do it?

Looking into it now...
I see that it uses some Cuda code, but at first sight, it doesn't look like much more than an initialization routine.
So it could be that they just used an off-the-shelf nVidia demo graphics engine, and Cuda is being initialized as a side-effect.
I'd have to remove that code and see what it does on an nVidia system.
If it still works without the Cuda code, then getting it to work on AMD hardware is just a formality from there on in.
I've already noticed that it checks for an 'nvidia' string in the device info. So that part is probably easy to 'fix'.
 
Back
Top