• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Nvidia's Future GTX 580 Graphics Card Gets Pictured (Rumours)

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Early last month, we reported about AMD's Radeon HD 6000 Series being in trouble, and that NVIDIA is apparently brimming with confidence and ready to counter-attack. In the past two weeks or so, production at the rumor mill has ramped up; we have seen leaked photos of the NVIDIA GeForce GTX 580 card itself, its heatsink, and even supposed specifications.
In our review of the AMD Radeon HD 6870 and HD 6850 cards, the NVIDIA GeForce GTX 460 1GB and 768MB versions performed decently against both AMD cards. In fact, a factory-overclocked GeForce GTX 460 1GB at the same price, will easily outperform a Radeon HD 6870. The tussle for the performance crown in the mid high-end segment is over, and both graphics card makers are focussing on their high-end (flagship) products now.
The GeForce GTX 580 will take on AMD's high-end Radeon HD 6900 Series (the Cayman chipset). Early leaks estimate the GeForce GTX 580 to be about 15% to 20% faster than the GeForce GTX 480. Performance findings of the Radeon HD 6900 Series are still well kept though (so far).
At present, there has been much speculation that NVIDIA is announcing the GeForce GTX 580 on the 8th of November in a bid to make enthusiasts and gamers 'think twice' about buying Radeon HD 6900 Series cards, which are slated for a late November release. The general consensus is that, the GeForce GTX 580, despite being unveiled on the 8th, will be no more than just a paper launch.






However, in an unexpected twist, we stumbled upon an online posting made no more than just a few hours ago, which suggests the GeForce GTX 580 is expected to cost US$599, and will be available on November 9th. These GeForce GTX 580 cards will come from a number of NVIDIA's partners, but all are reference boards.
We do expect some price adjustments once the Radeon HD 6900 Series becomes available, assuming the GeForce GTX 580 indeed hits retail first...
http://vr-zone.com/articles/nvidia-...-599-to-be-available-november-9th-/10222.html
 
The latest info shows Cayman XT (6970) to be more than 225W and less than 300W and HD6950 to be less than 225.

I would put GTX580 ahead of 6970 by 10-15% in performance and near to 5970, say at 90-95%. I don’t believe Cayman XT (6970) will get more than 20-30% more performance than 5870 up to 1920x1080.

94189504.jpg


Well, the 6870 has a 151 watt TDP, that gives us GTX470 performance. And we know the dual GPU card will be right up against the 300 watt wall. So it's not too hard to imagine a 6970 around 225 watts. Even if it's rated at 250 watts it will likely use less juice than a 244 watt TDP GTX580.
 
However, in an unexpected twist, we stumbled upon an online posting made no more than just a few hours ago, which suggests the GeForce GTX 580 is expected to cost US$599, and will be available on November 9th
holy mother of.... 600$ for the 580.

Buy 2 x 6870s and put them in Crossfire for much better performance at ~480$.
How do these people that come up with these prices justify them.

2x6870 has like +90% performance of a single 480 card. = 480$
1x 580 has like +20% performance of a single 480 card. = 600$

if the 600 is true I dont see this card selling much.
 
Last edited:
The reason for high mark up in price is to push amd graphics card at a low price. I dont think a 600 dollar card is worth buying when its close to the price of a dual gpu.
 
This been posted?

gtx480gpuz.jpg


5807698.jpg

Probably fake but very close to the truth. 772MHz is apparently correct, not 775Mhz. That said, perhaps the screens were taken with pre-production stats and the final stats weren't made until very recently.

http://www.zdnet.com/blog/hardware/asus-leaks-gtx-580-spec/10220

I visited the site myself before they took it down and can confirm that ZDNet was accurately reporting. It is possible that NV bumped clocks at the very last second so that even ASUS's numbers are wrong, though.
 
Last edited:
This been posted?

gtx480gpuz.jpg


5807698.jpg

Honestly not that impressed. They bumped up speeds, added the parts they couldnt get, and it doesnt seem like it will give much better performance. Even worse is the possibility for 75 MHz less overclocking. We'll see, might blow everbodies socks off, but if it is $600, it will be a pretty sizable failiure.
 
The latest info shows Cayman XT (6970) to be more than 225W and less than 300W and HD6950 to be less than 225.

I would put GTX580 ahead of 6970 by 10-15% in performance and near to 5970, say at 90-95%. I don’t believe Cayman XT (6970) will get more than 20-30% more performance than 5870 up to 1920x1080.

So Barts at ~255mm nearing Cypress performance.. and Cayman will be much bigger but you dont think its going to increase performance over the 5870 more than ~25%? Not to mention the new more efficient architecture & hugely improved tessellator.

AMD wouldn't even bother releasing such a big GPU if it isn't going to obliterate the previous gen.
 
So Barts at ~255mm nearing Cypress performance.. and Cayman will be much bigger but you dont think its going to increase performance over the 5870 more than ~25%? Not to mention the new more efficient architecture & ***hugely improved tessellator.

AMD wouldn't even bother releasing such a big GPU if it isn't going to obliterate the previous gen.

Barts is a great part considering it's size, but honestly after the reviews have been posted and all the information is coming together I'm underwhelmed and slightly impressed with it's performance overall. On equal image quality settings, the 6870 is 12-15% slower than the 5870, the 6870 overclocking headroom is very unimpressive, and the tessellator is only improved up to a certain extent of tessellation.

On the same note of disappointment and being unimpressed, if the gtx580 is only 15% faster than a gtx480 Nvidia will be leaving the door open for AMD to gain both the single GPU crown and the single card x2 gpu crown.

***only under certain conditions
 
Last edited:
Honestly not that impressed. They bumped up speeds, added the parts they couldnt get, and it doesnt seem like it will give much better performance. Even worse is the possibility for 75 MHz less overclocking. We'll see, might blow everbodies socks off, but if it is $600, it will be a pretty sizable failiure.

You expected what? Most people didn't even think they could pull this off.

In fact, there were articles saying a fully functional GF100 would pull down 500w or something ludicrous like that.


It is great that we will have competition for Xmas. 🙂
 
You expected what? Most people didn't even think they could pull this off.

In fact, there were articles saying a fully functional GF100 would pull down 500w or something ludicrous like that.


It is great that we will have competition for Xmas. 🙂

That isn't a GF100.
 
Well just in case everyone doesn't know yet, review sites have already received or are in the process of getting these new bad boys to test. It looks like cards will also officially be for sale Nov. 9th. While it remains to be seen just how well these cards are going to perform, and at what price they're coming in at, Nvidia has done a pretty damn good job keeping this launch and respin a secret for the greater part of the past few months, and it's commendable that they didn't let the delays of GF100 hurt the timetable for refreshing the high end part of their lineup.
 
You expected what? Most people didn't even think they could pull this off.

In fact, there were articles saying a fully functional GF100 would pull down 500w or something ludicrous like that.


It is great that we will have competition for Xmas. 🙂

There were articles claiming that the current chip would so such things. No one ever said a redone chip would not be possible.

But for those who might have not been wowed by the 6870 only bringing a touch more power to a price point this isn't going to wow them.
 
It seems as though GPUs are hitting the wall with current manufacturing technology. I'll bet Intel could really take over the GPU business if they put their minds to it. Their manufacturing tech is second to none.
 
It seems as though GPUs are hitting the wall with current manufacturing technology. I'll bet Intel could really take over the GPU business if they put their minds to it. Their manufacturing tech is second to none.
RotFL

Intel can't design a GPU to save their company - at least not in this decade. They belly flopped not once - but twice. Their engineers are "all CPU" ... they think sequentially and that will never change. AMD had the right idea integrating ATi into their company. Now they speak CPU-GPU.
 
lets wait until the release of the GPUs...

speculations are just speculations, what we need is facts, performance tests.. etc... =)
 
RotFL

Intel can't design a GPU to save their company - at least not in this decade. They belly flopped not once - but twice. Their engineers are "all CPU" ... they think sequentially and that will never change. AMD had the right idea integrating ATi into their company. Now they speak CPU-GPU.

Are you sure? SB has the potential to completely rain on the "fusion" APU parade, and increase their already mammoth lead in graphics processing marketshare.
 
RotFL

Intel can't design a GPU to save their company - at least not in this decade. They belly flopped not once - but twice. Their engineers are "all CPU" ... they think sequentially and that will never change. AMD had the right idea integrating ATi into their company. Now they speak CPU-GPU.

There is talk of Intel contracting out their 22nm fabrication technology. If they manufactured GPUs for Nvidia, AMD would be in real trouble.

Not only that, but as OCGuy said, SB appears to be a very good APU, perhaps better than Fusion. I don't think it's all that complicated to scale up a low-end GPU into a midrange part. Correct me if I'm wrong.
 
Back
Top