• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

GTX 580 appears (briefly) on NVIDIA website

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
AMD doesn't seem interested in building a 300W single chip solution.

the slides were showing Cayman at "<300w" which means it probably pretty close.......if the slides turn out to be accurate
 
Last edited:

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
Guys, I'm pretty sure T2K was talking about the OP (potential product "leak" and/or PR stunt from NVIDIA) and not the lowering of prices. At least that's how I saw it...

If there really is a 580 in the works soon to be released, cool stuff. However, if there isn't such a product or there is but it's release date is over 6 months from now, then that is pretty pathetic of NVIDIA.

thats how I saw it too, and my response to him was based on that.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
TDP's rise significantly for *any* (AMD or Nvidia) graphics card when the clocks are increased. But there are several big factors you didn't consider:

Compare the gtx465 to gtx470 on anandtech's original gtx465 review. Both are running at the same clock speeds and using the same GF100 chip, however the gtx470 is around 25% faster while only drawing 5% more power in games despite having more of the chip enabled.

The compare a gtx480 to gtx470. More chip enabled, 100mhz higher clock speeds. The gtx480 is around 20% faster while drawing around 15% more power in games.

This is a bit irrelevant because the GTX465 are heavy crippled chips. If you look 5830 is in a similar situation when compared to the 5850.

Now look at the latest reviews for newer manufactured gtx480's - some are running 50mhz faster than standard gtx480's while simultaneously pulling less power than the initial gtx480's at stock speeds.

So, in a nutshell, and especially with binning parts, TDP will be reasonably contained when, not if, Nvidia releases the 384 shader gf104 part.

It is hard to point if that is due to the chip being less leaky or the board being of better quality or both.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
I Cayman is as efficient as barts. Its going to easily beat a 5970.

That would be a tremendous accomplishment on the same node...I think it would take quite a bit more die size to do though >450mm^2

The leaked slides I saw had Cayman XT slotted in between the 480 and the 5970 so I doubt its going to reach that high.....
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
<300W could mean anything.

certainly true, but the assumption is if its closer to 250 then that would have been on the chart. Logic would presume that they would try and show it to be as little as possible.

The only scenario where that wouldn't be the case is if the slide is part of the misinformation campaign, but the numbers on it made sense to me for the most part based on what we know of barts.

Of course, If Antilles really is 2 Cayman and not 2 Barts that would mean they would have 2 be pretty crippled/slowed clock Cayman dies to fit under 300w with two of them or really amazing cherry picked dies.....the more I think about that the more it seems like Cayman will be much less than 300 like you imply though, because if they had to use two Barts for Antilles, it wouldn't be much of an upgrade over 5970
 
Last edited:

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
By your logic, why the hell is HD 6870 not named HD 5890?
():)

It depends entirely upon the chip design. Right now the 6800s are, according to rumor, new designs with less but more powerful shaders than the 5800s and thus more than worthy of a new name. The aspect we should be criticizing about AMD's naming is the fact that they're changing the naming schemes around and going with 6800s and 6900s instead of 6700s and 6800s.

If GF110 is just a complete GF100 then I'd agree its really not worthy of a new series, but nVidia will label it as such for marketing purposes.

If its a decently altered design like the GF104 and can deliver the performance then maybe it will be worthy of a new name, we'll just have to wait and see.

Although if the GTX580 is a dual GPU solution then I won't know what to think. Performance would be excellent for a single card solution but its a dual GPU solution...
 
Last edited:

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
thermi580.gif
 

formulav8

Diamond Member
Sep 18, 2000
7,004
523
126


Yuk Yuk Yuk :D - The GTX 580 is their brand new fermonuclear model.


Anyways its just a typical childish PR stunt by nvidia. They apparently know they are going to lose the single gpu crown by years end and thus trying to get people to hold off buying AMD and buy their brand new fermo model.

The real news when it comes to nvidia is the price drops. Wondering how low the prices will go..... :thumbsup:


Jason
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Guys, I'm pretty sure T2K was talking about the OP (potential product "leak" and/or PR stunt from NVIDIA) and not the lowering of prices. At least that's how I saw it...

If there really is a 580 in the works soon to be released, cool stuff. However, if there isn't such a product or there is but it's release date is over 6 months from now, then that is pretty pathetic of NVIDIA.


Don't come in here with your rational common sense interpretation. Adhering to that kind of interpretation wouldn't leave any room to misrepresent arguments and ideas surrounding this stuff.



My take on the 580 announcment:

We have recent history to look how nVidia promotes a new product.
1) Announcment/Leak
2) Announcement of an upcoming announcment of said announcment/leak
3) Official Announcement of a product
4) ???
5) Release,.. or not the actual product.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Don't come in here with your rational common sense interpretation. Adhering to that kind of interpretation wouldn't leave any room to misrepresent arguments and ideas surrounding this stuff.



My take on the 580 announcment:

We have recent history to look how nVidia promotes a new product.
1) Announcment/Leak
2) Announcement of an upcoming announcment of said announcment/leak
3) Official Announcement of a product
4) ???
5) Release,.. or not the actual product.

Yeah. Release of something that is a cut down version with all the fanbois saying, "Wait til they release the full version, you'll be sorry then! "
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Saving the link to this thread in case someone ever questions which way this forum leans again ;)

All of these posts would normally be considered thread-crapping....
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
If it's a fully enabled GF100 chip it makes more sense to call it a GTX 485 or 490 than a GTX 580. Either way, I'm personally not interested. Fermi isn't a bad chip (just warm), but with the 6-series launch imminent I just don't see why someone in the market for a high end card would want this.

Without seeing how either of them perform, how can you make a judgement either way?

If it's based on a fully functioning GF100 chip, it can only be so fast. We know for certain it would be hot. Sure, I guess theoretically there is a very small chance that it outperforms the upcoming 6970, but I doubt it. If it does I'll be surprised, and you can quote me as underestimating the GTX 580. :)
 

thilanliyan

Lifer
Jun 21, 2005
12,062
2,275
126
Saving the link to this thread in case someone ever questions which way this forum leans again ;)

All of these posts would normally be considered thread-crapping....

And it was the opposite when nVidia was crushing ATI after G80 came out. You shouldn't be surprised.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Guys, I'm pretty sure T2K was talking about the OP (potential product "leak" and/or PR stunt from NVIDIA) and not the lowering of prices. At least that's how I saw it...

Bingo! :)

If there really is a 580 in the works soon to be released, cool stuff. However, if there isn't such a product or there is but it's release date is over 6 months from now, then that is pretty pathetic of NVIDIA.

Exactly and unless we're missing something really-really secretly developed chip, manufactured on the Moon, it's pretty much out of question that NV will release a new architecture/generation anytime this year.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
If it's based on a fully functioning GF100 chip, it can only be so fast. We know for certain it would be hot. Sure, I guess theoretically there is a very small chance that it outperforms the upcoming 6970, but I doubt it. If it does I'll be surprised, and you can quote me as underestimating the GTX 580. :)

Fermi can only get so fast, we know that - it's a filed architecture as it is now, at least for gaming cards and even so calling it GTX5xx would require a new architecture, even by NV's crooked, customer-deceiving naming practices, nothing less.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Saving the link to this thread in case someone ever questions which way this forum leans again ;)

All of these posts would normally be considered thread-crapping....

How does a 580gtx fit in the lineup? Would it not make a lot more sense if we were talking about a leaked slide that showed a 485gtx or a 490gtx? Why is the inconsistency of the name of this, as of this point, fictional card being completely dismissed by those who support it's likely existance in the near future?

It doesn't make a lot of sense for a 580gtx to be released in the near future. There's not any conspiracy behind the reasonable thought that presents such an argument.
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
I woudln't be surprised if the GTX 580 spec wise is 50% spec wise in most areas compared to the G104 (384>576 Shaders, 256>384 membit bus, 64>96TMUs) since it is also using the GF110 monkier for the codename it will probably be tweaked again from the GF104 futher to get performance/effiency gains without making it too hot, but this is all assuming that the GF104s core is 334mm2 according to GPU-z and BSN.

Wow some actual architecture speculation. More of this and less Photoshop follies please.

nV obviously refocused with GF104. What can they do to make 580 solid? Detailed architecture speculation would be much appreciated. GF104 is sweet. What's in store? Please bring your thoughts here, or start a "GTX 580 Architecture Speculation" thread.

"The biggest and most visible change is that NVIDIA beefed up the number of various execution units per SM. The 32 CUDA cores from GF100 are now 48 CUDA cores, while the number of SFUs went from 4 to 8 along with the texture units. As a result, per SM GF104 has more compute and more texturing power than a GF100 SM. This is how a “full” GF104 GPU has 384 CUDA cores even though it only has half the number of SMs as GF100."

http://www.anandtech.com/show/3809/nvidias-geforce-gtx-460-the-200-king/2