Nvidia's Future GTX 580 Graphics Card Gets Pictured (Rumours)

Page 29 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Whats the point of gaming on three screens. Thats actually pulling more power than a 480. Eyefinity is just as power hungry as a 480 so why all the fuss to fermi on a 480.

Untrue. Even my non-LED (I will change that when I can) screens don't eat that much power (combined wattage for my 2 side monitors = 41.4W). 6850@stock can run L4D2 at ~70fps average, and a 6850@stock doesn't eat that much power (less than 100W draw on the card).

Idle watts matter more for electricity costs, anyway.

And screens don't add to system load = you can get away with a smaller PSU.

I already explained FOV in this thread and other uses for Eyefinity; look up several posts.

And stop spreading random numbers. I hate people who haven't tried Eyefinity/Surround who pull numbers from their air and spew nonsense like that.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Wait, didn't W1zzard just plainly say a few days ago that he wasn't given a card? Did he later admit that he was joking? His forum regulars were ragging nVidia about it and ended up blasting some biased sites, all because their favorite reviewer was not given a card, so I am genuinely confused here why not only does he seem to have a card, but also show it off earlier than everybody else.

Perhaps his public bellyaching about it shamed NV into giving him a card after all.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Exactly. I tried to say this but all these fanb-- er, 'people[/invertedcomma], haha, came out of the woodwork and lynched me.

Sour grapes. Betcha they'd be saying something else if they had their own Eyefinity/Surround setups. :)
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
His point is that due to stagnation of graphics for multiple reasons, there are games that can be run at 3 monitor resolutions that are popular (Left 4 Dead, and other games that aren't that demanding). Eyefinity is relatively unchecked and would be something that he feels nvidia could/should do in later releases.

Nothing odd about his views.
Left 4 Dead is 3 years old. That's not graphics stagnation, it's just an old game.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Left 4 Dead is 3 years old. That's not graphics stagnation, it's just an old game.

Fallout 3 in Eyefinity works as well (and probably New Vegas too, but I haven't looked into it yet). Let's face it, console ports aren't very demanding. It's what I mean by consolification of PC games.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Perhaps his public bellyaching about it shamed NV into giving him a card after all.

Or maybe he got it elsewhere. Maybe a friendly distributor or retailer slipped him one, under the table? No officially supplied card = no NDA = "Suck on this nVidia. I'm going to release my review early".
 

ugaboga232

Member
Sep 23, 2009
144
0
0
You are right about that one. I should have said like L4D2, Fallout New Vegas, and other stuff like that.
 

Dark_Archonis

Member
Sep 19, 2010
88
1
0
I don't think you're serious because if you thought about it for more than a microsecond, you would know that with 2 monitors the crosshairs get split down the middle (or disappear, with bezel compensation). Odd-numbers are better. 3 is already stressful on current hardware, let's focus on that before we go up to 5 or more. You will start reaching diminishing returns after 3 screens, anyway, because that already takes up so much peripheral vision. Unless you were born with eyes on the sides or back of your head or something.

Something people don't get with Eyefinity (I will use Eyefinity here but it applies to Surround too) sometimes is that it actually changes the aspect ratio. You literally gain more peripheral vision. Playing on a 30" is not the same as 3x24" because you are still stuck in a narrow field of vision (FOV).

Let me put it to you this way: would you want to drive a car where the side windows were spray-painted black? Because that's what it feels like to play on one monitor, after you've gamed on Eyefinity.

As for bezel-haters: guess what, your single-monitor STILL HAS BEZELS. The bezels are there whether or not you have Eyefinity. Eyefinity simply extends the screen real estate beyond those bezels. Your eyes will get used to bezels, it's not a big deal. People hardly notice the beams holding up their windshield, when they drive a car after a while.

Not to mention productivity benefits outside of gaming.

If you are truly interested in this subject aren't trolling, make a thread about it or something. I don't really care to derail this thread into an Eyefinity/Surround discussion.

With regards to field of vision and multiple monitors, it depends how they are arranged, what size the monitors are, and what aspect ratio the monitors have. You're frankly making a lot of assumptions if you feel that 3 is the best for peripheral vision.

Anyways, I won't go further off-topic with this discussion.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Fallout 3 in Eyefinity works as well (and probably New Vegas too, but I haven't looked into it yet). Let's face it, console ports aren't very demanding. It's what I mean by consolification of PC games.

Try STALKER: CoP maxed out on 3 monitors... it will bring your machine to its knees sans if you have a dual 5970 setup. :D
 

cscgo

Junior Member
Nov 9, 2010
2
0
0
They both have vapor chambers. Sapphire pretty much made vapor chambers, so it will be interesting to see if that gives Cayman an advantage in this one. On the flip side, the Nvidia solution looks very promising.

Vapor chambers have been around for a couple of decades....by "made vapor chambers" did you mean that they invented them or something?
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Vapor chambers have been around for a couple of decades....by "made vapor chambers" did you mean that they invented them or something?

Welcome to the forums. I think he means "popularized the usage of vapor chambers on video cards" or something.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Well, the 580 sure shows the 'large die nay-sayers'...
nVidia managed to make a GPU that is quite a bit faster than 480, while being slightly smaller, and a lot less powerhungry.
It seems to be giving the 5970 a good run for its money, while the combined die size of its two GPUs is larger than the 580's single GPU. Likewise the power consumption is also lower. And is this the best stock cooler ever? Apparently it's very quiet.

The 6900-series will probably put AMD back on top in gaming... but still, nVidia has gotten a lot closer, and didn't have to sacrifice any of the 480's strengths. Tessellation and GPGPU are better than ever, and still ahead of anything AMD has, and I don't think the 6900 will change that.

I guess Fermi will really get into its own once the process shrink finally arrives. AMD on the other hand will probably have to refresh their architecture *again*, to close the gap with GPGPU and tessellation.
 

Dark_Archonis

Member
Sep 19, 2010
88
1
0
Well, the 580 sure shows the 'large die nay-sayers'...
nVidia managed to make a GPU that is quite a bit faster than 480, while being slightly smaller, and a lot less powerhungry.
It seems to be giving the 5970 a good run for its money, while the combined die size of its two GPUs is larger than the 580's single GPU. Likewise the power consumption is also lower. And is this the best stock cooler ever? Apparently it's very quiet.

The 6900-series will probably put AMD back on top in gaming... but still, nVidia has gotten a lot closer, and didn't have to sacrifice any of the 480's strengths. Tessellation and GPGPU are better than ever, and still ahead of anything AMD has, and I don't think the 6900 will change that.

I guess Fermi will really get into its own once the process shrink finally arrives. AMD on the other hand will probably have to refresh their architecture *again*, to close the gap with GPGPU and tessellation.

AMD will also face much more intense GPU competition from Intel, starting early next year with Intel's Sandy Bridge CPUs with integrated GPUs.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
AMD will also face much more intense GPU competition from Intel, starting early next year with Intel's Sandy Bridge CPUs with integrated GPUs.

Well yea... I wonder if Fusion isn't just going to fall flat on its face.
After all, the biggest market for such processors will be the notebook market and simple office machines.
CPU performance and power consumption will be more important there than GPU performance. And Intel has AMD beat in those areas.
 

PingviN

Golden Member
Nov 3, 2009
1,848
13
81
Well, the 580 sure shows the 'large die nay-sayers'...
nVidia managed to make a GPU that is quite a bit faster than 480, while being slightly smaller, and a lot less powerhungry.

It seems to be giving the 5970 a good run for its money, while the combined die size of its two GPUs is larger than the 580's single GPU. Likewise the power consumption is also lower.

Yeah, it's awesome. It only took them 13 months to compete with a halo-poduct on their own, but hey! Sure showed us large-die-naysayers that large dies will rule the day. Darn it. It's great for us consumers that Nvidia were able to retaliate in the high end with such a speedy respons. Oh wait...!

Wouldn't it be a shame if AMD has a new dual GPU solution in the pipeline, called Antilles? A card which will presumably be faster than the HD5970, which in turn is faster than the GTX580.


And is this the best stock cooler ever? Apparently it's very quiet.

Might well be.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Well, the 580 sure shows the 'large die nay-sayers'...
nVidia managed to make a GPU that is quite a bit faster than 480, while being slightly smaller, and a lot less powerhungry.
It seems to be giving the 5970 a good run for its money, while the combined die size of its two GPUs is larger than the 580's single GPU. Likewise the power consumption is also lower. And is this the best stock cooler ever? Apparently it's very quiet.

The 6900-series will probably put AMD back on top in gaming... but still, nVidia has gotten a lot closer, and didn't have to sacrifice any of the 480's strengths. Tessellation and GPGPU are better than ever, and still ahead of anything AMD has, and I don't think the 6900 will change that.

I guess Fermi will really get into its own once the process shrink finally arrives. AMD on the other hand will probably have to refresh their architecture *again*, to close the gap with GPGPU and tessellation.

NV won't have to reinvent the wheel much for 28nm, a die shrink and some tweaks *cough* single-GPU Surround *cough* would be fine. I am curious as to what AMD will do. If Cayman is a smash hit architecture with legs, then they can do a die shrink and be competitive, but if not, at some point they will need to rebuild their architecture from the ground up and take their lumps like NV had to when it made Fermi. I think that "some point" is next year. R600 has been around forever....
 
Feb 19, 2009
10,457
10
76
Scali, i dunno which reviewed you read, i thought i saw the gtx580 using more power than the 5970 when gaming. It's a good GPU, but don't make it out to be something fantastic when it's not. It's just afterall, 15% better than a gtx480 at slightly lower TDP.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Scali, i dunno which reviewed you read, i thought i saw the gtx580 using more power than the 5970 when gaming. It's a good GPU, but don't make it out to be something fantastic when it's not. It's just afterall, 15% better than a gtx480 at slightly lower TDP.

The GTX480 was already the fastest GPU on the market, so 15% faster than the GTX480 is nothing to scoff at.
And if that is combined with less power, all the better. After all, power consumption was one of the biggest downsides of the GTX480. The reason why I didn't buy one.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Nice power curves. Looks like nV will own this holiday season with reported Cayman delays.
 

Dark_Archonis

Member
Sep 19, 2010
88
1
0
This is a valid point Zebo. Rumors are AMD's 6900 cards are being delayed by none other than TSMC yield issues. If this is true, it's very ironic. The 69xx cards might not be available in time for the holiday season, which would give Nvidia a really nice boost.

Well yea... I wonder if Fusion isn't just going to fall flat on its face.
After all, the biggest market for such processors will be the notebook market and simple office machines.
CPU performance and power consumption will be more important there than GPU performance. And Intel has AMD beat in those areas.

Exactly. AMD's bet with Zacate, Ontario, and Llano is that people don't need/want more CPU performance, but need/want more GPU performance. The irony is that the average person doesn't need more GPU performance, they just need "good enough" performance. Sandy Bridge's GPU will be able to do everything joe public needs, and soon all Atom platforms will be fully able to handle any HD video content.

I think it's a big risk AMD is taking though.
 

Ares1214

Senior member
Sep 12, 2010
268
0
0
I suspect that Cayman is a similar deal as GTX480->GTX580.
Slightly tweaked architecture, giving a bit of extra performance, and a bit of power savings.

I 100% disagree. Nvidia started with a massive die, shrinked things down a bit. AMD started with a small die, and had room to make it bigger, possibly up to 400mm. GF110 was just enabling everything, raising clocks, and shaving away some things, partially not used, partially CUDA related. This got more performance due to enabling everything, aswell as the higher clocks. 10-15% higher. Less energy came from what they cut down and such. AMD seems like they are trying to get about 20% more die size than the 5870 with the 6970. The TPU review put the 580 at about 25% above the 5870. Cayman is really a fairly new arch. GF110 really wasnt. GF110 was a lot more like Barts, cut things away for efficiency, raise clocks, and that yields the GF110. Cayman is much more of a new arch. So id expect a lot more power from the 6970, as well as requiring more power. Where have you been, havent you seen the power consumption charts of the 6970? 6+8 pin? No? BTW, the price gouging on newegg of the GTX 580's makes them a pretty pathetic buy now.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I 100% disagree. Nvidia started with a massive die, shrinked things down a bit. AMD started with a small die, and had room to make it bigger, possibly up to 400mm. GF110 was just enabling everything, raising clocks, and shaving away some things, partially not used, partially CUDA related. This got more performance due to enabling everything, aswell as the higher clocks. 10-15% higher. Less energy came from what they cut down and such. AMD seems like they are trying to get about 20% more die size than the 5870 with the 6970. The TPU review put the 580 at about 25% above the 5870. Cayman is really a fairly new arch. GF110 really wasnt. GF110 was a lot more like Barts, cut things away for efficiency, raise clocks, and that yields the GF110. Cayman is much more of a new arch. So id expect a lot more power from the 6970, as well as requiring more power. Where have you been, havent you seen the power consumption charts of the 6970? 6+8 pin? No? BTW, the price gouging on newegg of the GTX 580's makes them a pretty pathetic buy now.

Here's the thing. We (this means all of us here) do not have any definitive information on what Cayman actually is. Probably won't for quite a while.