Nvidia's Future GTX 580 Graphics Card Gets Pictured (Rumours)

Page 28 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

toyota

Lifer
Apr 15, 2001
12,957
1
0
Yeah, they have a true lifetime warranty on all GTX models, and a 5 year warranty on GTS models. They have also started accepting BFG's RMA's as well. So, if you have a BFG card and need service, they are going to do it.

I was surprised as well to find out their new warranty, but considering it just went in to effect after BFG went under, it seems they are trying to pick up where BFG left off and increase their market share with nVidia.

So far, I am more than pleased with this card. Time will tell of course, but build quality is good. I think the RAM they use is not rated as high as the other manufacurers, and they don't include any overclocking software, but I don't overclock any more...so for me it is no big deal. But, if I want to all I have to do is get Afterburner. But meh. Considering the factory overclock is 765 on the core, I won't really miss those extra 3-4 fps in game play.

Edit:

And no, I am not wrong. I know how to read.

Limited Lifetime Warranty
  • One year (will be extended for the lifetime of the original purchaser upon completion of a registration form on PNY's website)
right, but then read what their "Limited Lifetime Warranty" actually means which is what I posted. it says "Lifetime is defined as the lifetime of the product on the market."
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
right, but then read what their "Limited Lifetime Warranty" actually means which is what I posted. it says "Lifetime is defined as the lifetime of the product on the market."
That is their old 'Lifetime' warranty. The new warranty went in to effect after September 9th. It is the lifetime of the original purchaser, not the lifetime of the card. It has changed now.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
That is their old 'Lifetime' warranty. The new warranty went in to effect after September 9th. It is the lifetime of the original purchaser, not the lifetime of the card. It has changed now.
yeah I see that now after reading the Lifetime Warranty for XLR8 GeForce GTX Series closer. so I assume you remembered to register your card???
 

Ares1214

Senior member
Sep 12, 2010
268
0
0
After seeing techpowerup review...not too impressed. 10-13% more performance, 20-25% over the 5870. I mean, better, but 20-25% were the minimum numbers for the 6970 over the 5870.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Its actually for your benefit, not the mods. We are assuming you guys do not want VC&G to become grand-central station for free-advertising spammers to bombard you with links to their articles 24x7.

You guys get the forum you deserve...if you don't care to keep it cleaned up then it will slowly become a mess. Report the posts, make it easier on your volunteer mods to make the best use of their limited free time to keep your forums in the condition you want them to be.


Moderator Idontcare

THIS.


(I should stop using this comment as a sign of my agreement. :D)
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
So I assume you remembered to register your card???
Damn right!

:p

It looks like they've removed the link to the BFG RMA program though, so I don't know if they are doing that any longer or not. I know they were, but maybe stopped now.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
After seeing techpowerup review...not too impressed. 10-13% more performance, 20-25% over the 5870. I mean, better, but 20-25% were the minimum numbers for the 6970 over the 5870.

I'm still happy - it will put some pressure on AMD so we can expect more reasonable prices on 6970/6900, I think.
 

Dark_Archonis

Member
Sep 19, 2010
88
1
0
No. Reread the last few pages yourself, find out what the conversation was about, and stop replying to single posts without knowing how they fit.

If you want to be so stubborn about it, fine we'll do it your way.

You ORIGINALLY posted this:

http://forums.anandtech.com/showpost.php?p=30734169&postcount=623

I'm actually kinda surprised Nvidia hasn't brought out a 3+ monitor card. Seems a bit odd. Eyefinity is still in a league of its own imo.

You find it "odd", well good for you. However as I mentioned in my subsequent replies to you, it is NOT PRACTICAL to have one card running 3+ monitors (unless it's a dual GPU card) so maybe that is why Nvidia does not have such a card currently?

Is that CLEAR enough for you? :rolleyes:?

Or maybe Nvidia feels that there isn't a big enough market for a single card powering 3+ monitors.

I find it "odd" that you don't even try to understand the reasoning as to why Nvidia might not have such a product available on the market.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
How many gamers that have 3 monitors actually settle for the performance they'll get out of 1 card driving 3x the pixels of one monitor?

*raises hand*

Most people already know the drawbacks to 2+ GPUs: possible broken SLI/CF (see, e.g., http://www.hardocp.com/article/2010/10/08/ati_crossfirex_vs_nvidia_sli_new_games_performance/3), higher power/noise/heat than one card, possibly more microstuttering, possibly having to upgrade your mobo and/or PSU and/or case/cooling to accommodate more than 1 card, etc.

I went with mini-Eyefinity (3 x 22" 5040x1050) for flexibility:

1) Eyefinity is useful OUTSIDE OF games, too! You can use it for productivity and multitasking even when you aren't gaming, and for me, that's important because I don't game all the time. I can watch TV in one screen, surf the web in another, and type into Excel in another, for instance. You can sorta do that with a 30" but it's easier to have one app per screen, esp. in Windows 7 which was built with multi-monitor and widescreen monitors in mind.

2) Single-GPU is more than enough to drive older games. I mostly play Source games and can tell you that Left 4 Dead 2 at 5140x1050 (bezel-compensated) runs at ~70fps average at stock clocks on a 6850 at max detail settings and 4xMSAA. That's the framerate for a 6850--not even a 6870 or 5870 or overclocked 5870! TF2 runs almost as well. With overclocking, I can really crank up the fps in those games. Lots of console ports are similarly easy on GPUs and you can actually run Eyefinity with them. The fps hit isn't 67% like you might expect. It's not linear like that. It's more like a 40-60% hit depending on the game.

3) If you really need to go single-monitor, it's easy. I play BFBC2 on one screen with maximum eye-candy. No problem.

4) Lower cost and less deskspace than 3x 1080p or 3x 19x12 resolution. A highly oc'd 6850 can power those resolutions just as well as a stock-clocked 6850 can power 3x 16x10, but the displays a) cost more and b) 3x24" would almost spill over the edges of my desk.

In short, stop making excuses for NV's hardware limitations. I hope they don't settle on being second-best in ANYTHING, and that they therefore enable Kepler to run at least 3 monitors at once.
 

ugaboga232

Member
Sep 23, 2009
144
0
0
His point is that due to stagnation of graphics for multiple reasons, there are games that can be run at 3 monitor resolutions that are popular (Left 4 Dead, and other games that aren't that demanding). Eyefinity is relatively unchecked and would be something that he feels nvidia could/should do in later releases.

Nothing odd about his views.
 

Dark_Archonis

Member
Sep 19, 2010
88
1
0
So using this logic, why don't either AMD OR Nvidia have a single card product that can power 6+ monitors? What excuses do Nvidia and AMD have here?

I mean what makes 3 monitors better than 2? Why not have 6 or more monitors? Where do you draw the line? Where does it end, if ever?

How about 3D? Why are there no single cards offered than can power 6+ monitors in 3D? What if some people want to run Windows 7 in 3D on 6 monitors? What if somebody wants to play some really old game on 6+ monitors?

Why is this tiny market niche being ignored by both Nvidia and AMD?
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Or maybe Nvidia feels that there isn't a big enough market for a single card powering 3+ monitors.

Sure, Surround is a tiny market, but guess what? SLI is itself a TINY market. 3D is also a TINY market. Hell if we want to be rude about it, we can call other things TINY too, like GPU PhysX, but NV tries like hell to grow them anyway. Why? Because consolification of PC gaming is reducing the need for people to continually upgrade their GPUs, which is a VERY BAD THING for both ATi and NV. (Not to mention that the PC gaming market has been losing ground to consoles in recent years, so you can't rely on the overall market size of PC gaming to increase like in the past, either.)

What NV wants to do is to encourage people to get 3D and/or Surround in order to keep the sales pace going.

If NV devoted so much time to niches like 3D, I'm sure they will devote time to making single-GPU Surround in hardware. The problem is that, like Fermi, NV started working on Kepler's design a long time ago, so I don't know if they have time to change things in midstream to include native Surround support.

Even if Kepler doesn't, Maxwell will. I guarantee it. You seem to think that NV is the kind of company that would ever willingly settle for second place. NV is not that kind of company. It doesn't settle for second place if it can help it--even in niche markets.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
So using this logic, why don't either AMD OR Nvidia have a single card product that can power 6+ monitors? What excuses do Nvidia and AMD have here?

I mean what makes 3 monitors better than 2? Why not have 6 or more monitors? Where do you draw the line? Where does it end, if ever?

How about 3D? Why are there no single cards offered than can power 6+ monitors in 3D? What if some people want to run Windows 7 in 3D on 6 monitors? What if somebody wants to play some really old game on 6+ monitors?

Why is this tiny market niche being ignored by both Nvidia and AMD?

Is this a serious question or are you trolling? AMD does have single-GPU cards with 6-monitor support. Eyefinity6 ring a bell?

And if you don't see why 3 monitors are FAR better than 2, I don't know what to say other than you must not play very many FPS games. Or racing games. Or flight sims.

Your other questions are simply argumentative, GTX580 can't even run six-screen Metro 2033 for instance, the VRAM limitation is too great. Stop being argumentative and a Surround/Eyefinity hater. If you had a Surround/Eyefinity setup, you'd understand why those of us with such systems like it.
 

Dark_Archonis

Member
Sep 19, 2010
88
1
0
Sure, Surround is a tiny market, but guess what? SLI is itself a TINY market. 3D is also a TINY market. Hell if we want to be rude about it, we can call other things TINY too, like PhysX, but NV tries like hell to grow them anyway. Why? Because consolification is reducing the need for people to continually upgrade their GPUs, which is a VERY BAD THING for both ATi and NV.

What NV wants to do is to encourage people to get 3D and/or Surround in order to keep the sales pace going.

If NV devoted so much time to niches 3D, I'm sure they will devote time to making single-GPU Surround in hardware. The problem is that, like Fermi, NV started working on Kepler's design a long time ago, so I don't know if they have time to change things in midstream to include native Surround support.

Even if Kepler doesn't, Maxwell will. I guarantee it. You seem to think that NV is the kind of company that would ever willingly settle for second place. NV is not that kind of company. It doesn't settle for second place if it can help it--even in niche markets.

I never said that Nvidia would "settle" or never offer it. I wouldn't be surprised if Kepler and/or Maxwell offer 3+ monitor support on a single card.

My original point was in reply to Gloomy's post about how he found it "odd" that Nvidia did not have support for this yet. AMD cards support some things that Nvidia cards do not, and vice versa. I fail to see why someone would find anything "odd" at all in this situation.

Yes, I agree, both Nvidia and AMD continue to introduce and offer tiny niche features in order to drive the market forward.

As for the possible "consolification" of the PC market, Nvidia and AMD must act differently here. Continually introducing new features only to high-end products is not a great strategy. In order to drive the PC market forward, AMD and Nvidia need to start introducing their brand new features into mainstream and mid-range cards all at once.

Pushing new features on high-end products helps each company make some nice profit, but it isn't really the best way to push the PC market forward.

Single AMD Barts can drive 6 monitors.

It seems you missed my point. Can it do 6+ monitors in 3D? Point is, where does it end? Where do you draw the line?

I guess never, just for the sake of getting tech enthusiasts to keep the buying the "latest new thing".

Is this a serious question or are you trolling? AMD does have single-GPU cards with 6-monitor support. Eyefinity6 ring a bell?

And if you don't see why 3 monitors are FAR better than 2, I don't know what to say other than you must not play very many FPS games. Or racing games. Or flight sims.

Your other questions are simply argumentative, GTX580 can't even run six-screen Metro 2033 for instance, the VRAM limitation is too great. Stop being argumentative and a Surround/Eyefinity hater. If you had a Surround/Eyefinity setup, you'd understand why those of us with such systems like it.

I am not trolling anything, I am serious.

How about 8 monitors on a single card? As I have already asked, WHERE do you draw the line? Where does it end?

Going back to my original point, finding it "odd" that Nvidia currently doesn't offer this is meaningless in my opinion, due to reasons I have already pointed out.
 

Aenslead

Golden Member
Sep 9, 2001
1,256
0
0
Because it is against forum policy. Imagine what your forum will look like if every author of every review article out there decides to come here and spam every thread with self-promotion links to their websites and articles.

Moderator Idontcare

Fine. I do-do care now.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I am not trolling anything, I am serious.

How about 8 monitors on a single card? As I have already asked, WHERE do you draw the line? Where does it end?

Going back to my original point, finding it "odd" that Nvidia currently doesn't offer this is meaningless in my opinion, due to reasons I have already pointed out.

I don't think you're serious because if you thought about it for more than a microsecond, you would know that with 2 monitors the crosshairs get split down the middle (or disappear, with bezel compensation). Odd-numbers are better. 3 is already stressful on current hardware, let's focus on that before we go up to 5 or more. You will start reaching diminishing returns after 3 screens, anyway, because that already takes up so much peripheral vision. Unless you were born with eyes on the sides or back of your head or something.

Something people don't get with Eyefinity (I will use Eyefinity here but it applies to Surround too) sometimes is that it actually changes the aspect ratio. You literally gain more peripheral vision. Playing on a 30" is not the same as 3x24" because you are still stuck in a narrow field of vision (FOV).

Let me put it to you this way: would you want to drive a car where the side windows were spray-painted black? Because that's what it feels like to play on one monitor, after you've gamed on Eyefinity.

As for bezel-haters: guess what, your single-monitor STILL HAS BEZELS. The bezels are there whether or not you have Eyefinity. Eyefinity simply extends the screen real estate beyond those bezels. Your eyes will get used to bezels, it's not a big deal. People hardly notice the beams holding up their windshield, when they drive a car after a while.

Not to mention productivity benefits outside of gaming.

If you are truly interested in this subject aren't trolling, make a thread about it or something. I don't really care to derail this thread into an Eyefinity/Surround discussion.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
His point is that due to stagnation of graphics for multiple reasons, there are games that can be run at 3 monitor resolutions that are popular (Left 4 Dead, and other games that aren't that demanding). Eyefinity is relatively unchecked and would be something that he feels nvidia could/should do in later releases.

Nothing odd about his views.

Exactly. I tried to say this but all these fanb-- er, 'people[/invertedcomma], haha, came out of the woodwork and lynched me.
 
Last edited: