The GTX 780, 770, 760 ti Thread *First review leaked $700+?*

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

toyota

Lifer
Apr 15, 2001
12,957
1
0
A lot of us have been enjoying Titan-level performance on 7970's and GTX 680's for almost a year and a half. It's hardly significant.

NVIDIA made a nice cash-grab move and some people bought it. The significance of which is hard to determine without some actual sales figures.
stock gtx680 is 30-35% faster than a gtx580 overall and even less in some demanding games. a stock 7970 ghz(which is not even out that long)is maybe 40-45% faster than a gtx580. Titan overall is about 85-90% faster than a gtx580. so again Titan was the first significant upgrade in 2.5 years. and yes I know you want to talk about overclocking but I am comparing ALL the cards at stock.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
stock gtx680 is 30-35% faster than a gtx580 overall and even less in some demanding games. a stock 7970 ghz(which is not even out that long)is maybe 40-45% faster than a gtx580. Titan overall is about 85-90% faster than a gtx580. so again Titan was the first significant upgrade in 2.5 years. and yes I know you want to talk about overclocking but I am comparing ALL the cards at stock.
Well, this being an enthusiast forum, overclocking is considered.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
The rumors for this card are all over the place. The rumors indicate either 320 bit or 384 bit, and 5GB VRAM or 3GB VRAM depending on the bus size. I'd say if the GTX 780 ends up at 2.5-3GB of VRAM at a cost of no more than 650$, preferably 550-600$, then I think folks won't complain about the price an awful lot. It's high, but not awfully exorbitant.

There's another rumor indicating 800$. Now, I realize nvidia can do whatever they want. But I feel that's just exorbitant. Some will still buy as nvidia has a lot of fans of course, but it definitely won't win them any good will. That's just too much for the x80 part IMHO. They already have the Titan for folks wanting to part with an arm and a leg, why put the x80 in the same stratosphere?

Personally I think the former is the preferable approach. Make it 384 bit with 3 and 6GB configurations which can be chosen by the user; 5GB is far too much for the typical user. You can say "future proof", but we don't have games yet that can make adequate use of that much VRAM unless the end-user tries awfully hard to use 17 gabillion mods and 8X SGSSAA in a single screen resolution. Surround can use 5GB, and while I like surround - most users just aren't using surround, period.

So yeah...I think the best price point would be around 600 for a 3GB card, and higher for a 6GB version. They already have titan as mentioned for those who want something in the stratosphere. Assuming 384 bit as some rumors indicate, the latest sweclockers references indicate that it may indeed be 384 bit with fewer CUDA cores and less VRAM than titan.

If it's $599-649, then I do not see it having all it's memory controllers enabled. Performance will be just too close to Titan at that price range. And the rub there is that if it's a 320-bit bus, I doubt nvidia would want to use a mixed-memory configuration for their high end card, and 2.5 gigs of vram would be awkward in the the 700 series lineup if the gtx770 and gtx760ti having 4 gigs of vram standard turn out to be true.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Add another ~7% for GHz OC and it's still not even close, Titan is what GHz wishes it was, fast and efficient.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,116
136
If it's $599-649, then I do not see it having all it's memory controllers enabled. Performance will be just too close to Titan at that price range. And the rub there is that if it's a 320-bit bus, I doubt nvidia would want to use a mixed-memory configuration for their high end card, and 2.5 gigs of vram would be awkward in the the 700 series lineup if the gtx770 and gtx760ti having 4 gigs of vram standard turn out to be true.

If the 780 is really GK110 with 13 SMXs enabled, and sells for $700-$750 it will likely be because it has 5GB of GDDR5 w/320b interface. That's an added differentiator to 2-4GB models of GK104, especially for higher resolutions (1440p $ >).
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I honestly hope it isn't exclusive in a 5GB version. I'd much rather have a choice (eg 3 vs 6 GB) because 5GB just isn't beneficial outside of surround. Even with 2GB, i'll argue day about this because i've TRIED -- VRAM isn't an issue even at 2560x1600 unless you go overboard with mods or use tons of SGSSAA. Now, with surround VRAM is always an issue and you can easily make the case for more being better - and I would not disagree there.

But, the fact of the matter is most users are not using surround, and most are gaming on a single screen resolution. Therefore, 5GB is largely going to waste. You can argue about the future all you want but we're in the here and now, and right now it's a waste for a single screen. Someone may say that with the PS4 coming out, games will require more VRAM. I don't know, my response is the same: we're in the here and now. We don't know what affect next gen consoles will have, YET. And certainly, I can't imagine next-gen consoles being HIGHER than 1080p. So with that said -- I hope the 5GB rumor is wrong. Some rumors at sweclockers also indicate it could be a 384 bit part with 3 or 6GB. I hope that is correct. Then users will have the *choice* of buying a part that isn't completely exorbitant in price. I certainly can't see this forum going nuts about the 780 if it costs 800$. Sure, some will buy it, but cost will definitely be a preventative factor for many.

Sweclockers posted 3/6GB just a few days ago FYI. So it may not be 5GB. I *hope* it isn't 5GB, I would prefer it to be 3 or 6. Give users a choice. Give users a choice for VRAM and cost: 600 vs 800$. That sounds much less exorbitant to me as compared to a 5GB 800$ card.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
I honestly hope it isn't exclusive in a 5GB version. I'd much rather have a choice (eg 3 vs 6 GB) because 5GB just isn't beneficial outside of surround. Even with 2GB, i'll argue day about this because i've TRIED -- VRAM isn't an issue even at 2560x1600 unless you go overboard with mods or use tons of SGSSAA. Now, with surround VRAM is always an issue and you can easily make the case for more being better - and I would not disagree there.

But, the fact of the matter is most users are not using surround, and most are gaming on a single screen resolution. Therefore, 5GB is largely going to waste. You can argue about the future all you want but we're in the here and now, and right now it's a waste for a single screen. Someone may say that with the PS4 coming out, games will require more VRAM. I don't know, my response is the same: we're in the here and now. We don't know what affect next gen consoles will have, YET. And certainly, I can't imagine next-gen consoles being HIGHER than 1080p. So with that said -- I hope the 5GB rumor is wrong. Some rumors at sweclockers also indicate it could be a 384 bit part with 3 or 6GB. I hope that is correct. Then users will have the *choice* of buying a part that isn't completely exorbitant in price. I certainly can't see this forum going nuts about the 780 if it costs 800$. Sure, some will buy it, but cost will definitely be a preventative factor for many.

Sweclockers posted 3/6GB just a few days ago FYI. So it may not be 5GB. I *hope* it isn't 5GB, I would prefer it to be 3 or 6. Give users a choice. Give users a choice for VRAM and cost: 600 vs 800$. That sounds much less exorbitant to me as compared to a 5GB 800$ card.

I'm sure AMD an Nvidia know something about upcoming PC games that has them pushing the amount of RAM in their upcoming cards. It would be my guess that it has something to do with the lowest common denominator going for 512mb of ram to 8gb of ram in November. I can just imagine how much ram usage we will be seeing when current games that are designed for 512mb consoles use up to 3gb of ram on the PC already.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I honestly hope it isn't exclusive in a 5GB version. I'd much rather have a choice (eg 3 vs 6 GB) because 5GB just isn't beneficial outside of surround. Even with 2GB, i'll argue day about this because i've TRIED -- VRAM isn't an issue even at 2560x1600 unless you go overboard with mods or use tons of SGSSAA. Now, with surround VRAM is always an issue and you can easily make the case for more being better - and I would not disagree there.

But, the fact of the matter is most users are not using surround, and most are gaming on a single screen resolution. Therefore, 5GB is largely going to waste. You can argue about the future all you want but we're in the here and now, and right now it's a waste for a single screen. Someone may say that with the PS4 coming out, games will require more VRAM. I don't know, my response is the same: we're in the here and now. We don't know what affect next gen consoles will have, YET. And certainly, I can't imagine next-gen consoles being HIGHER than 1080p. So with that said -- I hope the 5GB rumor is wrong. Some rumors at sweclockers also indicate it could be a 384 bit part with 3 or 6GB. I hope that is correct. Then users will have the *choice* of buying a part that isn't completely exorbitant in price. I certainly can't see this forum going nuts about the 780 if it costs 800$. Sure, some will buy it, but cost will definitely be a preventative factor for many.

Sweclockers posted 3/6GB just a few days ago FYI. So it may not be 5GB. I *hope* it isn't 5GB, I would prefer it to be 3 or 6. Give users a choice. Give users a choice for VRAM and cost: 600 vs 800$. That sounds much less exorbitant to me as compared to a 5GB 800$ card.

Like I explained, I think it's a foregone conclusion that that a GK110 based gtx780 will have a memory controller disabled to differentiate it's performance from Titan, and I seriously doubt Nvidia will opt to use a mixed memory configuration for it's flagship 700 series card (3gb of vram with a 320-bit memory interface, resulting in odd number of memory chips per controller). And with rumors (which are totally believable) that the gtx770 will have 4gb standard, I don't see the gtx780 ending up with 2.5gb of vram. I'd be totally fine with it just like you, I just don't see it happening.

It would be nice if Nvidia released two reference models for gtx780, one with 2.5gb of vram and the other with 5gb, as well as a 2gb and 4gb gtx770 and gtx760ti. Charge a $50 premium and let the costumers decide which one to get. In my vision of this scenario, we'd see gtx770 2gb cards slightly outperforming gtx680's for $399 while the 4gb cards going for $449. Gtx780 2.5gb cards going for $549 and 5gb cards going for $599. It'd be nice, but I just don't see it happening. With all the rumors swirling I think the gtx780 is going to be $649 and have 5gb vram and there won't be a 2.5gb option. Gtx770 4gb will be $449, essentially replacing gtx680's at their current lowest price.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I'm sure AMD an Nvidia know something about upcoming PC games that has them pushing the amount of RAM in their upcoming cards. It would be my guess that it has something to do with the lowest common denominator going for 512mb of ram to 8gb of ram in November. I can just imagine how much ram usage we will be seeing when current games that are designed for 512mb consoles use up to 3gb of ram on the PC already.

First, you're arguing the future. We don't know what's happening in the future. But while we're on that discussion, VRAM is determined by:

1) resolution
2) anti aliasing
3) game assets

Again, I can't see any PS4 game having assets in excess of what the best PC exclusive games currently such as crysis 3. Crysis 3 uses 16k textures at the absolute highest quality setting (console versions obviously WAY WORSE texture quality...), and certainly those assets do not exceed 2GB in single screen resolutions. Much less 5GB. Additionally, MSAA is the second factor. With consoles reliant on other technologies such as shader based AA (FXAA, MLAA, SMAA) I don't see that being a factor unless you make it a factor (eg use override SGSSAA). Resolution for next gen consoles, unless i'm crazy, is likely to be 1080p.

And 8GB just isn't going to be needed for 1080p unless a game has something absolutely stupid like 2048k textures. And that day inst' even remotely approaching anytime soon, you'd be kidding yourself if you think any next-gen console is using anything remotely close to that.

Also, another obvious point is that neither upcoming console has 8GB of dedicated VRAM. I just want the end-user to have a choice. 3 or 6GB sounds perfect to me. Being forced into 5GB for a 780, not so much. Wouldn't you rather have a choice? GDDR5 is *not* cheap. Having that choice - being able to say, "Oh hey I use surround, I want 6GB" and having that choice would be great. Similarly, having the choice for 3GB would be great. Know what I mean? The alternative is not having a choice and an 800$ 5GB card landing. And let's not kid ourselves, if it's 5GB it IS going to be in that 700-800$ price range as the baseline. With a choice it could potentially be cheaper for those using single screens - Again, GDDR5 is not cheap.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
I can confirm gtx780 will come out in may. 3gb gddr5, 384bit bus. I haven't had a chance to look at core config yet.

It should be out very very soon. :)
 

DalekDoc

Junior Member
May 4, 2013
10
0
0
Some more details about the 7xx series refresh - http://www.fudzilla.com/home/item/31286-gtx-760-ti-770-780-detailed

The GTX 760 Ti has a 256-bit memory interface and in reality represents a rebranded GTX 670. It looks dangerously similar to existing GTX 670 boards.

Geforce GTX 770 is based on a somewhat faster GK 104 425 chip, again a 256-bit unit, and our sources are telling us that the card is more or less a GTX 680 renamed for 2013

The fastest of them all, based on the huge GK110 chip is the Geforce GTX 780. The card also shares a 256-bit memory buss and is based on GTX Titan LE.

Hit up the source link for more info
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Some more details about the 7xx series refresh - http://www.fudzilla.com/home/item/31286-gtx-760-ti-770-780-detailed



Hit up the source link for more info
so much nonsense and so many errors in that little write up. and there is zero chance that they would make a gk110 gtx780 with just a 256bit bus. its clocks would be so low that the gtx680 and gtx770 and even 760ti would already blow it away in effective rop performance. not to mention the bandwidth itself being no better than the gtx680, gtx670, gtx770 and gtx760ti would defeat the purpose of having a lot more shaders from gk110.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Yeah there are spelling mistakes etc, but we'll see soon.

The card was supposed to launch on May 23rd but out sources now insist that it might show a week earlier on the May 16th, but we will try to find out which of these two dates is right. (Referring to the 780 aka Titan LE)
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Given the size of the GK110 and the number of transistors the performance is pretty poor. It has double the number of transistors of the 680 but only really performs 30-40% better.

No wonder its so expensive. I dont hold that much hope for the 780 GTX but i hope im wrong. I think id rather take the 770 GTX and run them in SLI if its based on a better 680 chip with more memory.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Given the size of the GK110 and the number of transistors the performance is pretty poor. It has double the number of transistors of the 680 but only really performs 30-40% better.

No wonder its so expensive. I dont hold that much hope for the 780 GTX but i hope im wrong. I think id rather take the 770 GTX and run them in SLI if its based on a better 680 chip with more memory.
Titan is very efficient for as big as it is though. it delivers more performance per watt than the 7970 or 680.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Given the size of the GK110 and the number of transistors the performance is pretty poor. It has double the number of transistors of the 680 but only really performs 30-40% better.

No wonder its so expensive. I dont hold that much hope for the 780 GTX but i hope im wrong. I think id rather take the 770 GTX and run them in SLI if its based on a better 680 chip with more memory.

transistors # performance

You pay for the DP.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
transistors # performance

You pay for the DP.

So, without the added DP performance the Titan would be cheaper? What makes you think that? It's not like nVidia are offering any professional grade support to go along with the DP switch. I think they did that because Tahiti is very powerful in compute functions and they wanted their flagship K20 sibling to have something to brag about.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Titan is very efficient for as big as it is though. it delivers more performance per watt than the 7970 or 680.

Do you think anyone dropping 1k on a GPU gives a toss about efficiency?

If im paying for 7bln transistors then id want 7bln transistors pushing my games.

I run my 7970 at a 30% OC 24/7 to hell with efficiency.

Titan is 7.1 vs the 7970's 4.3 billion transistors. Thats about 65% more but you only get 30-35% more FPS. Even less if the 7970 is overclocked.

Big hot and expensive chips are not the future of graphics. We need smaller and cheaper GPU's that can be run paralel with less reliance on software to make it work.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
So, without the added DP performance the Titan would be cheaper? What makes you think that? It's not like nVidia are offering any professional grade support to go along with the DP switch. I think they did that because Tahiti is very powerful in compute functions and they wanted their flagship K20 sibling to have something to brag about.

Yes I think so.NV advertised the DP capability of Titan in their marketing slides too.Titan may be attractive to select few who wants dp but not ecc(however weird that is).
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Do you think anyone dropping 1k on a GPU gives a toss about efficiency?

If im paying for 7bln transistors then id want 7bln transistors pushing my games.

I run my 7970 at a 30% OC 24/7 to hell with efficiency.

Titan is 7.1 vs the 7970's 4.3 billion transistors. Thats about 65% more but you only get 30-35% more FPS. Even less if the 7970 is overclocked.

Big hot and expensive chips are not the future of graphics. We need smaller and cheaper GPU's that can be run paralel with less reliance on software to make it work.
way to miss the point. you were talking about big the chip is which usually results in very inefficient hot running power hungry product. the gk110's main purpose was not for gaming so of course many of those transistors do nothing for gaming.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Yes I think so.NV advertised the DP capability of Titan in their marketing slides too.Titan may be attractive to select few who wants dp but not ecc(however weird that is).

Almost everyone that buys it has absolutely no use for DP. They surely aren't paying extra for the card for that capability.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Almost everyone that buys it has absolutely no use for DP. They surely aren't paying extra for the card for that capability.

They are even without (probably) knowing.People find that Titan is the fastest sgpu available and buy it and don't bother with specifications.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
way to miss the point. you were talking about big the chip is which usually results in very inefficient hot running power hungry product. the gk110's main purpose was not for gaming so of course many of those transistors do nothing for gaming.

Who cares?

If the GK110 was made for something else then dont sell it as a gaming GPU.

Who wants to pay supercomputer prices for a gaming chip?

This is the problem with left over parts from other projects. They are not ideal for gamers.

Nvidia sounds like its just cobbling together a rebadge and reject bin GPU lineup for 2013 which it will likely want to sell for top $$$
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Who cares?

If the GK110 was made for something else then dont sell it as a gaming GPU.

Who wants to pay supercomputer prices for a gaming chip?

This is the problem with left over parts from other projects. They are not ideal for gamers.

Nvidia sounds like its just cobbling together a rebadge and reject bin GPU lineup for 2013 which it will likely want to sell for top $$$
YOU were the one that brought up the size of the damn gpu. I simply said that it was very efficient for its size. now you keep saying who cares about anything else? well who the hell cares about the size or number of transistors? stop trying to make a new argument when all I said was the massive chip was efficient.
 
Last edited:
Status
Not open for further replies.