The GTX 780, 770, 760 ti Thread *First review leaked $700+?*

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Xarick

Golden Member
May 17, 2006
1,199
1
76
Why would galaxy allow their fan profile to push their cards above 70c if the card throttled at 70c. That wouldn't make any sense.
I have no problem with gpuz. It matches every other program I have monitored boost with.
When I get home I will download afterburner, set this thing up and run well above 70 and post my graph.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
I can guarantee you it does not throttle at 70c
I have run heaven well above 80c and no throttling.
I know many others who have the same. I have also spoken to the galaxy rep because they allow their card to go well above 70 in the fan profile and they have guaranteed me it will not throttle. Since I have seen no evidence of throttling above 80c (my card is set to hit 1175 and I have monitored it's boost through gpuz for over 30 minutes above 80c) I have to assume there is a disconnect in the logic that all cards throttle.
BS because the throttling at 70 and above is part of EVERY Kepler that has boost 1.0.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Why would galaxy allow their fan profile to push their cards above 70c if the card throttled at 70c. That wouldn't make any sense.
I have no problem with gpuz. It matches every other program I have monitored boost with.
When I get home I will download afterburner, set this thing up and run well above 70 and post my graph.

It's part of nvidia's intentional design to increase efficiency, and to prevent excessive RMA's through customer abuse. It's also to protect your card from you. It's a nice system when you think about it. There is no kepler with boost card that doesn't throttle, they all do - again, i'm not saying this with a negative connotation, just stating facts. There isn't a perceptible performance hit, it is only a 1 bin throttle to maintain proper thermal characteristics.

Nvidia doesn't want tons of RMAs from folks running their cards at 95C 24/7. As was the case with some GTX 480 cards.
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
By default my card only boosts to 1136.7 which is low for galaxy cards.. and a sore point with me.
It only OC to 1189, but I did these test at default.


At the start
http://imageshack.us/f/825/beginningj.png/

At 75c
http://imageshack.us/f/22/62262742.png/

at 77c (I know it says 74, but it dipped for a second)
http://imageshack.us/f/824/97731509.png/


I don't throttle
I couldn't get it above 80 though.. I wonder if it does above 80.

I just gamed.. bf3 on ultra hit 74, but never dropped below 1137 according to AB.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
I think whats a happening with you is that 1137 is not your full boost below 70. my card sometimes takes a few minutes before it hits the final max boost. my card never goes above 70 though so once I hit full boost I stay there if my card needs it. I play nearly every game with vsync on though so I usually never hit full boost.

to be clear you are hitting 70 C it seems before you actually see your full boost anyway. it just looks like you are not throttling cause you are never hitting that last little 13mhz boost before your card heats up.

and to say others are not throttling and that even Galaxy told you it wont throttle is simply not true. and you cant even get to 80 C now where earlier you claimed it did not throttle there either which is not true.
 
Last edited:

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
How likely would we be to see a GTX 790? If the GTX 780 is GK110, pretty unlikely right?
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
No, because when I use precision to give me a fan profile to keep me 65 or below I still never go past 1137
I hit 80 in kombuster. Here I will do more tests just for you.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
No, because when I use precision to give me a fan profile to keep me 65 or below I still never go past 1137
I hit 80 in kombuster. Here I will do more tests just for you.
lol, then congratulations on having the only Kepler card in existence that does not throttle.
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
I get it I get it..
So.. more testing.
My card NEVER drops in MHZ
but it does lower the voltage at 73c
I drop from 1.175 to 1.162
But still do not lost any performance.

I can even run 1175 at 1.162

Well guys I stand corrected. I guess it does throttle, but the way my card is designed I never see the throttle at 70 because it can run my boost at lower voltage and can even run +40 at lower voltage.
 
Last edited:

Ajay

Lifer
Jan 8, 2001
16,094
8,116
136
They do throttle. But in different ways. GK110 uses a boost 2.0, while the 600s use boost 1.0. The original point remains, thermal throttle is present on any and every GTX 600 card, it is programmed into the drivers and is ingrained into the chip itself. Any assertion of a 600 card (with boost) not throttling at determined presets (70c, 80c) is wrong - it cannot be avoided because it's part of the GK104s core logic. (aside from the msi lightning series, maybe ev bot classified's from evga - but nvidia put an

I wasn't disagreeing about the fact that GK110s throttle, I just recalled reading somewhere that they didn't throttle @ 80C. Now, since I can't find that article, my point is unsubstantiated - I'll take your word as is for now.
 

Wreckem

Diamond Member
Sep 23, 2006
9,564
1,150
126
I was not a fan of boost when I owned a 670.

It's amazing you guys defend boost when we all know it's crap.

Give me unlocked voltage and a slider for the clocks.

I understand where Nvidia is coming from. And they shouldnt have to suffer the financial consequences from people overclocking and ruining their card then claiming warranties.

What they should do is allow the option to unlock a card, but said unlocking voids all warranties.
 

iMacmatician

Member
Oct 4, 2012
88
0
66
youtube.com
How likely would we be to see a GTX 790? If the GTX 780 is GK110, pretty unlikely right?
Even Fermi got a dual-chip card for < 375 W so I think a dual-GK110 card is feasible for 375 W. I'm not sure it will be called the 790 though, more likely "Titan II" or something like that. I feel like any well binned chips (that seem to be wanted for a dual card) would first go to a future Tesla upgrade though.

I've been wondering about the possibility of a second dual-GK104 card with 8 GB memory, 1.05-1.1 GHz core, possibly > 6 Gbps memory, and 300 W to replace the 690 (something like dual 770s).
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
Even Fermi got a dual-chip card for < 375 W so I think a dual-GK110 card is feasible for 375 W. I'm not sure it will be called the 790 though, more likely "Titan II" or something like that. I feel like any well binned chips (that seem to be wanted for a dual card) would first go to a future Tesla upgrade though.

I've been wondering about the possibility of a second dual-GK104 card with 8 GB memory, 1.05-1.1 GHz core, possibly > 6 Gbps memory, and 300 W to replace the 690 (something like dual 770s).

Yeah, the 2GB memory limit on the GTX 690 makes it a pretty useless card for the resolutions that need that kind of GPU power.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Even Fermi got a dual-chip card for < 375 W so I think a dual-GK110 card is feasible for 375 W. I'm not sure it will be called the 790 though, more likely "Titan II" or something like that. I feel like any well binned chips (that seem to be wanted for a dual card) would first go to a future Tesla upgrade though.

I've been wondering about the possibility of a second dual-GK104 card with 8 GB memory, 1.05-1.1 GHz core, possibly > 6 Gbps memory, and 300 W to replace the 690 (something like dual 770s).

770 ~== 680
690 = 680 x 2
790 != 690

I doubt they'd just bump the clocks up a touch.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Well back to the exact topic at hand.... I'm fearing Nvidia's greed. $649 for a GK110 based gtx780 is back to gtx280 release day prices.... I was expecting / hoping for a $599 release, but now that notion is seemingly a long shot. Even $649 sounds like that might actually be low balling what Nvidia wants to come in at. :( In the past I had no qualms paying an extra $10-20 for an equivalent performing Nvidia part over AMD for personal reasons, but with AMD's stellar bundles as of late and Nvidia's (rumored) upcoming high prices, any goodwill that has been built up may be thrown completely out the window. Thankfully I'm in no position to upgrade until both camps release their next gen, and as always with next gen there is a clean slate to work from, but as things are going it looks like AMD is trying much harder to earn my money than Nvidia.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
So how fast over a 680? Paid $550 for mine back around October/November. Titan is roughly ~35% faster for a $1K, so if the 780 is ~25% for 650, not too bad.

How fast you think?
 

Elfear

Diamond Member
May 30, 2004
7,169
829
126
So how fast over a 680? Paid $550 for mine back around October/November. Titan is roughly ~35% faster for a $1K, so if the 780 is ~25% for 650, not too bad.

How fast you think?

I doubt we'll see a 10% performance difference between the 780 and Titan for a $350 price difference. I'd expect at least a 20-25% delta between the two for those prices. Maybe something like 770 (680 performance) for $400-450, 780 (680 + 20%) for $650, and Titan (680 + 40%) for $1000.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I doubt we'll see a 10% performance difference between the 780 and Titan for a $350 price difference. I'd expect at least a 20-25% delta between the two for those prices. Maybe something like 770 (680 performance) for $400-450, 780 (680 + 20%) for $650, and Titan (680 + 40%) for $1000.

It is simple math, a gk110 chip with a 320 bit bus and an 850mhz boost will be about 15% slower than titan. I am guessing on the core speed, but that guess can't possibly be far off.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
How likely would we be to see a GTX 790? If the GTX 780 is GK110, pretty unlikely right?
Depends on the TDP of GTX 780. If its 235W like rumored, we could maybe see 780s as GTX 790. GTX 680 had a TDP of 195W, two of them together would make 380W but Nvidia managed to squeeze it down to 300W for the GTX 690.

So 2 * 780 would make 470W but Nvidia can squeeze it down to 390W and it will have same TDP as 7990 and GTX 580. So it can absolutely happen.

After all, its another year until Maxwell is here, so they might push it out to offer a new product after some months
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
How likely would we be to see a GTX 790? If the GTX 780 is GK110, pretty unlikely right?

It could definitely happen. Titan draws nearly the same power as a gtx580 under load and Nvidia has proven they can manage the heat with excellent reference coolers. The question would be whether the card would have 10gb of vram (assuming a 320-bit bus), or if it would have 5gb of vram (2.5gb effective per GPU). 10gb of vram on a single card might be logistically unfeasible. Also the price would be absolutely, positively, without a doubt, stupid.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
The rumors for this card are all over the place. The rumors indicate either 320 bit or 384 bit, and 5GB VRAM or 3GB VRAM depending on the bus size. I'd say if the GTX 780 ends up at 2.5-3GB of VRAM at a cost of no more than 650$, preferably 550-600$, then I think folks won't complain about the price an awful lot. It's high, but not awfully exorbitant.

There's another rumor indicating 800$. Now, I realize nvidia can do whatever they want. But I feel that's just exorbitant. Some will still buy as nvidia has a lot of fans of course, but it definitely won't win them any good will. That's just too much for the x80 part IMHO. They already have the Titan for folks wanting to part with an arm and a leg, why put the x80 in the same stratosphere?

Personally I think the former is the preferable approach. Make it 384 bit with 3 and 6GB configurations which can be chosen by the user; 5GB is far too much for the typical user. You can say "future proof", but we don't have games yet that can make adequate use of that much VRAM unless the end-user tries awfully hard to use 17 gabillion mods and 8X SGSSAA in a single screen resolution. Surround can use 5GB, and while I like surround - most users just aren't using surround, period.

So yeah...I think the best price point would be around 600 for a 3GB card, and higher for a 6GB version. They already have titan as mentioned for those who want something in the stratosphere. Assuming 384 bit as some rumors indicate, the latest sweclockers references indicate that it may indeed be 384 bit with fewer CUDA cores and less VRAM than titan.
 
Last edited:

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
They've already stated that the Titan was intended to be a niche product, just like the 690, and that's why it's so expensive. I really doubt they'll let it become the new normal.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
They've already stated that the Titan was intended to be a niche product, just like the 690, and that's why it's so expensive. I really doubt they'll let it become the new normal.
so your saying that anytime we finally get a gpu worth upgrading to that they will call it a niche product and slap a $1000 or more price tag on it. by that I mean the Titan was the first significant single gpu upgrade since the 2.5 year old gtx580 which was already 500 bucks when it launched.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
so your saying that anytime we finally get a gpu worth upgrading to that they will call it a niche product and slap a $1000 or more price tag on it. by that I mean the Titan was the first significant single gpu upgrade since the 2.5 year old gtx580 which was already 500 bucks when it launched.
A lot of us have been enjoying Titan-level performance on 7970's and GTX 680's for almost a year and a half. It's hardly significant.

NVIDIA made a nice cash-grab move and some people bought it. The significance of which is hard to determine without some actual sales figures.
 
Status
Not open for further replies.