[Guru3D] Galaxytech GeForce GTX 780 Ti Spec Sheet Leaks

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
I would also expect the ti to have a better cooler considering they skimped on the memory and it costs 25% more.

I'm sure the cooler is great and will work like the other cards from Nvidia. Skimping on memory for this cards is, well, typical of Nvidia. They know damn well that two of these cards with 6gb of ram will carry you through another full generation and then some. So, the low ram is planned obsolescence in my eyes, and it works beautifully.
The cooler is also all about perceived value. Perceived value is so critical to a product and its almost never thought about by the consumer. For instance, people sometimes add weight to their product to make it feel more expensive and substantial when that weight isn't otherwise needed. Other times they add big shiny heat sinks to their ram chips that do absolutely nothing. Or, they include a shiny aluminum shroud on their video card to make it look like the luxury sports cars of GPUs.
I'd take the 290x if I were in the market. Don't care about the noise. I use headphones anyway.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
So you going to sell those 780's and get the big daddy ultimate gk110 card groover?

Not without a stepup via EVGA, which I can't do unless I swap these cards for reference cards, so I'm not. Nvidia has put all the high end buyers on a neverending treadmill through 28nm. How many 28nm flagships are they going to release ?

I had Titans at 1200 and they were basically a hair slower than my current setup. Another SMX is going to give a minimal gain over cards clocked as high as mine are, with only reference locked down designs available for the initial release, I'm not sure I could even get more performance out of 780ti SLI. I'm riding these cards out till 20nm big die, hopefully will be able to not bother with the next set of mid-range cards being sold as high end we're sure to see next year.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
Not without a stepup via EVGA, which I can't do unless I swap these cards for reference cards, so I'm not. Nvidia has put all the high end buyers on a neverending treadmill through 28nm. How many 28nm flagships are they going to release ?

I had Titans at 1200 and they were basically a hair slower than my current setup. Another SMX is going to give a minimal gain over cards clocked as high as mine are, with only reference locked down designs available for the initial release, I'm not sure I could even get more performance out of 780ti SLI. I'm riding these cards out till 20nm big die, hopefully will be able to not bother with the next set of mid-range cards being sold as high end we're sure to see next year.


Good strategy on their part. It worked, that's for sure. Nvidia's been playing their enthusiasts for fools, you know, the way they were meant to be played.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Tens shocked :eek: ;)

Enjoy lavaheadache!

I paid just about $500 ea. for both of my 780's so I'm really not going to lose much money on them when I sell. Also some sap just bought an Apple Thunderbolt display off me on Ebay for twice what I thought I would get here.... Looking at almost nothing out of pocket.

That Apple monitor was part of a big score on craigslist. I got 3 monitors from some lady locally for $650. A original 30 incher, a thunderbolt 27 and a thunderbolt 24. Needless to say I made some good coin reselling them.

I'm not sure if I going to go SLi Ti's or just 1. I wanted to try out Sli this gen for fun and while I am quite impressed, 1 card is plently for nearly every game with a click or two down on the settings at 2560. 2 cards just allows you to crank everything at the cost of 2 cards.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
I'm sure the cooler is great and will work like the other cards from Nvidia. Skimping on memory for this cards is, well, typical of Nvidia. They know damn well that two of these cards with 6gb of ram will carry you through another full generation and then some. So, the low ram is planned obsolescence in my eyes, and it works beautifully.
The cooler is also all about perceived value. Perceived value is so critical to a product and its almost never thought about by the consumer. For instance, people sometimes add weight to their product to make it feel more expensive and substantial when that weight isn't otherwise needed. Other times they add big shiny heat sinks to their ram chips that do absolutely nothing. Or, they include a shiny aluminum shroud on their video card to make it look like the luxury sports cars of GPUs.
I'd take the 290x if I were in the market. Don't care about the noise. I use headphones anyway.

I dont know what the issue is with memory, every time AMD come out with a card thats got more memory we get the such and such isnt enough scenario. We have just done 2-3, now the 290x has 4ghz, sudden, BF4 needs 4gb...LOL...
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Like most people.
again with that? nearly every person can see tearing and choose to let it bother them or not. that's different then your claim of saying you get no tearing . that's nonsense and means you have problems with your vision or neurological issues.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
I dont know what the issue is with memory, every time AMD come out with a card thats got more memory we get the such and such isnt enough scenario. We have just done 2-3, now the 290x has 4ghz, sudden, BF4 needs 4gb...LOL...

Haha come on man. That's funny actually, but I didn't suggest BF4 needed 4gb. I'm just not comfortable with 2gb being able to feed the game perfectly.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
again with that? nearly every person can see tearing and choose to let it bother them or not. that's different then your claim of saying you get no tearing . that's nonsense and means you have problems with your vision or neurological issues.

It's too bad a poll can't be done on this because I personally think anyone suggesting vsync off doesn't produce tearing has to be severely lacking in terms of visual acuity. It's so apparent - It's the first thing I and all of my "gamer friends" mention when vsync is off. Look at the tearing.

It's one thing to just let it fly and ignore it, but to say it's not there..uh........I DUNNO. I mean why are we doing all of this adaptive vsync and g-sync stuff if nobody noticed tearing. Come on. Needless to say, I agree with you on this one.
 
Last edited:

janii

Member
Nov 1, 2013
52
0
0
Haha come on man. That's funny actually, but I didn't suggest BF4 needed 4gb. I'm just not comfortable with 2gb being able to feed the game perfectly.

I have the gtx 580 with (I think?) 1.5 gb ram.

Everything maxed on my system it was mostly over 40fps. Felt fuild anyway.
Was sitting on that "non bf4 optimized" driver and it was the beta build of bf4.
No idea how much ram it used. 1200p monitor.

So whats the deal with those ram? I cant remember anyone that cared for that when I build my system back in 10/11 or so.

I only know that more ram was better for mods (skyrim)
 

tg2708

Senior member
May 23, 2013
687
20
81
I need to know which single graphics card currently that will be able to give a minimum of 50-60 fps in every current gen game when almost all settings are maxed at 1080p? I'm asking because msaa (fxaa is fine though) and the like are settings I can't touch if I want a very smooth performing game. I have a ben q 120 hz monitor and that refresh rate is a god send if I can keep my fps above 70. My 7970 performs adequately, no issue apart from just wanting to improve my minimum fps. Not steering away from sli nor cf but I would like a single gpu setup.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
I play Battlefield 4 with vsync off and a framerate cap of 65. I see no tearing. If I turn off the framecap then I get tearing constantly. Don't need to use vsync to eliminate tearing.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I play Battlefield 4 with vsync off and a framerate cap of 65. I see no tearing. If I turn off the framecap then I get tearing constantly. Don't need to use vsync to eliminate tearing.

Curious, why not use adaptive vsync? I don't think I would bother with a 65 fps cap with a u3011 - that's me though. I just use adaptive vsync and forget it, and call it a day. I really love that feature.
 

Xed

Golden Member
Nov 15, 2003
1,452
0
71
Maybe nvidia will throw us a bone and price the 3gb model at 599 and the 6gb at 699.

It could happen!
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Curious, why not use adaptive vsync? I don't think I would bother with a 65 fps cap with a u3011 - that's me though. I just use adaptive vsync and forget it, and call it a day. I really love that feature.

I can't play any online FPS with any sort of vsync, there is too much delay added.
 
May 13, 2009
12,333
612
126
I really can't see tearing. Also don't really notice aa either. I'm sure if someone pointed it out to me it'd bother me to no end. So I'll just stay ignorant of it.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
again with that? nearly every person can see tearing and choose to let it bother them or not. that's different then your claim of saying you get no tearing . that's nonsense and means you have problems with your vision or neurological issues.

Again with that? Only some people see tearing, its not a issue for most people, that is why its a OPTION and not on in most games. Hell Geforce Experience does not enable it in lots of my games.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
Maybe nvidia will throw us a bone and price the 3gb model at 599 and the 6gb at 699.

It could happen!

That would be awesome no doubt, I would be willing to pay a little more for a 6 gig version. They are prob not doing the whole 6 gigs until Titan hits EOL, i would venture to guess they will eventually just let the card sell with 6 gigs.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I really can't see tearing. Also don't really notice aa either. I'm sure if someone pointed it out to me it'd bother me to no end. So I'll just stay ignorant of it.
please tell me how you cannot see this. this is Dead Space 3 at 144hz still tearing like mad cause of flickering lights which gives worse case scenario. even regular scene tearing when looking around is obvious to anyone with normal vision.

http://www.youtube.com/watch?v=Y3T6chyW2Vo


I guess imaheadcase will say he does not see tearing in the video or see it in the screenshot below.



Uploaded with ImageShack.us
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126

Nintendesert

Diamond Member
Mar 28, 2010
7,761
5
0
Lol, qft.

Pretty great if these are the final specs, this is turning into a good ol' slugfest.



Yeah it is. I'm pretty excited about the new developments and the efforts both sides seem to be putting into gaming again. I'm hoping audio takes off again, nothing like playing with good sound and things like G-Sync look to really change the landscape for the better.

Unfortunately the fragmentation needs to be worked out and hopefully it will be.
 

ICDP

Senior member
Nov 15, 2012
707
0
0
It looks like 780Ti will trade blows with R9 290X depending on resolution, with the Nvidia card being $150 more expensive. GTX780Ti reference has a much better cooler, R9 290X has a wider memory bus and 1GB more VRAM. Yay bring on the GPU price wars... oh wait, as you were Nvidia, charge more for similar performance. :rolleyes:

How can this be a great card for $200 more than a GTX780 and not even 15% faster?
 
Last edited:

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
It's good because it's a new stepping silicon (probably clocks higher) and it has faster memory modules (clocks higher). And it has the extra cuda cores to give the full boost GK110 was intended for. Price is high for those advancements, only drawback will be the 3GB instead of 6GB. 6GB & higher boost clocks are likely being held back for the higher binned ASICs to go into a 'Titan Ultra'. Looks like a sweet card to give 290X a run for top Hwbot scores.