Is a GTX 780 a future proof GPU?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Belial88

Senior member
Feb 25, 2011
261
0
0
Hello everyone ! Well i was wondering exactly that. Is a GTX 780 a future proof choice? Since the new generation of consoles are arriving with 8 gigs of GDDR5 ram shared with the system and the gpu im very worried about to be limited in the next gen games titles. I had a GTX 770 DCU2 wich i was planning to SLI it with another but i received replies here that my i5 4670 non k can bottleneck both cards so i had a chance to sell my 770 and i did. Now i have the chance to buy an ASUS GTX 780 DCU2 and what stopped me is the fact that i've been thinking about next gen games. 3GB of VRAM only and it doesnt even support full DX11.1. I'd hate to spend $749 and after 6 months the card would not give me a chance to max out a next gen game.
What do you think guys ? Titan is not a solution, no way im spending 1000 in that.. should i take the 780 or wait.. months? Years maybe? Lol thanks in advance.

The new gen consoles won't use anywhere close to 8GB ofV RAM, and they aren't really coming out with 8GB of VRAM, they have 8GB of RAM and use shared memory, like an APU or iGPU. In short, new gen consoles are going to be about as strong as a Phenom X4 + GTX470 more or less (ie xbox is weaker gpu more like 460). Or i3 + 7770, for a more modern equivalent. Something with such little power, is hardly going to be setting standards for future PC games.

We saw the same thing happen when Xbox 360 and PS3 launched. "Omg buy Phenom X6, new games are all going to be 6 cores like the new consoles!".

To date, we have only 4, AAA games that use 6-threads - BF3, FC3, BS:INf, Cry3, and all of them still prefer stronger IPC to 6 threads (ie a stronger quad-core like an i5 > Phenom X6/6300 or even 8350), all of them are much, much more GPU dependent than CPU dependent (ie an i3 + 7950 will way outperform an i7 + 7850 system), and only the latter has sub-100FPS with even an i5.

Furthermore, if all you do is play, say, Starcraft2, for the next 10 years, then your GTX 450/640/7750 will easily last 10 years. It's all about the applications you use.

The idea of 'future proof' is just retarded. Computer components, especially graphics cards, are fungible, meaning they are very liquid, like cash, they sell off easily. Buy the exact graphics card you need for the games you play today, and if you need more, sell it off and buy a new one. Because any dollar more you spend on unnecessarily 'future proof', means a dollar less in the rest of your system, where the performance would've made a difference. Your monitor, your sound, etc, lots of things can be upgraded, and those are all fungible too.

So don't buy the 780 if you don't need it. Get a 7950 instead, and use the extra money to sell your monitor and then all together buy an even better one.

Most games hardly use more than 2GB of VRAM, so don't worry about VRAM. You can easily google how much VRAM X game uses, or use a program like HWInfo or Rivatuner to see how much VRAM you currently use. VRAM isn't really a significant impact on your FPS until you are down about 500mb+ from how much a game is using. You really only need to concern about it if you play on larger resolutions, like 1200p or 1440p.

I read the news about amd and their new gpus coming in september that will compete with the titan.

yea! Just like Ivy was going to be this huge jump over Sandy! Oh, and remember Bulldozer? Smashing success! And dang, I'm so glad I waited for the 7xx series!

Next gen will never be a huge step up. As for AMD GPU's competing with Titan, sure, and with a similar price tag too. Titan is still seen as ridiculous overkill for 99.9% of gamers anyways, it'll be a while before Titan-level performance is really at an affordable price or worth getting for most gamers.

My point is don't wait on new tech. Like I said, you can always sell what you got and buy it. People around here act like when you build a custom computer, you can never swap out the parts, and that you must throw them away afterwards or something.

If you bought, say, a GTX 460 3 years ago for $100, and you sold it today for $60, I mean you basically 'rented' it for $40, which is a damn good deal. Throw that on, say, your 7850 you buy for $150, $90 for that 7850 that you'll sell off in the future and make most of it back, I mean you are paying very little.

All this computer components... at the same time, they will never be future proof, and they will always be future proof. A Phenom X4 + GTX 460 system for $350 from 3 years ago will still play any Crysis3 or top tier game on decent graphics, for example, and will destroy the vast majority games still.

Buying a 780 to be 'future proof' is just about the dumbest thing you can do. No matter what, in a few years, or even a few months, there will be a better purchase for the dollar. If you really want 'future proof'... buy on Black Friday. Just get what you need, and sell it off and replace it as necessary. You have a custom computer, the whole point of it is swapping out parts as necessary.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
OP whats your current card ? at 1080p a HD 7950 OC is more than enough. you can max any game at 1080p with a HD 7950 OC'd to 1100 - 1150 mhz. there are some crazy ass deals for USD 250.

http://www.amazon.com/Sapphire-Radeo...dp/B00CPLI74S/

and by next July you could buy 20nm GTX 880 GPUs which will kick Titan in the nuts for USD 500 -550.

frankly Nvidia's pricing at the high end is awful due to lack of competition which should change in Oct. in early-mid Oct Hawaii HD 9970 with BF4 game bundle is going to launch. Press event is Sep 25th. usually 2 weeks after the press event products will launch in retail with reviews going live. also BF4 uses DX 11.1 to improve performance. though you need Win 8 for that.

http://forums.anandtech.com/showthread.php?t=2333042

think and decide. :thumbsup:
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Hello everyone ! Well i was wondering exactly that. Is a GTX 780 a future proof choice? Since the new generation of consoles are arriving with 8 gigs of GDDR5 ram shared with the system and the gpu im very worried about to be limited in the next gen games titles.

Future proofing for next gen games with 780 is a waste of time. PS4/XB1 will last 7-8 years. The best way to future proof is to keep upgrading from 28nm to 20nm to 14-16nm, etc. Buying a $650 card and holding it for 5-6 years is the worst strategy ever. If you cannot afford to buy $650 GPUs every 2 years, and do not care to resell, just buy what you need for today's games and upgrade every 2 years to something faster.

We are a few months away from products based on the next process which will have a genuine performance increase.

TSMC reports that 20nm production won't start in volume until Q1 2014. We are not a few months away. More like 6-9 months away. It is only August 2013. 20nm flagship GPUs probably won't launch until May-June 2014 at the earliest.

780 over 770 is a useless upgrade.

There is a larger performance difference between 780 OC vs. 770 OC than there is between 770 OC and 760 OC. If you say upgrading from 770 to 780 is useless, than upgrading from 760 to 770 is an even bigger waste of $.

The cards that make the most sense right now are 760/7950 V2, 1Ghz 7970 and 780. 770 makes the least sense out of all the options on the market, especially $450 4GB versions. Those are a total waste of $. With the 780 you are paying for a card that in after-market form has > Titan level of performance. With 770 4GB, you are paying $100 more over 7970 cards that are barely 5-6% slower. The 780 at least represents a next level of performance beyond 770/7970.
 
Last edited:

zlatan

Senior member
Mar 15, 2011
580
291
136
Hello everyone ! Well i was wondering exactly that. Is a GTX 780 a future proof choice? Since the new generation of consoles are arriving with 8 gigs of GDDR5 ram shared with the system and the gpu im very worried about to be limited in the next gen games titles. I had a GTX 770 DCU2 wich i was planning to SLI it with another but i received replies here that my i5 4670 non k can bottleneck both cards so i had a chance to sell my 770 and i did. Now i have the chance to buy an ASUS GTX 780 DCU2 and what stopped me is the fact that i've been thinking about next gen games. 3GB of VRAM only and it doesnt even support full DX11.1. I'd hate to spend $749 and after 6 months the card would not give me a chance to max out a next gen game.
What do you think guys ? Titan is not a solution, no way im spending 1000 in that.. should i take the 780 or wait.. months? Years maybe? Lol thanks in advance.
The memory won't be a problem. 3 GB is well enough for the next gen titles.
The 8 supported UAV slot should be a problem. The nextgen consoles have unlimited UAVs, so there is a chance that the main graphics pipeline in the next gen games won't be portable to PC, or it will only portable to DX11.1 which provides 64 UAVs.
The Tiled Resources support in DX11.2 is another potential problem. The TIER1 implementation won't support shaders, and the Kepler based GeForce cards only support this option.
 
Last edited:

zlatan

Senior member
Mar 15, 2011
580
291
136
Also, the 700 series are all DirectX 11.1 compatible at the hardware level with the exception of a few non gaming features.
I said before that this statement is bullshit. The 64 supported UAV slot, and the UAV access in non-pixel-shader stages are definitely gaming features.
The shader feedback instructions availability in the tiled textures is also a gaming releated feature.

People must understand that NVIDIA is a profit oriented company, and they simply can't say that these are the most usefull gaming features in DX11.1/11.2, because the potential costumers will by GCN based Radeons. It's more profitable to lie about these things.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Future proofing for next gen games with 780 is a waste of time. PS4/XB1 will last 7-8 years. The best way to future proof is to keep upgrading from 28nm to 20nm to 14-16nm, etc. Buying a $650 card and holding it for 5-6 years it he worst strategy ever. If you cannot afford to buy $650 GPUs every 2 years, and do not care to resell, just buy what you need for today's games and upgrade every 2 years to something faster.



TSMC reports that 20nm production won't start in volume until Q1 2014. We are not a few months away. More like 6-9 months away. It is only August 2013. 20nm flagship GPUs probably won't launch until May-June 2014 at the earliest.



There is a larger performance difference between 780 OC vs. 770 OC than there is between 770 OC and 760 OC. If you say upgrading from 770 to 780 is useless, than upgrading from 760 to 770 is an even bigger waste of $.

The cards that make the most sense right now are 760/7950 V2, 1Ghz 7970 and 780. 770 makes the least sense out of all the options on the market, especially $450 4GB versions. Those are a total waste of $. With the 780 you are paying for a card that in after-market form has > Titan level of performance. With 770 4GB, you are paying $100 more over 7970 cards that are barely 5-6% slower. The 780 at least represents a next level of performance beyond 770/7970.

I agree.I believe op already had a 770, I doubt he will see any benefits going to a 780 @ 1080P.I think 770 is plenty for that res save a few games.People who already purchased 760 should buy another which will make their setup even faster than titan.
 

seitur

Senior member
Jul 12, 2013
383
1
81
You'll have to specify what exactly future proof mean to YOU.

I personally would not buy it.
 

BigChickenJim

Senior member
Jul 1, 2013
239
0
0
OP already bought the 780. There's an important lesson to be learned here: take upgrade suggestions, even the ones here, with a grain of salt. He's now had to pay far, far more than he needed to for a card that will show no noticeable gains over his old setup for quite some time simply because he was given bad information. That's a real bummer.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I don't know if anyone else mentioned this, but don't think for a second that 8GBs on the new consoles means 8GBs to the GPU.

Microsoft already said 3.5GBs are reserved for the OS and Kinect functions. And Sony is tip-toeing around a similar statement claiming it is entirely up to the developers but something like 2GBs is dedicated to the OS and system, and there is 1GB (or 512mbs) considered Flex Memory which the devs can choose to use (or not.)

To be practical I'd say these consoles are going to have any where from 4-5GBs dedicated to the GPU and that doesn't even mean they'll be using it fully.

Plus, we got system RAM too which game files can use to allocate memory for physic calculations or whatever other non-shader heavy crap. They got 8GBs for GPU+CPU+System.
 

Spjut

Senior member
Apr 9, 2011
931
160
106
The memory won't be a problem. 3 GB is well enough for the next gen titles.
The 8 supported UAV slot should be a problem. The nextgen consoles have unlimited UAVs, so there is a chance that the main graphics pipeline in the next gen games won't be portable to PC, or it will only portable to DX11.1 which provides 64 UAVs.
The Tiled Resources support in DX11.2 is another potential problem. The TIER1 implementation won't support shaders, and the Kepler based GeForce cards only support this option.

Is DX11.1/11.2 really that much of a problem though, given that Windows 7 is by far the biggest OS and only fully supports DX11.0?
This generation, we saw how even plenty of PC exclusives that required DX10 cards still only supported DX9

Anyway, thanks for going a little technical
 
Jun 23, 2013
95
0
66
Well Futureproof for me means something that can give good performance even after a few years. I dont really keep too long the same hardware, i upgrade very often but this time i want something that last longer at least for a year and a half. I'll get for sure a PS4 also, consoles are crap but i must admit that the best exclusive are often for Playstation users. I own a PS3 also which im not selling yet. Im waiting for the MGS Compilation and Play Ground Zerous and Phantom Pain un the PS4 is a MUST for me sice in a fan of MGS sagas. But i love gaming in the pc online (BF3, Arma 3, etc etc and many single llayer titles) i wasnt probably a good move (again) but i hope i can enjoy it a bit. I choosed ASUS because of its look. I like good looking hardware, when my gpu arrives i'll checl how well it performs against the newest games. I'll see how long it will last in my hands.. ASUS GTX 780 is something imposible to get where i live because of its price. So i can sell it after a year and get good money for it. Thanks you guys again. I wasnt expectin many answers and wow you all really taught me lots of things here. Next time it will be a smart move i swear lol my wallet will cry this weekend.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Well Futureproof for me means something that can give good performance even after a few years. I dont really keep too long the same hardware, i upgrade very often but this time i want something that last longer at least for a year and a half. I'll get for sure a PS4 also, consoles are crap but i must admit that the best exclusive are often for Playstation users. I own a PS3 also which im not selling yet. Im waiting for the MGS Compilation and Play Ground Zerous and Phantom Pain un the PS4 is a MUST for me sice in a fan of MGS sagas. But i love gaming in the pc online (BF3, Arma 3, etc etc and many single llayer titles) i wasnt probably a good move (again) but i hope i can enjoy it a bit. I choosed ASUS because of its look. I like good looking hardware, when my gpu arrives i'll checl how well it performs against the newest games. I'll see how long it will last in my hands.. ASUS GTX 780 is something imposible to get where i live because of its price. So i can sell it after a year and get good money for it. Thanks you guys again. I wasnt expectin many answers and wow you all really taught me lots of things here. Next time it will be a smart move i swear lol my wallet will cry this weekend.

If this gen is anything like last gen, I used a HD 5870 for 2 years and only ran into bottlenecks end of 2011 when more Tess heavy games were starting to appear.

My HD first HD 7970 is going to be hitting two years in about 5 months and so far I still haven't run into issues (I got a second one just because I'm stupid haha.)

I'm pretty sure a current top tier GPU will give you 1.5 years without issue.
 

Mr Expert

Banned
Aug 8, 2013
175
0
0
My HD first HD 7970 is going to be hitting two years in about 5 months and so far I still haven't run into issues (I got a second one just because I'm stupid haha.)
I would not say you are stupid to buy a second HD 7970 as all that means is that you want all the eye candy maxed and still maintain a minimum framerate of 60fps+ which is altamatly what most of us gamers are after.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Well hey OP enjoy it. I know how hard it is to play the "wait and see" game for new hardware when you have the money now. Asus makes great cards, very robust PCB and cooling. I prefer MSI but only for very minor reasons over Asus.
 

Mr Expert

Banned
Aug 8, 2013
175
0
0
Is DX11.1/11.2 really that much of a problem though, given that Windows 7 is by far the biggest OS and only fully supports DX11.0?
This generation, we saw how even plenty of PC exclusives that required DX10 cards still only supported DX9

Anyway, thanks for going a little technical
DX10 was a flop. The .1 iterations of DX are nothing to behold in fact they are mostly just marketing tools and are just a side show to the real deal which is DX11 and DX12. You are not missing out on anything if you can't run DX 11.1 and it will not be supported very well by the game developers anyway as it will be is only a few games like Physx.
 

Mr Expert

Banned
Aug 8, 2013
175
0
0
I don't know if anyone else mentioned this, but don't think for a second that 8GBs on the new consoles means 8GBs to the GPU.
A low power APU does not have the power to fill up a 1GB fram buffer let alone anthing more. People seem to think that Vram means how powerful the card is when in fact the GPU can only handle so much before it bogs down. For instance an HD 7850 which is a fairly robust graphics card only needs 1GB Vram because that's in reality all that it can effectivly handle before the GPU bottlenecks and creates low downs.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
A low power APU does not have the power to fill up a 1GB fram buffer let alone anthing more. People seem to think that Vram means how powerful the card is when in fact the GPU can only handle so much before it bogs down. For instance an HD 7850 which is a fairly robust graphics card only needs 1GB Vram because that's in reality all that it can effectivly handle before the GPU bottlenecks and creates low downs.

Many launch games are using 3GB for the GPU already. Killzone on the ps4 is one of them.
 

Mr Expert

Banned
Aug 8, 2013
175
0
0
Many launch games are using 3GB for the GPU already. Killzone on the ps4 is one of them.
Funny cause I never have any slow downs or anything even when the framebuffer on my HD 7850 1GB fills up it still performs just as an HD 7850 with 2GB should. Where do you hear that Killzone will use 3GB Vram ? That weak GPU/APU in the consoles can never fill it up without the grpahics bogging down the gGPU first.
 

NIGELG

Senior member
Nov 4, 2009
852
31
91
The 780 is a good card to have for a couple of years so yes it is future proof.
 
Jun 23, 2013
95
0
66
Guys i just made a new decision, i canceled the orden in amazon and bought 2xEVGA GTX 760 ACX 4GB DDR5 to make a 2way SLI, now i think thats is way better and more futureproof than a single gtx780-
 

Mr Expert

Banned
Aug 8, 2013
175
0
0
Guys i just made a new decision, i canceled the orden in amazon and bought 2xEVGA GTX 760 ACX 4GB DDR5 to make a 2way SLI, now i think thats is way better and more futureproof than a single gtx780-
More Vram does not make it better or more powerful. Future Proof is a word that is not well recieved in the PC world because it is an outright lie. Metro Last Light is arguably the best looking game of 2013 and it uses less than 1GB Vram even at 2560x1440P. Vram is not the way of the future but optmization is.
 
Jun 23, 2013
95
0
66
More Vram does not make it better or more powerful. Future Proof is a word that is not well recieved in the PC world because it is an outright lie. Metro Last Light is arguably the best looking game of 2013 and it uses less than 1GB Vram even at 2560x1440P. Vram is not the way of the future but optmization is.

Agree, Just to be safe, it was way cheaper than the 780 and both of them will probably look outstanding with sabertooth z87.

Everyone is right here about FutureProof stuff. I can afford changing the rig every two years since i work really TOO hard from Monday to friday. So i'll leave the righ with those two 4gb 760s and will focus in PS4, both PC and Console are the best combo to play the best out there in PC (FPS, MMO games and fun stuff online) and console for exclusivities an play with a friend in multiplayer.

As you can see guys im very impacient and a bit stupid.. wasted so much money in tech ..! But i love this stuff, the 760 sli will be my first SLI setup ever.

I invite you all to keep arguing about this topic which is interesting to teach people how to stop wasting money lol didnt work too much for me though but im really heappy for all your responses! :) Its cool that you all can understand me just fine, im still learning english.
 
Last edited:

desprado

Golden Member
Jul 16, 2013
1,645
0
0
First of all GTX 780 is much better than 7950.
Second i had 7950 and i cannot even max shit out in that card on 1080p.For example games like max payne 3,Metro 2033,Metro LL,Crysis 2 and 3 e.t.c.If even i enable MSAA or even SMAA the performance was an huge disaster.

Great Choice GTX 760 Sli> GTX 780