GTX 780 Ti in 2016: Benchmark vs. GTX 1080, 1070, RX 480, More

Status
Not open for further replies.

.vodka

Golden Member
Dec 5, 2014
1,203
1,537
136
290/290x not included. mainly compared to modern options

There's a 390X in there.. which is a 290x @ 1100Mhz. At least it's a valid datapoint.

Wow...the 390X demolishes the 780 TI in performance now..and that 390X is using architecture from the same generation as the 780 TI.

In the 780ti's defense, it's at stock clocks, and throttling (nV's shiny stock blower doesn't do well on each generation's big chips). There's a healthy 200-300MHz extra on the GPU that could boost performance there. If we could also slap a better heatsink in there, it'd do better... just like on Hawaii cards. Having said that, it's embarrasing for a card that costed $700... what a scam. I feel even more sorry for those who bought the OG Titan for gaming, not for its DP prowess.

On the other hand, GCN keeps improving over time. We see this on he RX480 vs 1060, with them now tied vs 1060 launch where it'd more or less be the better card. Vega is a new architecture so we'll see if GCN1-4 keeps getting some of that driver / console effect / DX12-Vulkan magic in 2017.
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
It's incredible how AMD has used the console advantage to make such a long lasting architecture. It's obvious nVIDIA in these past few years has designed for the short term whereas AMD has designed for the long term.

nVIDIA is still using an architecture with roots all the way back to Fermi so it's impressive that they've come so far on efficiency but I think Volta will be a big change for them in light of GCNs longevity.
 

DamZe

Member
May 18, 2016
187
80
101
This shows how nVIDIA purposefully designs their architectures to be viable for a 2-year period, then marks them down with inadequacies or abandons driver support for them. Kepler is a shining example of an architecture touted as the next best things since sliced bread, it’s power efficiency was marketed as a feature versus the more capable GCN HD7000 cards back in the day, but nVidia hid the fact that it couldn't do compute to for redacted! When you can twist the tech media into buying into your hype and have an advantage at the given time in a certain APi (DX11) the masses will flock to your product, the 700$ tag for a 3GB card is/was a rip off of massive proportions, especially when the 290x with 4GB was around 10-15% slower in nVidia favored titles at worst! A deficit it has reclaimed over the years and is still a viable gaming card, all while the 780Ti now has to compromise on the most basic of things like texture resolution for it not to choke.

What a perfect example of hype over substance. The Maxwell cards are next, they have no muscles to Flex in DX12 and see relatively little gain in Vulkan compared to modern GCN cards. And Pascal will suffer the same fate, it still doesn't see the same gains in DX12 compared to GCN, high core clock can only carry you so far. And what do you think nVidia’s pitch is going to be for Volta? Full DX12 capabilities, hardware async? The more I look at it the more I see nVidia being the Apple of the GPU world, it looks good when it’s new and hot, but man does it lose value over time, it’s shiny but it ain’t worth the cost relative to the competition. This is why I will never buy another x80 card from nVIDIA ever again.




No profanity in the tech forums.


esquared
Anandtech Forum Director
 
Last edited by a moderator:

DamZe

Member
May 18, 2016
187
80
101

Azix

Golden Member
Apr 18, 2014
1,438
67
91
This shows how nVIDIA purposefully designs their architectures to be viable for a 2-year period, then marks them down with inadequacies or abandons driver support for them. Kepler is a shining example of an architecture touted as the next best things since sliced bread, it’s power efficiency was marketed as a feature versus the more capable GCN HD7000 cards back in the day, but nVidia hid the fact that it couldn't do compute to for shit! When you can twist the tech media into buying into your hype and have an advantage at the given time in a certain APi (DX11) the masses will flock to your product, the 700$ tag for a 3GB card is/was a rip off of massive proportions, especially when the 290x with 4GB was around 10-15% slower in nVidia favored titles at worst! A deficit it has reclaimed over the years and is still a viable gaming card, all while the 780Ti now has to compromise on the most basic of things like texture resolution for it not to choke.

What a perfect example of hype over substance. The Maxwell cards are next, they have no muscles to Flex in DX12 and see relatively little gain in Vulkan compared to modern GCN cards. And Pascal will suffer the same fate, it still doesn't see the same gains in DX12 compared to GCN, high core clock can only carry you so far. And what do you think nVidia’s pitch is going to be for Volta? Full DX12 capabilities, hardware async? The more I look at it the more I see nVidia being the Apple of the GPU world, it looks good when it’s new and hot, but man does it lose value over time, it’s shiny but it ain’t worth the cost relative to the competition. This is why I will never buy another x80 card from nVIDIA ever again.

I dont know if its on purpose or just how their GPUs work. More software dependent so that the hardware specs don't speak for themselves. Works well near launch and a bit later, but as soon as something else comes along it starts getting neglected due to resources falls behind. It could be a consequence of whatever they do to get lower power usage. I think hardware scheduler in GCN is more flexible than the partly static one in kepler->pascal. My guess anyway. I would bet they can get kepler cards up to par if they tried with these new games. A GPU should not need that kind of babysitting though.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Load up Skyrim Special Edition with a handful of texture mods, that mighty 700$ paperweight will start to choke even in 1080p. That 3GB of vRAM doomed it.
You mean a game with texture packs specifically designed to use untold amounts of RAM? Sure thing. I'll get right on that to exploit the weakest link of a 780Ti if it serves my purposes.
 

laamanaator

Member
Jul 15, 2015
66
10
41
You mean a game with texture packs specifically designed to use untold amounts of RAM? Sure thing. I'll get right on that to exploit the weakest link of a 780Ti if it serves my purposes.
The thing is in Skyrim SE the textures are from the HQ texture pack originally released for the vanilla game. Those textures are at max 2K, most are between 1K and 2K, some even 512x512. They don't take a lot of space, so no, Skyrim SE is not designed to use "untold amounts of RAM". Hell, I've got in the original Skyrim almost all textures replaced by 4K ones, and I've never seen more than 3,8GB of VRAM used.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
The thing is in Skyrim SE the textures are from the HQ texture pack originally released for the vanilla game. Those textures are at max 2K, most are between 1K and 2K, some even 512x512. They don't take a lot of space, so no, Skyrim SE is not designed to use "untold amounts of RAM". Hell, I've got in the original Skyrim almost all textures replaced by 4K ones, and I've never seen more than 3,8GB of VRAM used.
Lets explore this. I want to see just how much RAM Skyrim can be made to use.
Unfortunately, I don't own Skyrim because IMHO 3rd person games suck. But that's just me.
Anyone interested in assisting that owns large memory cards that would be great. I'd like to see if laamanaator isn't just spouting random numbers.
 

laamanaator

Member
Jul 15, 2015
66
10
41
Lets explore this. I want to see just how much RAM Skyrim can be made to use.
Unfortunately, I don't own Skyrim because IMHO 3rd person games suck. But that's just me.
Anyone interested in assisting that owns large memory cards that would be great. I'd like to see if laamanaator isn't just spouting random numbers.
Just a note, Skyrim is 1st/3rd person game, not only 3rd ;). My numbers were only for modded original Skyrim, but as SE uses the same textures as original with HQ DLC, it shouldn't use more VRAM than a heavily modded version. The max that the original can use is 4GB of VRAM due to DX9 limitations.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Just a note, Skyrim is 1st/3rd person game, not only 3rd ;). My numbers were only for modded original Skyrim, but as SE uses the same textures as original with HQ DLC, it shouldn't use more VRAM than a heavily modded version. The max that the original can use is 4GB of VRAM due to DX9 limitations.
Well, it would be great to find out. You've already sais that you've seen it use 3.8GB. 780Ti has 3GB. I think you may have made my point for me anyway. Still would be great to have actual numbers.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
OG Skyrim with ENB and all of the 4K texture mods I could find never had an issue even at 1440p with my old GTX 780.

 

laamanaator

Member
Jul 15, 2015
66
10
41
Hmm, I did some googling and found out that for some reason Skyrim SE typically uses 2.8GB-3GB of VRAM, sometimes even more. The textures are from HQ DLC, so the upped VRAM usage must come from increased foliage and from some unknown (at least tome) factor. The meshes are the same as vanilla, so it must come from increased amount of objects. I was wrong apparently. :oops:
 

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
Just a note, Skyrim is 1st/3rd person game, not only 3rd ;). My numbers were only for modded original Skyrim, but as SE uses the same textures as original with HQ DLC, it shouldn't use more VRAM than a heavily modded version. The max that the original can use is 4GB of VRAM due to DX9 limitations.

Skyrim can use all the VRAM you have, supposing you got Boris Vorontsov's ENB installed. There are two settings that influence VRAM usage (can't remember the specifics).

UnsafeMemoryHacks will cause the game to not mirror the VRAM in RAM, as is standard with Direct3D 9. The cost of this is the game crashing when alt-tabbing out of full screen. Solution? Use border-less windowed fullscreen.

ExpandSystemMemoryX64 uses ENB's ENBHost.exe to work around D3D9's memory mirroring. The software launches another enbhost.exe process, containing all the meshes and textures that couldn't be held in the TESV.exe process. If the memory cap is reached on enbhost, another process is spawned.

The first setting invalidates the latter if it's set to true. Greatly reduces stuttering, though; always have it on.
 
  • Like
Reactions: laamanaator

f2bnp

Member
May 25, 2015
156
93
101
This almost feels like a meme. "Kepler! Losing performance since launch!"

Probaby time for 780Ti owners to chime in so they can tell us all the games they cant play anymore.

https://i0.wp.com/www.babeltechreviews.com/wp-content/uploads/2016/03/Launch-Chart-780-Ti.jpg

There is no greater meme than "Member of Nvidia Focus Group".

The overall performance of Kepler is absolutely disgraceful. I'd love to see how Maxwell performs same time next year.


Trolling is not allowed
Markfw
Anandtech Moderator
 
Last edited by a moderator:

daxzy

Senior member
Dec 22, 2013
393
77
101
To be fair, its the 3GB version of the 780Ti. The 6GB fares a lot better.

Doesn't excuse low VRAM Keplers as probably being some of the worst buys in GPU history (2GB GTX 770/670/680).
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
There is no greater meme than "Member of Nvidia Focus Group".

The overall performance of Kepler is absolutely disgraceful. I'd love to see how Maxwell performs same time next year.


Trolling is not allowed
Markfw
Anandtech Moderator
Only problem is, that isn't a meme. ;)
The overall performance of Kepler is ok. Not stellar, not legendary, but ok. You, like so many others try to make it seem like Kepler actually "LOST" performance since it's launch. I couldn't help but laugh about it. And they were persistent because they were desperate to discourage people from buying Maxwell and buy GCN. Yes, GCN aged a little better. For sure. But let's not make this out to be the sensationalism that it isn't, shall we?
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
To be fair, its the 3GB version of the 780Ti. The 6GB fares a lot better.

Doesn't excuse low VRAM Keplers as probably being some of the worst buys in GPU history (2GB GTX 770/670/680).
When those launched, 2GB was perfectly acceptable and performance was just fine. Just like the 3GB 7970. When it launched, it was just fine and still is not too bad today. I honestly don't know where you people get the idea that such tremendous amounts of VRAM are needed to enjoy games of yesterday or games of today. Even down the road a piece. Memory management does wonders and not many notice a big difference in IQ going from High to Ultra settings (depending on game) despite using mountains more memory.

More, is better when it comes to memory. I can't disagree with this, but for lower price brackets, settings can be turned down to accomodate.
Back when 780Ti launched, it was competing against 4GB cards. 290s, and doing quite well.
Stop sensationalizing, please.
 

WhoBeDaPlaya

Diamond Member
Sep 15, 2000
7,414
401
126
Got rid of my 780s for 290 @ 290Xs awhile back and it appears to have been a prudent course of action.
I do own a pair of 780s again for use in spare gaming rigs, but only because the price was right ($50 a piece on CL) :)
 
  • Like
Reactions: RussianSensation
Status
Not open for further replies.