Somebody find a way to talk ne out of another 980TI

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TennesseeTony

Elite Member
Aug 2, 2003
4,351
3,814
136
www.google.com
EVGA 1070 $420, or $390 after 'rebate'

This is the top 150 watt model, the next up uses 170 watts, which I don't feel is justified for an extra 100MHz or so, especially if the goal is to reduce power usage. Current promotion for Gears of War 4 game code, the game is on pre-order for $60. Might could sell the code for a quick $30 rebate.

Might could. Hmm. One might think I'm from the south using such phraseology. :flushed:
 

Orange Kid

Elite Member
Oct 9, 1999
4,453
2,223
146
Oops. Looks like I used 250w in my calculations for the 1080, but it's only 180w, making the 1080 by far the most efficient (but awfully expensive).
To do a real cost analysis you would have to figure in all costs for a certain amount of gain for a particular task. IE; if you were to do 50m points in POEM, it would take the 1070 longer to achieve that goal than the 1080 so total costs of running the machine for a longer period of time would have to be factored in as well as initial costs. Then other factors such as driver maturity and app changes in the future would be a consideration to take into account too.
Kinda like, should I take Social Security at 62 or wait till 70 when I would get the max, but then what if I die at 69 then I would have gotten nothing. :) Life is such a conundrum. What to do ,what to do. Spend more to get more or wait more to get more.
 

TennesseeTony

Elite Member
Aug 2, 2003
4,351
3,814
136
www.google.com
That's more than my little brain can calculate, Kid. :D Good insight though, and very true.

GTX1070 (lower spec 6171 model) FahBench = 88.8 Factory clocked/OC'd at 150 watts plus

Vs Mark's 980Ti at 1400MHz = 87.3 at 250 watts plus

Wow. :openmouth: Nice. :hearteyes:

Edit: I first ran the test with only 1 thread available, and got 77 something. Definitely need a full core+ per GPU for folding. Tested on a i7-5820K, which I finally figured out how to run at stock speeds (3.3GHz). :blush: (Yeah, now I don't have to worry about AVX tasks sending 300 watts to my processor.)
 
Last edited:

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,286
16,123
136
Not sure the 980 is 250 watts. I have a box with an E5570 running 100% load for CPU and GPU, and its only taking 266 watts from the wall. (kill-a-watt) but thats under a bad WU. Under a good one I have seen it at 300. But worst case scenario, only 50 watts for the CPU, hard drive, etc ?? I think the card uses less than 250.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,286
16,123
136
Also, the variability may be great, but less in the 980 series than the 1080 series. I think its the cuda cores. The 1070 only has 1920, where the 980TI has 2816. For now I stick to them, but may try a 1070.
 

TennesseeTony

Elite Member
Aug 2, 2003
4,351
3,814
136
www.google.com
980TI on a 3930k stock = 87.321 980TI@1400 mhz card stock is 1190
....

GTX980 TDP is 165 watts, plus overclocking. I've only noticed a max increase of 100-118 watts at the wall using a kill-a-watt, running Einstein and POEM, on the 1070.

Fails all tasks immediately for Asteroids and GPUGrid. GPUGrid hasn't programmed for the 10 series yet, waiting on Nvidia to release CUDA 8/8.5 for some reason. Not sure if the same holds true for Asteroids, didn't look into it yet.