Geforce GTX 1060 Thread: faster than RX 480, 120W, $249

Page 34 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
The way NV and AMD sponsor game development these days it's no longer considered fair or accurate when these titles are used in GPU reviews like in the old days where all titles are neutral.

Vendor specific optimizations used to be frown upon but we're no longer living nor gaming in those old days.

Is it fair or accurate to use mostly sponsored titles? No. But it's what gamers play, and that's what matters.

If it were purely based on games that players played however, it should be from the top Steam concurrent player charts.

http://steamcharts.com/top

I'm sure there's quite a few outstanding and graphically great games from this list to pick a lot for any benchmark suites.

Definite entries would have to include: Witcher 3, Fallout 4, Total War Warhammer, Civilization V, Evolve, Arma 3 (Apex xpac just released, top seller on Steam!).. and go down from there.
 

MarkizSchnitzel

Senior member
Nov 10, 2013
476
121
116
doesn't matter if the games are popular or not, as long as it is sponsored data, it become useless when you want to use it to calculate the avg.
...

why hide the fact they are sponsored games? give gamers all the info and let them decide. when you hide, that means you got something to hide. :thumbsdown:

I don't understand your reasoning. GPU is a means to an end. Gams are the end. Whichever card is best for the games I play will win (all else being equal).

I only play Civ, so if this game is sponsored, I could not care less. If both cards are quiet, whichever has better perf/$ will be my pick.
Others who play more games, would also do similar, no?

All this arguing about which games to use - just use a random pick of games released within the past year. This makes perfect sense to me.

This makes sense for me to. After all, you're buying a GPU to play games. Whichever card can do that best within your budget, you take that one.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
The way NV and AMD sponsor game development these days it's no longer considered fair or accurate when these titles are used in GPU reviews like in the old days where all titles are neutral.

Vendor specific optimizations used to be frown upon but we're no longer living nor gaming in those old days.

Is it fair or accurate to use mostly sponsored titles? No. But it's what gamers play, and that's what matters.

If it were purely based on games that players played however, it should be from the top Steam concurrent player charts.

http://steamcharts.com/top

I'm sure there's quite a few outstanding and graphically great games from this list to pick a lot for any benchmark suites.

Definite entries would have to include: Witcher 3, Fallout 4, Total War Warhammer, Civilization V, Evolve, Arma 3 (Apex xpac just released, top seller on Steam!).. and go down from there.

Agree 100%. Reviews are meant to inform consumers, and as such the closer the game selection comes to that of the average reader the better, regardless of whether or not said games may be "sponsored".

People may dislike that AMD/Nvidia are sponsoring games and thus causing said games to vary wildly in performance between vendors, but at the end of the day if your GPU of choice performs significantly better than the competition in your prefered games, then that is quite clearly a value add, and one that tech sites should make consumers aware of.

And as far as averages goes, it is braindead easy for people to recalculate the averages of say TPU, without any games they may consider biased. So instead of forcing sites start excluding games, people can just recalculate the average. But I guess the real issue is that this would force people to do a bit of work instead of just being spoonfeed info, and some people don't like when they have to work.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Yes. I've seen FE 1070/80 boosting that high at stock settings. GPU Boost 3.0 does take the clock speeds much higher over the rated boost, though sadly most of it is a meaningless initial peak.

Ah, yes. So even if it was at 1911 for a nanosecond, that is what will be reported in the bench. I guess, that is technically correct then.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
I don't understand your reasoning. GPU is a means to an end. Gams are the end. Whichever card is best for the games I play will win (all else being equal).

I only play Civ, so if this game is sponsored, I could not care less. If both cards are quiet, whichever has better perf/$ will be my pick.
Others who play more games, would also do similar, no?
that is because we were talking about averages, which you seems to have missed when you were reading the conversation between the few of us. :\ and you seemed to have missed the post where I stated you can test any sponsored games but just not included them in the avgs. I have stated this multiple times in the conversation so far. :\
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
that is because we were talking about averages, which you seems to have missed when you were reading the conversation between the few of us. :\ and you seemed to have missed the post where I stated you can test any sponsored games but just not included them in the avgs. I have stated this multiple times in the conversation so far. :\

The average is meant to be an average of all the games you want to play, if it doesn't include some of those games then it's no more useful then not bench-marking individual games due to some perceived bias. To be a fair average of the games (for people who don't care about fanboy gpu wars and just want to play) it will have some games that are much stronger for one gpu company then the other. If one company is consistently able to get stronger performance in games (the people playing games don't care why) then it should be reflected in the average.

By removing games from performance metrics you are applying your fanboy goggles (mostly nvidia gameworks is evil) to people who don't wear them because they just don't care. Gpu's are no different to any other stuff they buy (phones, cars, breakfast cereals, clothes) - they don't apply some higher moral reasoning to them either.
 
Last edited:

linkgoron

Platinum Member
Mar 9, 2005
2,599
1,238
136
The average is meant to be an average of all the games you want to play, if it doesn't include some of those games then it's no more useful then not bench-marking individual games due to some perceived bias. To be a fair average of the games (for people who don't care about fanboy gpu wars and just want to play) it will have some games that are much stronger for one gpu company then the other. If one company is consistently able to get stronger performance in games (the people playing games don't care why) then it should be reflected in the average.

By removing games from performance metrics you are applying your fanboy goggles (mostly nvidia gameworks is evil) to people who don't wear them because they just don't care. Gpu's are no different to any other stuff they buy (phones, cars, breakfast cereals, clothes) - they don't apply some higher moral reasoning to them either.

Average scores do not always represent average gameplay, if there is a game which skews the scores significantly, then it skews the average too much and the average does not represent the actual average expected performance for a typical gamer.

If, for example, a GPU A is a bit slower than GPU B in 9 games out of 10, and significantly faster in one game you might see that GPU A is faster (on average) than GPU B. Except the fact that what you'll actually see in 90% of the games is that GPU B is faster. The average in this case does not really represent what you'll see in your average game.

This has nothing to do with fanboyism. As already stated, if there is a large outlier or whatever, include it in the review, just remove it from the average as it has a large effect and makes the average less meaningful.

Look at Project cars at 1440p, without it the Fury X has 3% higher Average FPS.
perfrel_2560.gif


edit:

Look at the techreport review:
http://techreport.com/blog/28624/reconsidering-the-overall-index-in-our-radeon-r9-fury-review

However, prompted by your questions, I went back to the numbers this morning and poked around some. Turns out the impact of that change may be worthy of note. With Cars out of the picture, the overall FPS average for the R9 Fury drops by 1.2 FPS and the score for the GeForce GTX 980 drops by 2.8 FPS. The net result shifts from a 0.6-FPS margin of victory for the GTX 980 to a win for the R9 Fury by a margin of 1.1 FPS.
[...]
At the end of the day, I think the Cars-free value scatter plots are probably a more faithful reflection of the overall performance picture than our original ones, so I'm going to update the final page of our Fury review with the revised plots.
 
Last edited:

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Average scores do not always represent average gameplay, if there is a game which skews the scores significantly, then it skews the average too much and the average does not represent the actual average expected performance for a typical gamer.

That depends entirely upon who the typical gamer is. For instance if game in question is one that is very commonly played by the typical gamer, then you would actually bias the average more by leaving it out than by including it.

If, for example, a GPU A is a bit slower than GPU B in 9 games out of 10, and significantly faster in one game you might see that GPU A is faster (on average) than GPU B. Except the fact that what you'll actually see in 90% of the games is that GPU B is faster. The average in this case does not really represent what you'll see in your average game.

What you will see in you average game is irrelevant, the question is what the average gamer will see, and whilst that might sound like the same thing it really isn't, since the average gamer may not play your average game.

This has nothing to do with fanboyism. As already stated, if there is a large outlier or whatever, include it in the review, just remove it from the average as it has a large effect and makes the average less meaningful.

No it does not make the average less meaningful. You guys have to remember that you are not taking multiple samples from the same population, instead you are taking singular samples from multiple populations. Question then becomes which populations (i.e. games) best represents the average gamer, and if a game like for instance Project CARS is highly representative of what the average gamer plays, then it absolutely should be included in the average.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Average scores do not always represent average gameplay, if there is a game which skews the scores significantly, then it skews the average too much and the average does not represent the actual average expected performance for a typical gamer.

If, for example, a GPU A is a bit slower than GPU B in 9 games out of 10, and significantly faster in one game you might see that GPU A is faster (on average) than GPU B. Except the fact that what you'll actually see in 90% of the games is that GPU B is faster. The average in this case does not really represent what you'll see in your average game.

This has nothing to do with fanboyism. As already stated, if there is a large outlier or whatever, include it in the review, just remove it from the average as it has a large effect and makes the average less meaningful.

Look at Project cars at 1440p, without it the Fury X has 3% higher Average FPS.

If one company has games it completely fails in (reason why doesn't matter) then yes that should be reflected in the average. It could be argued its actually more important as most gamers would prefer a gpu that performs ok in all their games not one that sometimes performs great and other times performs badly.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
that is why all those sponsored games should have their own benchmarks, I don't see how this point is so hard to understand.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
A "logo'd" game vs "non-logo'd" game analysis would be interesting for sure. So anything with AMD or nVidia logo in the loading screen or otherwise known to be an AMD GE or nVidia Gameworks game gets put on one list, the others on another list, and a 3rd list with the average of all titles. Extra work but I'd personally read that review as I find that data interesting.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
According to the article those are the figures Nvidia based its average performance uplift on.
 

SPBHM

Diamond Member
Sep 12, 2012
5,076
440
126
it looks like the stuff with msaa is killing the RX 480 on those benchmarks
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Yup, NVidias RX480 scores are quite a bit low. If we ignoring the RX480 scores for the moment and just looking at GTX1060 scores it looks like it will come in just a bit short of GTX980. Might be the reason NVidia did not include GTX980 scores here on purpose.
 

sze5003

Lifer
Aug 18, 2012
14,320
683
126
Geforce GTX 1060 6GB listed in US stores ($259)

6GB model is already available for pre-order in the US, for $259 and $263:

https://www.sabrepc.com/catalogsearch/result/?q=gtx+1060&order=relevance&dir=desc
http://www.shopblt.com/search/order_id=%21ORDERID%21&s_max=25&t_all=1&s_all=gtx+1060

Not bad at all for pre-launch prices, closer to MSRP than other Pascal cards.


Geforce GTX 1060 6GB OC - 3DMark Fire Strike Extreme & Ultra Results

Fire Strike Extreme Graphics Score: 6517
Fire Strike Ultra Graphics Score: 3116

http://www.3dmark.com/compare/fs/9258975/fs/9249139/fs/9247799#
http://www.3dmark.com/compare/fs/9259050/fs/9248958/fs/9247684#
Does anyone really pre order them? From what I've seen in that chart so far it looks very close to the 480 at least for 1080p.

Either card will do fine at 1080p it seems. I wonder why I haven't found pre orders for any of the custom 480 cards.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
I'd like to see 480 8GB vs 1060 6GB performance in a couple of years please :cool:

This is called spreading Fear, Uncertainty and Doubt.

Like me saying,

The GCN architecture has run its course and is losing its legs. In two years time its performance will drop off drastically.

You guys are eventually going to wish you hadnt clung so tightly to the Kepler gen not having the legs GCN did at the time.