Why is it that Nvidia holds value more over AMD?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Z15CAM

Platinum Member
Nov 20, 2010
2,184
64
91
www.flickr.com
It seems to me that nVidia is also reaping some handsome benefits from brand recognition alone.
If you ask me AMD would never have these issues if they had left ATi in Canada ;o)
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
the role of the press in all of this is not to be underestimated. many sites are clearly biased in favour of Nvidia. many people look at these reviews for their buying decisions and such bias from the site tilts the market more in Nvidia's favour. the people who run these sites have a business to run and so they make compromises in objectivity and fairness.

http://hardocp.com/article/2013/09/09/msi_n650ti_tf_2gd5oc_be_video_card_review/10#.Ui7i0H_3x8E

"That makes the MSI N650Ti TF 2GD5/OC BE a fantastic value for your hard earned dollars. If you are trying to decide between the AMD Radeon HD 7850 or a NVIDIA GeForce GTX 650 Ti BOOST based video card, we'd opt for the GTX 650 Ti BOOST. The MSI N650Ti TF 2GD5/OC BE would be a perfect selection for the best gameplay under $200."

this review is a clear indication of the problem which is faced with the industry. tech sites are losing their meaning by catering to their business interests rather than provide objective and fair comparisons. this review is so far disconnected from reality thats its laughable. HD 7850(1 Ghz) cards are selling for USD 170-185, similar price as the reviewed GTX 650 Ti boost card. I mention this because hardocp like to argue about stock performance. HD 7850 users know the card is capable of hitting 1 Ghz easily at stock voltage and on average overclock to 1.1 - 1.15 Ghz with voltage tweaking.

http://www.newegg.com/Product/Produc...82E16814131472
http://www.newegg.com/Product/Produc...82E16814161416

an earlier review by hardocp shows that a HD 7850(1.1 Ghz) is faster than GTX 650 Ti boost(1.2 Ghz) across the board. Even in TWIMTBP titles HD 7850(1.1 Ghz) matches GTX 650 Ti boost(1.2 Ghz) as they scale roughly 20 - 25% in performance from stock 860 mhz. In AMD GE titles like tombraider, hitman, sleeping dogs the HD 7850(1.1 ghz) crushes the GTX 650 Ti boost(1.2 ghz) by 20%

http://www.hardocp.com/article/2013..._directcu_ii_video_card_review/4#.Ui7m9n_3x8F

Farcry 3 1920 x 1080 Ultra 2X MSAA HDAO

HD 7850(1.1 Ghz) - 35.8
GTX 650 Ti boost (1.2 Ghz) - 33.0

Hitman Absolution Ultra 4x MSAA

HD 7850(1.1 Ghz) - 54.9
GTX 650 Ti boost (1.2 Ghz) - 46.4

HD 7850 OC scales exceptionally well and competes with GTX 660 OC at 1.1+ Ghz speeds. whats worse hardocp says the GTX 650 Ti boost provides best gameplay under $200. thats far from the reality. HD 7870 cards are selling for USD 180 -200. these cards are faster than GTX 660 and will run circles around the GTX 650 Ti boost card.

www.amazon.com/gp/product/B007HYIRES/
www.amazon.com/gp/product/B00A2J4ROE/

I posted in hardocp forums and was not alone in my view that hardocp reviews are biased in Nvidia's favour for this entire Kepler generation.

http://hardforum.com/showthread.php?t=1780402
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Ignore the recommendations and just use HardOCP for their timegraphs. The site's financial interests clearly impact its recs to some degree, as the ceaseless pleas to install anti-newegg plug-ins demonstrate. They need to grow up instead of publicly throwing a temper tantrum about newegg cutting off the gravy train. The entire PC industry is in decline and newegg is passing along the hurt. It's not a conspiracy against HardOCP or anything, just belt-tightening by everyone in the declining industry.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
We are saying [H] is biased while at the same time AT now has an official AMD zone lol :)
 

ICDP

Senior member
Nov 15, 2012
707
0
0
We are saying [H] is biased while at the same time AT now has an official AMD zone lol :)

This kind of response is actually part of the problem. The playground "well you did it as well" mentality does not help. If you notice bias then point it out. If the problem persists stop reading the articles from the offending author or web site.

The problem is many people gravitate towards their favourite sites because the bias on show lines up with their own. Confirmation bias is quite rampant among the AMD/Nvidia fundamentalists :)

It is quite possible to find GPU reviews from different sites that show a massive performance difference in the same game at the same settings.

HardwareCanucks GTX780 review. Crysis 3, very high preset, FXAA, 2560x1440. Average FPS 39.01
http://www.hardwarecanucks.com/foru...ws/61310-nvidia-geforce-gtx-780-review-7.html

Same game, same settings, Anandtech. Average FPS 53.1 (EDIT: the settings are different - Thanks Jaydip)
http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review/16

According to the Anandtech review the 7970GE is actually faster than the GTX780 in the HardwareCanuck review. :)

Reviewers are only human, they have bias and make mistakes the same as the rest of us. It is up to us as consumers to make our own decisions rather than rely on a single web site for a review. It is quite amazing how some people are unable to put stats/results in context (I'm guilty of this above) and will happily mis-represent information to prove their bias.

Avoid sites with obvious bias.
Avoid authors/reviewers with obvious bias
Avoid sites that do a quick benchmark using a built in game test for results.
Avoid sites that don't reveal the actual level tested etc.

I find [H] reviews quite good, only the conclusions leave me scratching my head.

EDIT: Jaydip pointed out my error in that the setting for the Crysis 3 benchmarks were different. Though this does demonstrate how easy it is to get confused with so much conflicting info flying about.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
This kind of response is actually part of the problem. The playground "well you did it as well" mentality does not help. If you notice bias then point it out. If the problem persists stop reading the articles from the offending author or web site.

The problem is many people gravitate towards their favourite sites because the bias on show lines up with their own. Confirmation bias is quite rampant among the AMD/Nvidia fundamentalists :)

It is quite possible to find GPU reviews from different sites that show a massive performance difference in the same game at the same settings.

HardwareCanucks GTX780 review. Crysis 3, very high preset, FXAA, 2560x1440. Average FPS 39.01
http://www.hardwarecanucks.com/foru...ws/61310-nvidia-geforce-gtx-780-review-7.html

Same game, same settings, Anandtech. Average FPS 53.1
http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review/16

According to the Anandtech review the 7970GE is actually faster than the GTX780 in the HardwareCanuck review. :)

Reviewers are only human, they have bias and make mistakes the same as the rest of us. It is up to us an consumers to make our own decisions rather than rely on a single web site for a review.

Good points, but the task gets very difficult though.Now a days the cards are ~5% apart from each other at same price point, so just awarding 1 or 2 fps extra can change the scenario.You gotta trust some reviews unless you have the same hardware as the reviewers and testing the same scene.Otherwise you will be left with truckloads of disjointed and conflicting data.You see if reviewers start to slant towards different vendors we should start accepting the marketing slides they put forth as gospel truth because the reviews won't do any better.This is unfortunately a very sad situation for the consumers.

Edit:

It seems AT was using HQ instead of VHQ?
 
Last edited:

ICDP

Senior member
Nov 15, 2012
707
0
0
Good points, but the task gets very difficult though.Now a days the cards are ~5% apart from each other at same price point, so just awarding 1 or 2 fps extra can change the scenario.You gotta trust some reviews unless you have the same hardware as the reviewers and testing the same scene.Otherwise you will be left with truckloads of disjointed and conflicting data.You see if reviewers start to slant towards different vendors we should start accepting the marketing slides they put forth as gospel truth because the reviews won't do any better.This is unfortunately a very sad situation for the consumers.

Some people already accept marketing slides as gospel. We are already at the point where the majority of reviews have a bias (intentional or unintentional). The trick is taking the various results and putting them in context.

Edit:

It seems AT was using HQ instead of VHQ?

Yes they were, my mistake.

Though this perfectly demonstrates one of the issues with mainstream GPU reviews. There are no set standards other than what the author sets for him/herself. Even testing a game in a different level could skew results massively. [H] are now famous for the fact that they test for max playable settings on the most demanding levels. The problem is that this is entirely subjective and for the majority of users means nothing. I frequently ignore the max playable settings and go straight to the apples to apples results. having said that other people may find this info invaluable.

It has become such a minefield that it is literally possible to find wildly conflicting information. As you alluded to in your post, the data available is so disjointed and conflicting that it is possible to come away confused. I didn't mean to but I demonstrated this quite well with my stupid reading comprehension error. :)
 
Last edited:

spat55

Senior member
Jul 2, 2013
539
5
76
Nvidia does better... drivers only, but by far, i have to admit.

I think I agree slightly, but I have also heard that Nvidia drivers had been melting some cards, not sure how reliable the info is but it is out there.

This kind of response is actually part of the problem. The playground "well you did it as well" mentality does not help. If you notice bias then point it out. If the problem persists stop reading the articles from the offending author or web site.

The problem is many people gravitate towards their favourite sites because the bias on show lines up with their own. Confirmation bias is quite rampant among the AMD/Nvidia fundamentalists :)

It is quite possible to find GPU reviews from different sites that show a massive performance difference in the same game at the same settings.

HardwareCanucks GTX780 review. Crysis 3, very high preset, FXAA, 2560x1440. Average FPS 39.01
http://www.hardwarecanucks.com/foru...ws/61310-nvidia-geforce-gtx-780-review-7.html

Same game, same settings, Anandtech. Average FPS 53.1 (EDIT: the settings are different - Thanks Jaydip)
http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review/16

According to the Anandtech review the 7970GE is actually faster than the GTX780 in the HardwareCanuck review. :)

Reviewers are only human, they have bias and make mistakes the same as the rest of us. It is up to us as consumers to make our own decisions rather than rely on a single web site for a review. It is quite amazing how some people are unable to put stats/results in context (I'm guilty of this above) and will happily mis-represent information to prove their bias.

Avoid sites with obvious bias.
Avoid authors/reviewers with obvious bias
Avoid sites that do a quick benchmark using a built in game test for results.
Avoid sites that don't reveal the actual level tested etc.

I find [H] reviews quite good, only the conclusions leave me scratching my head.

EDIT: Jaydip pointed out my error in that the setting for the Crysis 3 benchmarks were different. Though this does demonstrate how easy it is to get confused with so much conflicting info flying about.

Might be they tested it on different levels? But I do agree that there does seem to be so much variance, think it might be best to go with benchmarking software from now on.
 
Last edited:

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Some people already accept marketing slides as gospel. We are already at the point where the majority of reviews have a bias (intentional or unintentional). The trick is taking the various results and putting them in context.



Yes they were, my mistake.

Though this perfectly demonstrates one of the issues with mainstream GPU reviews. There are no set standards other than what the author sets for him/herself. Even testing a game in a different level could skew results massively. [H] are now famous for the fact that they test for max playable settings on the most demanding levels. The problem is that this is entirely subjective and for the majority of users means nothing. I frequently ignore the max playable settings and go straight to the apples to apples results. having said that other people may find this info invaluable.

It has become such a minefield that it is literally possible to find wildly conflicting information. As you alluded to in your post, the data available is so disjointed and conflicting that it is possible to come away confused. I didn't mean to but I demonstrated this quite well with my stupid reading comprehension error. :)

All good points. That's why I've been using 3D Center lately since they compile performance results from multiple reviews. Very easy to get an overall picture of how cards perform against one another. They also include power draw from multiple sites for those who put a lot of emphasis on that metric.
 

ICDP

Senior member
Nov 15, 2012
707
0
0
Back on the topic of the thread.

My own guess is it is mostly peer pressure. it is the same reason why people pay over the odds for iPhones and iPads compared to cheaper yet equally, or better spec'd alternatives.

Everyone who is objective and knows GPUs will agree it isn't a build quality thing. Driver issues may be used as an excuse but to be honest it is more to do with preference/familiarty than poor quality. There are plenty of issues that come up in the drivers of both vendors and we could make a long list of current issues from both vendors current drivers.

Most people here have been gaming for a long time and old habits die hard. The idea that Nvidia somehow have better quality or feature sets is not based on fact. There have been plenty of instances where AMD have been 1st to market with DX10 or DX11 yet nothing is rarely made of this fact. But PhysX and Cuda = teh aswum.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
You forget that it takes time to develop games for a new DX version, the adoption is very slow. PhysX, downsampling etc. work "instantly". For example there will be only one DX11.1/2 game out there (BF4) in the near future, but 3 titles with GPU-PhysX (CoD Ghosts, Batman AO and Witcher 3).

In my opinion, many Nvidia features are used quicker (or require no adoption at all) and thus are more useful in regard to timing. The only thing I really envy AMD for is ZeroCore, but only with Multi-GPU configurations. In Single-GPU mode it's not useful imo. Or when AMD had SGSSAA in DX9 and then automatic LOD adjustment in DX10/11. Those were times when they really held something over Nvidia in terms of quality.

More often than not, Nvidia is first with things (either by direct action or circumstancially). GPU-based physics, TrAA, SLI, Frame Metering, SSAA (via nHancer back then), GPGPU/CUDA, adaptive VSync, Boost+Framelimiter, Downsampling, Application Profiles, 3D (3DVision)...this has provided them with an image as a "doer" rather than a "talker". This isn't completely accurate anymore since AMD is gaining in several areas quickly, but this perception doesn't change overnight.

Personally I see Nvidia as the more innovative company in the GPU sector, at least regarding enthusiasts. As long as that doesn't change, I'm willing to disregard perf/$ discrepancies because I feel better taken care of at Nvidia as an enthusiast.
 
Last edited:

Elfear

Diamond Member
May 30, 2004
7,163
819
126
You forget that it takes time to develop games for a new DX version, the adoption is very slow. PhysX, downsampling etc. work "instantly". For example there will be only one DX11.1/2 game out there (BF4) in the near future, but 3 titles with GPU-PhysX (CoD Ghosts, Batman AO and Witcher 3).

In my opinion, many Nvidia features are used quicker (or require no adoption at all) and thus are more useful in regard to timing. The only thing I really envy AMD for is ZeroCore, but only with Multi-GPU configurations. In Single-GPU mode it's not useful imo. Or when AMD had SGSSAA in DX9 and then automatic LOD adjustment in DX10/11. Those were times when they really held something over Nvidia in terms of quality.

More often than not, Nvidia is first with things (either by direct action or circumstancially). GPU-based physics, TrAA, SLI, Frame Metering, SSAA (via nHancer back then), GPGPU/CUDA, adaptive VSync, Boost+Framelimiter, Downsampling, Application Profiles, 3D (3DVision)...this has provided them with an image as a "doer" rather than a "talker". This isn't completely accurate anymore since AMD is gaining in several areas quickly, but this perception doesn't change overnight.

Personally I see Nvidia as the more innovative company in the GPU sector, at least regarding enthusiasts. As long as that doesn't change, I'm willing to disregard perf/$ discrepancies because I feel better taken care of at Nvidia as an enthusiast.

So if you were buying a high-end GPU today, Physx and Boost would be worth the $100 difference (35% more expensive in absolute terms) between the cheapest 1Ghz 7970 and 770? Or is it the fact you'd be willing to spend an extra $100 to support a company that you see as more innovative?

Genuinely curious here.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
You forget that it takes time to develop games for a new DX version, the adoption is very slow.

As opposed to PhysX where adoption is basically on a sponsorship basis only.

PhysX, downsampling etc. work "instantly". For example there will be only one DX11.1/2 game out there (BF4) in the near future, but 3 titles with GPU-PhysX (CoD Ghosts, Batman AO and Witcher 3).
In the long run there will be far more DX11.1/2 games than there will ever be PhysX games. Somebody has to be first in order to get the ball rolling, and it's always been AMD.

More often than not, Nvidia is first with things (either by direct action or circumstancially). GPU-based physics, TrAA, SLI, Frame Metering, SSAA (via nHancer back then), GPGPU/CUDA, adaptive VSync, Boost+Framelimiter, Downsampling, Application Profiles, 3D (3DVision)...this has provided them with an image as a "doer" rather than a "talker". This isn't completely accurate anymore since AMD is gaining in several areas quickly, but this perception doesn't change overnight.

Personally I see Nvidia as the more innovative company in the GPU sector, at least regarding enthusiasts. As long as that doesn't change, I'm willing to disregard perf/$ discrepancies because I feel better taken care of at Nvidia as an enthusiast.
HSA (hUMA)
ZeroCore
PowerTune
MLAA
Eyefinity
DX11
Tessellation
DX10
DX9
Almost always first to a new node.

Most of AMD's firsts are true advances to gaming, most notably Eyefinity (yes I'm aware of Triplehead2Go) which is worth more than everything Nvidia has developed on your list, imo.
 
Last edited:
Jul 29, 2012
100
0
0
AMD really needs to catch up in drivers. Native adaptive vsync or fps limiter to start. And frame pacing for Crossfire is still not done

While a newer feature, adaptive vsync is desirable for me. FPS limit can be done with tools like RadeonPro but new and average users will not have any clue

Crossfire may be a limited subset of users but it will be even more limited without the fix. Right now it is unusable
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
People look beyond performance. In particular the rule of "if it ain't broken don't fix it" applies. If you've used company X and it's worked fine then why change? Most people will just stick to what they know unless it's letting them down.

Simply put nvidia has been more reliable over the years. That leads to customer retention. If anything AMD is doing better then they should in the consumer market as a lot of people will root for the underdog. Look at the pro market where professionals take a colder logical view, it's completely dominated by nvidia.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
As opposed to PhysX where adoption is basically on a sponsorship basis only.

In the long run there will be far more DX11.1/2 games than there will ever be PhysX games. Somebody has to be first in order to get the ball rolling, and it's always been AMD.

HSA (hUMA)
ZeroCore
PowerTune
MLAA
Eyefinity
DX11
Tessellation
DX10
DX9
Almost always first to a new node.

Most of AMD's firsts are true advances to gaming, most notably Eyefinity (yes I'm aware of Triplehead2Go) which is worth more than everything Nvidia has developed on your list, imo.


nVidia offered the first unified DirectX 10 architecture -- G-80!
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
As opposed to PhysX where adoption is basically on a sponsorship basis only.

In the long run there will be far more DX11.1/2 games than there will ever be PhysX games. Somebody has to be first in order to get the ball rolling, and it's always been AMD.

Irrelevant since once a significant amount of DX11.1/2 games have been released (probably even before that in Q1 2014), the playing field will be even (Maxwell). But then you still cannot play PhysX games with AMD, so that advantage stays relevant for the foreseeable future. That's the problem with open technologies - adoption rate is slow so that even if you support it first, the advantage will be very limited.

HSA (hUMA)
ZeroCore
PowerTune
MLAA
Eyefinity
DX11
Tessellation
DX10
DX9
Almost always first to a new node.

HSA: Not relevant yet
ZeroCore: I counted that already. It's usefulness is limited imo as I said.
Powertune: Yes
MLAA: Yes
Eyefinity: Yes

The rest is no special feature of AMD that required some kind of innovation. It was "just" releasing their hardware on time. There is no conscious effort to differentiate themselves from the competition (except perhaps with DX10.1 and 11.1). Definitely not comparable with the things I said. Tessellation is part of DX11 and was much faster on Nvidia at first. Nvidia was first with DX10 (G80).

Most of AMD's firsts are true advances to gaming, most notably Eyefinity (yes I'm aware of Triplehead2Go) which is worth more than everything Nvidia has developed on your list, imo.

In your opinion. I disagree. I hate bars, so I would never buy multiple monitors.
 
Last edited:

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
ATI had hardware tessellation long before DX11 (way back in 2001 I believe), it just didn't get adopted much until then (by no fault of theirs). I forgot they were first with DX10.1, not 10.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Had TruForm been used more and not died off, that would have been more relevant. But it disappeared pretty quickly. I guess one could call it ATIs PhysX ;)

No one knows who contributed more and how that contribution could be judged. It's completely a matter of personal perspective and preference. I was saying that I personally feel more at home with Nvidia, at least since 2006. I've come to appreciate most of their features (I forgot CSAA, AA, AO and SLI compatibility bits!) and found them most useful. It may change or it may not, who knows.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
HSA (hUMA)
ZeroCore
PowerTune
MLAA
Eyefinity
DX11
Tessellation
DX10
DX9
Almost always first to a new node.


You can't add something that has no practical benefit currently.

ZeroCore is garbage in MGPU, and probably the reason most cards don't wake from sleep.

PowerTune came out when Nvidia wasn't limiting their cards, not really a benefit... Kepler isn't an improvement for Boost as far as overclocking goes.

MLAA never use it.

Eyestutter? Still no fix.

DX9/10/11 lol?

First to node, but slowest on it..
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Most of that only affects the <1% of people crazy enough to go dual gpu in the first place. I just don't see the point in spending more money for a worse experience, which any dual gpu setup surely is.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
My single GPU AMD experience is similar to my dual GPU nvidia experience.

Except I haven't had a DAII issue with a single card for AMD like I did with SLI.

AMD dual card has been hit or miss for me, great bang for your buck performance when it works right, but I spent the first four months owning the cards stuttering and the last few months troubleshooting driver/software issues.

Just rather miffed when something works for months then I update the driver because I need it for a new title but it breaks something I've used and enjoyed for months previous.