AMD vs. Nvidia - Prefs & Why

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Arzachel

Senior member
Apr 7, 2011
903
76
91
Actually a family like the GTX 2XX series garnered multi-monitor support, GPU Physic support, 3d vision support, garnered improvements with super-sampled support, ambient occlusion enhancements, adaptive V-sync, frame limiters and not placed on legacy support. Examples of being pro-active, adding tools, features and enhancements to improve upon immersion, flexibility and the gaming experience for their customers in the life of the product. This is why I give them money. Far cry of removing features, one may imagine.

Removing features was aimed at the vendor filter placed by Nvidia in Batman Arkham Asylum to disable the ingame AA options for AMD cards. I was also pretty angry about being having to chose between the DX9 render path or my GTX460 being brought to it's knees due to the unnecessary amounts of tessellation in Crysis 2 just so Nvidia's high end would look better in benches.
 

Gordon Freemen

Golden Member
May 24, 2012
1,068
0
0
Removing features was aimed at the vendor filter placed by Nvidia in Batman Arkham Asylum to disable the ingame AA options for AMD cards. I was also pretty angry about being having to chose between the DX9 render path or my GTX460 being brought to it's knees due to the unnecessary amounts of tessellation in Crysis 2 just so Nvidia's high end would look better in benches.
This statement ^ +1 is agreeable with me however officially not supported by nvidia corp none the less there is much truth behind it.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
What does playing games @ 60fps have to do with your making somewhat false and defiantly misleading and on the borderline of false claims as far as Nvidia goes.

You have to be kidding me on misleading -- from a poster that claims the features are all busted!:)
 

tigersty1e

Golden Member
Dec 13, 2004
1,963
0
76
No they don't

How much does a used 6950 go for?

actually amd cards usually hold their value better when the new gen comes out.

but that is only because the previous 6000, 5000, and 4000 series held a tremendous price/perf over competing nvidia cards.

when cards become older, people usually buy the better value card.

but when both cards are still current, nvidia cards hold their value better because more people want nvidia cards.

in b4 you talk about 7900 prices before the price drop.
 

Gordon Freemen

Golden Member
May 24, 2012
1,068
0
0
Actually a family like the GTX 2XX series garnered multi-monitor support, GPU Physic support, 3d vision support, garnered improvements with super-sampled support, ambient occlusion enhancements, adaptive V-sync, frame limiters and not placed on legacy support. Examples of being pro-active, adding tools, features and enhancements to improve upon immersion, flexibility and the gaming experience for their customers in the life of the product. This is why I give them money. Far cry of removing features, one may imagine.

You have to be kidding me on misleading -- from a poster that claims the features are all busted!:)

(Physx) adds NOTHING but lowers frame rates = BUSTED, (Adaptive Vsync) still screen tears = BUSTED, (Supersample) on a 2xx series card is not really an option so it's busted for 2xx cards however 8x (supersample) work nicely in some select games on 5xx series and above whereas HQ AA/Supersampling works on all games for AMD cards, (Ambient Occlusion) when and if it wants to even work makes weird and ugly image anomalies when on 2xx series cards & frame rate takes a dive = Busted, 3D vision is busted cause it takes two cards not one like AMD/Radeon :)
 
Last edited:

tigersty1e

Golden Member
Dec 13, 2004
1,963
0
76
I usually buy the better performance/price card.

Anyone who agrees with this, probably last owned a nvidia card back in 8800 gt/gtx days. (not counting gtx 670).
 

Gordon Freemen

Golden Member
May 24, 2012
1,068
0
0
I usually buy the better performance/price card.

Anyone who agrees with this, probably last owned a nvidia card back in 8800 gt/gtx days. (not counting gtx 670).
I paid $650 plus tax so close to $700 for my XFX 8800GTX back in the day when was it in 07 I think and I have to say it was one of the worst GPU investments into gaming I have ever made to be honest.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Removing features was aimed at the vendor filter placed by Nvidia in Batman Arkham Asylum to disable the ingame AA options for AMD cards. I was also pretty angry about being having to chose between the DX9 render path or my GTX460 being brought to it's knees due to the unnecessary amounts of tessellation in Crysis 2 just so Nvidia's high end would look better in benches.

Good lord! It was the developers decision! Not some zany conspiracy.

Batman AA GOTYE

http://store.steampowered.com/app/35140

This title has in-game AA for Radeons.

The developers added more tessellation to improve the title like these examples -- not some zany conspiracy.

http://forums.anandtech.com/showpost.php?p=32286552&postcount=554
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
(Physx) adds NOTHING but lowers frame rates = BUSTED, (Adaptive Vsync) still screen tears = BUSTED, (Supersample) on a 2xx series card is not really an option so it's busted for 2xx cards however 8x (supersample) work nicely in some select games on 5xx series and above whereas HQ AA/Supersampling works on all games for AMD cards, (Ambient Occlusion) when and if it wants to even work makes weird and ugly image anomalies when on 2xx series cards & frame rate takes a dive = Busted, 3D vision is busted cause it takes two cards not one like AMD/Radeon :)

Exactly, why your view doesn't hold any weight to me.
 

Gordon Freemen

Golden Member
May 24, 2012
1,068
0
0
Exactly, why your view doesn't hold any weight to me.
Hey the truth hurts but Nvidias drivers or rather there developer relationships and in turn there laundry list of well optimized games support and EVGA are why I stuck with Nvidia for a second go around since my last Radeon card.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Sorry, your truth doesn't hurt but your views like it takes two GPU's to run 3d vision and one to run AMD -- does not make any sense.

I get it -- anything under 60 FPS -- you have no use for.
 
Last edited:

Gordon Freemen

Golden Member
May 24, 2012
1,068
0
0
Sorry, your truth doesn't hurt but your views like it takes two GPU's to run 3d vision and one to run AMD does for not making any sense.

I get it -- anything under 60 FPS -- you have no use for.

Physx is a placebo it add nothing over the competition IMHO other than a serious performance flop. Does 3D vision take only one GPU now cause it used to take two right ? Or am I getting 3D gaming and triple monitor support mixed up LOL. I like to do most of my gaming @ 60fps Vsync but some games play ah'ight at around 30fps like Metro 2033 as well I played Trine 2 demo @ 35fps and had a blast.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Sorry, your truth doesn't hurt but your views like it takes two GPU's to run 3d vision and one to run AMD -- does not make any sense.

I get it -- anything under 60 FPS -- you have no use for.

Are you two really going to argue about whose opinion is right? Live and let live bro. I personally don't notice a difference in image quality anymore, but I think SGSSAA is far easier to enable on AMD hardware and it also works in more titles (1 click in ccc vs fiddling around in nvidia inspector). This is just my opinion, i'm sure others may disagree which is fine by me.

But comparing apples to apples, they basically look the same to me. I do really enjoy FXAA in the driver control panel, and transparency SS. No, TrSS is not nearly the same as SSAA but it looks damn good - and works in most titles. Nvidia really does it right in terms of software even if the hardware isn't a dramatic leap forward in performance, I wish AMD would step it up in that respect.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
They both suck, but AMD sucks more. nvidia is better because they make an effort to get more features and they don't force as many optimizations, but they're far from sufficient. Those who are interested in old games are screwed though. No W-buffer emulation, no ability to force trilinear mipmaps, OpenGL performance bug, no texel origin adjustment, force rotated grid AA broken with some older games, broken 16 bit color quality, no emulation of shadow buffers, the list goes on.

One thing AMD did do a good job with was making sure FP64 precision wasn't super slow. Double fp precision means higher IQ capability, not just better compute capability.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Removing features was aimed at the vendor filter placed by Nvidia in Batman Arkham Asylum to disable the ingame AA options for AMD cards. I was also pretty angry about being having to chose between the DX9 render path or my GTX460 being brought to it's knees due to the unnecessary amounts of tessellation in Crysis 2 just so Nvidia's high end would look better in benches.

That wasn't done on Nvidia's request or anything like that. Basically it comes down to Crytek was lazy and they tested on Nvidia hardware anyway. Determined it was good enough and moved on. They even have a vsync bug in their engine that dates back to the original Crysis. If you run 1920x1080 and enable vsync you are locked to 50hz, not 60.
 

Gordon Freemen

Golden Member
May 24, 2012
1,068
0
0
They both suck, but AMD sucks more. nvidia is better because they make an effort to get more features and they don't force as many optimizations, but they're far from sufficient. Those who are interested in old games are screwed though. No W-buffer emulation, no ability to force trilinear mipmaps, OpenGL performance bug, no texel origin adjustment, force rotated grid AA broken with some older games, broken 16 bit color quality, no emulation of shadow buffers, the list goes on.

One thing AMD did do a good job with was making sure FP64 precision wasn't super slow. Double fp precision means higher IQ capability, not just better compute capability.
"Double fp precision means higher IQ capability" Could this be the reasoning as to why Radeon cards look like better IQ to my eyes ? I have been trying to find a way to explain it but Radeon cards just make games look better IMHO.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Are you two really going to argue about whose opinion is right? Live and let live bro. I personally don't notice a difference in image quality anymore, but I think SGSSAA is far easier to enable on AMD hardware and it also works in more titles (1 click in ccc vs fiddling around in nvidia inspector). This is just my opinion, i'm sure others may disagree which is fine by me.

But comparing apples to apples, they basically look the same to me. I do really enjoy FXAA in the driver control panel, and transparency SS. No, TrSS is not nearly the same as SSAA but it looks damn good - and works in most titles. Nvidia really does it right in terms of software even if the hardware isn't a dramatic leap forward in performance, I wish AMD would step it up in that respect.

I specifically used the GTX 2XX series as an example of being pro-active by nVidia. In the life cycle of this product, it garnered and still continues to receive full support --- features like 3d vision and GPU Physic support but also improvements like enhancements to ambient occlusion, multi-monitor support, additional super-sampled modes, adaptive v-sync and frame limiters and FXAA. All kinds of goodies that may be appealing to gamers in the life of the product. Their additions is not opinion.

It is nVidia's pro-active nature and what they offer their customers is differentiation to me.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I'm going to probably stay away from Nvidia for a while, due to bumpgate. Their sheer incompetence and deception pissed me off and left me with 2 dead laptops.

It is slightly annoying that they do things like buy Physx then lock it up, ensuring that most game companies do nothing with it, thus making it essentially worthless to everyone.

I'm not an AMD fanboy but when there are only two choices and one of them kind of pissed you off, you only have one left. The Nvidia CEO also has no problems with blatant lies and deception (woodscrews anyone?).

AMD isn't perfect but again I'm not rewarding Nvidia for acting like they do, so I don't have any other choice. There isn't a significant difference between their offerings so it isn't like I'm sacrificing anything by avoiding Nvidia until I think they have straightened out.

Most game companies did nothing with Physx anyway, actually since Nvidia acquired that tech it was actually used in a couple major titles that it likely would have otherwise not been used in. I don't like the performance hit but the features that GPU physx brought to Batman Arkham City for example were pretty nice additions to the eye candy.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I specifically used the GTX 2XX series as an example of a being pro-active by nVidia. In the life cycle of this product, it garnered and still continues to receive full support --- features like 3d vision and GPU Physic support but also improvements like enhancements to ambient occlusion, multi-monitor support, additional super-sampled modes, adaptive v-sync and frame limiters and FXAA. All kinds of goodies that may be appealing to gamers in the life of the product.

It is nVidia's pro-active nature and what they offer their customers is differentiation to me.

I'm totally with you there. Like I mentioned nvidia really puts a lot of effort into their software and it generally shows. Its the little attention to detail stuff that makes life easier, its usually not a major feature but makes life slightly easier.

I've always thought AMD made fine hardware, I did enjoy my 7970s a lot when I used them -- its just unfortunate that they're hurting in terms of software support -- I presume because of layoffs at AMD. I remember back when the 5000 series had just came out, they had made huge strides in software support for their cards and this year they sorta stagnated. Hopefully this changes whenever their next 8000 series hits.
 

Gordon Freemen

Golden Member
May 24, 2012
1,068
0
0
I specifically used the GTX 2XX series as an example of a being pro-active by nVidia. In the life cycle of this product, it garnered and still continues to receive full support --- features like 3d vision and GPU Physic support but also improvements like enhancements to ambient occlusion, multi-monitor support, additional super-sampled modes, adaptive v-sync and frame limiters and FXAA. All kinds of goodies that may be appealing to gamers in the life of the product. Their additions is not opinion.

It is nVidia's pro-active nature and what they offer their customers is differentiation to me.
All these enhancement are moot on 2xx series cards cause the 2xx series cards just do not output a quality image like the 4xx series and up do with Shader Model 5.0 , DX11, OpenGL 4.1
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I would say existing GTX 2XX series owners -- enjoyed receiving adaptive V-sync and FXAA for their games.

I would say when nVidia first introduced surround gaming with the GTX 4XX series, GTX 2XX Sli owners enjoyed their new surround gaming feature without having to buy another platform for the feature.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
All these enhancement are moot on 2xx series cards cause the 2xx series cards just do not output a quality image like the 4xx series and up do with Shader Model 5.0 , DX11, OpenGL 4.1

So you're saying when you used a DX10 card it wasn't a quality image? I bet back in 2009 it was no? What changed?

Image quality has little to do with whether or not it used DX11, we had this discussion in the Crysis IQ thread.
 

Gordon Freemen

Golden Member
May 24, 2012
1,068
0
0
I would say existing GTX 2XX series owners -- enjoyed receiving adaptive V-sync and FXAA for their games.

I would say when nVidia first introduced surround gaming with the GTX 4XX series, GTX 2XX Sli owners enjoyed their new surround gaming feature without having to buy another platform for the feature.
I already stated Adaptive Vsych does not work the screen still tears and FXAA is a low quality AA so I rather use normal AA or (supersample) or nothing. I have me EVGA GTX 275 sitting on my desk right in front of me and the reason I upgraded to this GTX 5xx card was because the IQ was all busted on the GTX 275.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
nVidia's image quality has been pretty consistent since the 8800 GTX with AA sample positioning and anisotropy quality. I don't know what you mean by busted IQ.