- Feb 11, 2011
- 480
- 0
- 0
Why is it that with less shaders Nvidia cards beat AMD cards?
I used to buy almost exclusively ATI card back in the good ol' DX9 days. But since the advent of DX10 and unified shaders, it has always bugged me why with so much horsepower are AMD cards so much slower to their Nvidia counterparts? I'm OCDish about certain things, having unused things in my life is one of them. So I've avoided AMD cards except for the one time I bought a 4850 for the hell of it. I had an 8800gts that lasted me for 5 years, then I bought the 4850, and now I have a 460.
I've heard that with the old Vec5 and even with the new Vec4 architecture only about 50% of shader power is being utilized. So why doesn't AMD just cut out the parts that don't work or make them work? Or is their architecture so poorly designed that they can't without sacrificing overall horsepower?
I used to buy almost exclusively ATI card back in the good ol' DX9 days. But since the advent of DX10 and unified shaders, it has always bugged me why with so much horsepower are AMD cards so much slower to their Nvidia counterparts? I'm OCDish about certain things, having unused things in my life is one of them. So I've avoided AMD cards except for the one time I bought a 4850 for the hell of it. I had an 8800gts that lasted me for 5 years, then I bought the 4850, and now I have a 460.
I've heard that with the old Vec5 and even with the new Vec4 architecture only about 50% of shader power is being utilized. So why doesn't AMD just cut out the parts that don't work or make them work? Or is their architecture so poorly designed that they can't without sacrificing overall horsepower?