i am wondering why the demo and the earlier digital download builds of Lords of Shadow worked fine with forced AA on nvidia then after the newest patch it doesnt work. amd had severe rendering errors with forced aa and i dont know why else mercury steam/david cox would've removed the ability to force driver AA.
and Tomb Raider Underworld while not perfect, still turned out a lot smoother (not just in framerate, but also in AA options) than the series reboot did. never played the reboot, but isn't the SSAA pattern in TR2013 just ordered grid? and why did AMD not ask for more? was it because their hardware couldnt do it? crystal D is perhaps one of my favorite western devs and one would think that their standards would be higher than those found in tomb raider 2013 unless they got paid a ton of cash by the sponsor of their newest game's sponsor.
and i am not an nvidia fanboy as AMD did do one thing right with the newest lineup and the previous lineup and that was non-crippled DP. but AMD's current CEO, the fact that they dont support driver-forced AA to the level nvidia does, as well as the low frame rates and less than optimal AA in the GE titles leads me to believe that GE is a lot more exclusive than TWIMTBP ever was (i dont think TWIMTBP was really ever exclusive given how ATi made little attempt at dev relations, no sincere effort was made on openGL drivers for a long time as Microsoft bailed them out with the R300, and the fact that ATi's hardware was missing a lot of features that made things impossible if the implementation was to be anywhere near close to TWIMTBP) .. the only thing that i know of that nvidia did was that super low class to their competitor's customers was the doom3 command and that was almost 10 years ago. and maybe the tesselation, maybe not.. i dont know enough about that to say.
dev relations are a good idea, and they really wouldve been worth more under openGL because there is no way MS can make DX's implementation the same under all hardware no matter how hard it tries nor can they have a maximum specification which would exacerbate the disaster that DX has pretty much always been... i am thinking GE is just being used for lack of features beyond AMD's poor DX implementations (they have gained a lot more from microsoft's min specs starting with SM2.0 at the expense of IQ). and it doesnt help that most professional reviews dont analyze several different driver versions, their stability, and image quality but only frame rates... i think IQ and compatibility are far more important at 60fps and beyond although then i dont value super high screen resolutions (but we could have had 240 hz signals and DDM on better than the best available IPS panels had it not been for HDMI and the fact that higher resolutions take up more bandwidth).
and Tomb Raider Underworld while not perfect, still turned out a lot smoother (not just in framerate, but also in AA options) than the series reboot did. never played the reboot, but isn't the SSAA pattern in TR2013 just ordered grid? and why did AMD not ask for more? was it because their hardware couldnt do it? crystal D is perhaps one of my favorite western devs and one would think that their standards would be higher than those found in tomb raider 2013 unless they got paid a ton of cash by the sponsor of their newest game's sponsor.
and i am not an nvidia fanboy as AMD did do one thing right with the newest lineup and the previous lineup and that was non-crippled DP. but AMD's current CEO, the fact that they dont support driver-forced AA to the level nvidia does, as well as the low frame rates and less than optimal AA in the GE titles leads me to believe that GE is a lot more exclusive than TWIMTBP ever was (i dont think TWIMTBP was really ever exclusive given how ATi made little attempt at dev relations, no sincere effort was made on openGL drivers for a long time as Microsoft bailed them out with the R300, and the fact that ATi's hardware was missing a lot of features that made things impossible if the implementation was to be anywhere near close to TWIMTBP) .. the only thing that i know of that nvidia did was that super low class to their competitor's customers was the doom3 command and that was almost 10 years ago. and maybe the tesselation, maybe not.. i dont know enough about that to say.
dev relations are a good idea, and they really wouldve been worth more under openGL because there is no way MS can make DX's implementation the same under all hardware no matter how hard it tries nor can they have a maximum specification which would exacerbate the disaster that DX has pretty much always been... i am thinking GE is just being used for lack of features beyond AMD's poor DX implementations (they have gained a lot more from microsoft's min specs starting with SM2.0 at the expense of IQ). and it doesnt help that most professional reviews dont analyze several different driver versions, their stability, and image quality but only frame rates... i think IQ and compatibility are far more important at 60fps and beyond although then i dont value super high screen resolutions (but we could have had 240 hz signals and DDM on better than the best available IPS panels had it not been for HDMI and the fact that higher resolutions take up more bandwidth).