But I feel if they knew the default was 64x, they would have pushed nVidia to drop that down to 16x. The 75% performance boost and no visual quality difference is a pretty big driver to drop that.
You place far too little blame on Project RED. Of course they knew the default was 64x. Do you really think they would implement a feature without testing? While the code may have already been written its a lot of work to integrate into the engine. Furthermore the performance loss is not something competent developers are going to write off. They would investigate what is causing - 40-60% fps drops and try to work around it.
This looks like Nvidia being stupid and simply not optimizing.
As far as performance differences between AMD and Nvidia...
Nvidia wrote the code...of course its optimized better for their chips. Its poorly written but the fact that it performs better for their chips is no surprise and (assuming this is the only reason for the performance difference) should be expected. If its doing anything low level why on earth would you assume its going to perform well on AMD or intel setups.
Last edited:
