Sad to see even AMD fans want AMD to go out of business.
Probably... just probably they arent AMD fans, but gaming fans. I also would rather have both companies die in a fire if the situation gets consolized for PC gaming, if you know what I mean.
I know you are probably projecting on others thinking "uh huh, they must be AMD fans, smash keyboard!!!!one!!eleventy" because they dont support your opinion, which at this point it is leading me to think it is based on a bias/preference for certain company which obviously isnt AMD.
So to be clear, I dont want any bribing going on, no crappy gfx tied to a GPU brand or made to gimp the other one (my eyes bleed when we still think AO effects in 2016 are any kind of "graphics fidelity", when in reality AO was born as a cheap hack to simulate indirect lighting and secondary light bounces of GI without the performance penalty and quality of the latter) or any other shenaningans that either: makes all GPUs less performant for minimal or no IQ gain, and/or end up messing up the dev cycle (which at this point considering today's gaming industry is already pretty messed up by itself) and thus the games themselves with bugs or even being FUBAR (last Batman if I recall right).
ALL OF THAT IS BS, no matter whom does it come from.
And this is where it comes the second point. Nvidia fans/shills/stockholders/whatever get mad because Async Compute is starting to become a thing in DX12 games. Well, AC by itself doesnt cripple performance at expense of some gimmickery gfx, on the other hand, ENHANCES IT at no compromise.
If the next Hitman game uses this feature a lot for example, it will be free performance for the ones running cards able to do AC (which FYI arent all GCN gpus, we have seen less than moderate gains for GCN 1.0 GPUs because of their lack of many ACEs compared to 1.1/1.2), and the cards that cant do AC at all or at least properly wont get any gain with the option to turn it off for cases like Maxwell V2 where a little performance loss is seen when AC is turned on on them (obvious reasons behind this behaviour, we can all agree that at this point).
Some people like to link this with an "unnecesary amount of compute to cripple NV gpus" and cite Nvidia's past TWIMP behaviour with overtesselation, an obvious jab at AMD's VLIW uarch. Well it is not the same at all because the first is done as a result of devs targeting AC-capable hardware in the consoles and the ports carrying this on to PCs, and the second was indeed absurd as the amounts of tess seen well... wasnt actually being able to be seen by the gamer as it was sub-pixel tesselation of a concrete wall or rendered water occluded from the camera POV.
Hopefuly AC is here to stay and will indeed make this and next-gen's gpus use their transistor budget to the fullest making the compute capabilites already laying in the hardware be of good use and not sit idle anymore.
I know suddenly DX12 went from "the gaming revolution, that API JHH proposed like 78 years ago (yeah, he knows how to stay in shape and hide his age) and designed in coffee napkin in a rainy Tuesday morning while doing sit-ups and gave to MS's devs to work on so we all humanity can enjoy better performance" to "that API with meh gains that even thou we support to it's fullest (LOOK, the box says DX 12.1 capable, must be good!), it came to destroy PC gaming because of MS' greedyness with their gaming store (because ALL DX12 games will only be released into that platform, right? RIGHT??) locking in refresh rate, Vsync, not allowing you to stream properly, making your bacon taste bad and God knows what other horrors will they force into you with this". Goebbels would be all teared up of joy because of you guys, really, but that is not what is happening.