Is no one reading this graph right? Nvidia power consumption is crazy at higher refresh rates while the strix doesn't even budge?
Am I reading this wrong or are we all downplaying how much less power amd is using compared to nvidia?
Any power consumption savings are thrown out the window to me for nvidia while gaming if this is the case(for high refresh rate gamers. This sadly doesn't apply to me since I can't get a high refresh rate panel at 4k).
Yeah right? I always treated it as a normal thing to run at higher idle clocks as I was running 3x 1440p with Titan Xs, but seeing AMD handle 3x 1080p @144hz with low idle clocks I'd really like to see if the same holds true for AMD at 3x 1440p @ 144hz.