IntelUser2000
Elite Member
- Oct 14, 2003
- 8,686
- 3,785
- 136
I would assume, that under more realistic gaming conditions, that the user would reduce GFX setting to achieve a playable framerate, which would make the CPU usage going up - leaving less power for the GPU.
Wouldnt you think so?
No, and this is why Furmark is unrealistic. Games don't behave like Furmark. It's like the fundamental code is different. It's Linpack for 3D. Linpack doesn't care about memory bandwidth, or that even in HPC, codes have an ILP limit, or that it doesn't fit conveniently in the CPU caches, or that interconnects matter. You'll notice it right away if you have an x86 desktop and a laptop to run them on.
Real world games also have game developers working closely with video card vendors so people can play them. They'll have people sitting in front of the screen and reduce demanding features that stress the GPU too much relative to the visual impact. You wouldn't enable AF 16x on the lowest setting. Furmark doesn't care because its a stress test that pretends its a 3D benchmark.
Do you know what a good 3D benchmark is? 3DMark before they started copying(Firestrike, Cloud Gate, Time Spy) mobile benchmarks. The latest is called 3DMark11. The older versions were good back in their days. I can approximate how the iGPUs perform simply by looking at 3DMark11 scores.