BFG10K
Lifer
- Aug 14, 2000
- 22,709
- 2,971
- 126
Youre still not getting it. In any given situation, you need X amount of CPU power to saturate a graphics card so it becomes the primary bottleneck. X is derived from the CPUs clock-speed, number of cores, IPC, etc. Once you reach X, increasing the CPU power by increasing the number of cores will do nothing.The reason I say this is is there appears to be differences between a dual core running at 3.6 Ghz and a quad core running at 2.4 Ghz according to some benchmarks I've seen. Since the quad is still faster despite being slower it stands to reason that adding or subtracting the number of cores can be different depending on how the games are programmed. In single threaded games clockspeed was all that mattered.
Thus I just wanted to see if in your results if changing the number of cores would have mattered in any particular game since it may change how the game works.
For my particular tests, I demonstrated X was often reached with just two cores running at 2 GHz. Throwing in more cores will simply increase X, and will have no impact since the graphics cards was already the primary bottleneck.
Or to put it another way, if the number of cores is making a difference, it means the GPU isnt the primary bottleneck. That wasnt the case in my tests, because I demonstrated the GPU was the primary bottleneck.
Sure, but this level of standardizing is based on the GPU, not on the CPU. Its by reducing or increasing graphics details that most performance changes come from. This is my point that its the available GPU power that influences how playable a game is and what settings can be used, by far.Perhaps I'm not being clear what I mean by standardizing. When I build a system I'm looking for a certain minimum level of performance, and anything extra is gravy.
That minimum level of performance might be triple 2560 x 1600 screen at high settings with 2xAA 16xAF at over 60 fps average, or it could be a 1360 x 768 screen at medium with 0xAA 0xAF that has enough fps to feel smooth. If a game cannot perform at the required minimum that means some component needs upgrading to play that game. So what I want to know is playability at some fixed standard which I can extrapolate to the parts I use when building or upgrading.
I agree its subjective, but again Id challenge everyone to be always configuring their graphics card to the highest playable settings. Id also be encourage everyone to buy the biggest monitors they can afford instead of running crappy 1680x1050 displays with 4 GHz i7 CPUs.Thus the problem I have with your test is it is too variable as it doesn't tell me if the equipment I buy can perform at the level I want throughout most games. You are using a subjective test based on what you consider playable. I think this basically involves sacrificing fps to raise the graphics options as high as they can go until the game becomes unplayable to you. You also lower resolution or certain settings to make a game playable.
In other words playability is your constant when it really should be what you are looking for (in the form of fps at some standard chosen for all games).
No, not always. I can have a lower split-second minimum due to benchmarking noise, but the actual benchmark run as a whole has a higher framerate which makes the average higher. If you base your decision on the minimum, then its completely misguided.I agree that minimum FPS isn't a perfect indicator of performance. However more data is better than less data, and a lower minimum can show that the game has become unplayable.
In the absence of a benchmark plot putting a minimum into context, an average is the best single number you can use.