All things being equal

Nov 26, 2005
15,197
403
126
With all things being equal does more cores raise the bottom number with FPS dips? I understand overclocking a chip will help raise the minimums but I'm curious about it with all other variables being equal. I'm on an i7 920. I've tried disabling 2 cores in bios but I can't get it to boot that way..

I don't generally set my pre-rendered frames to be buffered (1 through 4) I have it set to 'Use 3D application setting' (Nvidia)

CPU usage:
cpuusageut3_zps18d9cf0a.jpg~original
 
Last edited:

ninaholic37

Golden Member
Apr 13, 2012
1,883
31
91
I think things will still not be equal, because:

- different apps use threads/cores differently
- some apps can use more threads than others
- apps use each thread/core differently at different times based on what they're doing
- background OS activity may fluctuate, and some handle threads differently

More needed info would be:

- What are you trying to run
- How many cores are you trying to run it with
- Your program settings

Then someone who's using been in the same scenario might be able to give you a more specific and accurate answer. The best answer I think, at this point, would be: If the app can can utilize more threads/cores than available when it dips, then it might not dip with more cores.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
No, because what's really important for that will not be equal between games, even potentially of games using the same base engine. If a game can be found to use up to X cores well, then generally having at least X+1 cores is good, but past that, it's going to be game-dependent.

Now, going forward, having more than needed shouldn't be detrimental, but may not be advantageous, either. Games are moving towards using work queues (anything needing doing goes in the front of the queue, and then a loop picks things out of the back of the queue, and sends them to a thread to be run), instead of old coroutines (each thing in a thread strictly triggers the next actions), allowing for better/simpler scaling up. That won't mean game X will use 20 cores, but odd performance dips from threads bouncing around or not being scheduled ideally shouldn't be much of a problem, going forward.
 
Nov 26, 2005
15,197
403
126
No, because what's really important for that will not be equal between games, even potentially of games using the same base engine. If a game can be found to use up to X cores well, then generally having at least X+1 cores is good, but past that, it's going to be game-dependent.

Now, going forward, having more than needed shouldn't be detrimental, but may not be advantageous, either. Games are moving towards using work queues (anything needing doing goes in the front of the queue, and then a loop picks things out of the back of the queue, and sends them to a thread to be run), instead of old coroutines (each thing in a thread strictly triggers the next actions), allowing for better/simpler scaling up. That won't mean game X will use 20 cores, but odd performance dips from threads bouncing around or not being scheduled ideally shouldn't be much of a problem, going forward.

So something like setting the Advanced Performance options to 'Programs' will possibly help the FPS dips, correct?

If so then will changing the reg key Win32PrioritySeperation have a positive/beneficial impact towards fps dips?
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
In my experience, no.

The problem is that threaded games tend to have 1 master thread where much of the game logic takes place, and then several worker threads that process things like animations, Physics and AI, these can often dip and run at slower rates while not holding back the primary thread.

Dips in FPS are caused when the primary logic thread see's too much load, the minimums are best increased by having faster cores rather than more of them.

From memory isn't the Intel 920 4 physical cores and 4 HT cores? Double check this but I think that overclocking those chips netted you higher speeds with HT turned off, most games still do not take advantage of more than 4 cores and instead benefit from higher speeds.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Would there be any benefit in terms of cooling the chip more, if you disable some cores? Perhaps if you achieve a tiny bit cooler temperature, it would let you overclock the remaining active core(s) a tiny bit more?
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
As mentioned above a game itself won't use more cores than it is programmed to use. The advantage to having more cores than exactly the number a game can use is that other programs you have running and the OS stuff won't negatively impact game performance.

9 times out of 10 disabling cores and trying to overclock the remaining cores a little further ends up being overall slower than just having all of the cores active at whatever clocks you can achieve that way.
 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
It probably does as a side effect of more cores coming with more L3 cache.

Which is the only explanation I can think of why reviews say Civ V performs better with more cores as you can see in perfmon it never uses more than 12.5% CPU on a quad-core w/hyperthreading, meaning it is pegging one thread. Also tested this by setting the CPU affinity to only allow it one logical core with no difference whatsoever.

Disabling cores won't yield the lower cache that a chip which really had fewer cores would have.