Question about Intel HD Graphics

tipoo

Senior member
Oct 4, 2012
245
7
81
Does the GPU run at the base clock or the turbo boost clock most of the time in games? I'm curious because of the lower base clock of GT3 vs GT2 in Haswell (200 and 400), but the turbo clock is about the same. The GT3 has double the execution resources, but if it runs at half the clock speed most of the time the performance gains would be lower. If it's always hitting turbo that's a different case, but then why call it turbo? Or does it constantly jump in between, in which case wouldn't performance change second to second with the thermal load of the CPU cores?
And is the double performance claim for the GT3 with or without eDRAM? It seems getting up to double the performance you would start to be memory bandwidth limited like AMDs APU GPUs are, so it might be with the eDRAM?
 
Last edited:

mikk

Diamond Member
May 15, 2012
4,308
2,395
136
In Turbo Boost of course (dynamic frequency). There are many steps between base and max dynamic frequency.
 

tipoo

Senior member
Oct 4, 2012
245
7
81
There are many steps between base and max dynamic frequency.


And what frequency do Intels HD Graphics in Ivy Bridge usually run at in games? I think I remember anandtech saying once that it almost always stayed at the Turbo Boost clock rate, but I can't find that article, that's why I'm asking. If that's the case, GT3 will perform much better than GT2, but if they go lower than the Turbo and go to the base clock often the difference won't be that great, I would guess.

I guess what I'm confused about is if the highest clock rate is something it usually can hit, why call it a turbo? AMD and Nvidia GPUs also have a broad range of cock frequencies they scale to with load, but Nvidias turbo is a different matter, it only goes to that clock when some parts of the chip are underused and there is thermal headroom. If Intels Turbo Boost clock is really just a max frequency, why call it a boost?
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
And what frequency do Intels HD Graphics in Ivy Bridge usually run at in games? I think I remember anandtech saying once that it almost always stayed at the Turbo Boost clock rate, but I can't find that article, that's why I'm asking. If that's the case, GT3 will perform much better than GT2, but if they go lower than the Turbo and go to the base clock often the difference won't be that great, I would guess.

I guess what I'm confused about is if the highest clock rate is something it usually can hit, why call it a turbo? AMD and Nvidia GPUs also have a broad range of cock frequencies they scale to with load, but Nvidias turbo is a different matter, it only goes to that clock when some parts of the chip are underused and there is thermal headroom. If Intels Turbo Boost clock is really just a max frequency, why call it a boost?

:biggrin:
 

mikk

Diamond Member
May 15, 2012
4,308
2,395
136
And what frequency do Intels HD Graphics in Ivy Bridge usually run at in games? I think I remember anandtech saying once that it almost always stayed at the Turbo Boost clock rate, but I can't find that article, that's why I'm asking. If that's the case, GT3 will perform much better than GT2, but if they go lower than the Turbo and go to the base clock often the difference won't be that great, I would guess.


It depends on the model. I have a i5-3570K with HD4000. Even with Prime95+Furmark my iGPU stays on 1150 Mhz which is the max dynamic frequency. ULV models don't stay on max, as you can see from anandtech it might run around 1000 Mhz average. Standard notebook models with bigger TDP headroom usually hit the max dynamic frequency as far as I know. For GT3 models with Haswell we don't know yet.
 

Hulk

Diamond Member
Oct 9, 1999
5,162
3,779
136
Does the GPU run at the base clock or the turbo boost clock most of the time in games? I'm curious because of the lower base clock of GT3 vs GT2 in Haswell (200 and 400), but the turbo clock is about the same. The GT3 has double the execution resources, but if it runs at half the clock speed most of the time the performance gains would be lower. If it's always hitting turbo that's a different case, but then why call it turbo? Or does it constantly jump in between, in which case wouldn't performance change second to second with the thermal load of the CPU cores?
And is the double performance claim for the GT3 with or without eDRAM? It seems getting up to double the performance you would start to be memory bandwidth limited like AMDs APU GPUs are, so it might be with the eDRAM?


The turbo mode was initially developed so that multicore chips could increase the clockspeed of active cores when not all of the cores were used. Thus using all available TDP. When GPU moved on die with Westmere (I believe) then there was the added complication of balancing GPU and CPU speeds and TDP. If the GPU and CPU are running full speed and the chip is getting hot which should throttle? This is really more of a concern with mobile parts since with a desktop part it's pretty hard to run into TDP issues these days if you aren't overclocking. Even with mobile parts the turbo speed of the CPU/GPU is more of a marketing tool in my opinion. Have a look at some of the CPU specs on this and you'll see a crazy range of base clock and turbo steps for both the CPU and GPU across dozens of basically identical cores in order to establish price/performance targets.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
From what I gathered:
The Desktop chips will almost always stay at max turbo.
The 55/45/35W Notebook chips will stay at close to or max turbo if the cooling of your notebook is up to the task.
From personal experience:
The 17W Ultrabook chips will be near their max turbo in benches, hover around 900 Mhz in typical games and around 700 Mhz when you additionally push the CPU.

Haswell might be a bit more sensitive in the 35W versions as well, considering it's the same process with slight tweaks but a substantial increase in theoretical GPU power. That's personal opinion though, take it salted.
 

mikk

Diamond Member
May 15, 2012
4,308
2,395
136
37W GT2 isn't that big of a increase, I don't think it will change anything.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
I haven't tested much, but from my experience with 35w SB, it almost always stays at the max turbo clock, I only noticed it constantly under that when running occt psu test (which is something like furmark + linx).

judging by how well 18w Ivy bridge can run games
https://www.youtube.com/watch?v=cKWoKtNjL74

it's also probably not running at 350MHz.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Just remembered, I played through HL2 on an oced Athlon 500 and a Geforce 2 Pro back then, that whole system was below 100 Watts in 2004 :colbert:
It does run better on IB though, especially on the resolution.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
Just remembered, I played through HL2 on an oced Athlon 500 and a Geforce 2 Pro back then, that whole system was below 100 Watts in 2004 :colbert:
It does run better on IB though, especially on the resolution.

that's EP2 from 2007, it uses DX9c compared to your Geforce 2 running the older version without many effects (DX7 mode),
also with the Athlon 500 even the original Hl2 was probably really bad, I played it with a 2GHz+ K7, but again, it's irrelevant because EP2 was a lot more demanding

also the video include other games, and there is a second video with even more games,

the video is no proof that the IGP is not locked at 350MHz, but considering how it runs games like bioshock, I think it's running at a higher clock
http://youtu.be/cKWoKtNjL74?t=1m23s

still, not bad for a 18w Intel CPU+IGP (inside a Tablet http://www.eurogamer.net/articles/digitalfoundry-surface-pro-review)
 
Last edited:

Piroko

Senior member
Jan 10, 2013
905
79
91
also with the Athlon 500 even the original Hl2 was probably really bad, I played it with a 2GHz+ K7
Nah, it was running fine, that was the surprising part back then. My old PC was the center of attention on some local lans because we threw everything at it (Painkiller - fine, but bullet-time like slowdown with stable fps at large incomings, very cool to play actually; Doom 3 - was running, but really bad; UT03 - fine bar some mods; Serious Sam - brilliant).

And I've got a fairly good idea what that chip can and can't play, I have one. :hmm:
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Does the GPU run at the base clock or the turbo boost clock most of the time in games? I'm curious because of the lower base clock of GT3 vs GT2 in Haswell (200 and 400), but the turbo clock is about the same. The GT3 has double the execution resources, but if it runs at half the clock speed most of the time the performance gains would be lower. If it's always hitting turbo that's a different case, but then why call it turbo? Or does it constantly jump in between, in which case wouldn't performance change second to second with the thermal load of the CPU cores?
And is the double performance claim for the GT3 with or without eDRAM? It seems getting up to double the performance you would start to be memory bandwidth limited like AMDs APU GPUs are, so it might be with the eDRAM?

The lower base clock allows the GT3 to potentially use less power in very little demand 3D scenarios(like in Aero or browser 3D) or in 2D rendering. Basically anything that can't idle and completely power down but is very light load. Also with GT3, peak might be similar as GT2 but its expected it won't reach it for any sustained periods of time(Part of the reason for GT3 is to have better perf/watt). It's just for burst. Since the performance gain is estimated to be 50% over GT2, it also means there's some loss of clocks.

Leaked specs are also indicating that the GT3 parts feature lower CPU clocks than GT2 equivalents as well.
 
Last edited:

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I assume notebook parts will continue to have a different socket than their desktop counterparts?

I'd be interested in building a SFF Haswell + GT3 rig and hope I'm given the option of doing that.