• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Gaming with Intel HD 4000

krnmastersgt

Platinum Member
Well I wasn't really sure where to go with this question but I figured this was the best sub-forum to post in.

Has anyone attempted/does play games with the HD 4000 on the 3rd Gen i5/7s? Just curious as to how they perform with some modern games. I'm wondering if (this is if GW2 isn't out by late July) I'd be able to do my normal routine + League of Legends which is the only thing I'm really playing right now on the integrated HD 4000 on my 3570k or if that'd be a bit too much for it with everything maxed out, though I guess if it ran fine without shadows that'd be an OK sacrifice. I play at 1920x1080 currently.
 
im interested in this as well, but for a different reason. Ive been gaming much more than i thought i would be on my sandy-bridge macbook air. Im thinking about selling my current MBA and if IVB integrated graphics are good enough, just getting the next gen air. If not i might be moving to a pro with dedicated graphics. Any know any rumors about the next gen air/pro specs?
 
Any know any rumors about the next gen air/pro specs?

You probably mean Intels Hashwell. It will supposedly be a graphics monster, probably several times better on GPU side than IVB. CPU is also supposed to be up to 15-20% faster if I'm not mistaken. Hashwell is supposed to be a big jump in CPU technology. That's the common rumor at least.
 
ya me too, my card us coming 2 weeks later, so how will a 3570k run say ghost reconcile advanced warfighter 2 at1920x1080 as well as D3 ?
 
I tried out the HD4000 graphics on my 3570, it was meh. I bought everything but a GPU thinking I could limp along with the on board graphics for a bit, but my old 9800 GTX outperforms it by quite a bit. I don't have any benchmarks for you, just real world experience. I play at 1680x1050 and while the HD4000 could play bad company 2, it was less than ideal. It will work in a pinch, but it's not going to replace a discrete card.
 
I tried out the HD4000 graphics on my 3570, it was meh. I bought everything but a GPU thinking I could limp along with the on board graphics for a bit, but my old 9800 GTX outperforms it by quite a bit. I don't have any benchmarks for you, just real world experience. I play at 1680x1050 and while the HD4000 could play bad company 2, it was less than ideal. It will work in a pinch, but it's not going to replace a discrete card.

Awesome feedback. Forget numbers, experience is what matters.
 
You probably mean Intels Hashwell. It will supposedly be a graphics monster, probably several times better on GPU side than IVB. CPU is also supposed to be up to 15-20% faster if I'm not mistaken. Hashwell is supposed to be a big jump in CPU technology. That's the common rumor at least.

hashwell? do you mean haswell? Ivy bridge macbooks havent even been released yet. Seems like a lot of speculation.

I would like to see how ivb compares to snb on airs in newer games. Right now i can play TF2, but at a low res and low settings. Id like to at least play at native res.
 
I used HD2000 graphics to play a bit of Half Life 2 and a lot of Torchlight 1, it worked perfectly for those. So, HD4000 will at least work for old games and light 3D.
 
Everything prior to HD4000 from Intel was pretty much useless for games. HD4000 still sucks but it's usable as long as you don't play modern games or high res.

I game on a 24" 1900x1200. I play a lot of CS:S and I tried using the HD4000 as my only GPU and it was the first time in about 8 years I've seen my FPS drop below the 299fps cap. It averages around 90-100fps so it's still playable but definitely shows it's weakness.

The F.E.A.R series, Fallout 3, S.T.A.L.K.E.R, L4D, Dead Island, were all unplayable.
 
Everything prior to HD4000 from Intel was pretty much useless for games. HD4000 still sucks but it's usable as long as you don't play modern games or high res.

I game on a 24" 1900x1200. I play a lot of CS:S and I tried using the HD4000 as my only GPU and it was the first time in about 8 years I've seen my FPS drop below the 299fps cap. It averages around 90-100fps so it's still playable but definitely shows it's weakness.

The F.E.A.R series, Fallout 3, S.T.A.L.K.E.R, L4D, Dead Island, were all unplayable.

Yeah, stupid Intel won't give us a GTX 680 class IGP, this sucks.
 
Yeah, stupid Intel won't give us a GTX 680 class IGP, this sucks.

I sense a hint of sarcasm :sneaky:

Well, it's no so much to ask that Ivy at least compete with Llano. It's coming close but still not there. All the games I mentioned I can't play on my 3770k I can play just fine on my A8. As well as on my 3770k with a GeForce 8800GT.
 
Well thanks for the info on your experiences with running it everyone!
Guess I won't be de-commissioning my card during the summer, might underclock everything though.
 
The HD 4000 is a decent gpu. While waiting for my GTX 670, which just arrived today, I've comfortably played Mass Effect 3 and Diablo III on it at 1080p. I set the quality fairly low, but was able to maintain a very playable frame rate regardless of the situation.
 
The HD 4000 is a decent gpu. While waiting for my GTX 670, which just arrived today, I've comfortably played Mass Effect 3 and Diablo III on it at 1080p. I set the quality fairly low, but was able to maintain a very playable frame rate regardless of the situation.

really? I have a 3570k and I'm waiting for my 7850, what CPU did y use to play d3 on 1080p?
 
Diablo 3 played alright, lower settings. Do NOT try skyrim though.

Those are the only games I've tried. I was impressed though. It's a decent backup.
 
I've been gaming solely with my freshly arrived i5-3570k for a about a week now. It's more than adequate for games like Dirt 3 and Total War:Shogun 2 with low settings @1920*1080. Metro 2033 needs to drop to DX10 to become playable (I don't care much about boring FPS in any case, just trying the 4000's capabilities). In older, DX9 titles I am seeing ~200% framerate increase over my previous Athlon X2 3800 - Radeon 2600 Pro platform.
 
I've been gaming solely with my freshly arrived i5-3570k for a about a week now. It's more than adequate for games like Dirt 3 and Total War:Shogun 2 with low settings @1920*1080. Metro 2033 needs to drop to DX10 to become playable (I don't care much about boring FPS in any case, just trying the 4000's capabilities). In older, DX9 titles I am seeing ~200% framerate increase over my previous Athlon X2 3800 - Radeon 2600 Pro platform.

seems promising, i think for me its probably worth it to upgrade to the next mba, retina or not.

How much of a difference will there be between desktop and laptop intel hd 4000 graphics? I try to stick with less taxing games on my current macbook air, but it sucks when i cant even play games like hl2 at full res.
 
Yes... let's put every heat source in a case into one tiny 1"x1" chip. On the plus side, it'd be real easy to cook eggs every morning or heat up some kind of snack while gaming.

Personally I wish Intel would butt out of integrated graphics and focus on just making a CPU. Or at least make a chip every generation that focused completely on computing and didn't have an inkling of graphics processing. Make that chip the OC chip... I'd probably buy it.

Oh, and if my laptop with HD Graphics 2500 can play LoL, HD 4000 should be able to play it just fine.
 
Just curious; outside of measuring your e-penis sizes (this would be a general "you", not meant to hurl insults at anyone - and I am exempt because my only computer would be a lowly laptop running on an i5-2410M NO I'M NOT JEALOUS YOU SHUT YOUR FILTHY MOUTHanyhow) why not just get Trinity if you're interested in running with an IGP?

I can't imagine that you'd notice a difference in CPU speed during everyday use (assuming you're not like, an engineer, film editor or possibly a weird combination of both?), and granted looking at your specs it looks like you've already made your purchase (so my question is a moot one, really) but wondering if it was a consideration before you made your purchase.
 
I've had the GTX for over a year now 😛

But it's a pretty fierce source of heat and my room can get pretty hot in the summer, so I was wondering how the system would perform overall if I took the card out for a few months and just ran it off the integrated HD 4000. I planned on downclocking my chip a bit as well, that or just crank up the AC I suppose but my house isn't very efficient when it comes to cooling due to its design so its a lot of wasted energy (as is the heater during the winter). Just a question out of curiosity more than anything really.
 
Just curious; outside of measuring your e-penis sizes (this would be a general "you", not meant to hurl insults at anyone - and I am exempt because my only computer would be a lowly laptop running on an i5-2410M NO I'M NOT JEALOUS YOU SHUT YOUR FILTHY MOUTHanyhow) why not just get Trinity if you're interested in running with an IGP?

I can't imagine that you'd notice a difference in CPU speed during everyday use (assuming you're not like, an engineer, film editor or possibly a weird combination of both?), and granted looking at your specs it looks like you've already made your purchase (so my question is a moot one, really) but wondering if it was a consideration before you made your purchase.

im really only interested in laptop performance, either a macbook air/pro probably, since a pro with a dedicated gpu is much more money. Id like to stay the air route, so if hd4000 is a lot better than hd3000 then its worth it for me to upgrade.

Not for anything crazy, but i would like to play mid-range games at native res - something i cant do right now on my current air.
 
why not just get Trinity if you're interested in running with an IGP?

I like the size and lightness of the Ultrabook. Give me that form factor with an AMD Trinity heart, that doesn't burn my skin if I set it in my lap, and I'll buy in a heart beat. 🙂

AT has a couple articles with detailed benchmarks on the HD4000 now. In a nutshell, its crap. An improvement from the HD3000, but performance was already nearly rock bottom with it. If you're used to playing games on higher details settings, be prepared for a shock.

Course, one doesn't buy an ultrabook for gaming.
 
Back
Top