How does the HD graphics on the G1820 compare to other series?

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
On the brink of building a new tower for my wife,the chip of course most likely will be the G1820 or the G1840.

Specs seem vague from Intel on the"HD Graphics" for the G1820.Is it faster then HD2000 or HD3000?My doubts are on it being as fast then HD4000 but what do i know.:)

Wife plays 2 games currently,Wizard 101 and a sudden interest in WOT.Wizard 101 of course runs on a potato.Wot runs good enough on the 3000 and 4000.

Eventually will be dropping in a respectable gpu like a 750 ti but was wondering where the G1820 would land with its IGP in the mean time.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
On the brink of building a new tower for my wife,the chip of course most likely will be the G1820 or the G1840.

Specs seem vague from Intel on the"HD Graphics" for the G1820.Is it faster then HD2000 or HD3000?My doubts are on it being as fast then HD4000 but what do i know.:)

Wife plays 2 games currently,Wizard 101 and a sudden interest in WOT.Wizard 101 of course runs on a potato.Wot runs good enough on the 3000 and 4000.

Eventually will be dropping in a respectable gpu like a 750 ti but was wondering where the G1820 would land with its IGP in the mean time.

Intel Haswell GT1 (12 CUs) performance will be lower than HD4000 (16CUs). Also, HD4000 was only found on the High-End CPUs with large 8GB Caches, making Celeron G1820 GT1 even slower per CU.
 

Seba

Golden Member
Sep 17, 2000
1,596
258
126
Intel HD Graphics (GT1) from Celeron and Pentium Haswell processors has 10 Execution Units.

HD 4000 (from some Core i3/i5/i7 Ivy Bridge CPUs) has 16 Execution Units so probably is faster.

I have an Core i5-4440 Haswell with HD 4600 (with 20 Execution Units) and I can tell you that even this is very weak for games. Still, it is usable in some games (such as Left 4 Dead 2 and Portal 2 at 720p and some reduced settings, Dota 2 at 1080p with reduced settings, Limbo).
 

NTMBK

Lifer
Nov 14, 2011
10,400
5,635
136
If you're planning on using the IGP, have you considered an AMD chip? Something like an A8-7600 would be pretty good for her.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Resolution - 1024x768 :rolleyes:

As silly as that resolution sounds,its the resolution of the monitor my wife uses.Well it will be till i could get her something like a 900p panel later on down the road.

She was using a bigger 17'' 1280x1024 old crt but it honestly gave me migraines and her headaches.It was a older 60hz one.D: S

I got the 19'' t.v with 1280x800 in our room,It's our Netflix/movie or youtube tower or if she catches me off the pc for half a second she jumps on and does whatever.Its how she suddenly decided to play WOT.:D Shes awful and i want another platoon buddy so a new rig it is.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
As silly as that resolution sounds,its the resolution of the monitor my wife uses.Well it will be till i could get her something like a 900p panel later on down the road.

Then don't even think about it. ;)

Looks like the gtx650 is held back by the G1820 but i do know pumping up some of those settings easily will put the 650 right back to being the bottleneck.Having had a 650 before,i know what it could do.:p

74555.png


74553.png


thief.gif


Stay away from the $22 more expensive A6-7400K if you're going to use a discrete GPU at some point.
 
Last edited:

MrTeal

Diamond Member
Dec 7, 2003
3,900
2,622
136
What's your budget for this build, and is it from scratch or are you reusing components? A thread in General Hardware might be able to help you get the best value if you are building a full tower.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
I emulated my i5 2500 to a dual core at 2.7ghz,tested out the HD2000 and it certainly plays Wizard 101 perfectly fine upwards of 1366x768 and WOT it appears to be bottleneckd by the cpu speed as gpu usage is anywhere from like 75%-90% and other times it will hit 98% but frames are sitting about 45-63 fps.This is everything on bare minimum low 1024x768.

Emulated chip may not be the real G1820 but the IPC increases from Sandy Bridge to Haswell certainly could make up for the extra cache of the i5 maybe perhaps?I know the G1820 has a better IGP then HD3000 so the experience certainly could be the same or just a bit better maybe?

Will toss back in my gtx660 and see where the frames land for that game with the same settings.Maybe Nvidia driver overhead or something would dramatically the experience?
skipsneeky2 is online now Report Post