What's faster/better? AMD 2400 HD or Intel HD 3000/4000 ?

ibex333

Diamond Member
Mar 26, 2005
4,086
119
106
Cant find conclusive info on this....

I have an old ATI/AMD 2400 HD card and I wonder if it is better to use that over the built in graphics on my 2500k. And if it is, what if I had an Ivy Bridge? or a Haswell or a Broadwell? Are any of them truly better purely graphics wise?

I know this comparison is kind of stupid and pointless to begin with , but I'm just curious and I want to know.
 

ibex333

Diamond Member
Mar 26, 2005
4,086
119
106
How interesting... Thanks very much. I didn't know about this website.

Got rid of my 290X because I hardly have time to play games but I do still play Diablo 3 and Command and Conquer 3/Generals on occasion.

Given how Diablo 3 scales well to old hardware, the HD 2400 really is a piece of crap when it comes to gaming. It is really, really bad. I play Diablo 3 at 800x600 in a window and the HD 2400 only pushes 19 fps on average. Whats worse, for some reason AMD OverDrive is disabled for it, so it cannot be overclocked.

I compared it to an AMD 5450 that I have and I got 46 fps on average and about 54fps when overclocked to the max. (again that's at 800x600)

I will test the built in Intel GFX later just to compare.


There is something really fun for me about benchmarking older hardware to see just how much abuse it can take and far it can be pushed in relation to newer games.
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
AMD's drivers are better than Intel's so that's a point for the red team.
Normally I'd agree with you, however AMD's DX10 parts have been legacy for the last couple of years. AMD hasn't pushed out a driver update in quite some time. So at this point the Intel drivers would likely be newer.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
yes the AMD support for anything older than the HD5000 series is abandoned, it even feels like older Intel IGPs have better support, so you have to use Windows 7 drivers for newer windows, things like video acceleration on the web browser don't work and so on...

and the hardware itself is horrible for the HD2400, only 40SPs, very old UVD for video acceleration... HD3000 is a way better choice... I would even take the HD Graphics from gen 1 i3/i5s over the HD 2400.
 

Auric

Diamond Member
Oct 11, 1999
9,596
2
71
To be fair, the HD 2400 is a low-end 64-bit card from eight years ago. Maybe a third-party utility would allow overclocking.

But apparently it wouldn't be worth it compared to the Intel HD Graphics 3000 -which itself may be overclockable via the BIOS or a motherboard utility or Intel Extreme Tuning Utility.

In my experience casually testing Intel Graphics, clock increases result in about linear increases in benchmark scores -i.e. first-gen from 733 to 900 MHz is 21% average, and HD 4600 from 1150 to 1600 MHz is 37%. DDR3 1333 to 1600 MHz is good for 3% as well (0-5% for different benchmark components).
 

MrTeal

Diamond Member
Dec 7, 2003
3,554
1,658
136
I actually installed my old 2400 in my system when I got my 2500k, and the processor graphics were a couple times faster in 3dMark06 than the HD2400. I don't have that card anymore to test, but Anand tested the HD3000 IGP as being faster than the 5450, and the 5450 is much, much faster than 2400 Pro.
 

cdebbie

Junior Member
Apr 23, 2015
17
0
0
AMD's HD2400Pro/XT is at Intel's HD2000/HD3000. Both are DX10, so it depends on you which one you prefer. In my opinion AMD's drivers are better than Intel's so that's an adavantage for the red team. Also in my opinion Ivy's HD4000 and above is better.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
AMD's HD2400Pro/XT is at Intel's HD2000/HD3000. Both are DX10, so it depends on you which one you prefer. In my opinion AMD's drivers are better than Intel's so that's an adavantage for the red team. Also in my opinion Ivy's HD4000 and above is better.

I have both Intel HD 2000 and an old Radeon (HD 4670), and the experience with the Intel IGP is better, newer drivers, working video acceleration for web browser... automatically installs drivers on win10 preview (with the radeon I have to install windows 7 drivers manually, even with windows 8.1 drivers are already a pain with the Radeon)

the HD 2400 is even worse than the HD 4670 in terms of drivers support

the HD 3000 can keep up with an HD 5450 which is a lot faster than the HD 2400, the HD 2000 should also outperform the HD 2400...

HD 2400 is only clearly better than really bad IGPs like the GMA 4500
 

ibex333

Diamond Member
Mar 26, 2005
4,086
119
106
Didn't want to start a new thread just for that, so I will ask this question here...

I switched from my discreet video card to HD 3000 and now I cant run my monitor at it's native res of 2560x1440 (not in games obviously - I wouldn't expect it to)

It will do a max of 1080p, which looks blurry on a 1440 monitor.

My motherboard is GIGABYTE GA-Z68A-D3H-B3. I did some research and some peopel claim that this is the mobo's limitation. But is this really so?

I actually went to Gigabyte's website and installed the intel driver they list for my mobo, and there is no change. 1080p is max.

Can someone who knows about this elaborate please?

If this really is the mobo's limitation this sort of defeats the purpose of built in graphics. If you have to invest more money into a "fancy" mobo that does support higher than 1080p then you might as well invest in a video card...
 

ibex333

Diamond Member
Mar 26, 2005
4,086
119
106
Yep, found it. Clearly a mobo or port limitation. According to Gigabyte's manual for my mobo:

Onboard
Graphics
Š Integrated in the Chipset:
- 1 x D-Sub port
- 1 x DVI-D port, supporting a maximum resolution of 1920x1200
* The DVI-D port does not support D-Sub connection by adapter.
- 1 x HDMI port, supporting a maximum resolution of 1920x1200


*** And that, my fellow AT'ers is why even a crappy HD 2400 is still at least in one way better than HD3000. ; ) I checked and the HD 2400 does indeed easily do 2560x1440, while HD3000 requires a fancy mobo to support that.
 
Last edited:

MrTeal

Diamond Member
Dec 7, 2003
3,554
1,658
136
Yep, found it. Clearly a mobo or port limitation. According to Gigabyte's manual for my mobo:

Onboard
Graphics
Š Integrated in the Chipset:
- 1 x D-Sub port
- 1 x DVI-D port, supporting a maximum resolution of 1920x1200
* The DVI-D port does not support D-Sub connection by adapter.
- 1 x HDMI port, supporting a maximum resolution of 1920x1200


*** And that, my fellow AT'ers is why even a crappy HD 2400 is still at least in one way better than HD3000. ; ) I checked and the HD 2400 does indeed easily do 2560x1440, while HD3000 requires a fancy mobo to support that.

That's interesting. My Z68X-UD3H-B3 will do 1920x1200 over DVI-D or HDMI but 2560x1600 over DisplayPort. Probably just a single link DVI port issue.
 

rootheday3

Member
Sep 5, 2013
44
0
66
Yes, Intel IGPs support DVI single link but not dual link transmitters on the mother board.

If your motherboard has DP out and monitor has DP in, then you can get 25x16 that way. If your motherboard has DP out and your monitor only has DVI-D in, you can get a DP->DVI-D active dongle, then that should work too.

I am a little fuzzy on the exact HDMI spec rev supported in HD 3000/Sandybridge generation vs later chips. Certainly anything with HDMI1.4 support can drive 25x16.

In my opinion, DP isn't a "fancy" feature - it was introduced in 2008 and, together with HDMI, is the clear path forward from VGA, DVI (all the many variants).
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
lol an HD 2400XT (top model 2400) is only one third as fast as a 8600GTS. It was a complete dog back in the day, so of course it would be even moreso today. The floor for old discrete cards is basically 8800GT and HD4650. Go any older than that and you are better off with intel HD graphics. Luckily, those two video card models were featured in plenty of reviews, so you can use their benchmarks to compare your own older GPU to see if it is better or not.
 

ibex333

Diamond Member
Mar 26, 2005
4,086
119
106
Yeah, no display port on my mobo - so if I want to use integrated graphics for anything higher than 1080p I am pretty much out of luck.