Onboard Graphics Vs. Separate card

Pudnana

Junior Member
Jan 9, 2012
3
0
0
Sorry if this sounds like a noob question, but I recently purchased a Asus desktop with a AMD A8 APU processor to replace an aging HP that I had. In the HP I had a Radeon HD4650 which I believe is a Direct X 10 Card, and I believe the onboard graphics in the new Asus supports Direct X 11. I know having a separate graphics card frees up resources such as processor and memory usage, but would it be worth it to do so using a Direct X 10 card?

I dont play many games, maybe some City of Heroes, DCUO or Civ 4 here or there, but I think I'm going to be buying Civ 5 soon. Any input or opinions would be welcome!
 

ScottAD

Senior member
Jan 10, 2007
735
77
91
Anand had this to say when he did a review of it;

As we mentioned in our preview, the integrated Radeon HD 6550D generally performs between a Radeon HD 6450 and 5570 depending on memory speed.

So depending on your RAM it sounds like the onboard is a better value I guess. If not someone will tell me otherwise! I'm still learning things myself and I am not very familiar with AMDs card numbering!
 

Pudnana

Junior Member
Jan 9, 2012
3
0
0
Thanks, for the info, I was thinking along those lines already. Hopefully if there's a varying opinion, someone will chime in.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
The Radeon HD 4650 has 320 shaders clocked at 600 MHz, and 500 MHz DDR2 memory over a 128 bit memory bus. The Radeon HD 6550D integrated onto the A8 has 400 shaders clocked at 444 MHz, and uses the system's DDR3 RAM (DDR3 > DDR2). The 6550D can also make use of DirectX 11 and all of AMD's latest video and graphics processing technology, while the 4650 will be rather limited in that regard. It should work better with AMD's latest drivers as well. The APU graphics really should be better to use than the 4650.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
I cant help think of this video:
http://www.youtube.com/watch?v=lth_M25cXjE

my jaw dropped when I watched it. It also convinced me, IGPs are gonna swallow mainstream pc users, once they get something to help them with the memory bandwidth issues (ei. DDR4, tri-slot ram?, on-die ram of some sort).

Air cooled, overclocking:
CPU : 2.9gb->3.6ghz
FSB: 100mhz ->140mhz
mem: 2200+ mhz
IGP: 600mhz -> 840mhz

3D vantage score ~6200
Then running Crysis at 40+ fps, with decent ish graphics, though low resolution.
 

brybir

Senior member
Jun 18, 2009
241
0
0
I cant help think of this video:
http://www.youtube.com/watch?v=lth_M25cXjE

my jaw dropped when I watched it. It also convinced me, IGPs are gonna swallow mainstream pc users, once they get something to help them with the memory bandwidth issues (ei. DDR4, tri-slot ram?, on-die ram of some sort).

Air cooled, overclocking:
CPU : 2.9gb->3.6ghz
FSB: 100mhz ->140mhz
mem: 2200+ mhz
IGP: 600mhz -> 840mhz

3D vantage score ~6200
Then running Crysis at 40+ fps, with decent ish graphics, though low resolution.

I think they will start doing the things that the consoles have been doing for some time, integrating a small amount (say 32MB) of fast edram or something of that nature into the socket package. A small amount of pretty fast memory feeding an integrated GPU and fed over the next gen memory standards could very well provide solid entry to mid-level gaming on larger monitors and really solid gaming on notebooks.

I have a GT540m in my Acer notebook, it uses Optimus, I get 8 hours+ of battery life, and at 1366x768, I can run most of my games at high detail with solid frames. The GT540 level performance is certainly not out of reach of the integrated GPUs!
 

Pudnana

Junior Member
Jan 9, 2012
3
0
0
That video probably solidifies it. I'll stick with my onboard graphics. Only this is, I have a retail preassembled PC, it looks like the BIOS is locked to prevent overclocking, and the ATI utility doesn't show any ways to overclock it either.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
I have a GT540m in my Acer notebook, it uses Optimus, I get 8 hours+ of battery life, and at 1366x768, I can run most of my games at high detail with solid frames. The GT540 level performance is certainly not out of reach of the integrated GPUs!
http://www.notebookcheck.net/NVIDIA-GeForce-GT-540M.41715.0.html

Mobile 540M scores ~6600 in 3Dmark vantage.

You can compair that to the heavly overclocked Llano IGP scoreing 6200.
That level of performance from CPU IGP's isnt that far off (into the future).
 

brybir

Senior member
Jun 18, 2009
241
0
0
http://www.notebookcheck.net/NVIDIA-GeForce-GT-540M.41715.0.html

Mobile 540M scores ~6600 in 3Dmark vantage.

You can compair that to the heavly overclocked Llano IGP scoreing 6200.
That level of performance from CPU IGP's isnt that far off (into the future).

Yeah that is certainly true. I expect my GT540 to be on par (or slightly beaten) by the next gen integrated graphics from AMD. I OC my 540 to GT550 levels, so hopefully that is the mark to shoot for. I really like my notebook, but the Optimus tech is sort of frustrating (since it never turns on when I want it to).
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
Sorry if this sounds like a noob question, but I recently purchased a Asus desktop with a AMD A8 APU processor to replace an aging HP that I had. In the HP I had a Radeon HD4650 which I believe is a Direct X 10 Card, and I believe the onboard graphics in the new Asus supports Direct X 11. I know having a separate graphics card frees up resources such as processor and memory usage, but would it be worth it to do so using a Direct X 10 card?

I dont play many games, maybe some City of Heroes, DCUO or Civ 4 here or there, but I think I'm going to be buying Civ 5 soon. Any input or opinions would be welcome!
I had an HD4650, it was pretty a decent card, especially from power/performance point of view (TDP > 46w). If it has DDR3 memory, why not?

However, it didn't run modern (DX11) games well even on medium settings and hence I ditched it. I believe, Civ5 is DX11. It's best to pair your A8 with a 5xxx or 6xxx series card instead.
 
Last edited:
Aug 11, 2008
10,451
642
126
I had an HD4650, it was pretty a decent card, especially from power/performance point of view (TDP > 46w). If it has DDR3 memory, why not?

However, it didn't run modern (DX11) games well even on medium settings and hence I ditched it. I believe, Civ5 is DX11. It's best to pair your A8 with a 5xxx or 6xxx series card instead.

Pairing a discrete GPU with the A8, last time I read about it anyway, was sort of a mixed bag. Some games show good performance increases, while some show little improvement or even a decrease. I am also pretty sure that it does not work with DX9, if you want to play older games.

Personally, I am not sure asymetric crossfire is worth the hassle. Besides, using a discrete GPU sort of seems to defeat the purpose of an APU in my opinion. If I wanted to add a discrete card, I would get either an Intel CPU or one of the Llano chips without the APU.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
@Frozenthundra,

Hopefully drivers for that stuff is better now, than on release date.
I still love the idea of hybrid crossfire...
with CPU's IGP + small card = massive performance/$


some pics:

LLANO-22.png



IGP alone = 1802 score
6570 alone = 2332 score

together in hybrid crossfire = 3108 score (thats like ~34% gain from useing IGP with the card).

But its free extra performance so why not?



LLANO-23.png



Here 6570 alone = 55,69
Hybrid crossfire = 76.89 (~39% gain)



LLANO-26.png


6570 alone = 37.54
hybrid crossfire = 39.33 (~5% gain)



Like I said... hopefully drivers are better now than on release.
But I like the idea of not *wasteing* your IGP away doing nothing, it might as well be helping.
 
Last edited:

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
Some people I have been talking to, are reporting microstutters with these kind of setups.