^There is no benefit if you combine an 4770 with a 3300. In fact I don't even think ATI allows it, so it won't work.
Well, this is better late than never: I have some benchmarks comparing the performance of the HD3200 to an older system.
The following were ran on different platforms, but I still feel they are useful numbers. The first platform is an Athlon XP-M 2400+ at two different speeds, Radeon X800GT, Windows XP Pro, 2GB RAM and the second is an Athlon X2 at 3.08 GHz, Integrated 780G, Windows Vista Ultimate 32-bit, 4GB RAM.
I'll use this format:
First result/Second result/Third result
First result is the HD3200 paired with the Athlon X2 @ 3.08
Second result is the X800GT paired with the Athlon XP @ 1.75
Third result is the X800GT paired with the Athlon XP @ 2.275
I feel at 1.75 GHz the Athlon XP will be typical of a low end Athlon XP processor. And obviously at 2.275 GHz it will be typical of a high end Athlon XP processor and possibly low end Athlon 64 or Sempron processor. I did this to show CPU scaling. I also tested using two different resolutions to show GPU scaling.
Crysis SP Demo, 1024x768, All low settings
(HD3200 was run in DX10 because the game crashed when I tried DX9 mode)
Avg: 21.28 | 33.02 | 37.86
Min: 15.00 | 20.00 | 23.00
Max: 32.00 | 51.00 | 58.00
Crysis SP Demo, 1280x1024, All low settings
(HD3200 was run in DX10 because the game crashed when I tried DX9 mode)
Avg: 14.08 | 31.87 | 35.06
Min: 10.00 | 20.00 | 23.00
Max: 20.00 | 48.00 | 52.00
Half-Life 2: Lost Coast, 1024x768, All High
45.84 | 48.47 | 56.11
Half-Life 2: Lost Coast, 1280x1024, All High
31.58 | 48.22 | 56.03
Half-Life 2: Lost Coast, 1280x1024, All Low
41.18 | 50.16 | 57.93
Half-Life 2: Lost Coast, 1280x1024, All High, 2xAA/2xAF
20.82 | 47.05 | 45.41
Call of Duty 4 Demo, 1024x768, All Low, Fraps 2 minute run
(Crashed with the HD3200 platform, so I'm providing three values which are all the X800GT paired with the Athlon XP @ 1.75, 2.00, and 2.275 to show CPU scaling. I have also previously provided a subjected experience with the HD3200 and deemed the X800GT more playable.)
Avg: 20.53 | 25.29 | 32.98
Min: 09.00 | 13.00 | 16.33
Max: 46.33 | 52.50 | 70.33
Call of Duty 4 Demo, 1280x1024, All Low, Fraps 2 minute run
(Crashed with the HD3200 platform. The two values are for the X800GT with the Athlon XP at 1.75 and 2.275 GHz. I have also previously provided a subjected experience with the HD3200 and deemed the X800GT more playable, as I experienced microstuttering and low framerates with the HD3200.)
Avg: 19.52 | 31.81
Min: 08.33 | 16.33
Max: 46.00 | 66.00
3DMark06 Proxycon
3.11 | 4.96 | 5.08
3DMark06 Firefly
3.99 | 6.45 | 6.51
Conclusion: The results were more telling than I thought they were going to be. Even the older Athlon XP processor paired with a midrange card 4-5 years old provides a better gaming experience than a dual core processor paired with an integrated graphics chipset. At a low resolution of 1024x768 or lower the new HD3200 is pretty competitive. However, once the resolution is cranked up and/or AA+AF is used, the X800GT swiftly pulls ahead as shown in the Half-Life 2 results. And even though we may not directly compare the Crysis results, since the HD3200 was running in DX10 while the X800GT was running in DX9, it is obvious that a higher resolution killed the HD3200. Going from 1280 to 1024 resulted in a 33% drop in performance for the HD3200 and only a 3% and 7% drop for each X800GT tests.
If I had more time and resources I would love to make all things equal besides hardware platform, but I really can't. I do think this these results backup the claims I was making: Don't expect the faster processor to make up for the lack of GPU power.