^
The 5450 has its own memory controller and cache. The intel one shares all that. So you can't compare on a 1 to 1 design like that.
I would assume that the figure doesn't include video decode hardware either, unlike the 5450. That's a good 100 million transistor chunk right there.
The Radeon 3450 is comprised of 182 million transistors with 40 SPs + 4 TMUs + 4 ROPs and is DX10.1 capable. The 4550 has 80 SPs + 8 TMUs + 4 ROPs, and it's transistor count is 242m (only 60m increase to double the SPs + TMUs and remain DX10.1). The Radeon 5450 has 80 SPs + 8 TMUs + 4 ROPs like the 4550, and the same relative per clock performance, but to be DX11 capable (and I guess OpenCL too), it has a massive increase of 50m transistors. I'm sure there were increases in video decode size among other things, but clearly, AMD has their transistor per transistor performance down in comparison to Intel when it comes to actual 3D performance.
The Radeon 64xx series increases the count to 370 million, with 160 SPs + 8 TMUs + 4 ROPs with DX11 capability, an only 90m increase for double SP performance. Fascinating considering the 26xx series was 390m transistors, and the 36xx GPUs were 378m transistors, both being 120:8:4 configurations. AMD managed to reduce the transistor numbers for the same, if not bit better performance + DX10.1 capability when going to the 3000 series. IIRC the 4xxx series introduced some change in ROP design too (not too sure really). Basically AMD has their head in the game, uses less power to do it compared to Nvidia and Intel. If only they were as good as Nvidia at drivers, though they are very good now.
Finally, it's crazy on Intel's part to ramp their IGP so high in speed, running twice as fast as most AMD GPUs at it's max. Doesn't Intel understand that energy needs go up exponentially as you increase clock on a linear scale (like aerodynamic drag)? Of course they do, but most customers are not going to notice the thermal increase or measure their CPU energy use on the electric bill, but it could decrease the life and durability of the processor if the standard cooling wasn't designed to take it on. Too bad it's policed by the TDP output in comparison to how much the CPU is being used.