Which ultimately means that games are usually optimized to use less draw calls, and kept this way across the board.
But back then the delta wasn't as great. XBox 360 in particular had a GPU with technology that was absent from PCs in 2005. PS3 had Cell which gave it a bunch more raw FLOPs than 2006 CPUs.
PS4 has a GPU which is same basic design but half as powerful as what was released over a year ago and, and a CPU which is even further behind even older CPUs - and no, having a big hardware core count doesn't change that. The delta will be bigger by the end of 2013 when PS4 is actually released. Low level API advantages and unified RAM does help but they don't negate what you can get with a > 2x difference in transistor budget and an even bigger difference in power budget while using the same or better process technology.
But regarding the draw calls, isn't it likely that it's going to be a problem down the road for PCs with DX11? (and even if there is a much more improved DX or OpenGL available, it probably will take very long before it's adopted)
Almost all current games are made for the PS3/360 as LCD, and the PC version of Crysis 3 still scales down to a HD 5770 and any quad core CPU. And devs are already complaining about the draw calls cost.
Once games are being made for the next-gen consoles in mind, can the devs just really optimise away eventual draw calls performance issues in DX11?
I'm no developer(which you certainly have noticed

) but I'm reading on plenty of forums where there are some developers, and all of them certainly aren't optimistic about DX11's limitations when compared to the next-gen consoles.
Then there are also those who have complained openly, as seen in bit-tech's article about the DirectX performance overhead.
We'll have better parts in 2013, but it doesn't seem like there will be anything ground-breaking. Haswell seems to have improved IPC by around 10-15%, and the new GPUs on 20nm are pretty much confirmed for sometime in 2014.