The PS3 had more theoretical performance on offer, but due to its odd architecture it took a lot more effort to unlock the potential. With most games being straight ports the PS3 ended up being hard to extract performance for and ultimately many developers rather than spending considerable time to play to PS3's strengths, that no other system would benefit from instead just let the PS3 port have worse graphics and resolution.
To some extent I think we are seeing a reverse of that in the XB1. The fancy high speed cache they have is faster than the PS4's memory, but to properly utilise it requires specialist programming, with the base hardware somewhat slower and with this specialist feature to accelerate it the end result is cross platform developers are going for the easier option of graphics reduction.
The Xbox 360 and PS3 at least had strengths and weaknesses in regard to each other, although I'd give the edge to the 360. (edram + better gpu + more vram is a huge triple threat advantage, although the cell is essentially a 2nd gpu for the ps3, but a really hard to use one with huge memory issues)
The Xbox One's cache isn't particularly hard to use, it's just tiny, and not even all that fast. It seems like Microsoft bet on GDDR5 being way more expensive/limited than it was and lost horribly, with the Xbox One SOC being more expensive than the PS4 SOC, having slower computational resources, and lower bandwidth. The ESRAM is comparable bandwidth to the PS4's GDDR5, but it's a small pool of ram.
Apparently the PS4 was originally supposed to have 2GB or 4GB of GDDR5, it was only last minute luck that GDDR5 advanced enough and got cheap enough that Sony could fit 8GB. Had it been a 2GB or 4GB PS4 versus an Xbox One with ESRAM, things wouldn't be so clear cut.