Originally posted by: bunnyfubbles
Originally posted by: LCD123
The GPU will need more bandwith than the hd5770 provides at anything above 800x600 and for all modern games. If there was no need, ATI would have gone 128 bit for the high end cards, but the fact ATI has gone 256 bit ram since the radeon 9700s tells me that there is a need for bandwith! Nvidia is actually going 384 bit for their new cards and they have gone above 256 bit on their current high end cards. A 50% increase in memory bandwith doesn't always give 50% more performance unless the game requires every bit of bandwith then the bandwith will be there!
High definition is 1280x720, full high definition is 1920x1080. My 32" LCD is considered to be high definition with a native resolution of 1360x768. A tiny resolution would be 800x600 or lower which is what I had on my CRT monitors ive owned before I finally went LCD in late 2008. There's plenty of LCDs(small monitors and medium TVs) with 1360x768 so this will be the lowest common resolution except for those still on a CRT or those running non native resolutions on LCDs. 1920x1080 is a common resolution nowdays and can be found on LCD monitors in the 23" range and on LCD TVs 32" and up.
ATI went from a 512bit bus with their 2900XT to a 256bit bus with the 3870 and saw virtually no drop off in bandwidth intensive situations.
Then there was the G80 -> G92 transition where nVidia dropped from a 384bit bus to a 256bit bus, again with the 256bit bus being more than enough. And now they're also dropping from 512bit to 384bit from GT200 to Fermi
Also, you can delude yourself about your TV resolution all you want, anything lower than 1680x1050 is a tiny resolution per PC gaming standards. Like I said before, 1440x900 or 1280x1024 are both much larger than your tiny, non HDTV resolution, and both of those resolutions are as small as you'll see tested on most review sites.