the 8800 gtx has 768MB and bandwidth of 86.4 GB/sec
can someone explain how all the bandwidth is used up?
some numbers would be nice. for example:
a game at 1600x1200 at 32bits, requires ##MB/sec per frame, 8xAA multiplies that by ## per frame, HDR adds ##
86.4GB/sec divided by bandwidth/frame = frames/sec
or something along those lines
can someone explain how all the bandwidth is used up?
some numbers would be nice. for example:
a game at 1600x1200 at 32bits, requires ##MB/sec per frame, 8xAA multiplies that by ## per frame, HDR adds ##
86.4GB/sec divided by bandwidth/frame = frames/sec
or something along those lines
