Memory interface: 128bit or 256bit.

Silversierra

Senior member
Jan 25, 2005
664
0
0
I always thought that a memory interface in a graphics card refered to the amount of data it could send/recieve at a time to the memory, but I saw a defintion of memory data width that seems to contradict this. It says, "The data width (similar to bit depth) determines the range of colors that can be displayed through the graphics card. The higher the data width, the more colorful the picture results will be." http://www.dealtime.com/xPP-Graphics_Cards-nvidia_5500-more_than_256_mb

So a 256bit graphics card is more colorful than a 128bit? A 128 more than a 64? That doesn't seem right.
 
Jun 14, 2003
10,442
0
0
no thats wrong

basically the more bits you have, the more bandwidth yoy have.

ie

128bit means.......you can transport 128bits of infomation, in one clockcyle
256bit means you can move 256bits of info in one clockcycle

so in theory a 256bit memory bus provides twice the bandwidth of a 128bit bus for the same clockspeed, and this bandwidth is used to transpot things like textures to the GPU for processing, and its used when tings like AA and AF are implemented, so the more bandwidth you have the more info you can transport in one go, and therefore the faster the process is. this is why 256bit bus cards are better at AA and AF when compared to a identicle GPU with only 128bit bus. the 128bit bus becomes too narrow to have all the texture, AA and AF info going along it at once, and thats where things slow down

i think you have got it confused with the processing capabilities of the gpu's im not too up on this area so someone can fil u in.
 
Jun 14, 2003
10,442
0
0
little calculation for u

128bit bus with 300mhz memory. so you can transport 128bits per clock, and there are 300 million clock pulses for 300mhz so that equals
128x300e6 = 38400000000 bits in a second, divide this by 8, since theres 8 bits in one Byte and you get 4800000000 or 4.8Gb/s

with the 256bit bus

256x300e6 = 76800000000 divided by 8 = 9600000000 = 9.6Gb/s

the more data you move in a given time period, the faster things are when you use memory intensive feautres such as large textures, and AA and AF.

of course those are jus made up numbers, and due to the nature of DDR you should doube each of the above. ie 4.8Gb/s would become 9.6Gb/s (due to the doubel data rate of DDR) and the 9.6 would become 19.2Gb/s
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
You're getting two things confused - the data bandwidth of the memory card (which is where you get '256-bit' cards, '128-bit' cards, and '64-bit' cards).

That link you sent is completely false - all cards from the GeForce 5 series have the same colour depth, etc (pretty much the same with the GeForce 6 series). That is false advertising.

What 256-bit gets you over 128-bit is double the memory bandwidth which helps on a memory bandwidth starved card (ie a really high end card like a Radeon X800 series or GeForce 6800 series).

All of those cards that you linked (GeForce FX 5500 series) are mediocre and very poor price/performance purchases. Ask around this forum for a card recommendation for your price range and you will get a much faster card for similar money.
 

Silversierra

Senior member
Jan 25, 2005
664
0
0
Yeah, I didn't think that it was color related. I'm not looking at getting a 5500, I was looking at them because someone said there was a 256bit 5500, and I didn't believe them. Who knew there's a 256bit 5500?

Edit: Forgot to say thanks for clearing that up! Thanks again.
 

Silversierra

Senior member
Jan 25, 2005
664
0
0
Can anyone explain how a graphics card works in a simple way. I've read some things, but they are too technical. Does the graphics card stor all the information it's going to use to display a game in it's memory? Does it constantly change what's in the memory? Does info. go from the gpu to the memory or vice versa or both? Thanks, hope someone can explain this in an easy to understand way, I think it will be helpful.
 

SNM

Member
Mar 20, 2005
180
0
0
In a simple way? The graphics card puts the pretty pictures on your monitor. :p

A graphics card has its own memory, which it uses to store as much data as possible. Because many games require more than 128/256 megs of texture memory/shapes/etc, data must sometimes be replaced, but as memory capacity increases, the number of necessary swaps is reduced. Info goes in both directions: A gpu (graphics processing unit) is just a specialized cpu with its own superhuge cache.
 

fstime

Diamond Member
Jan 18, 2004
4,382
5
81
128 cant handle aa/af
256 can.

Its like the 6600GT and 6800nu. Also 256 should work better at higher resoloutions.
 
Jun 14, 2003
10,442
0
0
Originally posted by: Silversierra
256bit on a fx5500 wouldn't really help though, would it?


dont think there is 256bit version is there? 256mb memory maybe

less of course the 256bit has been quoted in reference to the processing that goes on in the GPU

arent all GPU's from the original geforce 256bit in operation or something? i didnt wanna say coz im not too clear on this