128 vs 64-bit; 4650 vs 5450?

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
I'm looking at the 4650 and 5450. Most of the 4650's have 128-bit memory and all of the 5450's have 64-bit. My usage will just be desktop/general usage and blu-ray playback, nothin fancy or demanding.

Does the "bit" make any difference in performane?
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I'm looking at the 4650 and 5450. Most of the 4650's have 128-bit memory and all of the 5450's have 64-bit. My usage will just be desktop/general usage and blu-ray playback, nothin fancy or demanding.

Does the "bit" make any difference in performane?

Not for what you need it for. Most modern 50$ cards will be just fine for general use.
 

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
Not for what you need it for. Most modern 50$ cards will be just fine for general use.

Thought so. Card is actually for my father. I really didn't think it mattered for a simple HTPC-like machine.

For some reason he thought the "bit" made a real difference for the piddly crap he does on his computer. I told him otherwise, but stubborn ass that he is he buys a 4650 even though I told him to get the 5450 since it's the same price and probably provides slightly better video quality and/or performance.
 
Last edited:

Voo

Golden Member
Feb 27, 2009
1,684
0
76
To be more exact we're talking about:
4650: 16GB/s
5450: 12.8GB/s

So thanks to using DDR3 vs. 2 and a faster clock the difference is even smaller than the buswidth would imply. The 4650 should be faster in games, but nobody would want to play with any of these cards anyhow, so I can't think of any reason to get the last gen card, especially if you want to use it for HTPC usage as well.
 

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
So, for HTPC use, is the 4650 just as good as the 5450? The main difference being no audio bitstreaming and higher power consumption on the 4650.
 
Last edited:

0roo0roo

No Lifer
Sep 21, 2002
64,795
84
91
dunno, ati is apparently setting the level of decode quality for accelerated video based on gpu power level in the drivers now. i dunno, look into it. i don't remember exactly which gpu can't handle all the decode quality features, but some can't.