• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

The difference between 128bit and 64bit video cards?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
You missed my point Vian. Don't you find it hard to believe how the overclocked 5700 nvidia can get 85 and my average on x800 is 79?
He was talking about invasion, not onslaught.

I get 30fps average on stock speeds.

1280x1024; no AA/AF; vsync on;game settings to maximum.
 
There is a massive difference between 128 bit and 64 bit wide memory but both blow utter chunks compared to 256 bit. These days you really shouldn't be picking up anything slower than a 9700NP or 5900XT.
 
Originally posted by: VIAN
To play games, I recommend a card with 256-bit memory interface. Anything below that is just crap.



A Radeon 9600 Pro plays ANY game out there at 1024X768 MAX settings with 16XAF at 45FPS and up average. Yea, im talkin farcry, ut2k4, the works. And guess what, it has a 128 bit memory inteface.

Saying anything under 256Bit is crap is like saying anything in your driveway that is not a ferrari is crap.
 
Not saying that 128-bit is crap. Just saying that any card w/less than 256-bit isn't worth it. Just like BFG, I think that any card less than 9700 and 5900 XT isn't worth picking up.

Your Radeon 9600 Pro can do that, it won't be able to do that for long or it will have trouble playing at my settings.

I feel like my card has limited power, so to have a card that has 30% less performance than my card would just blow monkey nuts.

BTW check this out. http://forums.anandtech.com/messageview.cfm?catid=31&threadid=1347085&enterthread=y
 
Well there ya go VIAN, your settings are why you "must" have 256 bit, a lot of people are fine with gaming at 1024 x 768, and medium high settings 😉
 
Originally posted by: dguy6789
Originally posted by: VIAN
To play games, I recommend a card with 256-bit memory interface. Anything below that is just crap.



A Radeon 9600 Pro plays ANY game out there at 1024X768 MAX settings with 16XAF at 45FPS and up average. Yea, im talkin farcry, ut2k4, the works. And guess what, it has a 128 bit memory inteface.

Saying anything under 256Bit is crap is like saying anything in your driveway that is not a ferrari is crap.

Bull sh!t!! My 5900XT doesn't even do that. You can see my system specs. I can barely run it like that on my 5900XT and even then its very very stuttery so i push it back down to 800x600 (the shame 🙁 ) Even the next gen cards dont make it that high half the time. If your gonna say crap like that you better damn well back it up!!!

Smile 🙂 ... not mad everyone just hate it when people troll.

-Kevin
 
Well, I play UT2K4 on a GF3Ti200 at 1024x768x16bit color and it runs acceptably well (Rig is optimus in my sig). Around 45-50FPS even in botmatches. I just picked up an MSI FX5700 tho, so that is going in hopefully tomorrow, and I'm looking forward to being able to play decently in 32 bit color with AA/AF if possible (though doubtfully in UT2K4).
There's something to be said for the codepaths for older hardware giving it longer legs. Keep in mind that Doom 3 is going to be targeted to run on GF3 level tech (at least last I'd heard), so you may see something like an X800XT/6800U getting like 60FPS outta the engine and a GF4Ti4600 doing about the same because they run on differing codepaths.
 
Bull sh!t!! My 5900XT doesn't even do that. You can see my system specs. I can barely run it like that on my 5900XT and even then its very very stuttery so i push it back down to 800x600 (the shame ) Even the next gen cards dont make it that high half the time. If your gonna say crap like that you better damn well back it up!!!
If you need to place it at that low a resolution, then something else is wrong.
 
Originally posted by: VIAN
Bull sh!t!! My 5900XT doesn't even do that. You can see my system specs. I can barely run it like that on my 5900XT and even then its very very stuttery so i push it back down to 800x600 (the shame ) Even the next gen cards dont make it that high half the time. If your gonna say crap like that you better damn well back it up!!!
If you need to place it at that low a resolution, then something else is wrong.

Agreed, and before you call BS on other people, maybe you should evaluate why your performance is so poor.
 
My performance isn't poor. But there is no way that a 9600XT can max everything out detail wise and run far cry at 1024x768!! Even the new next gen cards have trouble.

-Kevin
 
A Radeon 9600 Pro plays ANY game out there at 1024X768 MAX settings with 16XAF at 45FPS and up average. Yea, im talkin farcry, ut2k4, the works.
Utter rubbish. Even a 9700 Pro can't manage 45 FPS at those settings in Far Cry.
 
Originally posted by: EliZ
My old video card got messed up so I want to buy a new, mainstream, one.
I'm probably going to take either the Radeon 9200 of the FX 5200, and I noticed a non-trivial price difference between the 128bit and 64bit versions of both cards. So, how big is the performance difference?

fx5200 128bit v fx5200 64bit
 
Back
Top