The difference between 128bit and 64bit video cards?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
You missed my point Vian. Don't you find it hard to believe how the overclocked 5700 nvidia can get 85 and my average on x800 is 79?
He was talking about invasion, not onslaught.

I get 30fps average on stock speeds.

1280x1024; no AA/AF; vsync on;game settings to maximum.
 

SneakyStuff

Diamond Member
Jan 13, 2004
4,294
0
76
I'm at 1024 X 768, which is prolly why, that resolution seems to be the sweet spot for the 5700u.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
There is a massive difference between 128 bit and 64 bit wide memory but both blow utter chunks compared to 256 bit. These days you really shouldn't be picking up anything slower than a 9700NP or 5900XT.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: VIAN
To play games, I recommend a card with 256-bit memory interface. Anything below that is just crap.



A Radeon 9600 Pro plays ANY game out there at 1024X768 MAX settings with 16XAF at 45FPS and up average. Yea, im talkin farcry, ut2k4, the works. And guess what, it has a 128 bit memory inteface.

Saying anything under 256Bit is crap is like saying anything in your driveway that is not a ferrari is crap.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Not saying that 128-bit is crap. Just saying that any card w/less than 256-bit isn't worth it. Just like BFG, I think that any card less than 9700 and 5900 XT isn't worth picking up.

Your Radeon 9600 Pro can do that, it won't be able to do that for long or it will have trouble playing at my settings.

I feel like my card has limited power, so to have a card that has 30% less performance than my card would just blow monkey nuts.

BTW check this out. http://forums.anandtech.com/messageview.cfm?catid=31&threadid=1347085&enterthread=y
 

SneakyStuff

Diamond Member
Jan 13, 2004
4,294
0
76
Well there ya go VIAN, your settings are why you "must" have 256 bit, a lot of people are fine with gaming at 1024 x 768, and medium high settings ;)
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: dguy6789
Originally posted by: VIAN
To play games, I recommend a card with 256-bit memory interface. Anything below that is just crap.



A Radeon 9600 Pro plays ANY game out there at 1024X768 MAX settings with 16XAF at 45FPS and up average. Yea, im talkin farcry, ut2k4, the works. And guess what, it has a 128 bit memory inteface.

Saying anything under 256Bit is crap is like saying anything in your driveway that is not a ferrari is crap.

Bull sh!t!! My 5900XT doesn't even do that. You can see my system specs. I can barely run it like that on my 5900XT and even then its very very stuttery so i push it back down to 800x600 (the shame :( ) Even the next gen cards dont make it that high half the time. If your gonna say crap like that you better damn well back it up!!!

Smile :) ... not mad everyone just hate it when people troll.

-Kevin
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
Well, I play UT2K4 on a GF3Ti200 at 1024x768x16bit color and it runs acceptably well (Rig is optimus in my sig). Around 45-50FPS even in botmatches. I just picked up an MSI FX5700 tho, so that is going in hopefully tomorrow, and I'm looking forward to being able to play decently in 32 bit color with AA/AF if possible (though doubtfully in UT2K4).
There's something to be said for the codepaths for older hardware giving it longer legs. Keep in mind that Doom 3 is going to be targeted to run on GF3 level tech (at least last I'd heard), so you may see something like an X800XT/6800U getting like 60FPS outta the engine and a GF4Ti4600 doing about the same because they run on differing codepaths.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Bull sh!t!! My 5900XT doesn't even do that. You can see my system specs. I can barely run it like that on my 5900XT and even then its very very stuttery so i push it back down to 800x600 (the shame ) Even the next gen cards dont make it that high half the time. If your gonna say crap like that you better damn well back it up!!!
If you need to place it at that low a resolution, then something else is wrong.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: VIAN
Bull sh!t!! My 5900XT doesn't even do that. You can see my system specs. I can barely run it like that on my 5900XT and even then its very very stuttery so i push it back down to 800x600 (the shame ) Even the next gen cards dont make it that high half the time. If your gonna say crap like that you better damn well back it up!!!
If you need to place it at that low a resolution, then something else is wrong.

Agreed, and before you call BS on other people, maybe you should evaluate why your performance is so poor.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
My performance isn't poor. But there is no way that a 9600XT can max everything out detail wise and run far cry at 1024x768!! Even the new next gen cards have trouble.

-Kevin
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
A Radeon 9600 Pro plays ANY game out there at 1024X768 MAX settings with 16XAF at 45FPS and up average. Yea, im talkin farcry, ut2k4, the works.
Utter rubbish. Even a 9700 Pro can't manage 45 FPS at those settings in Far Cry.