How important is the 64 versus 128 bit memory for me?

Lcarvone

Platinum Member
Sep 20, 2000
2,875
0
0
Hello everyone,

I am looking to upgrade an older computer to use for the sole purpose of watching and recording TV.....not really a true HTPC but more of a casual computer for normal computer uses, video capture (via a Hauppauge PVR-250) and viewing on a NON-HD standard 27 inch TV. I am looking to get a Nvidia FX5200 based card but noticed in my research that many people frown heavily on the 64 bit memory versions. It would be replacing the current card which is a
*cough* Radeon 7000 AGP 32 meg *cough*

The specs of the system this would go into are as follows:

Athlon 1900+
1 gig PC2100 DDR
ECS K7S5A mobo
WD 160 gig HD
Samsung 80 gig HD

The ECS mobo only supports 4X AGP which I understand a FX5200 will work with. Due to this fact is there still a major benefit to the 128 bit memory versions? Again the main purpose for this system is not gaming but viewing recorded material so I am truly limiting myself with 64 bit memory on my "limited" system?

Any and all input is greatly appreciated
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
I don't think the video card is involved at all with watching video, so the memory bus is probably irrelevant. If you'll be capping to MPEG2 with that Hauppage, I'm 99.44% sure that even a 64-bit 5200 will be able to handle MPEG2 decoding without a problem. It's mainly gaming that requires as much of everything as possible (memory bus width, memory speed, GPU pipeline count, GPU speed, etc.).

AFAIK, your main concerns should be finding a video card with good signal quality (both TV out and monitor out), the driver features you want (if you'll be running both your TV and a monitor at the same time), and no fan (to make noise or fail).
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
64-bit will be fine for you. However, even simple PS2.0 video filters might be rough on the 5200 as it has horid floatingpoint performance, I'm not sure how much of a impact that would actually have as I have never seen tests of that, but to be safe you might want to consider going with a 6200 or 9550.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
I am looking to get a Nvidia FX5200 based card but noticed in my research that many people frown heavily on the 64 bit memory versions.

64bit memory cards are generally frowned on for HDTV decoding which makes it somewhat borderline, otherwise its probably fine. For recording it should be dandy since the TV card does the work. For analog TV, it would be OK I'd think.
 

Lcarvone

Platinum Member
Sep 20, 2000
2,875
0
0
Originally posted by: Pete

AFAIK, your main concerns should be finding a video card with good signal quality (both TV out and monitor out), the driver features you want (if you'll be running both your TV and a monitor at the same time), and no fan (to make noise or fail).

Thanks for all the posts folks.......so is there a way or feature set I should look for in a card that is known to have good signal quality for both TV and monitor out (which is what I will be setting up)? I have read up some on the Purevideo features of the newer Nvidia 6XXX line of cards which looks promising but as far as ATI goes I'm not sure.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
I believe ATI's PureVideo is called SMARTSHADER HD, but I haven't heard what it does or doesn't do. I know it can accelerate WMV decode (DXVA).
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Signal quality? Well, generally ATI cards exceed NVIDIA cards at DVI compliance, but I'm not sure if the same chip is used to route to TV out or what. It may also vary by card vendor (Leadtek, EVGA, etc.) Still, you won't notice much of a difference unless you're running insane (Apple Cinema-like) resolutions. I'm also pretty sure PureVideo has more features/better quality than SMARTSHADER HD. Check out AnandTech's PureVideo review. As of now, MPEG-2 decoding and WMV acceleration for cards that are listed on NVIDIA's PureVideo support page will work. The codecs, etc. exist by now. You will have to pay for NVIDIA's PureVideo MPEG-2 decoder, which is the case with most MPEG-2 decoders. However you will still get the decode acceleration (DXVA) with other MPEG-2 codecs such as the accelerated Moonlight/Elecard MPEG-2. You won't get the image enhancement though.

THG DVI compliance tests
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
I'm also pretty sure PureVideo has more features/better quality than SMARTSHADER HD

I'm pretty certain Purevideo vs SMARTSHADER is a "marketing team cage match" ;)

Don't buy the BS, most recent cards from ATI and Nvidia ship with a free DVD software and can decode the MPEG-2 you'll be encoding with that TV card just fine.