ATI X19xx series GPU's may help make a cure?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Originally posted by: tanishalfelven
They use ATI 1900 only because they're the first ones to put 32 bit floating point hardware on their graphics card. The "issue" with nVidia's cards is that they don't have that capability. While the math can still be done on a nVidia card - the same way having a 32-bit CPU doesn't mean you can't process 64-bit variables - the processing overhead takes away much of the advantage. Development may be more complex, too.

this was one of the comments on the daily tech page. someone here mind explaining what it means.
He seems to be saying that the g70/71 doesn't really do real 32bit math. It emulates it with lesser precision math. It's true that 32bit CPUs can do 64bit math in this way but doing floating point that way doesn't work. Floating point can be emulated by integer instructions but doing so is super slow compared to hardware FP.

Nvidia's site also contradicts what he claims:
http://www.nvidia.com/object/7_series_techspecs.html

 

bluemax

Diamond Member
Apr 28, 2000
7,182
0
0
Well.... now that I've finally got a computer that doesn't suck (ahhh... my first dual-core!) it's time to upgrade the onboard 6150 IGP!

I might be able to justify the cost of the video card if I can help save lives with it too! ;)

Trick is to try and do it for as little cash as possible!

I see that the new X1950 is out and uses less electricity (important for my not-terribly-impressive Dell power supply!) and the X1900GT is pretty cheap....



I'm gonna' have to dig deep into the data to figure out which card will work well with Folding@home, cost the least, use the least power, etc.

...then figure out where the cash to buy it will come from. ;)

Edit: Forgot to mention, I'll definately be joining the Anandtech Folding Team! ;) ;)
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: zephyrprime
Originally posted by: tanishalfelven
They use ATI 1900 only because they're the first ones to put 32 bit floating point hardware on their graphics card. The "issue" with nVidia's cards is that they don't have that capability. While the math can still be done on a nVidia card - the same way having a 32-bit CPU doesn't mean you can't process 64-bit variables - the processing overhead takes away much of the advantage. Development may be more complex, too.

this was one of the comments on the daily tech page. someone here mind explaining what it means.
He seems to be saying that the g70/71 doesn't really do real 32bit math. It emulates it with lesser precision math. It's true that 32bit CPUs can do 64bit math in this way but doing floating point that way doesn't work. Floating point can be emulated by integer instructions but doing so is super slow compared to hardware FP.

Nvidia's site also contradicts what he claims:
http://www.nvidia.com/object/7_series_techspecs.html

If I'm not mistaken, the NV GPU's support 32-bit FP textures, but they don't support 32-bit FP Shaders. I believe the F@H engine is using the "programmable shader pipelines", not the normally fixed-function texture pipes. NV's shaders are where they don't support true 32-bit.
 

bluemax

Diamond Member
Apr 28, 2000
7,182
0
0
Originally posted by: SunnyD
If I'm not mistaken, the NV GPU's support 32-bit FP textures, but they don't support 32-bit FP Shaders. I believe the F@H engine is using the "programmable shader pipelines", not the normally fixed-function texture pipes. NV's shaders are where they don't support true 32-bit.

I jsut read that on the F@H forums. It's still ATI or bust when it comes to GPU-for-folding.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: bluemax
Originally posted by: SunnyD
If I'm not mistaken, the NV GPU's support 32-bit FP textures, but they don't support 32-bit FP Shaders. I believe the F@H engine is using the "programmable shader pipelines", not the normally fixed-function texture pipes. NV's shaders are where they don't support true 32-bit.

I jsut read that on the F@H forums. It's still ATI or bust when it comes to GPU-for-folding.

Keep in mind this will probably change with the 8800.
 

bluemax

Diamond Member
Apr 28, 2000
7,182
0
0
Originally posted by: SunnyD
Originally posted by: bluemax
Originally posted by: SunnyD
If I'm not mistaken, the NV GPU's support 32-bit FP textures, but they don't support 32-bit FP Shaders. I believe the F@H engine is using the "programmable shader pipelines", not the normally fixed-function texture pipes. NV's shaders are where they don't support true 32-bit.

I jsut read that on the F@H forums. It's still ATI or bust when it comes to GPU-for-folding.

Keep in mind this will probably change with the 8800.

Nope. they're really talking about the G80 over there, and so far the word is that it'll only do 16 shaders vs. 48(?) on the X1900XT. No so good for folding - but still better than CPU alone! :)