I was reading a specification document from AMD, about my HD4850, and it claimed that everything was done in 128-bit floating point. Which is a lot of precision. I use my GPU for distributed-computing, in which that precision is, I assume, useful. Are modern (less power-hungry) GPUs just as high in precision, or more? Or have they gone more gaming-oriented, with lower FP precision? Is GCN, considered 128-bit floating point? More? Less?