Electrons move at a pretty good fraction of the speed of light, last time I checked.

Optical's great for long/noisy runs because electrical interference won't do anything to it, whereas it *will* eventually start to degrade a digital signal over coax.
I agree that it is a digital signal and in theory the quality shouldn't be different between the two connections, but sometimes the hardware might use a better D/A converter or circuitry on the one connection than the other. I usually just try both and go with the best sounding one for that situation.
This seems more likely -- it doesn't make any sense to me, but at least it's an explanation. I mean, after the digital signal is converted from optical back to analog (but still digital), shouldn't it go through the same exact decoding process? And digital-digital conversion (even between different physical formats) shouldn't have any effect on the data. I don't know much about digital amp design, and I'm not an HT expert, so please correct me if I'm wrong.
I wonder if the *encoding* process is different for the two signals? If I was designing it, I'd do the encoding from analog to digital on copper, then (if necessary) from digital on copper to digital on fiber and back (should be no change in signal), then from digital on copper in the receiver to analog for output. But if they used different encoding paths for the coax and optical connectors, that could affect it, I guess. But still, it's a digital signal being read from a digital source... these HT people just confuse me.
