Well now it boils down to the validity of nvidia's claims of the xray'ed picture.
Assuming it is true, and Ati really did use an internal bridge on their agp card, they are indeed being hyprocritical. However, if ati plans to bridge this "natively pci-e" chip back to agp, why would they want to bridge a card twice from AGP to PCI-e back to AGP again?
Assuming this accusation is false, then the original discussion is still valid. ATI's card is natively PCI-e, and Nvidia's is AGP. I know the performance differences of bridges and what not are negligible, and that games wont come close to using this extra bandwidth. However, it should be noted that ATI's card is more geared toward future technology, offering backwards compatility to agp, while nvidia is supplementing a current AGP part to be compliant with PCI-e. It cannot be argued against that ATI's solution is indeed more elegant when used in a PCI-e environment (by not requiring a separate bridge chip), even though it probably wont make a noticeable difference. When in an AGP environment, Nvidia's implementation will be more elegant, but for now ATI is still making natively AGP cards which are interfacing exactly the same.
Hopefully we will get an update on the whole accusation soon.
-Steve
Assuming it is true, and Ati really did use an internal bridge on their agp card, they are indeed being hyprocritical. However, if ati plans to bridge this "natively pci-e" chip back to agp, why would they want to bridge a card twice from AGP to PCI-e back to AGP again?
Assuming this accusation is false, then the original discussion is still valid. ATI's card is natively PCI-e, and Nvidia's is AGP. I know the performance differences of bridges and what not are negligible, and that games wont come close to using this extra bandwidth. However, it should be noted that ATI's card is more geared toward future technology, offering backwards compatility to agp, while nvidia is supplementing a current AGP part to be compliant with PCI-e. It cannot be argued against that ATI's solution is indeed more elegant when used in a PCI-e environment (by not requiring a separate bridge chip), even though it probably wont make a noticeable difference. When in an AGP environment, Nvidia's implementation will be more elegant, but for now ATI is still making natively AGP cards which are interfacing exactly the same.
Hopefully we will get an update on the whole accusation soon.
-Steve