ATI to bridge their chips

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ss284

Diamond Member
Oct 9, 1999
3,534
0
0
Well now it boils down to the validity of nvidia's claims of the xray'ed picture.

Assuming it is true, and Ati really did use an internal bridge on their agp card, they are indeed being hyprocritical. However, if ati plans to bridge this "natively pci-e" chip back to agp, why would they want to bridge a card twice from AGP to PCI-e back to AGP again?

Assuming this accusation is false, then the original discussion is still valid. ATI's card is natively PCI-e, and Nvidia's is AGP. I know the performance differences of bridges and what not are negligible, and that games wont come close to using this extra bandwidth. However, it should be noted that ATI's card is more geared toward future technology, offering backwards compatility to agp, while nvidia is supplementing a current AGP part to be compliant with PCI-e. It cannot be argued against that ATI's solution is indeed more elegant when used in a PCI-e environment (by not requiring a separate bridge chip), even though it probably wont make a noticeable difference. When in an AGP environment, Nvidia's implementation will be more elegant, but for now ATI is still making natively AGP cards which are interfacing exactly the same.


Hopefully we will get an update on the whole accusation soon.

-Steve
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
Well, ATI had native PCI-E cards showing back in September, I believe.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
Assuming it is true, and Ati really did use an internal bridge on their agp card, they are indeed being hyprocritical. However, if ati plans to bridge this "natively pci-e" chip back to agp, why would they want to bridge a card twice from AGP to PCI-e back to AGP again?


No one has mentioned ATI using an AGP to PCI-E bridge at all.

The only ATI bridge in question is the one from PCI-E back to AGP. I believe that is what NV says the X-ray pics show, but ATI says it is buffers.
 

ss284

Diamond Member
Oct 9, 1999
3,534
0
0
Originally posted by: LTC8K6
Assuming it is true, and Ati really did use an internal bridge on their agp card, they are indeed being hyprocritical. However, if ati plans to bridge this "natively pci-e" chip back to agp, why would they want to bridge a card twice from AGP to PCI-e back to AGP again?


No one has mentioned ATI using an AGP to PCI-E bridge at all.

The only ATI bridge in question is the one from PCI-E back to AGP. I believe that is what NV says the X-ray pics show, but ATI says it is buffers.

Read the link again:

http://www.theinquirer.net/?article=16651

The wording is a little funny, But nvidia is accusing ati of using an internal AGP (from VPU) to PCI-e (connector) bridge chip. Just look at the orientation of the chip and you will understand. ATI has already acknowledged that they will be using a bridge chip in the future from PCI-e(from VPU) to AGP (connector) but were criticizing nvidia for using an AGP(from VPU) to PCI-e(connector). Forgive me if thats confusing.

The xray pics show an RV350 (radeon 9600 native AGP), and the RV380(reported as native PCI-e; which allegedly is just the rv350 with a bridge chip according to nvidia)

But again, nothing has been verified yet.

-Steve
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
The specifics of the implementation of ATi's bridging or their filtering are not what I'm pointing out, but rather that the company itself is eroding away at the credibility it gained from the 9xxx series by twice now saying that nVidia's methods are sub-par then using similar methods themselves. Is nVidia 'x-raying' chips any better? No, it's just as underhanded. But the point is I'm tired of both companies slinging mud, and it's even worse when it's hypocritical.

Should ATi use a single chip and bridge it to save the costs of having two different processors being made at once? Sure, it's a good business move. Is it any worse that nV just uses AGP processors with an HSI? No, not necessarily, because we haven't seen any decent PCI-E benches yet. So lambasting one method or the other won't matter until people get their hands on them and see for themselves.

Either way, the point of the tech (for either company) is moot, as I stated before, because the PCI-E bus potential won't be tapped for quite a while yet, so these new solutions are likely to offer no tangible gains in performance. It will still come down to which card has the most raw power, so saying you'll never buy a AGP bridged to PCI-E is your choice but it will likely not matter. Like I said before, PCI-E offers other benefits from sheer graphics bus speed, but the switch is being made now so the cards have to be brought out. Most people will still use AGP mobos for quite a while yet, so I advise you to skip this first generation of cards from either camp, until you can get a GPU that actually makes use of the speed.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
Well, I wasn't counting NV when I said no one is mentioning it, since they are the ones making the allegations.

I meant anyone with any objectivity in the matter.

Maybe ATI will explain what's on the x-ray pics but I really hope they don't respond to this nonsense at all.

Well, I guess technically they have already denied the NV accusations.

I doubt my much loved ti4400 will be replaced with a 6800.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
ZobarStyl, ATI is not doing anything similar to NV, either with filtering or with bridge chips.
 

ss284

Diamond Member
Oct 9, 1999
3,534
0
0
Originally posted by: LTC8K6
Well, I wasn't counting NV when I said no one is mentioning it, since they are the ones making the allegations.

I meant anyone with any objectivity in the matter.

First thing, it was mentioned when someone linked to it earlier in the thread. Hence I mentioned it again.

Secondly, you might want to reread the link, as you seem to be confused as to exactly what nvidia is accusing ati of. I would think that this allegation is pretty important considering the subject matter.

Lastly, since nvidia and ati are both far from objective on this matter, I guess from your logic we shouldnt even bother to pay attention to statements from either of them.

Originally posted by: LTC8K6
Maybe ATI will explain what's on the x-ray pics but I really hope they don't respond to this nonsense at all.

Well, I guess technically they have already denied the NV accusations.

I doubt my much loved ti4400 will be replaced with a 6800.

It was stupid of nvidia to base this entire accusation on a picture, but it would be in ATI's best interest to deny and clearly prove that the accusations were false.

Originally posted by: LTC8K6
ZobarStyl, ATI is not doing anything similar to NV, either with filtering or with bridge chips.

They might not be exactly the same, but last I checked, filter optimizations are pretty similar to filter optimizations and bridge chips are pretty similar to bridge chips.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: LTC8K6
Well, I wasn't counting NV when I said no one is mentioning it, since they are the ones making the allegations.

I meant anyone with any objectivity in the matter.

Maybe ATI will explain what's on the x-ray pics but I really hope they don't respond to this nonsense at all.

Well, I guess technically they have already denied the NV accusations.

I doubt my much loved ti4400 will be replaced with a 6800.

It was an allegation when Nvidia said it. The moment Orton confirmed it, it was no longer an allegation. Do you understand this?

I hope ATI does explain whats on the X-ray, but what makes it nonsense? It is what it is. Buffer or bridge. Done.

I doubt you even like your ti4400 let alone love it. It obvious what corner your in despite the current card that you own. I dont have a problem with that, but you sure do.

And stop having a conversation with yourself... :p
 

ss284

Diamond Member
Oct 9, 1999
3,534
0
0
Originally posted by: keysplayr2003
Originally posted by: LTC8K6
Well, I wasn't counting NV when I said no one is mentioning it, since they are the ones making the allegations.

I meant anyone with any objectivity in the matter.

Maybe ATI will explain what's on the x-ray pics but I really hope they don't respond to this nonsense at all.

Well, I guess technically they have already denied the NV accusations.

I doubt my much loved ti4400 will be replaced with a 6800.

It was an allegation when Nvidia said it. The moment Orton confirmed it, it was no longer an allegation. Do you understand this?

I hope ATI does explain whats on the X-ray, but what makes it nonsense? It is what it is. Buffer or bridge. Done.

I doubt you even like your ti4400 let alone love it. It obvious what corner your in despite the current card that you own. I dont have a problem with that, but you sure do.

And stop having a conversation with yourself... :p

Key, read the inquirer articles, but pay attention to the date, and the exact wording. ATI admiitted that future native PCI-e products will be bridged down to agp. However, nvidia is accusing ati of using an AGP to PCI-e bridge on their current PCI-e cards(which is the opposite bridge chip that ati admitted to be using in future chips) which would effectively muddle ati's definition of a native PCI-e chip.

-Steve
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
They might not be exactly the same, but last I checked, filter optimizations are pretty similar to filter optimizations and bridge chips are pretty similar to bridge chips.

Couldn't have said it better myself. :thumbsup:

Once again, I think neither company should be tossing around accusations, but above all I'm saddened that ATi, who earned some real respect with their last generation of cards, has now lowered themselves to this level. If they had just let nV develop their tech without incident, no one would care much, but instead they talked it down and said the bridging was a bad idea, and now they are eating their words.
 

Cuular

Senior member
Aug 2, 2001
804
18
81
Once again, I think neither company should be tossing around accusations, but above all I'm saddened that ATi, who earned some real respect with their last generation of cards, has now lowered themselves to this level. If they had just let nV develop their tech without incident, no one would care much, but instead they talked it down and said the bridging was a bad idea, and now they are eating their words.

I'm going to have to disagree. They said that bridging from an older technology that has inherantly lower specs from the newer faster technology is bad. They did not say that bridging in general was bad.

I don't know how many people that have posted in here understand the implications, but AGP8X can only transfer data in a single direction at a time, and does that half as fast the current PCI-E spec. PCI-E can transfer the data both directions on the pipe at the same time, with each direction being twice as fast as AGP8X.

Now nvidia has done 2 things both mentioned by ATI that should cause anyone looking at the situation to raise their eyebrows.

1. They came up with their own proprietary AGP16X implementation, that still transfers data in a single direction at a time.

2. They then use a bridge chipset to convert that to PCI-E.


The fact that in the future when there are a few diehards that still have AGP8X machines that don't want to upgrade to the faster PCI-E is when ATI is going to downgrade technology to fit the situation is a long ways apart from trying to take current lacking technology and bridge it to a future more robust technology.

If you can't make the distinction between the two, then you need to go study technology a bit more.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
current lacking technology

See that's my point, current technology (AGP8x) has not been maxed out by any stretch. When 4x vs 8x benches were done, the results were insignificant, pretty much showing that even 4x was sufficient bandwidth. So now we have nV moving a card that already has more than enough bandwidth to an even larger amount, and it makes sense to me that the gains will be minimal at best. I mean if you increase something that is already in excess your gains are not likely to increase significantly. I think nV realized that and decided if they can get their bridge to connect without much latency then the result would be comparable to ATi's native PCI-E cards (this remains to be proven/disproven; I have no crystal ball that tells me who has better PCI-E cards, but it's current usefulness to graphics cards is little). Giving something much more than it needs is not likely to be detrimental but bridging something down from a 2-way huge pipe to a 1-way smaller pipe might choke some of the future ATi AGP cards. And although I don't think anyone is going to cling to AGP based mobos for any other reason than the price of upgrading, hampering their AGP line before PCI-E fully kicks in might hurt ATi. As I said many posts ago, I doubt that the reason for me to upgrade to PCI-E will be graphics bandwidth, as it's non-issue in current cards, but rather dual-card setup ability and throughput for things like gigabit LAN.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: ss284
Originally posted by: keysplayr2003
Originally posted by: LTC8K6
Well, I wasn't counting NV when I said no one is mentioning it, since they are the ones making the allegations.

I meant anyone with any objectivity in the matter.

Maybe ATI will explain what's on the x-ray pics but I really hope they don't respond to this nonsense at all.

Well, I guess technically they have already denied the NV accusations.

I doubt my much loved ti4400 will be replaced with a 6800.

It was an allegation when Nvidia said it. The moment Orton confirmed it, it was no longer an allegation. Do you understand this?

I hope ATI does explain whats on the X-ray, but what makes it nonsense? It is what it is. Buffer or bridge. Done.

I doubt you even like your ti4400 let alone love it. It obvious what corner your in despite the current card that you own. I dont have a problem with that, but you sure do.

And stop having a conversation with yourself... :p

Key, read the inquirer articles, but pay attention to the date, and the exact wording. ATI admiitted that future native PCI-e products will be bridged down to agp. However, nvidia is accusing ati of using an AGP to PCI-e bridge on their current PCI-e cards(which is the opposite bridge chip that ati admitted to be using in future chips) which would effectively muddle ati's definition of a native PCI-e chip.

-Steve

I can see where your coming from Steve, but the article is really irrevelant. I'll tell you why. ATI scoffed at Nvidia for implementing a bridge chip on their cards to be able to run in a PCI-e mobo. They said performance would suffer because of it. Now it turns out that ATI is going to bridge their stuff also. Its the hipocracy that is to be noted here, not wording in an article or any dates. If ATI is not doing it now, they will be. And, it doesn't matter which way the bridge goes either. AGP to PCI-e, or PCI-e to AGP. A bridge is a bridge is a bridge.

Kind of a "stick your foot in your mouth" situation for ATI. again.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: Cuular
Once again, I think neither company should be tossing around accusations, but above all I'm saddened that ATi, who earned some real respect with their last generation of cards, has now lowered themselves to this level. If they had just let nV develop their tech without incident, no one would care much, but instead they talked it down and said the bridging was a bad idea, and now they are eating their words.

I'm going to have to disagree. They said that bridging from an older technology that has inherantly lower specs from the newer faster technology is bad. They did not say that bridging in general was bad.

I don't know how many people that have posted in here understand the implications, but AGP8X can only transfer data in a single direction at a time, and does that half as fast the current PCI-E spec. PCI-E can transfer the data both directions on the pipe at the same time, with each direction being twice as fast as AGP8X.

Now nvidia has done 2 things both mentioned by ATI that should cause anyone looking at the situation to raise their eyebrows.

1. They came up with their own proprietary AGP16X implementation, that still transfers data in a single direction at a time.

2. They then use a bridge chipset to convert that to PCI-E.


The fact that in the future when there are a few diehards that still have AGP8X machines that don't want to upgrade to the faster PCI-E is when ATI is going to downgrade technology to fit the situation is a long ways apart from trying to take current lacking technology and bridge it to a future more robust technology.

If you can't make the distinction between the two, then you need to go study technology a bit more.

Interesting last statement there. Study technology more huh? Maybe you should take your own advice to then. Trying to take a current "lacking technology" and bridge it ? How is AGP 8x lacking when AGP 4x is barely edging out its full utilization? PCI-e is a sales gimmick as anyone with half a brain cell in here will tell you. Somewhere in the distant future, PCI-e vid cards will show their benefits, but not withing the next year or so until it becomes mainstream.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: gsellis
Looking at the rope attached to that clapper... ;) - Having fun with you Insomniac - because otherwise, you are totally on the mark, with this exception.

I do know of one example where it does matter that is in beta. ATI helped write this article, but it 'rings' true. Doing real-time High Definition exceeds AGP bandwidth.

Pinnacle, ATI, Intel announce HD Editing solution

There almost is always an exception to the rules. This is the one. Of course, the card will be a one-off Pinnacle-ATI card similar to the current Liquid Edition Pro card, which is based on the ATI Radeon 8500 All-In-Wonder setup.



That's video editing...these cards are made for high end gaming, and that's what I'm referring to. There are plenty of other apps that can saturate the AGP bus, but I don't consider that relevant as they're such a miniscule portion of the market.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
X600 is not high-end. 6800U and X800XTPE both are, though, and both appear to allow for at least basic video editing (they have multiple video inputs, at any rate). NV40's media processor seems to be an in-road into video editing.
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
It's highly unlikely that ATI is going to even respond to this smear attempt by nVidia. They sat quietly while nVidia had there months of gloating, while everyone speculated that the 6800 would kill the R420. It's not their style to release press releases to discount rumors started by nVidia's marketing department. Do they really need to even give Faud or the CEO or their competitor any validity to the point of denying their allegations?

I still can't believe that some here (Keys) believe that any bridge is a bridge and is therefore bad. There's a clear difference in what nVidia is doing in bridging. From and OEM standpoint, bridging is bad because it's going to cost them more for the chips (due to a seperate chip with I/O pins). nVidia clearly stated (in their conference call) that they will pass the cost of the bridge onto their customers because they believe PCI-E is a value added product and therefore expects it's customers to pay a premium for it. That was a nice spin on saying "yes a bridge will cost more and we don't intend to eat that cost ourselves". ATI went the other route and implemented PCI-E natively to save money on the chips and thus giving them a clear advantage in the OEM game (which is where the real money is). Since Intel is set to announce the PCI-E chipsets this weekend, I suspect PC makers will start announcing their new machines with PCI-E and I'm very interested to see which graphic vendor they've chosen (and I bet this is what's got nVidia's panties in a bind)

I also doubt that ATI will need to bridge their new PCI-E chips back to AGP in future, simply because most vendors will switch to PCI-E and AGP will go away in the high and mid-range fast. I suspect AGP will still be common in the low end PC's for sometime, but I also don't think that these companies are going to be willing to put some fancy new video card (based on PCI-E) into these low-end systems. Instead these vendors will just put some cheap last gen AGP card in it to fit the bill.
 

hysperion

Senior member
May 12, 2004
837
0
0
Lets say you have a 4 lane highway (agp4x) and an 8 lane highway (agp8x).........then you have a 16 lane highway (pcie).....

Does it matter which highway you drive on if there's never a traffic jam or traffic slowdowns? Hell no....everyone already knows that agp8x has more then enough bandwidth for current cards, 4x is probably enough.

That's why a bridge chip doesn't matter reguardless of whos' doing it. It doesn't matter what way the bridge goes either. Neither way is better and I find it laughable that people are arguing one way is better. The only way it is better is if there is no bridge period- As in the ATI PCIE card without a bridge on the chip running on a PCI-E motherboard. Why? Because a chain (in this case bus) is only as strong as it's weakest link. If you take a 2.0gb signal and throw it down a 1.0gb path that turns into 2.0 shortly after (kinda like nvidia's solution smaller to larger bus), you end up with 1.0gb making it there. If you take a 2.0gb signal and throw it down a 2.0gb path that shrinks to 1.0gb (like ATI's solution) you still get the same result. The ONLY time this comes into play is when an nvidia card is used on a PCI-E motherboard because it isn't native. But guess what THAT DOESN'T MATTER EITHER. Because as I said before agp8x is MORE then enough bandwidth for current videocards and it doesn't become a bottleneck anyways....Maybe 2 generations from now it will matter but by then who cares as both will be using it natively anyways. PCI-E is just sh1t intel is shoving us anyways that isn't needed other then dual video card arrays which do seem promising, have you seen the benchmarks of the new intel chipsets that were just posted- pretty pathetic....
 

hysperion

Senior member
May 12, 2004
837
0
0
The fact that in the future when there are a few diehards that still have AGP8X machines that don't want to upgrade to the faster PCI-E is when ATI is going to downgrade technology to fit the situation is a long ways apart from trying to take current lacking technology and bridge it to a future more robust technology. If you can't make the distinction between the two, then you need to go study technology a bit more.

A current lacking technology? Can you show me some benchmarks that show AGP lacking in bandwidth or lacking in general? I don't need an answer because NO you can't. All the bandwidth related benchmarks show that current cards do not use AGP's available bandwidth and probably won't for some time, at least until the next generation. Maybe you should study technology a little more buddy, reminds me of comments from the ignorant masses that think higher mhz is always better. Ever hear of the law of diminishing returns? Personally I'd think nvidia and ATI's hardware designers are a lot smarter then you but maybe you should apply and show them how to build better videocards- I'll give them a good reference for you- your posts. Maybe then ATI will hire you for their PR department. Or maybe ATI will give you some of their profits because you're a friend of the company.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: jasonja
It's highly unlikely that ATI is going to even respond to this smear attempt by nVidia. They sat quietly while nVidia had there months of gloating, while everyone speculated that the 6800 would kill the R420. It's not their style to release press releases to discount rumors started by nVidia's marketing department. Do they really need to even give Fudo or the CEO or their competitor any validity to the point of denying their allegations?

I still can't believe that some here (Keys) believe that any bridge is a bridge and is therefore bad. There's a clear difference in what nVidia is doing in bridging. From and OEM standpoint, bridging is bad because it's going to cost them more for the chips (due to a seperate chip with I/O pins). nVidia clearly stated (in their conference call) that they will pass the cost of the bridge onto their customers because they believe PCI-E is a value added product and therefore expects it's customers to pay a premium for it. That was a nice spin on saying "yes a bridge will cost more and we don't intend to eat that cost ourselves". ATI went the other route and implemented PCI-E natively to save money on the chips and thus giving them a clear advantage in the OEM game (which is where the real money is). Since Intel is set to announce the PCI-E chipsets this weekend, I suspect PC makers will start announcing their new machines with PCI-E and I'm very interested to see which graphic vendor they've chosen (and I bet this is what's got nVidia's panties in a bind)

I also doubt that ATI will need to bridge their new PCI-E chips back to AGP in future, simply because most vendors will switch to PCI-E and AGP will go away in the high and mid-range fast. I suspect AGP will still be common in the low end PC's for sometime, but I also don't think that these companies are going to be willing to put some fancy new video card (based on PCI-E) into these low-end systems. Instead these vendors will just put some cheap last gen AGP card in it to fit the bill.

Your hopeless Jasonja. I didn't say bridging was bad did I.. You will only see what you care to see and in your mind, thats final. Most of your statements are very obviously incorrect. Such as costing more to bridge a chip. It actually saves the company from the cost of producing 2 entirely separate cores. So its much more cost effective to bridge. Chew on that please and let me know how it tastes.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
Well, ATI has denied that there is a bridge in those X-rays, and I see no other evidence of bridges being presented by anyone.

I am still waiting for the source of the quote about ATI's alleged PCI-E to AGP bridge.

Again, I am not buying a card with an AGP to PCI-E bridge, period.

ATI?s RADEON X-series visual processors use a full, native 16-lane PCI Express bus interconnect to transfer data between the VPU and CPU in both directions simultaneously.

No bridge mentioned. When ATI changes this, I'll worry about it.