ATI to bridge their chips

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: LTC8K6
Well, ATI has denied that there is a bridge in those X-rays, and I see no other evidence of bridges being presented by anyone.

I am still waiting for the source of the quote about ATI's alleged PCI-E to AGP bridge.

Again, I am not buying a card with an AGP to PCI-E bridge, period.

ATI?s RADEON X-series visual processors use a full, native 16-lane PCI Express bus interconnect to transfer data between the VPU and CPU in both directions simultaneously.

No bridge mentioned. When ATI changes this, I'll worry about it.

And you have every right not to. Question is, how do you know your not buying a bridged chip. Do you believe everything your told?
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
No, I was told that NV40 would crush R420. :D

I believe what is supported by good evidence.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Then you were lied to. NV40 does not crush the R420. They are equivalent with one better than the other in certain situations. Im sure you were told that the R420 will crush the NV40 as well. I'm certain you believed that. Your choice.

Good evidence? Well, at the moment, there is nothing extensive concerning bridging.

What good things do you have to say about the NV40? C'mon bite your tongue. :p
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
I never believed the NV40 would crush the R420, so I don't believe everything I am told. :D

Well, I have said many times on this board earlier that since I had to build 2 new systems soon anyway, I might as well buy one of each new card and see them both first hand. I would upgrade my ti4400 to a 6800 for one system and get an X800 for the other system.

NV's behavior of late is just making me reconsider that decision.
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Originally posted by: keysplayr2003
Originally posted by: jasonja
It's highly unlikely that ATI is going to even respond to this smear attempt by nVidia. They sat quietly while nVidia had there months of gloating, while everyone speculated that the 6800 would kill the R420. It's not their style to release press releases to discount rumors started by nVidia's marketing department. Do they really need to even give Faud or the CEO or their competitor any validity to the point of denying their allegations?

I still can't believe that some here (Keys) believe that any bridge is a bridge and is therefore bad. There's a clear difference in what nVidia is doing in bridging. From and OEM standpoint, bridging is bad because it's going to cost them more for the chips (due to a seperate chip with I/O pins). nVidia clearly stated (in their conference call) that they will pass the cost of the bridge onto their customers because they believe PCI-E is a value added product and therefore expects it's customers to pay a premium for it. That was a nice spin on saying "yes a bridge will cost more and we don't intend to eat that cost ourselves". ATI went the other route and implemented PCI-E natively to save money on the chips and thus giving them a clear advantage in the OEM game (which is where the real money is). Since Intel is set to announce the PCI-E chipsets this weekend, I suspect PC makers will start announcing their new machines with PCI-E and I'm very interested to see which graphic vendor they've chosen (and I bet this is what's got nVidia's panties in a bind)

I also doubt that ATI will need to bridge their new PCI-E chips back to AGP in future, simply because most vendors will switch to PCI-E and AGP will go away in the high and mid-range fast. I suspect AGP will still be common in the low end PC's for sometime, but I also don't think that these companies are going to be willing to put some fancy new video card (based on PCI-E) into these low-end systems. Instead these vendors will just put some cheap last gen AGP card in it to fit the bill.

Your hopeless Jasonja. I didn't say bridging was bad did I.. You will only see what you care to see and in your mind, thats final. Most of your statements are very obviously incorrect. Such as costing more to bridge a chip. It actually saves the company from the cost of producing 2 entirely separate cores. So its much more cost effective to bridge. Chew on that please and let me know how it tastes.


Sorry you actually said "a bridge is a bridge is a bridge" and continue to imply that what nVidia and ATI are doing is the same thing. This clearly isn't true and you seem to be the only person here arguing that point. Example, if I have a two 100mb/s networks and I bridge them with a 10mb/s router, is this the same thing as bridging two 100mb/s networks with a 100mb/s router? Hardly.

Sure it might save nVidia some initial development cost because they didn't have to tape out a new GPU core, instead they just made a bridge chip. However they still have two cores (one bridge and one GPU), sure the bridge is small and cheap compared to the GPU but it still has I/O pins and a package and that can get expensive to produce even for small ASICs. Now nVidia's customers need to order/stock/buy TWO ASICs everytime they want a PCI-E part, then they need to take into account additional expense for routing I/O lines on their board layout to connect the two chips as well as the added board cost for the added real estate and 2nd heatsink for the bridge. You may try and dismiss this a minimal, but when you are making thousands (or hundreds of thousands) of graphic boards the added expense can quickly cut into your bottom line.

If you read carefully I said this was bad for OEMs, not nVidia. However, if OEMs don't like it (which we can only tell when the design wins are announced) then it's bad for nVidia because they don't get the big OEM deals. nVidia's CEO stated in the last conference call that the bridge chip would have a premium cost to it that they expected their customers to pay for because it's "value added". That statement says that you're going to pay more for that same old AGP part. Lastly if the bridge is such a great idea and cost savings winner for nVidia, then why are they making the NV45 which is just a PCI-E version of NV40? Why not just stick that wonderful bridge on the NV40 board? Clearly the bridge was a stop-gap solution for nVidia until they could release a new core with PCI-E in it.

So please tell me which statements of mine are incorrect. Try backing up your statements. Do you build ASICs for a living? are you a EE? or are you just regurgitating what you read on the Inquirer?


And you have every right not to. Question is, how do you know your not buying a bridged chip. Do you believe everything your told?

Do you? You're assuming the word of the Inquirer and nVidia is true. Perhaps if I want the truth about Ford's cars I should go ask a Chevy salesman?
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: keysplayr2003

Your hopeless Jasonja. I didn't say bridging was bad did I.. You will only see what you care to see and in your mind, thats final. Most of your statements are very obviously incorrect. Such as costing more to bridge a chip. It actually saves the company from the cost of producing 2 entirely separate cores. So its much more cost effective to bridge. Chew on that please and let me know how it tastes.

Speaking of hopeless, it is 'you're' not 'your'.

Dont be upset that you're wrong, and its not a bridge chip.
 

Cat

Golden Member
Oct 10, 1999
1,059
0
0
PCI-E is more important than most of you think. GPU->CPU communication is pretty slow in current cards, requiring pipeline stalls. Things like occlusion culling, render-to-texture, and framebuffer reads can be significantly sped up with the asyncronous nature of PCI-E.
 

hysperion

Senior member
May 12, 2004
837
0
0
Sorry you actually said "a bridge is a bridge is a bridge" and continue to imply that what nVidia and ATI are doing is the same thing. This clearly isn't true and you seem to be the only person here arguing that point. Example, if I have a two 100mb/s networks and I bridge them with a 10mb/s router, is this the same thing as bridging two 100mb/s networks with a 100mb/s router? Hardly.

Your example is way flawed....what's happening is nvidia starts with a higher bandwidth that travels thru agp and debatably stays the same because agp8x in nvidia's case 16x hasn't reached it's potential yet then goes into PCI-E which increases the potential but doesn't hurt the bandwidth because it's already been thru the chokepoint. ATI puts their bandwidth thru the higher potential first, and it hits the chokepoint later on.

Lets say 2 companies both built 10 mile tunnels. On company A's tunnel you have to pay to get in at the entrance (like a booth with a gate we'll say). And then drive straight thru. On company B's tunnel you drive straight thru and pay upon leaving (at the booth).....Either way you were slowed up the same amount because you hit the booth either way....just like nvidia and ati- both hit this booth, only nvidia cards hit it first, ati cards later on which matters none as performance lost is exactly the same. PCI-E isn't needed yet other then dual video arrays.....Besides AMD makes the fastest gaming processors right now and they won't be supporting PCI-E until the next gen cards are out anyways so does it really matter?
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
Originally posted by: hysperion
Your example is way flawed....what's happening is nvidia starts with a higher bandwidth that travels thru agp and debatably stays the same because agp8x in nvidia's case 16x hasn't reached it's potential yet then goes into PCI-E which increases the potential but doesn't hurt the bandwidth because it's already been thru the chokepoint. ATI puts their bandwidth thru the higher potential first, and it hits the chokepoint later on.

Lets say 2 companies both built 10 mile tunnels. On company A's tunnel you have to pay to get in at the entrance (like a booth with a gate we'll say). And then drive straight thru. On company B's tunnel you drive straight thru and pay upon leaving (at the booth).....Either way you were slowed up the same amount because you hit the booth either way....just like nvidia and ati- both hit this booth, only nvidia cards hit it first, ati cards later on which matters none as performance lost is exactly the same. PCI-E isn't needed yet other then dual video arrays.....Besides AMD makes the fastest gaming processors right now and they won't be supporting PCI-E until the next gen cards are out anyways so does it really matter?

Uh.... no. More like there are 4 tunnels (each company owns 2). One Tunnel has a 60 mph speed limit and one has a 30 mph speed limit. One company requires you to slow to 30 mph before you can go 60. The other allows you to go 60 into the 60 zone. But for the 30 mph tunnel, the one company requires you to slow from 60 to 30 before you go into the tunnel. But, there is no proof that the transition does really exist yet for that one company. ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: Acanthus
it doesnt matter what it is... the cards dont need the bandwidth.

why did this thread make it to 2 pages.

I have no idea. And check out Ackmed, the spelling and grammar moderator for AT. LOL
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
What Anandtech has to say about this

With this article, we were also trying to put an end to the ATI vs. NVIDIA PCI Express debate. Our conclusion? The debate was much ado about nothing - both solutions basically perform the same. ATI's native PCI Express offering does nothing to benefit performance and NVIDIA's bridged solution does nothing to hamper performance. The poor showing of NVIDIA under Far Cry and Warcraft III is some cause for concern, which we will be looking into going forward. We will keep you all updated on any and all findings with regards to that issue as we come across them.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: oldfart
What Anandtech has to say about this

With this article, we were also trying to put an end to the ATI vs. NVIDIA PCI Express debate. Our conclusion? The debate was much ado about nothing - both solutions basically perform the same. ATI's native PCI Express offering does nothing to benefit performance and NVIDIA's bridged solution does nothing to hamper performance. The poor showing of NVIDIA under Far Cry and Warcraft III is some cause for concern, which we will be looking into going forward. We will keep you all updated on any and all findings with regards to that issue as we come across them.


You are the MAN!!!!
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Techreport disagrees.

I hardly think benchmarking games designed for AGP cards is a decent measure of PCI-E's abilities. Lets not forget to mention video editing and other things people use their PC for. Also looks like ATI's PCI-E solution was a hit with Dell, no sign of nVidia anywhere in their new lineup (not even an option).
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
You are completely right jason that the current generation of chips cannot utilize PCI-E for any real-world performance, only for theoretical bandwidth fun with ATi chips (which was still quite lower than the purported PCI-E max bandwidth). That's why nV used bridge chips, because there's no real gains to be had in native PCI-E solutions this go around. However, by next generation the PCI-E platform will have matured and nV can release native PCI-E solutions that are worth the money and the upgrade. By then, maybe enough people will have PCI-E setups that developers will start making games take advantage of the bandwidth, but that's at least 1.5-2 years away. Of course, that is of course if games actually have any need for such bandwidth, since we never even maxed out AGP8x (and some believe 4x as well) and added bandwidth won't do much good if the core still can't dish out decent performance (x600's perform exactly the same as equivalently clocked 9600's). nV might well save a lot of money waiting out on native solutions this round, and they certainly aren't missing out on any real performance gains.

Also, not to flame or anything but
looks like ATI's PCI-E solution was a hit with Dell
...that really doesn't count for much to have Dell pushing your products...by your rationale, Intel EXTREME (woo) Graphics are a big hit with Dell, I can't believe I missed out on that one...:(

And on a more random note...for anyone I know who builds their own rig (i.e. knows what they're doing), Dell is a four-letter word :D
 

AnnoyedGrunt

Senior member
Jan 31, 2004
596
25
81
Originally posted by: jasonja
Techreport disagrees.

I hardly think benchmarking games designed for AGP cards is a decent measure of PCI-E's abilities. Lets not forget to mention video editing and other things people use their PC for. Also looks like ATI's PCI-E solution was a hit with Dell, no sign of nVidia anywhere in their new lineup (not even an option).

Well, it all depends on which page of the article you link to:
http://techreport.com/reviews/2004q2/intel-9xx/index.x?pg=12

I think we can draw the following conclusions:

For GAMING, PCI Express does not add any performance benefit. It is highly probable that this will continue to be the case for the near future (next 3 years or so).

For seding data from the Video card back to the CPU, PCI Express will make a difference on current ATI chips, but not on current Nvidia chips. However, the next iteration of chips from Nvidia will perfrom better in this area.

So, it really all depends on what you are doing. If you palying games, or doing CAD, or performing most types of visualization tasks, it won't matter which card you get, since you will be downloading data from the CPU and therefore won't benefit too much from the added bandwidth.

If you are doing tasks that require the card to send data back to the CPU (HDTV video editing has been mentioned, but I'm not sure if any of the gaming cards can edit video on-board anyway), then maybe ATI will be better for your tasks.

This bridge, or lack thereof (depending on vendor) is really only one more feature to keep in mind when picking your next card. There simply is no "best" card out there, since the two main players are so close in performance. There is only "better for my application and budget."

-D'oh!
 

Aftermath

Golden Member
Sep 2, 2003
1,151
0
0
Several years ago:

Guy1: I think I want to get a Voodoo II card, but I'm not sure which one. I think X is the fastest from what I've seen.
Guy2: Really? I want a 2D/3D solution, so I'll probably go with nVidia and get a TNT.
GUy3: I think I'm going to go with an All in Wonder, I think it would be cool to hook up my console to my video card.


Today:

Guy1: I got the new ATi card.
Guy2: I got the new nVidia card.
Guy1: Yeah, mine gets 391fps in Quake 3.
Guy2: Really? Only 391? Mine gets 424.
Guy1: Oh? Well, I get around 6000 points in 3DMark2003. I like to chill out and watch it play over and over. It's a beautiful thing.
Guy3: That's nice, too bad your drivers suck. My ATi All in Wonder never worked right when I bought it.
Guy2: Yeah, haha. ATi is the sucks. nVidia is way better.
Guy1: Well at least they don't cheat with their drivers. HA. Stupid nVidia.

*hours later*

Guy1: Look! Your card is even a different shade of green! What's that, some special, cheaters card?
Guy2: Oh shut up. You're just hiding the fact that the solder points on your card are all disfigured and making the card slower.
Guy1: Oh they are not! You're just pissed because your box didn't come with as many cool pictures of CG girls on it.


*Ment as a distraction, not a finger pointer, name caller, or back scratcher. Not giving either side crap, just making fun of the situation between fanboys.
 

dfloyd

Senior member
Nov 7, 2000
978
0
0
Its not really funny its just standard marketing. As I recall Nvidia grilled 3dfx over quite a few things that they are now themself doing. And like I said, just standard marketing. You always try to make it look like your competition is doing something bad and you are doing something good. First thing to do is completly ignore any marketing as it is not fully showing the final product. Instaed you should look at performance or stability, thats what should matter in the long run.
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
Originally posted by: AnnoyedGrunt
If you are doing tasks that require the card to send data back to the CPU (HDTV video editing has been mentioned, but I'm not sure if any of the gaming cards can edit video on-board anyway), then maybe ATI will be better for your tasks.
FYI - Pinnacle Liquid Edition 5.x and Pinnacle Studio 9 (some Hollywood FX only) both use the GPU for rendering for some effects (there are GPU and CPU effects). My 9600XT is much faster than my 7500 AIW at render. The nVidia cards work too, but earlier drivers have issues that have since been fixed. So, yes, they work with gaming cards.

The HDV card will be an item purchased from Pinnacle and will probably created from one of the R4xx cores, but that detail has not been disclosed, so only a guess.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
You do know that's just an ATi press release, right? While it may denote certain vendors being exclusive to ATi chips, I know for sure that Alienware isn't an exclusive ATi company, since their new Array setups were showcased with dual 6800u's. You already showed us Dell is using the x800's but all that link shows is that ATi cards are actually being used by the OEMS, not that nV's aren't.
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Originally posted by: ZobarStyl
You do know that's just an ATi press release, right? While it may denote certain vendors being exclusive to ATi chips, I know for sure that Alienware isn't an exclusive ATi company, since their new Array setups were showcased with dual 6800u's. You already showed us Dell is using the x800's but all that link shows is that ATi cards are actually being used by the OEMS, not that nV's aren't.

I'm well aware that's an ATI press release. nVidia press release for their list of customers was released yesterday Here. nVidia's list doesn't have ANY OEMs. No Dell, Emachines, IBM, ACER, Gateway, Sony, etc anywhere on nVidia's list. Alienware is a "system builder" not an OEM. If you look at both announcements you'll see that most of the people on nVidia's list are also on ATI's list.

...that really doesn't count for much to have Dell pushing your products...by your rationale, Intel EXTREME (woo) Graphics are a big hit with Dell, I can't believe I missed out on that one...

Wrong, winning Dell (the number one PC maker in the world) is a huge deal to any hardware company. That's a ton of income and volume and will signficantly raise their market share numbers. ATI's out to make money after all. Intel Extreme was in Dell's low and mid end computers, Dell has chosen ATI's parts for their high end desktops (XPS, and 8400). There's not even an upgrade option for nVidia in those machines.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Winning Dell over is indeed a boon for any hardware manufacturer from a sheer income standpoint, however my point was that having Dell push your products doesn't necessarily mean that product is the best. If Dell is all about pushing products that have more future-oriented capabilities (such as DDRII and PCI-E), why do they not sell A64 setups? Thus I stand by my original statement, which is that any self-respecting person who has built their own computer is not the slightest bit interested in what Dell thinks is the best product.

Overall I think you misunderstand me; I don't think the ATi solution is by any means sub-par; in fact I think for the performance that the XT's give (either PCI-E or AGP, both are the same) make them great cards. But at the same time, I don't think a native PCI-E solution is inherently better at real-world applications at this point in time, and the benchmarks agree. However, if nV is still using a bridge solution for their second generation PCI-E lineup, they'll fall short of ATi (who is likely by then to have a design that actually utilizes the bandwidth) and no one will want nV PCI-E cards.

And though I hate to nitpick (I'm not one to tear apart the nuances of another's post) but you listed Acer as not being on the nVidia list, while in fact it is.