Originally posted by: keysplayr2003
Do you have any idea how much better xfire would appeal to consumers and reviewers alike if ATI substituted the external dongle with an internal interconnect solution? It is FAR more elegant to do so. Sure, people say "It doesn't bother them to have a dongle, who looks in the back of their computer anyway?" But then ask them, given a choice, would they prefer an internal connector or the dongle? If anyone says they prefer the dongle, then that person is absolutely full of it. But because they have NO choice currently with Xfire, and HAVE to have an external cable to connect them, then of course their response will be, "It really doesn't bother me.". I am speaking of ATI fans who want to see ATI do better than Nvidia. Xfire is sloppy in comparison in it's current incarnation. Performance seems to be there, but it's just sloppy IMHO. Maybe 1 or 2 gens of xfire down the road, and some ATI engineer will see the light.
Originally posted by: Ackmed
Except that 3DFX's SLI used an external type of cable, and even a cable that connected the cards inside the PC. It had no competition, so I doubt it was "rushed". I understand that its your opinion it has the dongle because it was rushed, but past evidence shows that the only other previous time, it doesnt look like that was the case.
Originally posted by: Matthias99
From a technology standpoint, the way ATI is doing things may actually end up giving better performance, especially with SuperAA. Calling it a 'rush job' or 'afterthought' because it has an external connection (as opposed to internal) between the cards is a little silly.
Originally posted by: Ackmed
Originally posted by: keysplayr2003
Do you have any idea how much better xfire would appeal to consumers and reviewers alike if ATI substituted the external dongle with an internal interconnect solution? It is FAR more elegant to do so. Sure, people say "It doesn't bother them to have a dongle, who looks in the back of their computer anyway?" But then ask them, given a choice, would they prefer an internal connector or the dongle? If anyone says they prefer the dongle, then that person is absolutely full of it. But because they have NO choice currently with Xfire, and HAVE to have an external cable to connect them, then of course their response will be, "It really doesn't bother me.". I am speaking of ATI fans who want to see ATI do better than Nvidia. Xfire is sloppy in comparison in it's current incarnation. Performance seems to be there, but it's just sloppy IMHO. Maybe 1 or 2 gens of xfire down the road, and some ATI engineer will see the light.
Since you felt the need to bump it after it had fallen way off the front page.. at least attempt to replies towards you.
Originally posted by: Ackmed
Except that 3DFX's SLI used an external type of cable, and even a cable that connected the cards inside the PC. It had no competition, so I doubt it was "rushed". I understand that its your opinion it has the dongle because it was rushed, but past evidence shows that the only other previous time, it doesnt look like that was the case.
Originally posted by: Matthias99
From a technology standpoint, the way ATI is doing things may actually end up giving better performance, especially with SuperAA. Calling it a 'rush job' or 'afterthought' because it has an external connection (as opposed to internal) between the cards is a little silly.
SuperAA is much faster than SLI AA. Its not even close. Obviously the way ATi did it has some advantages.
I would like for you to sit at a table of ATi engineers and help them "see the light", as you put it.
Originally posted by: keysplayr2003
SuperAA is much faster than SLI AA? Congratulations! Is the external dongle required for SuperAA? I thought it was done through the PCI-e bus no? So what does SuperAA have to do with the dongle?
3dfx now belongs to Nvidia, and Nvida didn't even use the external dongle even though they aquired the original SLI technology from 3dfx. What does this tell you?
If you could arrange it with your buddies, I would love to have a sit in with ATI engineers, so you just let me know what strings you can pull to make it happen.
There have been many people saying that it's faster but does not look as good. It tops out at 14X compared to 16X for SLI.Originally posted by: Ackmed
SuperAA is much faster than SLI AA. Its not even close. Obviously the way ATi did it has some advantages.
Originally posted by: munky
I believe SuperAA is done by the compositing engine on the master card. It shouldnt depend on the connector type (dongle/bridge), but doing it over pci-e bus would most likely be slower than through a dedicated connection.
http://www.techreport.com/reviews/2005q4/radeon-x1800-crossfire/index.x?pg=2
Originally posted by: keysplayr2003
What does it tell me? It tells me that this was the height of technology. Ten years ago. That was how 3dfx was able to SLI at the time. It tells me that ATI's implementation is even worse than 3dfx's Ten years ago, because even 3dfx did not require a "master" card. Only two cards from the same manufacturer were required. So in effect, Xfire is even more primitive than 3dfx's SLI a decade ago. I'm talking of course about the implementation, not the technology in the cards themselves.
So, yes. I believe Xfire was a rush to market to have something to "appear" like it had something to offer against Nvidia's SLI.
Nobody is avoiding your point Ackmed. You seem to have neatly avoided one of mine about the sit down with ATI engineers. Sometimes it takes a layman, like me, to make the geeks, see the obvious.
Originally posted by: keysplayr2003
What does it tell me? It tells me that this was the height of technology. Ten years ago. That was how 3dfx was able to SLI at the time. It tells me that ATI's implementation is even worse than 3dfx's Ten years ago, because even 3dfx did not require a "master" card. Only two cards from the same manufacturer were required. So in effect, Xfire is even more primitive than 3dfx's SLI a decade ago. I'm talking of course about the implementation, not the technology in the cards themselves.
So, yes. I believe Xfire was a rush to market to have something to "appear" like it had something to offer against Nvidia's SLI. Nobody is avoiding your point Ackmed. You seem to have neatly avoided one of mine about the sit down with ATI engineers. Sometimes it takes a layman, like me, to make the geeks, see the obvious.
Originally posted by: Wreckage
There have been many people saying that it's faster but does not look as good. It tops out at 14X compared to 16X for SLI.Originally posted by: Ackmed
SuperAA is much faster than SLI AA. Its not even close. Obviously the way ATi did it has some advantages.
Personally anything above 8X is a waste to my eyes, but some people must play some slow games where they have time to stop and stare.
Originally posted by: keysplayr2003
What does it tell me? It tells me that this was the height of technology. Ten years ago. That was how 3dfx was able to SLI at the time. It tells me that ATI's implementation is even worse than 3dfx's Ten years ago, because even 3dfx did not require a "master" card.
Originally posted by: Creig
Originally posted by: keysplayr2003
What does it tell me? It tells me that this was the height of technology. Ten years ago. That was how 3dfx was able to SLI at the time. It tells me that ATI's implementation is even worse than 3dfx's Ten years ago, because even 3dfx did not require a "master" card.
In case you hadn't noticed, video card architecture has changed quite a bit in the past 10 years. What is necessary for today's PCI-E/DX9.0c/1600x1200/4xAA/16xAF can't be directly compared to yesterday's AGP/DX6.0/800x600/no AA/no AF.
Car tires are STILL round and lightbulbs are STILL just glowing filaments as they have been since they were invented. Yet we're still using them today. Just because an idea is "old" doesn't mean it's bad.
Quit trying to compare apples to oranges.
Originally posted by: Matthias99
Originally posted by: keysplayr2003
What does it tell me? It tells me that this was the height of technology. Ten years ago. That was how 3dfx was able to SLI at the time. It tells me that ATI's implementation is even worse than 3dfx's Ten years ago, because even 3dfx did not require a "master" card. Only two cards from the same manufacturer were required. So in effect, Xfire is even more primitive than 3dfx's SLI a decade ago. I'm talking of course about the implementation, not the technology in the cards themselves.
So, yes. I believe Xfire was a rush to market to have something to "appear" like it had something to offer against Nvidia's SLI.
ATI's use of an external compositing engine was a conscious design choice that is designed to improve performance. This lead to the need for a separate 'master' card (since otherwise they would have to put the compositing engine on every card, driving up costs for people who aren't using Crossfire), and for a dedicated high-speed connection between the cards (which NVIDIA also uses; they just put it inside the case).
Somehow you're getting out of this that Crossfire is inferior to not just NVIDIA SLI, but the old 3DFX SLI (which worked completely differently)?
Is anybody understanding the logic here? Or is it just me?
Nobody is avoiding your point Ackmed. You seem to have neatly avoided one of mine about the sit down with ATI engineers. Sometimes it takes a layman, like me, to make the geeks, see the obvious.
That their solution is, from a performance standpoint, probably the best one (at the cost of some flexibility in terms of hardware)?
Originally posted by: Pete
Um, did 3dfx's SLI even use a dongle? I recall the dongle was for passing thru the output of your 2D card, but I think the SLI-specific chores were handled by an internal connector (albeit "dongly," unlike NV's silicon bridge). Sorry to deprive ppl of another historical "parallel."
Anyway, since we're all listing XFire X1800XT reviews, I don't know if GamePC's review has been mentioned yet. They tested with Cat 5.13. The benchmarks are only with typical settings (no SLI-specific modes).
Originally posted by: Rollo
Sheesh Creig stop trying to make the proverbial silk purse out of a sow's ear.
Originally posted by: Rollo
Almost all the reviewers have the same opinion as Keys about the dongle/master card/compsoiter chip- it's not like he felt out of the sky spouting Martian philosophy and nobody but him gets it.
Originally posted by: Rollo
(he's pretty much in the majority)
Originally posted by: Rollo
I'd care more about that stuff too if there wasn't what I consider more important stuff: driver issues, lack of flexible defaults.
Originally posted by: Rollo
Crossfire X850 was a joke.
Originally posted by: Rollo
Crossfire X1800 is halfway there.
(half good, half bad)
Originally posted by: Creig
Originally posted by: Rollo
Sheesh Creig stop trying to make the proverbial silk purse out of a sow's ear.
Sheesh Rollo, stop trying to make out like ATI is staffed by nothing but incompetent idiots.
ATI did what they could in the time alotted. Nobody said ATI didn't have talent. They just didn't have the time.
Originally posted by: Rollo
Almost all the reviewers have the same opinion as Keys about the dongle/master card/compsoiter chip- it's not like he felt out of the sky spouting Martian philosophy and nobody but him gets it.
Is it as "elegant" as the internal Nvidia bridge? No. But as long as it works, who cares if they have 1 more cable behind their computer? It's not as if you're showing off the back of your computer case to house guests.
It's that same fact again. Not enough time. They had to go external compositor because it was the only thing they could have done, or knew how to do. Its the fact that they used an external cable in the first place that I can't get over. There is really no question in my book. This was thrown together the best way they knew how in a short amount of time.
Originally posted by: Rollo
(he's pretty much in the majority)
Apparently he's not.
You need to back this one up brudda. Just sayin it don't make it so. Check as many reviews as you can find. You'll change this tune quickly enough. Of course, not publicly.
Originally posted by: Rollo
I'd care more about that stuff too if there wasn't what I consider more important stuff: driver issues, lack of flexible defaults.
The "driver issues" and "lack of flexible defaults" issues are nowhere near as bad as your soapbox rants would like to have people believe.
No, they're bad. But will get worked out over time. Just as NV worked out SLI's shortcomings.
Originally posted by: Rollo
Crossfire X850 was a joke.
I'll mostly agree with you on that one. It seemed to be more for the benefit of their PR dept than anything else. But for people with LCDs, the refresh rate issue is meaningless. So it wasn't totally useless.
Ah, those silver linings.
Originally posted by: Rollo
Crossfire X1800 is halfway there.
(half good, half bad)
Crossfire is over 3/4 there and getting better quickly.
Now you're arguing in fractions? Ok, lemme try. Crossfire is 9/16 * X / compositor + dongle = 64/100*3.14 there.
Originally posted by: keysplayr2003
...stuff...
Originally posted by: Creig
Originally posted by: Rollo
I'd care more about that stuff too if there wasn't what I consider more important stuff: driver issues, lack of flexible defaults.
The "driver issues" and "lack of flexible defaults" issues are nowhere near as bad as your soapbox rants would like to have people believe.
[/quote]Originally posted by: Rollo
Crossfire X1800 is halfway there.
(half good, half bad)
Crossfire is over 3/4 there and getting better quickly.
Originally posted by: keysplayr2003
It's that same fact again. Not enough time. They had to go external compositor because it was the only thing they could have done, or knew how to do. Its the fact that they used an external cable in the first place that I can't get over. There is really no question in my book. This was thrown together the best way they knew how in a short amount of time.
Originally posted by: keysplayr2003
Ah, those silver linings.Originally posted by: Creig
Originally posted by: Rollo
Crossfire X850 was a joke.
I'll mostly agree with you on that one. It seemed to be more for the benefit of their PR dept than anything else. But for people with LCDs, the refresh rate issue is meaningless. So it wasn't totally useless.
Originally posted by: keysplayr2003
Rushed. Rushed. Rushed. I can almost hear the echo's of the whips crackin in the ATI development labs. One guy in a suit slashing the whip around groaning, "Work faster you insignificant nothing of a scientist!!! We have money to steal from our uninformed sheep. I don't care how you do it, JUST DO IT RIGHT THE F NOW!!!" LOL.