R580: Here by January and smoking 2 512 7800s in SLI

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: ElFenix
this front page article has to be the worst news rollo has heard all month.

I always scratch my head in wonder when I see posts like this. It's almost as if people seem to think it would make me sad or angry if ATI did better. :confused:

LOL, far from it. I wish they'd launch a card today that was 10X faster than the current best, with 10X better IQ that cost $5..

Why?

Because I'd go buy it? :roll:eek:

The only people who should get angry about a consumer products companies success are working for their competitors, and only then because they know overtime is probably on the way.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Steelski
Originally posted by: Jeff7181
Originally posted by: DAPUNISHER
AT has their RD580 preview up, ATI is going to be owning the overclockers and high-end graphics titles if nV has nothing better than their current flagship offerings out at that time.

That's the funny thing... nVidia has PLEANTY of time to work on new things.

Yes....Yes it does.
If the X1800XT was released on time...............please oh please tell me who would have had the crown for the past 6 months......
LOL Yeah, and if nVidia had released their next years parts six months ago, they would be owning! :roll:eek:

Silly Jeff.
Silly Demoth. We can only discuss what has happened because you have no idea whatsoever if nVidia would have countered the XT with the 512 GTX back then if necessary. It's only good business to just stay ahead of your competition, reduces the amount you have to surpass your last effort by, saves on R&D costs?

We would all be sitting here wondering how the GTX got beat in almost everything.
If "ifs" and "buts" were candies and nuts, we'd ALL have a Merry Christmas.

I hardly think that ATI did nothing to increase the performance to this next gen. just like i think that Nvidia has not been standing around. But i can hardly see the GTX would have dominated anything if the R520 was there.
Of course it wasn't, so you have no point whatsoever.

Does anyone watch Formula 1 here. I gues if you did you would have seen Michael Shumacher loose his crown this year.
I think watching car racing is exactly as much fun as pulling up a chair next to the Interstate highway and watching cars go by.

Because of a crappy car..... From mid season when hopes were nearly dashed, Ferrari stopped any real development on their car and just went with the flow. guess why.
i am betting that they will still be the team to beat next year (according to new world champ) because their development program will not have stopped for the next car.
You honestly think that has something to do with the video card industry? That if Ferrari works that way, ATI and nVidia must too? Wow.

 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
That performance for just an update seems a little unbelievable, This is one of those things where, now I gotta say: I'll believe it when I see it.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: Steelski
Originally posted by: Jeff7181
Originally posted by: DAPUNISHER
AT has their RD580 preview up, ATI is going to be owning the overclockers and high-end graphics titles if nV has nothing better than their current flagship offerings out at that time.

That's the funny thing... nVidia has PLEANTY of time to work on new things.

Yes....Yes it does.
If the X1800XT was released on time...............please oh please tell me who would have had the crown for the past 6 months......
Silly Jeff.
We would all be sitting here wondering how the GTX got beat in almost everything.
I hardly think that ATI did nothing to increase the performance to this next gen. just like i think that Nvidia has not been standing around. But i can hardly see the GTX would have dominated anything if the R520 was there.
Does anyone watch Formula 1 here. I gues if you did you would have seen Michael Shumacher loose his crown this year. Because of a crappy car..... From mid season when hopes were nearly dashed, Ferrari stopped any real development on their car and just went with the flow. guess why.
i am betting that they will still be the team to beat next year (according to new world champ) because their development program will not have stopped for the next car.

If, maybe, would've, should've, could've... none of that theoretical crap matters if they don't have a product for sale. Where's the x1800XT? Newegg doesn't have them. ZipZoomFly doesn't have them. Monarch doesn't have them. If the x1800XT performs 10 times better than a 7800GTX, what difference does it make if I can't find one for sale?

I love your Michael Shumacher analogy. He shows up with a crappy car, ATI shows up with a non-existant card. In both situations, neither of them "win" even if he is a better driver and even if ATI's card performs better. Shumacher is still behind in points and ATI is still behind in sales.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
But with the "quality" setting, the 9700p looks better than either Nv's "balanced" or "aggressive"

The reason I didn't quote a larger section is I had a feeling you hadn't read the article and instead of reading it after I suggested it you would make a comment on the quality modes.... you are very predictable, I'll give you that-

Visually, the quality of the image constructed by NVIDIA GeForce FX in ?Application? and ?Balanced? modes is similar to ATI RADEON 9700 PRO in the ?Quality? mode.

This angle (22.5o) is ?inconvenient? for the anisotropic filtering algorithm from ATI. RADEON 9700 PRO exposes to shame its fuzzy textures. NVIDIA GeForce FX, and also GeForce4, keep the same quality of texture filtering.

What exactly were you saying again? The NV30 was clearly inferior to the NV20 in terms of AF, but the R300 based parts are pathetic at best. For the record- I am currently and have been running a R9800Pro for a long time now in my main rig. Actually, the last nVidia card I purchased was a GeForceDDR back when they first hit retail.

It takes a pretty one sided person to to claim the superiority of the NV30, or to dismiss it's shortcomings.

I'm quoting the article YOU provided the link to. I pointed out where the NV30 fell down flat along with where it did well. The R520 is the best current board at handling AF hands down- that's why I want it- but it has everything wrong with it the NV30 did also. The x1K series is a failure for ATi overall- I don't see how that can reasonably be debated. Why would they be planning on launching a PE edition and be leaking R580 info already if they didn't know it was an abject failure? It is the NV30 fiasco all over again.
 

nts

Senior member
Nov 10, 2005
279
0
0
Originally posted by: BenSkywalker
but it has everything wrong with it the NV30 did also.

LOL, no.

The x1K series is a failure for ATi overall- I don't see how that can reasonably be debated. Why would they be planning on launching a PE edition and be leaking R580 info already if they didn't know it was an abject failure?

Companies are always launching new products, why did NVIDIA release the GTX 512?

Why have i heard rumors about the G80 recently...

I still doubt ATi launching a PE edition, it doesn't look like its needed. Enable AA and the lead the GTX512 has becomes a lot smaller (with these cards who plays without AA anyways).

It is the NV30 fiasco all over again.

No not until ATi starts claiming the R520 is 32 pipes (8x1 when it was 4x2), starts blaming TSMC for a broken process (when the NV30 was broken), starts going after companies about unfair benchmarks (Futuremark, Valve), gets benchmarks removed from games (TRAOD), starts lowering image quality to compete speed wise and needs a new driver for every new game/patch released.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: BenSkywalker
But with the "quality" setting, the 9700p looks better than either Nv's "balanced" or "aggressive"

The reason I didn't quote a larger section is I had a feeling you hadn't read the article and instead of reading it after I suggested it you would make a comment on the quality modes.... you are very predictable, I'll give you that-

Visually, the quality of the image constructed by NVIDIA GeForce FX in ?Application? and ?Balanced? modes is similar to ATI RADEON 9700 PRO in the ?Quality? mode.

I had a feeling you would quote this piece, but that statement refers to the texture quality of the screenshots posted directly above. This does not apply to AF quality, since the article discusses it further down.

This angle (22.5o) is ?inconvenient? for the anisotropic filtering algorithm from ATI. RADEON 9700 PRO exposes to shame its fuzzy textures. NVIDIA GeForce FX, and also GeForce4, keep the same quality of texture filtering.

What exactly were you saying again? The NV30 was clearly inferior to the NV20 in terms of AF, but the R300 based parts are pathetic at best. For the record- I am currently and have been running a R9800Pro for a long time now in my main rig. Actually, the last nVidia card I purchased was a GeForceDDR back when they first hit retail.

They are both inferior to the nv20. But unless you were runing in "application" mode, the bilinear mipmapping would result in a picture hardly any less pathetic than the r300 at 22 degrees, no matter at what angle you look at it.

"by introducing the mixture of tri-linear and bi-linear filtering instead of true tri-linear filtering they tend to worsen the image quality quite tangibly.

And this worsening does show itself, although it is not that noticeable in static screenshots. When the game is running, that is, the scene is dynamic, this worsening reveals itself on those parts of the picture that do not undergo tri-linear filtering."

It takes a pretty one sided person to to claim the superiority of the NV30, or to dismiss it's shortcomings.

I'm quoting the article YOU provided the link to. I pointed out where the NV30 fell down flat along with where it did well. The R520 is the best current board at handling AF hands down- that's why I want it- but it has everything wrong with it the NV30 did also. The x1K series is a failure for ATi overall- I don't see how that can reasonably be debated. Why would they be planning on launching a PE edition and be leaking R580 info already if they didn't know it was an abject failure? It is the NV30 fiasco all over again.

Does the x1k series have a particular hardware/design flaw that makes it stumble when running a particular shader type? Is the introduction of a PE edition an indication that a particular card series is a failure? By that logic, does it also mean the x850 was a failure because Ati released a x850xt pe?

As it stands now, the x1k series is not as competitive as I would like, and if Ati scraps the whole thing and releases the r580 in January then it might as well be Ati's nv30. But if Ati releases an x1800xt pe and it competes well with the 512 gtx in price and performance, then it would not be a failure anywhere close to the failure of nv30.
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: nts
Enable AA and the lead the GTX512 has becomes a lot smaller (with these cards who plays without AA anyways).

This is true. Its too bad hardly any reviews show benches with Adaptive AA, and Transparency AA to show users what they can expect. With both of these features selected, they are very comparable in terms of performance. They tell us about these new features, yet dont show us numbers. I dont understand why they do that, its just silly to me.
 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: Ackmed
Originally posted by: nts
Enable AA and the lead the GTX512 has becomes a lot smaller (with these cards who plays without AA anyways).
This is true. Its too bad hardly any reviews show benches with Adaptive AA, and Transparency AA to show users what they can expect. With both of these features selected, they are very comparable in terms of performance. They tell us about these new features, yet dont show us numbers. I dont understand why they do that, its just silly to me.
Be careful. 5150joker got flamed for going down the same road..
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
I know, I backed up what he said too. I think its just flat out dumb not to include benches of new features of cards. Especially when they have detailed images of how they improve IQ. I use TAA every single time I can. It GREATLY improves AA quality in games such as BF2, HL2, etc. Hard is one of the few that does use them, but since its not at the same settings, you cant directly compare. From the limited numbers Ive seen, when enabling the higher quality AA, and using 16xAF, they are very close in performance.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
^^Yep some of the more rabid nV fanboys were accusing me of being a fanboy for even bringing up this point. Of course the thread was derailed by BFG's personal attacks and absurd point about 2048x1536 resolution being a more valid IQ test than transparent SSAA/AAA and HQ AF. I even mentioned that my benchmarks were selective because very few websites utilized higher AA/AF IQ modes for testing but the fanboys wouldn't hear of it. :roll:
 

shabby

Diamond Member
Oct 9, 1999
5,782
45
91
Im sorry but... fat pipe technology?
First it was extreme pipes, now fat pipes, whats next bertha pipes?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
Remember when looking at the benches to compare nV's Balanced to ATi's High Quality
Balanced and aggressive had notably inferior IQ to ATi's quality. Application was the comparable setting but that's where the NV30's performance really tanked.

Be careful. 5150joker got flamed for going down the same road..
Joker got flamed for his trolling, not for requesting TrAA/AAA benchmarks. Of course with the evidence gone thanks to his dirty editing tactics you may well be forgiven into thinking he was just the poor victim.

Of course the thread was derailed by BFG's personal attacks and absurd point about 2048x1536 resolution being a more valid IQ test than transparent SSAA/AAA and HQ AF.
Uh, no. You requested "max IQ" and then produced results @ 1280x1024 with 4xAAA/4TrAA as evidence, something which isn't even remotely "max IQ". To refresh your memory you'll need read my posts because your posts have been doctored by your edits.
 

Dainas

Senior member
Aug 5, 2005
299
0
0
For the R580 to be a G70 killer it would have some pretty noticeable architectural differences from the R520. ATI most definetly can release the R580 in January, its just however very unlikely that they will release it before it has reached a level of performance noticeably above the GTX 512...which makes a post spring launch very likely.

Please stop blowing smoke until you have real evidence of what will be what, If you have that much faith in ATI let it speak for itself when the R580 is here. This forum is ripe with Nvidia and ATI fanboys... but the "Nvidiots" have atleast substanciated their main claims in the last 6 months, while the ATI side has even bent over before talking or threw the topic into the relm of pure opinion. Quite the opposite of 3 years ago.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Dainas
For the R580 to be a G70 killer it would have some pretty noticeable architectural differences from the R520. ATI most definetly can release the R580 in January, its just however very unlikely that they will release it before it has reached a level of performance noticeably above the GTX 512...which makes a post spring launch very likely.

Please stop blowing smoke until you have real evidence of what will be what, If you have that much faith in ATI let it speak for itself when the R580 is here. This forum is ripe with Nvidia and ATI fanboys... but the "Nvidiots" have atleast substanciated their main claims in the last 6 months, while the ATI side has even bent over before talking or threw the topic into the relm of pure opinion. Quite the opposite of 3 years ago.

. . . what we DO know:

From Anand's ATI RD580: Dual x16 Crossfire Preview
the R580 GPU is still scheduled for launch in January.


no smoke or mirrors . . . just fact.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Dainas
For the R580 to be a G70 killer it would have some pretty noticeable architectural differences from the R520.

Why exactly would it need to be so different? The r520 architecture has already shown itself to be very well suited for shader-intensive modern DX9 games (FEAR, BF2, etc.), and for enabling AA with a smaller performance hit than any other card. Just by adding more shader ALU's Ati can make the r580 a lot faster then the 512 gtx.
 

Steelski

Senior member
Feb 16, 2005
700
0
0
Originally posted by: Rollo
Originally posted by: Steelski
Originally posted by: Jeff7181
Originally posted by: DAPUNISHER
AT has their RD580 preview up, ATI is going to be owning the overclockers and high-end graphics titles if nV has nothing better than their current flagship offerings out at that time.

That's the funny thing... nVidia has PLEANTY of time to work on new things.

Yes....Yes it does.
If the X1800XT was released on time...............please oh please tell me who would have had the crown for the past 6 months......
LOL Yeah, and if nVidia had released their next years parts six months ago, they would be owning! :roll:eek:

Silly Jeff.
Silly Demoth. We can only discuss what has happened because you have no idea whatsoever if nVidia would have countered the XT with the 512 GTX back then if necessary. It's only good business to just stay ahead of your competition, reduces the amount you have to surpass your last effort by, saves on R&D costs?

We would all be sitting here wondering how the GTX got beat in almost everything.
If "ifs" and "buts" were candies and nuts, we'd ALL have a Merry Christmas.

I hardly think that ATI did nothing to increase the performance to this next gen. just like i think that Nvidia has not been standing around. But i can hardly see the GTX would have dominated anything if the R520 was there.
Of course it wasn't, so you have no point whatsoever.

Does anyone watch Formula 1 here. I gues if you did you would have seen Michael Shumacher loose his crown this year.
I think watching car racing is exactly as much fun as pulling up a chair next to the Interstate highway and watching cars go by.

Because of a crappy car..... From mid season when hopes were nearly dashed, Ferrari stopped any real development on their car and just went with the flow. guess why.
i am betting that they will still be the team to beat next year (according to new world champ) because their development program will not have stopped for the next car.
You honestly think that has something to do with the video card industry? That if Ferrari works that way, ATI and nVidia must too? Wow.

Oh dear,,,,,,,,you and your buddy are so think......its unbelivable.

the GTX512 would not have been available then........whyyyyyyy? because the ram did not exist, not at those speeds. and nor did the chip which is a refined version according to Nvidia.

And yes i do think that the ferrari comparison has something to do with the way an industry works. If you and your buddy cant see squat then youz are a pair of idiots that cant seem to make any comparisons whatsoever.

As for your ifs and but, I dont see how you can coment that the GTX is not beaten by an XT and most likley the similar priced 256MB version XT. its only an if if we dont know really.
Yes i dont know what Nvidia would have countered with but it is likley to have been a situation like the last gen cards with one winning something and the other the other.
 

Demoth

Senior member
Apr 1, 2005
228
0
0
I'd like to repeat, everything is still rumor until the actual testing numbers are in. The purpose of this post was not to hype up that ATI is better then NVIDIA or such. The purpose was to warn those who were unaware the R580 is still slated to be out by January despite the long delay of the R520, and initial leaks are showing a very large increase in performance.

With expected reviews by mid December, my intent was to inform those who might be ready to grab a 7800 512 GTX or a X1800XT for $700 and then a week later see the next gen card for the same price is giving much better performance. That is, after all, one of the main jobs of this board, to inform people of what may be coming down the pike, so they can make an informed decision.

It's true you can always wait and hold off on a vid card purchase because something better at the top end is right around the corner. However, this is an unusal generation for ATI because we have 2 top end releases that are pretty much in an overlap. Some people are likely unaware of this fact.
 

imported_Rampage

Senior member
Jun 6, 2005
935
0
0
This is retarded.

ATI has more rumors and wives tales to spread... and Nvidia has real product that is dominating.
It truley IS dominating (not just theoretically in the future) and its for sale.
The answer is clear.

ATI says "R520 will dominate" and it comes too little, too late.
I'll believe all this when I see it. Nvidia pretty much delivers on their promises lately.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: apoppin
Originally posted by: Dainas
For the R580 to be a G70 killer it would have some pretty noticeable architectural differences from the R520. ATI most definetly can release the R580 in January, its just however very unlikely that they will release it before it has reached a level of performance noticeably above the GTX 512...which makes a post spring launch very likely.

Please stop blowing smoke until you have real evidence of what will be what, If you have that much faith in ATI let it speak for itself when the R580 is here. This forum is ripe with Nvidia and ATI fanboys... but the "Nvidiots" have atleast substanciated their main claims in the last 6 months, while the ATI side has even bent over before talking or threw the topic into the relm of pure opinion. Quite the opposite of 3 years ago.

. . . what we DO know:

From Anand's ATI RD580: Dual x16 Crossfire Preview
the R580 GPU is still scheduled for launch in January.


no smoke or mirrors . . . just fact.

that's certainly a problem if you consider that fact. just because someone announces something, that's fact? when was r520 supposed to launch? ati announced launches of products which simply disappeared (x700xt anyone?)... and now if they say something it's fact?

many products "launched" oct 5 are still either mia or very hard to obtain (with the exception of x1800xl). ati's entire PR strategy for 2005 seems to have been "smoke and mirrors" to appease their loyalists (or their stockholders?).

i certainly welcome the release of r580, but i'll believe it when i see it.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I had a feeling you would quote this piece, but that statement refers to the texture quality of the screenshots posted directly above. This does not apply to AF quality, since the article discusses it further down.

You should rewrite that, it sounds like you are saying their explicit discussion about AF isn't about AF- I know that can't be what it actually is.

But unless you were runing in "application" mode, the bilinear mipmapping would result in a picture hardly any less pathetic than the r300 at 22 degrees, no matter at what angle you look at it.

Brilinear, not straight bilinear. There was actually a fairly sizeable difference.

Does the x1k series have a particular hardware/design flaw that makes it stumble when running a particular shader type?

So far evidence is suggesting it stumbles badly when you are utilizing pixel shaders combined with smaller complexity/high quantity vertex shaders.

Is the introduction of a PE edition an indication that a particular card series is a failure?

When they can't ship their lower tier product as it is, they are already pushing against noise/heat issues and their pricing only allows them to introduce it at $700 then I would tend to say yes.

By that logic, does it also mean the x850 was a failure because Ati released a x850xt pe?

The x850xtpe was an attempt at a halo card.

But if Ati releases an x1800xt pe and it competes well with the 512 gtx in price and performance, then it would not be a failure anywhere close to the failure of nv30.

The watercooled NV30 outran the R9700Pro in almost everything. Didn't make it any less of a failure.

Balanced and aggressive had notably inferior IQ to ATi's quality. Application was the comparable setting but that's where the NV30's performance really tanked.

I can pull up quotes with you saying straight bilinear AF was great on the R9700Pro, aliasing, mip banding and all. I can also pull up quotes of you from not that long ago talking about how non angle dependant AF was a pipe dream and it wasn't plausible for today's GPUs as the performance hit would be too steep. Then again, you were going on about it didn't really make a difference anyway, although I see that has changed somewhat since it was ATi that pulled if off. BTW- Still waiting for your frothing at the mouth rants about ATi's 'cheats' in F.E.A.R. :D
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: CaiNaM
Originally posted by: apoppin
Originally posted by: Dainas
For the R580 to be a G70 killer it would have some pretty noticeable architectural differences from the R520. ATI most definetly can release the R580 in January, its just however very unlikely that they will release it before it has reached a level of performance noticeably above the GTX 512...which makes a post spring launch very likely.

Please stop blowing smoke until you have real evidence of what will be what, If you have that much faith in ATI let it speak for itself when the R580 is here. This forum is ripe with Nvidia and ATI fanboys... but the "Nvidiots" have atleast substanciated their main claims in the last 6 months, while the ATI side has even bent over before talking or threw the topic into the relm of pure opinion. Quite the opposite of 3 years ago.

. . . what we DO know:

From Anand's ATI RD580: Dual x16 Crossfire Preview
the R580 GPU is still scheduled for launch in January.


no smoke or mirrors . . . just fact.

that's certainly a problem if you consider that fact. just because someone announces something, that's fact? when was r520 supposed to launch? ati announced launches of products which simply disappeared (x700xt anyone?)... and now if they say something it's fact?

many products "launched" oct 5 are still either mia or very hard to obtain (with the exception of x1800xl). ati's entire PR strategy for 2005 seems to have been "smoke and mirrors" to appease their loyalists (or their stockholders?).

i certainly welcome the release of r580, but i'll believe it when i see it.
"supposed to launch?" Who said?

As usual you are missing something important: ATI did NOT announce the r520 until it was ready . . . just over 6 weeks ago and delivered on their promise of availability for 3 product launches [out of 4 . . . the x800 xfire was a disaster]. Even their flagship xt is at least as available as the 7800-ultra.

IF ATI failed [somehow] on their announcement, they would lose face and fans. They cannot afford to do that.

i am certain we will see the r580 launched in January and product available shortly thereafter* my prediction. call me on it.

*unless Anand's article misquoted ;)


edit: even theInq isn't so sure of their date as they are only saying r580 is 'taped out' . . . if so, that should make it February. ;)
edited