Some news about the X1800XL core.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Basically, ATI put in a large wafer order to TSMC and they produced a bunch of wafers before ATI found the problem. Rather than letting all those wafers go to waste, which would cost ATI money, they had the new metal layers and connections applied to these older wafers to repair them. As far as I am concerned, and this is something you can take with a grain of salt as it is just my opinion, the first initial wave of XL's are from tape out number 2. Problem is, there will be no way to tell if you're getting an old core, or a newer core when you purchase an XL. And there is no telling how many of these old cores will be in circulation. This is kind of like the first revision R300 core that nobody wanted because of some issues. I know, I had 3 of them. Again. My opinion.
I would say at this point, if you are considering an XL, go with a 7800GT instead for now. Or just go for broke and get the XT when available. Let pricing be your judge I suppose.
 

TStep

Platinum Member
Feb 16, 2003
2,460
10
81
Here are my reasons for believing the OP:

-launch is 6 months late
-known fact that tapeouts were over clockspeed
-late launch spearheaded not by the flagship model, but by the XL
-near term availaility of XL, but not XT

my take:
-current stock of cores cannot make XT speed---->this is why XL will be the first card available
-binning cores right now for XT speed so there is sufficient quantity at time of availablity
-overclocking on the first releases will suck as all the XT speed capable cores are being hoarded in the short term.

hope I'm wrong, but the facts seem to point in that direction.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Nice post Keys!

It looks like 99.9% of the people understand this, but we know someone here still doesn't get it.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
How come none of the reviewers did an OC test of the XL? We could have known this by now, or maybe they dont want anyone to know. But I also have my suspicions that the early XL's are actually failed XT's.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: munky
How come none of the reviewers did an OC test of the XL? We could have known this by now, or maybe they dont want anyone to know. But I also have my suspicions that the early XL's are actually failed XT's.

I was wondering that myself. Usually the reviewers touch at least briefly on overclocking on a new vid card.

 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: munky
How come none of the reviewers did an OC test of the XL? We could have known this by now, or maybe they dont want anyone to know. But I also have my suspicions that the early XL's are actually failed XT's.
Macci just confirmed that 11K score is easily attainable on air. This is for the XT not the XL.

Still waiting on XL's oc abilities. :clock:
 

TStep

Platinum Member
Feb 16, 2003
2,460
10
81
Originally posted by: crazydingo
Originally posted by: munky
How come none of the reviewers did an OC test of the XL? We could have known this by now, or maybe they dont want anyone to know. But I also have my suspicions that the early XL's are actually failed XT's.
Macci just confirmed that 11K score is easily attainable on air. This is for the XT not the XL.

Still waiting on XL's oc abilities. :clock:

Have a link? Interested @ what core/mem speeds it took for 11K.

Just guessing
9K @ 625 x 16 pipes --------> 10B fill
11K would take 760ish no? ---------> 12B fill

7800GTX: 12B fill @ 24 pipes --------> 500mhz, isn't this about the highest Oc you 7800 guys are seeing reliably?

OC7800 vs. OCXT real world performance would be fairly close then, no?
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: crazydingo
Originally posted by: munky
How come none of the reviewers did an OC test of the XL? We could have known this by now, or maybe they dont want anyone to know. But I also have my suspicions that the early XL's are actually failed XT's.
Macci just confirmed that 11K score is easily attainable on air. This is for the XT not the XL.

Still waiting on XL's oc abilities. :clock:

Yah, Linky Please? Would like to check 'er out.

 

Steelski

Senior member
Feb 16, 2005
700
0
0
Originally posted by: Cookie Monster
Originally posted by: M0RPH
Originally posted by: Cookie Monster
Actually i would buy the 100 cheaper 7800GT.
Nvidia right now has the performance/price crown.
Plus the 7800Gt performs better.

Just read the xbitlabs review, and about IQ. The 7 series edges out the X1series in AA (such as 8xS, TSS vs AAA (where ATi blurrs out the jaggies), but the X1 series edging it out in AF IQ.

Both 7 series and the X1 series are good cards, but right now the 7800GT looks to be the better option by far. 100 dollar cheaper, cooler, consumes way less power, and faster than the X1800XL.



Sorry CM but you're sorely misinformed. Look at this picture:

Text

Look at the bottom half of the wall on the left. Huge difference. Can't be any clearer than that. That's just one example. I won't go into all the other ways the ATI IQ is superior. And I won't go into texture shimmering either.

Shimmering... it isnt an issue faced by many people, because the issue was later knowned that it happen by chance, on certain games and its different across different systems. Many dont see the shimmering, but for the ones who do, it was fixed in the 78.03 (78.05?) driver i think it was.
People seriously overplay the shimmering issue.
Anyway back to inform you that the IQ is almost a tie.

Xbitlabs:
AF:
I have to draw your attention to the fact that we haven?t found any real evidence pointing at the significant advantage of the enhanced AF mode over the standard AF mode. In other words, there is no big difference in the image quality of real games between the enhanced anisotropic filtering mode of the new RADEON X1800 XT and the standard anisotropic filtering of the new ATI solutions as well as of the other graphics cards.

AA:
As we can see from the screenshots, adaptive anti-aliasing of transparent textures works fine on RADEON X1000, however, the actual image quality improvement is not that significant, just like in case of alpha-textures multi-sampling by NVIDIA GeForce 7 (TMS, transparent multi-sampling). I have to stress that the Adaptive FSAA of the new RADEON X1000 is of much better quality than the similar mode by GeForce 7800 GTX, however it is still much lower than what the competitor?s TSS (transparent textures super-sampling) would provide.

I would also like to say that adaptive anti-aliasing of alpha textures by RADEON C1800 XT may sometimes lead to their complete removal. In fact, it could be a drive issue, because the anti-aliasing masks can be set on the software level for ATI RADEON solutions.

So, the laurels for the best FSAA quality wills till remain with NVIDIA for now.[/quote]

Hothardware:

[/quote]AA:
If you direct your attention to the water-tower and crane in the background of these images, the impact anti-aliasing has on image quality is readily apparent. In the "No AA" shots it seemed to us that the Radeon X850 XT Platinum Edition and Radeon X1800 XT had the lowest detail, and had the most prominent "jaggies." Look closely at the ladder on the water tower and you'll notice parts missing in the Radeon shots that are there on the GeForce 7800 GTX. With standard multi-sample 4X anti-aliasing enabled, though, it becomes much harder to discern any differences between the cards. The ladder in the background gets cleaned considerably, as do the cables on the crane. The same holds true when ATI's 6X MSAA and NVIDIA's 8xS AA is enabled, although in this comparison, we'd give an edge in image quality to NVIDIA, because the additional super-sampling applied by 8xS AA does a decent job of cleaning up edges of transparent textures.

However, at the very bottom of the page, we've got some screen shots using the Radeon X1000 family's new adaptive anti-aliasing algorithm. Adaptive AA is basically a combination of multi-sampling and super-sampling AA, similar to NVIDIA's 8xS mode, or a combination of NVIDIA's MSAA and the GeForce 7's transparency AA. ATI's adaptive AA mode super-samples any textures that have transparency to reduced jaggies that don't land on the edge of a polygon. There are multiple Adaptive AA modes available with the new X1000 family of cards. When in quality mode, for example, 4X Adaptive AA is a combination of 4X MSAA and 4X SSAA; 6X Adaptive AA is 6X MSAA and 6X SSAA. In performance mode though, the number of samples applied in the super-sample stage are halved (performance mode was not available in the drivers we used for testing). As you can see, ATI's adaptive AA does a great job of reducing jaggies in the scene. Open up a standard 4X or 6X AA shot, and compare the trees and grass in the scene to either of the adaptive AA screens. You'll see a significant reduction in the prominence of jaggies. Overall, we were impressed with the images produced by ATI's Adaptive AA. The X1800 XT produced some of the best images we have seen on the PC to date.

AF:
The same seemed to be true when inspecting the 16x aniso images. Of course, image quality analysis is objective by its nature, but based on these images, we think the GeForce 7800 GTX has the best image quality as it relates to anisotropic filtering when standard "optimized" aniso is used. The new Radeon X1000 family of graphics cards offer another "high quality" anisotropic mode, that doesn't have the same angular dependency as ATI's previous generation of cards. The new high-quality aniso mode offered by the X1000, applies nearly the same level of filtered regardless of the angle. Overall, the effect of enabling ATI's high-quality aniso mode is positive, as it does an even better job of sharpening texture and increasing the detail level. The fully appreciate ATI's high-quality aniso mode though, you've got to see it in action. Still screen shots don't convey the full effect.The same seemed to be true when inspecting the 16x aniso images. Of course, image quality analysis is objective by its nature, but based on these images, we think the GeForce 7800 GTX has the best image quality as it relates to anisotropic filtering when standard "optimized" aniso is used. The new Radeon X1000 family of graphics cards offer another "high quality" anisotropic mode, that doesn't have the same angular dependency as ATI's previous generation of cards. The new high-quality aniso mode offered by the X1000, applies nearly the same level of filtered regardless of the angle. Overall, the effect of enabling ATI's high-quality aniso mode is positive, as it does an even better job of sharpening texture and increasing the detail level. The fully appreciate ATI's high-quality aniso mode though, you've got to see it in action. Still screen shots don't convey the full effect.[/quote]


These some of the sites discussing IQ, while MOST e.g hexus, driverheaven dont really go into looking at IQ.
Your misinformed. And the fact TAA looks better than AAA is another fact you should know about.




[/quote]
AAAAAAAHHHHHHHHHH. Did you see that picture, AF quality on /nvidia is horrid,
And by the way, there is no performance drop with 6XAA for the X1800.

 

gunblade

Golden Member
Nov 18, 2002
1,470
0
71
Originally posted by: TStep
Here are my reasons for believing the OP:

-launch is 6 months late
-known fact that tapeouts were over clockspeed
-late launch spearheaded not by the flagship model, but by the XL
-near term availaility of XL, but not XT

my take:
-current stock of cores cannot make XT speed---->this is why XL will be the first card available
-binning cores right now for XT speed so there is sufficient quantity at time of availablity
-overclocking on the first releases will suck as all the XT speed capable cores are being hoarded in the short term.

hope I'm wrong, but the facts seem to point in that direction.

Exactly, what I thought.

 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: M0RPH
Maybe I should ask a mod to lock this. What you're doing is trying to scare people away from buying a good product based on an unsubstantiated rumor. Smells like FUD to me.

Go away. You'll be gone soon enough. Save us the trouble.

That FUD you're smelling? That's you.
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: keysplayr2003
Basically, ATI put in a large wafer order to TSMC and they produced a bunch of wafers before ATI found the problem. Rather than letting all those wafers go to waste, which would cost ATI money, they had the new metal layers and connections applied to these older wafers to repair them. As far as I am concerned, and this is something you can take with a grain of salt as it is just my opinion, the first initial wave of XL's are from tape out number

The key thing here that you do not understand is that the defect in question was in the metal layers, not in the silicon wafers. When they applied the new metal layers, the problem had been eliminated and so these metal layers were defect-free. Therefore, any chips produced would be defect-free.

Although, publicly, ATI representatives wouldn't lay blame on exactly were the issue existed, quietly some will point out that when the issue was eventually traced it had occurred not in any of ATI's logic cells, but instead in a piece of "off-the-shelf" third party IP whose 90nm library was not correct. Once the issue was actually traced, after nearly 6 months of attacking numerous points where they felt the problems could have occurred, it took them less than an hour to resolve in the design, requiring only a contact and metal change, and once back from the fab with the fix in place stable, yield-able clockspeeds jumped in the order of 160MHz.

Some may question why the X1800 XL's are being introduced immediately, but the XT's coming a month later. Silicon for all the parts are in production, but the final configuration of the silicon and metal layers was resolved fairly late, and with the production times taking up to three months for chip orders to final products the XT's are coming a little later, once those chips appear. ATI can make XL's available fairly shortly because they had placed large silicon orders earlier, but stopped the production once they realised there were still issues, meaning there were many cut wafers but without metal layers ? as the issues could be resolved with a change to the contacts and metal layers ATI could utilise the silicon that had already been cut and apply new metal layers and all of these are going towards the initial XL products.

So your whole interpretation of the matter is FLAWED.

Now I'm done here.


 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Ronin
Originally posted by: M0RPH
Maybe I should ask a mod to lock this. What you're doing is trying to scare people away from buying a good product based on an unsubstantiated rumor. Smells like FUD to me.

Go away. You'll be gone soon enough. Save us the trouble.

That FUD you're smelling? That's you.

You're a joke. You can't win an argument against me so you need to threaten me with bans all the time. It's getting old. BTW where's your X1800 card, Mr ATI Insider?
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
You defeat yourself in your arguments, and you attack more than you present proof, so please, don't argue an invalid point.

ATi has never, ever, provided me a pre-release product, in the 5 years I've been dealing with them. To quote an email:

As soon as they are ready to ship you are on the list. :)

My rep was actually here last week, and we discussed a few different things, one of them being availability. I didn't, however, get a date for when I'd receive a card. Rest assured, however, that I will, and it won't cost me a dime. You really don't want to get into a fight with me when it comes to hardware availability, M0RPH. It's a waste of both our times. ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Ronin
You defeat yourself in your arguments, and you attack more than you present proof, so please, don't argue an invalid point.

ATi has never, ever, provided me a pre-release product, in the 5 years I've been dealing with them. To quote an email:

As soon as they are ready to ship you are on the list. :)

My rep was actually here last week, and we discussed a few different things, one of them being availability. I didn't, however, get a date for when I'd receive a card. Rest assured, however, that I will, and it won't cost me a dime. You really don't want to get into a fight with me when it comes to hardware availability, M0RPH. It's a waste of both our times. ;)

He just might be under Turtle hypnosis. Hey! It's possible. ;)

 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
I am going to have to sort of side with Morph here. Keys, it does seem you are fishing for problems. The XL chip is a piece of silicon that ATI says will clock to this. If it does not overclock by more, that does not make it defective.

BUT, as you state, quite correctly, if this puppy does not overclock and you are one of those folks that likes to overclock to the higher card rates, caveat emptor. Finding that out is important and it should be done (sorry Morph, but he is right that we should know.)

piddle - I added that word to see how many folks flame me without reading past the first two lines or paragraph - :)

 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Wow... I seem to recall saying that the X1800 could be a overclockers dream and then was called crazy because they taped it out 3 times and had a dual slot cooler that no way it could be overclocked. Just wait until the XT's come out.... We'll see 800 Mhz cores hit for sure.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Originally posted by: jasonja
Wow... I seem to recall saying that the X1800 could be a overclockers dream and then was called crazy because they taped it out 3 times and had a dual slot cooler that no way it could be overclocked. Just wait until the XT's come out.... We'll see 800 Mhz cores hit for sure.
Wait, I'm sorry...but you're trying to justify your initial remark, and your justification is that you just said it again? You don't make any sense. You were called crazy because expecting 800 mhz is utter foolishness. Expect 700 with modded coolers and voltage changes, but world famous overclockers only got 865 using LN2. From what we've seen, there's absolutely no reason to believe it will be an overclocker's dream.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: gsellis
I am going to have to sort of side with Morph here. Keys, it does seem you are fishing for problems. The XL chip is a piece of silicon that ATI says will clock to this. If it does not overclock by more, that does not make it defective.

BUT, as you state, quite correctly, if this puppy does not overclock and you are one of those folks that likes to overclock to the higher card rates, caveat emptor. Finding that out is important and it should be done (sorry Morph, but he is right that we should know.)

piddle - I added that word to see how many folks flame me without reading past the first two lines or paragraph - :)

Gsellis, I am not fishing for anything. This thing jumped in my boat and we are here trying to find out if it's a keeper or if we should throw it back. Nothing more, nothing less.

Your second paragraph hit it right on the head and I'm glad you got it.

Hehe,,, you said "piddle".... heheheh. ;)

 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: jasonja
Wow... I seem to recall saying that the X1800 could be a overclockers dream and then was called crazy because they taped it out 3 times and had a dual slot cooler that no way it could be overclocked. Just wait until the XT's come out.... We'll see 800 Mhz cores hit for sure.

Well if you say so. That's way more than good enough for me.

 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
Originally posted by: gunblade
Originally posted by: TStep
Here are my reasons for believing the OP:

-launch is 6 months late
-known fact that tapeouts were over clockspeed
-late launch spearheaded not by the flagship model, but by the XL
-near term availaility of XL, but not XT

my take:
-current stock of cores cannot make XT speed---->this is why XL will be the first card available
-binning cores right now for XT speed so there is sufficient quantity at time of availablity
-overclocking on the first releases will suck as all the XT speed capable cores are being hoarded in the short term.

hope I'm wrong, but the facts seem to point in that direction.

Exactly, what I thought.


I don't think ATI is binning anything. Those XLs were probably bad chips with the bad metal layers, but the new chips will probably be nice in terms of overclocking.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: munky
How come none of the reviewers did an OC test of the XL? We could have known this by now, or maybe they dont want anyone to know. But I also have my suspicions that the early XL's are actually failed XT's.
I think it's b/c no reviewer had a program that could OC the X1000s. Macci and those few had a custom version of an app that apparently no one was successful in coaxing out of them. :) I'm sure we'll see OCing scores in the second wave of articles, in a few weeks.