If you think the XT is a bad deal then why do you think the GTX512 is a good deal?Originally posted by: Rollo
...
If you think the XT is a bad deal then why do you think the GTX512 is a good deal?Originally posted by: Rollo
...
Like you are going to listen to anybody and stop your flame-bait/trolling.Originally posted by: Rollo
Do I need to go on?
Isnt it obvious? So much for having a life ..Originally posted by: jiffylube1024
Through all of your posts is a bizarre undercurrent of fanboyism that you don't want ATI to sell a single card. Why is this?
Originally posted by: Alexstarfire
All in all, ATI was just doing what every company does. Praise their product and diss the competition. At least put more than 2 facts behind your allegations if you're going to do it.
Originally posted by: Matt2
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.
Also, ATI and the fanATIcs have no right to call the 7800GTX 512 or any cards in the 7800 series "old tech". That is just stupid to try and say that about a technology that has been destroying ATI's best offering for 6 months now.
Had X1800XT been launched alongside the 7800, then I can see that as a valid point. AS it stands now, ATI has lost most, if not all of its credibility with me.
Originally posted by: Alexstarfire
The IQ problems they talk about can be non-existant. I can't say for sure but I'd put money on the fact that they put ATI IQ as high as they could and left nVidia's at default. The default IQ for nVidia is quality, but I believe they have a couple of the optimizations on. Put them at the highest IQ you can get and they look the same. I think nVidia looks better, but that's just a personal preference.
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.
The original screenshots are from hardOCP, both cards at their highest settings
linky
ATi IQ > NVIDIA IQ
btw the second thing they talked about was HDR+AA which is not possible on NVIDIA cards (some formats are, other [FP16] aren't technically)
Originally posted by: GOREGRINDER
is it me or does the x1800 not even render any links in the chainlink fence on the righthand side of the screenshot?
Originally posted by: GOREGRINDER
is it me or does the x1800 not even render any links in the chainlink fence on the righthand side of the screenshot?
From: hardOCP article
The first thing you?ll notice is that the color is much softer and lighter with the Radeon X1800 XL than the GeForce 7800 GT, which has a more harsh or darker anti-aliasing color.
perception ! = realityOriginally posted by: Matt2
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.
Its the same argument as comparing the X850XTPE to 6800 Ultra, X850 was faster across the board with less features.Originally posted by: Matt2
Also, ATI and the fanATIcs have no right to call the 7800GTX 512 or any cards in the 7800 series "old tech". That is just stupid to try and say that about a technology that has been destroying ATI's best offering for 6 months now. .
Originally posted by: crazydingo
perception ! = realityOriginally posted by: Matt2
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.
Its the same argument as comparing the X850XTPE to 6800 Ultra, X850 was faster across the board with less features.Originally posted by: Matt2
Also, ATI and the fanATIcs have no right to call the 7800GTX 512 or any cards in the 7800 series "old tech". That is just stupid to try and say that about a technology that has been destroying ATI's best offering for 6 months now. .
If you direct your attention to the water-tower and crane in the background of these images, the impact anti-aliasing has on image quality is readily apparent. In the "No AA" shots it seemed to us that the Radeon X850 XT Platinum Edition and Radeon X1800 XT had the lowest detail, and had the most prominent "jaggies." Look closely at the ladder on the water tower and you'll notice parts missing in the Radeon shots that are there on the GeForce 7800 GTX. With standard multi-sample 4X anti-aliasing enabled, though, it becomes much harder to discern any differences between the cards. The ladder in the background gets cleaned considerably, as do the cables on the crane. The same holds true when ATI's 6X MSAA and NVIDIA's 8xS AA is enabled, although in this comparison, we'd give an edge in image quality to NVIDIA, because the additional super-sampling applied by 8xS AA does a decent job of cleaning up edges of transparent textures.
However
Open up a standard 4X or 6X AA shot, and compare the trees and grass in the scene to either of the adaptive AA screens. You'll see a significant reduction in the prominence of jaggies. Overall, we were impressed with the images produced by ATI's Adaptive AA. The X1800 XT produced some of the best images we have seen on the PC to date.
Loads of screenshots at different settings.
However, with 8X anisotropic filtering enabled, the detail in the road is dramatically enhanced. If you open each of the standard shots individually and skip through them quickly, you're likely to notice a bit more detail in the shots taken with the GeForce 7800 GTX, disregarding artifacts produced by the JPG compression.
The same seemed to be true when inspecting the 16x aniso images. Of course, image quality analysis is objective by its nature, but based on these images, we think the GeForce 7800 GTX has the best image quality as it relates to anisotropic filtering when standard "optimized" aniso is used.
However
The new high-quality aniso mode offered by the X1000, applies nearly the same level of filtered regardless of the angle. Overall, the effect of enabling ATI's high-quality aniso mode is positive, as it does an even better job of sharpening texture and increasing the detail level. The fully appreciate ATI's high-quality aniso mode though, you've got to see it in action. Still screen shots don't convey the full effect.
Loads of screenshots of AF.
I have to draw your attention to the fact that we haven?t found any real evidence pointing at the significant advantage of the enhanced AF mode over the standard AF mode. In other words, there is no big difference in the image quality of real games between the enhanced anisotropic filtering mode of the new RADEON X1800 XT and the standard anisotropic filtering of the new ATI solutions as well as of the other graphics cards.
As we can see from the screenshots, adaptive anti-aliasing of transparent textures works fine on RADEON X1000, however, the actual image quality improvement is not that significant, just like in case of alpha-textures multi-sampling by NVIDIA GeForce 7 (TMS, transparent multi-sampling). I have to stress that the Adaptive FSAA of the new RADEON X1000 is of much better quality than the similar mode by GeForce 7800 GTX, however it is still much lower than what the competitor?s TSS (transparent textures super-sampling) would provide.
I would also like to say that adaptive anti-aliasing of alpha textures by RADEON X1800 XT may sometimes lead to their complete removal. In fact, it could be a drive issue, because the anti-aliasing masks can be set on the software level for ATI RADEON solutions.
So, the laurels for the best FSAA quality, in at least certain cases, will remain with NVIDIA for now.
TASS vs AAA in many different modes.
Originally posted by: GOREGRINDER
same site, but with a standard 256mb 7800gtx and the much more expensive 512mb x1800xt
in this one look at the shadow on the back wall and the border around the opening in the floor
f.e.a.r.
well this one speaks for itself and is quite obvious
Serious Sam2
Originally posted by: nts
Originally posted by: GOREGRINDER
same site, but with a standard 256mb 7800gtx and the much more expensive 512mb x1800xt
in this one look at the shadow on the back wall and the border around the opening in the floor
f.e.a.r.
This is caused by the sampling pattern if I am not mistaken. The sampling patterns are opposite on NVIDIA and ATi cards. View it from another direction (top,left to bottom,right for the angle) and the 7800 will exhibit the same behaviour.
well this one speaks for itself and is quite obvious
Serious Sam2
That is obviously a bug somewhere 🙂
If you are impling that all X1800XT HDR looks like that then, lol I don't think so 🙂Originally posted by: GOREGRINDER
yeah only when they enabled HDR
Originally posted by: nts
If you are impling that all X1800XT HDR looks like that then, lol I don't think so 🙂Originally posted by: GOREGRINDER
yeah only when they enabled HDR
7800 exhibits the same blocky behaviour here (see reflection in window)
Originally posted by: Matt2
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.
Also, ATI and the fanATIcs have no right to call the 7800GTX 512 or any cards in the 7800 series "old tech". That is just stupid to try and say that about a technology that has been destroying ATI's best offering for 6 months now.
Had X1800XT been launched alongside the 7800, then I can see that as a valid point. AS it stands now, ATI has lost most, if not all of its credibility with me.
Similarly apart from the SM3 feature set and HDR [both of which proved equally (or less) helpful/useful as HDR+AA, angle independent AF] the 6800 series didnt have much over the X850. Get my point? Taking away or discrediting the main feature sets of either series takes away all of their advantages over competing cards.Originally posted by: Cookie Monster
Originally posted by: crazydingo
perception ! = realityOriginally posted by: Matt2
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.
Its the same argument as comparing the X850XTPE to 6800 Ultra, X850 was faster across the board with less features.Originally posted by: Matt2
Also, ATI and the fanATIcs have no right to call the 7800GTX 512 or any cards in the 7800 series "old tech". That is just stupid to try and say that about a technology that has been destroying ATI's best offering for 6 months now. .
Apart from angle independent AF (you can ONLY use this at HQ AF mode) and HDR+AA (which hasnt been proven to work) i dont see how the X1 series have more richer feature set than NV?
Pure video has better playback than AVIVO not to mention the pure video on the 7 series is 100% working.
<...>
Originally posted by: nts
Originally posted by: Alexstarfire
The IQ problems they talk about can be non-existant. I can't say for sure but I'd put money on the fact that they put ATI IQ as high as they could and left nVidia's at default. The default IQ for nVidia is quality, but I believe they have a couple of the optimizations on. Put them at the highest IQ you can get and they look the same. I think nVidia looks better, but that's just a personal preference.
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.
The original screenshots are from hardOCP, both cards at their highest settings
linky
ATi IQ > NVIDIA IQ
btw the second thing they talked about was HDR+AA which is not possible on NVIDIA cards (some formats are, other [FP16] aren't technically)
linky?Originally posted by: keysplayr2003
We conducted our own tests here on the very same screenie. This screenshot from H was discredited whether it was a mistake on their part or not.
Sloppy? So what do we call the person who didn't read the slides and instead jumped to conclusions?Sloppy ATi for reusing propoganda slides ( both the ones I mentioned have been used before).
Where did they state this on the slide regarding WHQL?They shouldn't state they are talking about the 512mb card then actually discuss the 256mb model should they?
Originally posted by: crazydingo
Similarly apart from the SM3 feature set and HDR [both of which proved equally (or less) helpful/useful as HDR+AA, angle independent AF] the 6800 series didnt have much over the X850. Get my point? Taking away or discrediting the main feature sets of either series takes away all of their advantages over competing cards.Originally posted by: Cookie Monster
Originally posted by: crazydingo
perception ! = realityOriginally posted by: Matt2
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.
Its the same argument as comparing the X850XTPE to 6800 Ultra, X850 was faster across the board with less features.Originally posted by: Matt2
Also, ATI and the fanATIcs have no right to call the 7800GTX 512 or any cards in the 7800 series "old tech". That is just stupid to try and say that about a technology that has been destroying ATI's best offering for 6 months now. .
Apart from angle independent AF (you can ONLY use this at HQ AF mode) and HDR+AA (which hasnt been proven to work) i dont see how the X1 series have more richer feature set than NV?
Pure video has better playback than AVIVO not to mention the pure video on the 7 series is 100% working.
<...>
And who was even talking about Purevideo and AVIVO? AVIVO isnt completely finished yet, thankfully I'll wait before jumping to conclusions on that issue. :laugh: