• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Ati talks smack about Nvidia's 512mb gtx

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Rollo
Do I need to go on?
Like you are going to listen to anybody and stop your flame-bait/trolling.
Originally posted by: jiffylube1024
Through all of your posts is a bizarre undercurrent of fanboyism that you don't want ATI to sell a single card. Why is this?
Isnt it obvious? So much for having a life ..
 
Everyone dissing Rollo right now has absolutely no right to do so.

fanATIcs and Nvidiots are equally dissapointing in this thread.
 
Man, I can't even stand to read the rest of the posts because ATI is the dumbest company ever apparently. I'm not sure about the volts on the 7800 GTX, but I know my GT is at 1.4v stock. It has already been pointed out that many companies offer lifetime warranties, but I'm also not sure how many "average gamers" are going to be getting a 512 MB 7800 GTX anyways. Most people that are going to be getting the 512 MB 7800 GTX know what they can do to improve the life of the product, at least a little. They talk about availability problems, yet nVidai themselves said it would be a limited release. Even funnier is that I've seen more 512 MB 7800 GTXs than I have the "massively produced" X1800XT. Not to mention the fact that you can't even see anything at the two "online retailers" that they show. The screens of the websites are too blurry to even see anything, but that doesnt' really matter if they say it's only at two places.

Then they want to go into the "using old technology" route. How the hell are they going to say that when they did the exact same thing between the 9xxx series and the Xxxx series. All they did was f*cking up the clock speeds. The IQ problems they talk about can be non-existant. I can't say for sure but I'd put money on the fact that they put ATI IQ as high as they could and left nVidia's at default. The default IQ for nVidia is quality, but I believe they have a couple of the optimizations on. Put them at the highest IQ you can get and they look the same. I think nVidia looks better, but that's just a personal preference.

The best part had to be when they say nVidia overestimated the performance of the 512 MB 7800 GTX. They come in saying that turning off optimizations and increasing the IQ lowers performance. WELL NO SH!T THERE. Who the f*ck would say it increases performance? ATI does the exact same thing. They run in "pure speed" mode, in other words as fast as it can go, all optimizations on and such. Not sure if you can do this on ATI drivers or not, but nVidia people can at least choose whether we want the optimizations on or not. I haven't used an ATI card in a while, so I don't know if that's true for ATI or not.

The AA thing they are talking about with Age of Empires 3 is just BS. I put mine at every AA level possible, thanks to nHancer for that, and it ran. Didn't run flawlessly, but I didn't expect 16x12 at 8xS AA to run very well to begin with, let alone at 16xS or 4x4 SS, even though at 4x4 it can only bet set to 1024x768.

All in all, ATI was just doing what every company does. Praise their product and diss the competition. At least put more than 2 facts behind your allegations if you're going to do it.
 
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.

Also, ATI and the fanATIcs have no right to call the 7800GTX 512 or any cards in the 7800 series "old tech". That is just stupid to try and say that about a technology that has been destroying ATI's best offering for 6 months now.

Had X1800XT been launched alongside the 7800, then I can see that as a valid point. AS it stands now, ATI has lost most, if not all of its credibility with me.
 
Originally posted by: Alexstarfire
All in all, ATI was just doing what every company does. Praise their product and diss the competition. At least put more than 2 facts behind your allegations if you're going to do it.

In other words all talk and no action. And as we all know talk is cheap. So there...lol

 
Originally posted by: Matt2
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.

Also, ATI and the fanATIcs have no right to call the 7800GTX 512 or any cards in the 7800 series "old tech". That is just stupid to try and say that about a technology that has been destroying ATI's best offering for 6 months now.

Had X1800XT been launched alongside the 7800, then I can see that as a valid point. AS it stands now, ATI has lost most, if not all of its credibility with me.

QFT.
 
Originally posted by: Alexstarfire
The IQ problems they talk about can be non-existant. I can't say for sure but I'd put money on the fact that they put ATI IQ as high as they could and left nVidia's at default. The default IQ for nVidia is quality, but I believe they have a couple of the optimizations on. Put them at the highest IQ you can get and they look the same. I think nVidia looks better, but that's just a personal preference.

I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.

The original screenshots are from hardOCP, both cards at their highest settings

linky

ATi IQ > NVIDIA IQ

btw the second thing they talked about was HDR+AA which is not possible on NVIDIA cards (some formats are, other [FP16] aren't technically)

 
IQ is really debatable, due to the fact xbitlabs and hothardware thinks AA on NV cards are better than ATi. They mention NV looks better on a LCD and ATi on a CRT. HOwever as LCDs are becoming the mainstream, i think this would favor NV in the IQ department as well as their ghosting removable feature.

It really comes down to preferences.

HDR+AA? Judging at how the X1800 series do HDR slower than the 7 series, and plus adding AA, i dont think you will be enjoying it very much with slow performance. However its a nice feature to have, and at 12x10 resolution i wouldnt mind using it compared to 16x12 AA AF.
 
The original screenshots are from hardOCP, both cards at their highest settings

linky

ATi IQ > NVIDIA IQ

btw the second thing they talked about was HDR+AA which is not possible on NVIDIA cards (some formats are, other [FP16] aren't technically)

is it me or does the x1800 not even render any links in the chainlink fence on the righthand side of the screenshot?

 
Originally posted by: GOREGRINDER
is it me or does the x1800 not even render any links in the chainlink fence on the righthand side of the screenshot?

I looks to me like the edges on the fence are cleaner in the ATI screenshot so they don't look as apparent as the nVidia shots but I'm not sure its tough to tell.
 
Originally posted by: GOREGRINDER
is it me or does the x1800 not even render any links in the chainlink fence on the righthand side of the screenshot?

It is rendered, just lighter then NVIDIA. Look at the brown overhead thing with the fence on it. NVIDIA renders black dots, ATi renders it lighter. This is kind of where CRT/LCD argument comes in (except with AA).

btw the above screenshot is only comparint AF quality, no AA is applied.

EDIT: linky to IQ comparisons for the hardOCP article.

From: hardOCP article
The first thing you?ll notice is that the color is much softer and lighter with the Radeon X1800 XL than the GeForce 7800 GT, which has a more harsh or darker anti-aliasing color.
 
Originally posted by: Matt2
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.
perception ! = reality

Originally posted by: Matt2
Also, ATI and the fanATIcs have no right to call the 7800GTX 512 or any cards in the 7800 series "old tech". That is just stupid to try and say that about a technology that has been destroying ATI's best offering for 6 months now. .
Its the same argument as comparing the X850XTPE to 6800 Ultra, X850 was faster across the board with less features.

 
same site, but with a standard 256mb 7800gtx and the much more expensive 512mb x1800xt
in this one look at the shadow on the back wall and the border around the opening in the floor
f.e.a.r.
well this one speaks for itself and is quite obvious
Serious Sam2


 
Originally posted by: crazydingo
Originally posted by: Matt2
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.
perception ! = reality

Originally posted by: Matt2
Also, ATI and the fanATIcs have no right to call the 7800GTX 512 or any cards in the 7800 series "old tech". That is just stupid to try and say that about a technology that has been destroying ATI's best offering for 6 months now. .
Its the same argument as comparing the X850XTPE to 6800 Ultra, X850 was faster across the board with less features.

Apart from angle independent AF (you can ONLY use this at HQ AF mode) and HDR+AA (which hasnt been proven to work) i dont see how the X1 series have more richer feature set than NV?

Pure video has better playback than AVIVO not to mention the pure video on the 7 series is 100% working. How about the NV feature that allows to get rid of ghosting? TSAA?
Digital vibrance?

From hothardware:

If you direct your attention to the water-tower and crane in the background of these images, the impact anti-aliasing has on image quality is readily apparent. In the "No AA" shots it seemed to us that the Radeon X850 XT Platinum Edition and Radeon X1800 XT had the lowest detail, and had the most prominent "jaggies." Look closely at the ladder on the water tower and you'll notice parts missing in the Radeon shots that are there on the GeForce 7800 GTX. With standard multi-sample 4X anti-aliasing enabled, though, it becomes much harder to discern any differences between the cards. The ladder in the background gets cleaned considerably, as do the cables on the crane. The same holds true when ATI's 6X MSAA and NVIDIA's 8xS AA is enabled, although in this comparison, we'd give an edge in image quality to NVIDIA, because the additional super-sampling applied by 8xS AA does a decent job of cleaning up edges of transparent textures.

However
Open up a standard 4X or 6X AA shot, and compare the trees and grass in the scene to either of the adaptive AA screens. You'll see a significant reduction in the prominence of jaggies. Overall, we were impressed with the images produced by ATI's Adaptive AA. The X1800 XT produced some of the best images we have seen on the PC to date.

Loads of screenshots at different settings.

However, with 8X anisotropic filtering enabled, the detail in the road is dramatically enhanced. If you open each of the standard shots individually and skip through them quickly, you're likely to notice a bit more detail in the shots taken with the GeForce 7800 GTX, disregarding artifacts produced by the JPG compression.

The same seemed to be true when inspecting the 16x aniso images. Of course, image quality analysis is objective by its nature, but based on these images, we think the GeForce 7800 GTX has the best image quality as it relates to anisotropic filtering when standard "optimized" aniso is used.

However
The new high-quality aniso mode offered by the X1000, applies nearly the same level of filtered regardless of the angle. Overall, the effect of enabling ATI's high-quality aniso mode is positive, as it does an even better job of sharpening texture and increasing the detail level. The fully appreciate ATI's high-quality aniso mode though, you've got to see it in action. Still screen shots don't convey the full effect.

Loads of screenshots of AF.

Xbitlabs:

I have to draw your attention to the fact that we haven?t found any real evidence pointing at the significant advantage of the enhanced AF mode over the standard AF mode. In other words, there is no big difference in the image quality of real games between the enhanced anisotropic filtering mode of the new RADEON X1800 XT and the standard anisotropic filtering of the new ATI solutions as well as of the other graphics cards.

As we can see from the screenshots, adaptive anti-aliasing of transparent textures works fine on RADEON X1000, however, the actual image quality improvement is not that significant, just like in case of alpha-textures multi-sampling by NVIDIA GeForce 7 (TMS, transparent multi-sampling). I have to stress that the Adaptive FSAA of the new RADEON X1000 is of much better quality than the similar mode by GeForce 7800 GTX, however it is still much lower than what the competitor?s TSS (transparent textures super-sampling) would provide.

I would also like to say that adaptive anti-aliasing of alpha textures by RADEON X1800 XT may sometimes lead to their complete removal. In fact, it could be a drive issue, because the anti-aliasing masks can be set on the software level for ATI RADEON solutions.

So, the laurels for the best FSAA quality, in at least certain cases, will remain with NVIDIA for now.
TASS vs AAA in many different modes.





 
Originally posted by: GOREGRINDER
same site, but with a standard 256mb 7800gtx and the much more expensive 512mb x1800xt
in this one look at the shadow on the back wall and the border around the opening in the floor
f.e.a.r.

This is caused by the sampling pattern if I am not mistaken. The sampling patterns are opposite on NVIDIA and ATi cards. View it from another direction (top,left to bottom,right for the angle) and the 7800 will exhibit the same behaviour.

well this one speaks for itself and is quite obvious
Serious Sam2

That is obviously a bug somewhere 🙂


 
Originally posted by: nts
Originally posted by: GOREGRINDER
same site, but with a standard 256mb 7800gtx and the much more expensive 512mb x1800xt
in this one look at the shadow on the back wall and the border around the opening in the floor
f.e.a.r.

This is caused by the sampling pattern if I am not mistaken. The sampling patterns are opposite on NVIDIA and ATi cards. View it from another direction (top,left to bottom,right for the angle) and the 7800 will exhibit the same behaviour.

well this one speaks for itself and is quite obvious
Serious Sam2

That is obviously a bug somewhere 🙂

yeah only when they enabled HDR



 
Originally posted by: GOREGRINDER
yeah only when they enabled HDR
If you are impling that all X1800XT HDR looks like that then, lol I don't think so 🙂

7800 exhibits the same blocky behaviour here (see reflection in window)
 
Originally posted by: nts
Originally posted by: GOREGRINDER
yeah only when they enabled HDR
If you are impling that all X1800XT HDR looks like that then, lol I don't think so 🙂

7800 exhibits the same blocky behaviour here (see reflection in window)


if it was the same,..dont you think HDR would have to be enabled?,...but thnx for the blocky 2d reflection pic without HDR ,...






 
Originally posted by: Matt2
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.

Also, ATI and the fanATIcs have no right to call the 7800GTX 512 or any cards in the 7800 series "old tech". That is just stupid to try and say that about a technology that has been destroying ATI's best offering for 6 months now.

Had X1800XT been launched alongside the 7800, then I can see that as a valid point. AS it stands now, ATI has lost most, if not all of its credibility with me.

So if a bunch of review sites say the x1800 has better IQ, and you say NV has better IQ, who's more credible here? You think flower AF is better than full AF? And what about the Nv trolls who keps calling the x850 cards old tech and primitive because they didnt support SM3, even though the x850xt pe killed the 6800u in almost every game, and now it turns out that Nv cards suck at dynamic branching, the holy grail of SM3. By that logic, the 7800 cards are old tech with inferior IQ and features. Some examples: flower AF, texture shimmering, inability to do AA with EXR HDR, blocky shadows in some games, and such.
 
Originally posted by: Cookie Monster
Originally posted by: crazydingo
Originally posted by: Matt2
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.
perception ! = reality

Originally posted by: Matt2
Also, ATI and the fanATIcs have no right to call the 7800GTX 512 or any cards in the 7800 series "old tech". That is just stupid to try and say that about a technology that has been destroying ATI's best offering for 6 months now. .
Its the same argument as comparing the X850XTPE to 6800 Ultra, X850 was faster across the board with less features.

Apart from angle independent AF (you can ONLY use this at HQ AF mode) and HDR+AA (which hasnt been proven to work) i dont see how the X1 series have more richer feature set than NV?

Pure video has better playback than AVIVO not to mention the pure video on the 7 series is 100% working.
<...>
Similarly apart from the SM3 feature set and HDR [both of which proved equally (or less) helpful/useful as HDR+AA, angle independent AF] the 6800 series didnt have much over the X850. Get my point? Taking away or discrediting the main feature sets of either series takes away all of their advantages over competing cards.

And who was even talking about Purevideo and AVIVO? AVIVO isnt completely finished yet, thankfully I'll wait before jumping to conclusions on that issue. :laugh:
 
Originally posted by: nts
Originally posted by: Alexstarfire
The IQ problems they talk about can be non-existant. I can't say for sure but I'd put money on the fact that they put ATI IQ as high as they could and left nVidia's at default. The default IQ for nVidia is quality, but I believe they have a couple of the optimizations on. Put them at the highest IQ you can get and they look the same. I think nVidia looks better, but that's just a personal preference.

I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.

The original screenshots are from hardOCP, both cards at their highest settings

linky

ATi IQ > NVIDIA IQ

btw the second thing they talked about was HDR+AA which is not possible on NVIDIA cards (some formats are, other [FP16] aren't technically)

We conducted our own tests here on the very same screenie. This screenshot from H was discredited whether it was a mistake on their part or not.

 
Originally posted by: keysplayr2003
We conducted our own tests here on the very same screenie. This screenshot from H was discredited whether it was a mistake on their part or not.
linky?
 
Sloppy ATi for reusing propoganda slides ( both the ones I mentioned have been used before).
Sloppy? So what do we call the person who didn't read the slides and instead jumped to conclusions?

They shouldn't state they are talking about the 512mb card then actually discuss the 256mb model should they?
Where did they state this on the slide regarding WHQL?
 
Originally posted by: crazydingo
Originally posted by: Cookie Monster
Originally posted by: crazydingo
Originally posted by: Matt2
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.
perception ! = reality

Originally posted by: Matt2
Also, ATI and the fanATIcs have no right to call the 7800GTX 512 or any cards in the 7800 series "old tech". That is just stupid to try and say that about a technology that has been destroying ATI's best offering for 6 months now. .
Its the same argument as comparing the X850XTPE to 6800 Ultra, X850 was faster across the board with less features.

Apart from angle independent AF (you can ONLY use this at HQ AF mode) and HDR+AA (which hasnt been proven to work) i dont see how the X1 series have more richer feature set than NV?

Pure video has better playback than AVIVO not to mention the pure video on the 7 series is 100% working.
<...>
Similarly apart from the SM3 feature set and HDR [both of which proved equally (or less) helpful/useful as HDR+AA, angle independent AF] the 6800 series didnt have much over the X850. Get my point? Taking away or discrediting the main feature sets of either series takes away all of their advantages over competing cards.

And who was even talking about Purevideo and AVIVO? AVIVO isnt completely finished yet, thankfully I'll wait before jumping to conclusions on that issue. :laugh:

X850 didnt have pure video or any sort of a video proscessor. They didnt have SLi aka no dual GPU feature. etc etc

However, i only pointed out pure video and AVIVO because you gave me the impression that the 7 series vs X1 series was just like the X series vs 6 series in terms of feature set/perfomance. You see my point?

I havent discredited any of the features, i dont even know if X1800XT can do HDR+AA on a hardware bases.

But what i am trying to say is both ATi and NV cards are feature rich. Its just that the 7 series feature set is similiar that to the 6 series, but for ATi its all brand new and shiny hence there are many hype and astonishment over its new features compared to the old X series. Just like the R520 performance hype.
 
Back
Top