Ati talks smack about Nvidia's 512mb gtx

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: Cookie Monster
X850 didnt have pure video or any sort of a video proscessor. They didnt have SLi aka no dual GPU feature. etc etc
X850 didnt have dual GPU feature? Thats news to me. :laugh:

Originally posted by: Cookie Monster
However, i only pointed out pure video and AVIVO because you gave me the impression that the 7 series vs X1 series was just like the X series vs 6 series in terms of feature set/perfomance. You see my point?
You are taking the comparision a little too literally. Ofcourse they cant have the same feature set advantages. Just pointing out that one ATI down played some features last generation and Nvidia did the same this generation.

Originally posted by: Cookie Monster
I havent discredited any of the features, i dont even know if X1800XT can do HDR+AA on a hardware bases.
I think you did discredit by saying AVIVO has poor compared to Purevideo when its not even fully functional. Remember H264 decoding, transcoding etc. are yet to come and I dont see Nvidia promising those features with Purevideo. (or did they?)

And yes X1800XT does HDR+AA. Playable frame-rates in 12x10. So much for people saying that it will be in 800x600 ... :D

Originally posted by: Cookie Monster
But what i am trying to say is both ATi and NV cards are feature rich. Its just that the 7 series feature set is similiar that to the 6 series, but for ATi its all brand new and shiny hence there are many hype and astonishment over its new features compared to the old X series. Just like the R520 performance hype.
R520 performance hype? You mean the Inq? :laugh: Well anybody who was interested in videocards and was following the developments knew that R520 would offer competitive performance (16 pipes vs 24 pipes) while delivering a better feature set.

I agree both cards are feature rich but like previous generation, one of them has an edge. How useful or beneficial that is solely depends on the end user.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Look did Ati have crossfire during those months against the 6 series? No. They started to have them NOW (You can actually find the X850 master cards that was promised a long time ago). Thats about one and a half years catching up. The X850 didnt have dual GPu solution back then for 1 and a half years.

Pure Video is a mystery to many people. It really is. A driver update activated the video processor but MANY do not really know what it is or what it does, all they know is that most pure video on the 6800 series has a dis functioning WMV9 decode acceleration (most just say its broken not working which is FUD).

Just done a little research.
From AT:
1) Hardware acceleration of Windows Media Video 9 and MPEG-2 decode

2) Spatial-Temporal Adaptive Per Pixel De-Interlacing (with 3-2 and 2-2 detection)

3) Everything previous NVIDIA GPUs have been able to do

For 7 series:
This time around, Purevideo has extended support for HD format acceleration. The 7800 GTX will now have support for spatial-temporal de-interlacing for HD content. This feature promises to make 1080i content look that much better on a PC.

-h.264 promised by NV through driver update.
-Pure Video processor is programmable hence the h.264

Huge list from a interview about pure video

High-Definition MPEG -2 and WMV Hardware Acceleration for smooth playback of all MPEG -2 and WMV HD video with minimal CPU usage so the PC is free to do other work.

High-Quality Hardware Scaling that delivers a clear, clean image when upscaling low-resolution video to HDTV resolutions up to 1080i and maintains image detail without any annoying flicker, even when scaling down high-definition video.

Spatial Temporal De-Interlacing smoothes DVD playback on progressive displays to deliver a crisp, clear picture that rivals high-end home theater systems.

Inverse Telecine (3:2 & 2:2 Pull-Down Correction) that eliminates errors introduced in the film-to- DVD conversion process, by recovering the original film format data.

Bad Edit Correction to clean up anomalies caused by improper editing. If a film is edited after it has been converted to video for distribution on DVD , the edits can disrupt the normal 3:2 pulldown cadence.

PureVideo uses advanced processing techniques to detect poor edits, recover the original content, and display perfect picture detail frame after frame for smooth, natural looking video.

Video Color Correction to compensate for the different color characteristics of RGB monitors and TV monitors. To compensate for these differences, the ProcAmp Color Controls let you apply color correction settings such as Brightness and Contrast to the video playback window.

HDTV Output to provides TV-out functionality, including composite, S-Video, Component, and DVI up to 1080i resolution (actual outputs vary by manufacturer.)

Pretty huge list.

Anyways, i would have to agree on two things so far:
One, X1800XT has a slight edge over feature set compared to the 7800GTX 512mb
Two, the 7800GTX512mb performs slightly faster than the X1800XT 512mb. About 10% most of the time with eye candy on.

 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: nts
Originally posted by: keysplayr2003
We conducted our own tests here on the very same screenie. This screenshot from H was discredited whether it was a mistake on their part or not.
linky?

Here is the link to the thread. It's VERY long and depending on the number of posts per page you have in your preferences, It's page 4 if you have 50 posts per page set.
Linky

And here is a quote from it.

---------------------
Originally posted by: keysplayr2003
Originally posted by: SPARTAN VI
Alright, my X800XL is no X1800XL, so the AF is no where near that sharp, but here we go! Don't mind the framerate, it drops when a screenshot is taken:

1280x960 4xAA 16xAF, No AdaptiveAA, settings maxed

1280x960 4xAA 16xAF, No AdaptiveAA, settings maxed, zoomed

1680x1050 4xAA 16xAF, AdaptiveAA enabled, settings maxed

I would've taken the last screenshot at 1280x960, but my PC is extremely unstable with the temporary 512mb of crap corsair value ram, so HL2 kept locking up whenever I tried to change the resolution.

And here is my GTX
1280x960 4xAA 16xAF, No TAA, settings maxed

Not for anything, but mine looks a HELL of a lot better than the shots HardOCP took.
It almost looks like they had the 7800's set to Bilinear Filtering.
The X1800XL looks like night and day in those shots. Your XL shots looks good but seems to get blurrier the further down it goes.



 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Originally posted by: Matt2
I strongly believe that Nvidia's IQ is higher than ATI's, this coming from observation and experience with both brands.

Also, ATI and the fanATIcs have no right to call the 7800GTX 512 or any cards in the 7800 series "old tech". That is just stupid to try and say that about a technology that has been destroying ATI's best offering for 6 months now.

Had X1800XT been launched alongside the 7800, then I can see that as a valid point. AS it stands now, ATI has lost most, if not all of its credibility with me.

You're high! Check out AF drop off nV does in links provided for you by nts (look down the bricks).. I also notice shimmering or what I'd call AA sectional dropoff as well. nV been doing this since 6800, nothing new to me but that does'nt stop me from using them or being able to see. It's not that big a deal to me and I really have to focus...I prefer nV's drivers like 400x more but I won't ignore poor IQ in an effort for topping the charts (realtive to a couple gens ago). Both these companies discust me really since ATI does same thing http://forums.anandtech.com/messageview...atid=31&threadid=1720840&enterthread=y

What smells worse? Dog *&$^ or cat *&$^?
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Originally posted by: keysplayr2003
Originally posted by: nts
Originally posted by: keysplayr2003
We conducted our own tests here on the very same screenie. This screenshot from H was discredited whether it was a mistake on their part or not.
linky?

Here is the link to the thread. It's VERY long and depending on the number of posts per page you have in your preferences, It's page 4 if you have 50 posts per page set.
Linky

And here is a quote from it.

---------------------
Originally posted by: keysplayr2003
Originally posted by: SPARTAN VI
Alright, my X800XL is no X1800XL, so the AF is no where near that sharp, but here we go! Don't mind the framerate, it drops when a screenshot is taken:

1280x960 4xAA 16xAF, No AdaptiveAA, settings maxed

1280x960 4xAA 16xAF, No AdaptiveAA, settings maxed, zoomed

1680x1050 4xAA 16xAF, AdaptiveAA enabled, settings maxed

I would've taken the last screenshot at 1280x960, but my PC is extremely unstable with the temporary 512mb of crap corsair value ram, so HL2 kept locking up whenever I tried to change the resolution.

And here is my GTX
1280x960 4xAA 16xAF, No TAA, settings maxed

Not for anything, but mine looks a HELL of a lot better than the shots HardOCP took.
It almost looks like they had the 7800's set to Bilinear Filtering.
The X1800XL looks like night and day in those shots. Your XL shots looks good but seems to get blurrier the further down it goes.

While nice, SPARTAN's attempt, fails to address the 1800 which has, according to Kyle "ATI?s new High Quality Anisotropic Filtering option"
 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: Cookie Monster
Look did Ati have crossfire during those months against the 6 series? No. They started to have them NOW (You can actually find the X850 master cards that was promised a long time ago). Thats about one and a half years catching up. The X850 didnt have dual GPu solution back then for 1 and a half years.
One and half years? Its not even 1 year since they launched X850, actual release date was Jab-Feb of this year.

I'm not even debating the video acceleration feature sets of either cards, I was just comparing the features related to the gaming aspect (SM3, HDR ...) only.

Originally posted by: Cookie Monster
Anyways, i would have to agree on two things so far:
One, X1800XT has a slight edge over feature set compared to the 7800GTX 512mb
Two, the 7800GTX512mb performs slightly faster than the X1800XT 512mb. About 10% most of the time with eye candy on.
Glad you agree. :)
 

GOREGRINDER

Senior member
Oct 31, 2005
382
0
0
what i think is funny throughout this whole thread is people that rant about cards they dont even own,....

started out talkin about the company,..and people turn it into "i love ati,. shut up you just shut up! :brokenheart: gtx sux" and the try to find pictures and what not and post em to the nvidia people in here,..well i see the nvidia people in here,..more than half of them have first hand experience with a gtx a 512gtx or SLi'd gtx's,..nobody said the x1800xt was a bad card,..just that the company is sad,and starting to reach,.....

so i think a good idea is,.... to do this when someone states a claim on a card,.....no sayin "well this one guy said,." or" this one website's bench,..",..or well i think the gtx sux,.but i have a radeon 9200" crap,....cuz frankly i dont see anyone here with an x1800 of any kind,.yet ATI people are buying video cards left and right,..so that must say what you guys really think,regardless what the post says..actions speak louder than words,...simple observation on the reality of it all,..so if you wanna talk somethin up,and talk other cards down...back it up,...post your cpu-z info memory,motherboard,.and cpu along with the gpu driver control panel open on the desktop as well,...if you dont have any cards from this gen and dont know,...then dont talk sh!t,..thats all im sayin,.and thats my opinion,...burn me or flame me i dont care thats my opinion on it and im stickin to it :p


 

nRollo

Banned
Jan 11, 2002
10,460
0
0
It is pretty funny.

ATI falls on hard times and very strange ways, lashing out like a cornered rat, and everyone who ever bought a 9700Pro comes out the woodwork "Leave teh ATIs alooooone!".

Next we'll all be treated to a slide show that says "SLI cause cavities- 4/5 dentists agree".
:laugh:
 

2kfire

Senior member
Nov 26, 2004
246
0
76
Had the R520 come out when it was supposed to, ATI would rule right now. Unfortunately, it didn't. Now, let's just forget about it just like everyone has forgotten about the FX-series and when the R580 comes out, let the war begin again. If you can't wait til then, buy an X1800 XT if you can find one, or buy a 7800 GTX 512 if you can afford one.
 

SparkyJJO

Lifer
May 16, 2002
13,357
7
81
for all those who can't find the x1800xt 512MB or the 7800gtx 512MB, well, newegg has no more 512MB 7800gtx cards but has the x1800xt 512MB cards in stock. This could be seen two ways, either because there is more supply of the x1800xt or there is more demand for the 7800gtx. But there are x1800xt's available.

The way I see it, nvidia will have the upper hand, then ATI will take control, then switch again, back and forth. Wasn't that long ago that nvidia underestimated the 9700pro and had their pathetic gforce fx line (in comparison to the 9700 and 9800 series). Now, ATI is having trouble keeping up with nvidia. It'll switch around again, and again, and again.

Personally, I don't really care. I'd prefer an ATI x800 over my current nvidia 6800 in some cases. Other times, I am glad I have nvidia.
 

GOREGRINDER

Senior member
Oct 31, 2005
382
0
0
Originally posted by: Rollo
It is pretty funny.



Next we'll all be treated to a slide show that says "SLI cause cavities- 4/5 dentists agree".
:laugh:


LOL,..yeah its like you can talk about caramel dutch warm apple pie ala'mode,..but until you8 actually taste it,.you dont have a clue,....id say 7800gtx's in SLi is no different ;)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: SparkyJJO
for all those who can't find the x1800xt 512MB or the 7800gtx 512MB, well, newegg has no more 512MB 7800gtx cards but has the x1800xt 512MB cards in stock. This could be seen two ways, either because there is more supply of the x1800xt or there is more demand for the 7800gtx. But there are x1800xt's available.

The way I see it, nvidia will have the upper hand, then ATI will take control, then switch again, back and forth. Wasn't that long ago that nvidia underestimated the 9700pro and had their pathetic gforce fx line (in comparison to the 9700 and 9800 series). Now, ATI is having trouble keeping up with nvidia. It'll switch around again, and again, and again.

Personally, I don't really care. I'd prefer an ATI x800 over my current nvidia 6800 in some cases. Other times, I am glad I have nvidia.

Actually, the 9700Pro/9800Pro is the only time ATI has ever been ahead of nVidia.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Originally posted by: Rollo
Originally posted by: SparkyJJO
for all those who can't find the x1800xt 512MB or the 7800gtx 512MB, well, newegg has no more 512MB 7800gtx cards but has the x1800xt 512MB cards in stock. This could be seen two ways, either because there is more supply of the x1800xt or there is more demand for the 7800gtx. But there are x1800xt's available.

The way I see it, nvidia will have the upper hand, then ATI will take control, then switch again, back and forth. Wasn't that long ago that nvidia underestimated the 9700pro and had their pathetic gforce fx line (in comparison to the 9700 and 9800 series). Now, ATI is having trouble keeping up with nvidia. It'll switch around again, and again, and again.

Personally, I don't really care. I'd prefer an ATI x800 over my current nvidia 6800 in some cases. Other times, I am glad I have nvidia.

Actually, the 9700Pro/9800Pro is the only time ATI has ever been ahead of nVidia.


Nah, you're wrong again.

The X800XT/XT PE & X850XT/XT PE > 6800 Ultra

Don't even start on SLI, since yes, if you want to compare SLI to a single card & consider that being ahead, you go right ahead & do that.
It is true that nVidia's technology in that sense was better, & featurewise, the 6800 series was slightly better, but in overall performance terms, they were behind.
Also, just to add: Don't even think about labelling me as a fanboi ;)
I had a Leadtek 6800GT that i loved...heck i liked it better than my X850XT PE...but i got my current 850 for less, & it performs better...so yeah.
 

GOREGRINDER

Senior member
Oct 31, 2005
382
0
0
Originally posted by: nts
Originally posted by: Rollo
Actually, the 9700Pro/9800Pro is the only time ATI has ever been ahead of nVidia.

X800/850 > 6800 ...


false,...your stateing an opinion of which one you like better,not which one ended up in more systems during that generation,not even counting SLi was released for that gen....thats it lets see your screenshot cough it up:p
 

nts

Senior member
Nov 10, 2005
279
0
0
Originally posted by: GOREGRINDER
false,...your stateing an opinion of which one you like better
No I stated which one was better (overall)
which one ended up in more systems during that generation,
Better != sold more... Would you consider the GF4MX good, the 5200 Good?

Show me some (official) numbers, not from NV/ATi.

not even counting SLi was released for that gen....thats it lets see your screenshot cough it up:p
Comparing dual card solution to single card ?

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: crazydingo
Originally posted by: Cookie Monster
Look did Ati have crossfire during those months against the 6 series? No. They started to have them NOW (You can actually find the X850 master cards that was promised a long time ago). Thats about one and a half years catching up. The X850 didnt have dual GPu solution back then for 1 and a half years.
One and half years? Its not even 1 year since they launched X850, actual release date was Jab-Feb of this year.

I'm not even debating the video acceleration feature sets of either cards, I was just comparing the features related to the gaming aspect (SM3, HDR ...) only.

Well, X850 series as in X800 as well, since it was just a speed increase plus some other little things. Im sure it was one and half years already. Ok the 680ultra was released in 14th april 2004, and the X800XT May4th 2004. And since X800 plus X850 didnt have dual GPU solution til now makes it one and a half years.

I want areview where they see which HDR offers better quality and performance. This is interesting me because HDR seems to well on the 7 series.
 

GOREGRINDER

Senior member
Oct 31, 2005
382
0
0
, but in overall performance terms, they were behind.

SLi is a performance term,...and whos behind?...wheres crossfire on every mid/high end card on the market?

Show me some (official) numbers, not from NV/ATi.

if they arent from nv/ati then they arent "official" numbers,..so wtf are you talking about?

your stateing an opinion of which one you like better,not which one ended up in more systems during that generation,not even counting SLi was released for that gen

thats what the actual quote should be,..so keep it in context ok:D

in the last gen,..nvidia released SLi,sm3.0,AND the most tweakable motherboard chipsets in history! all of which shaped the market for future generations,..including what ATI did in future generations as well

 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Bottom line this is a deplorable tacttic no matter the pathetic company who does it....You cannot compete and uyou get one upped repeatedly so you resort to PR spin and borderline mistruths....Unfortunatly it has just been more ATi of late. I guess I would be upset being second fiddle for so long.....
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: n7
Originally posted by: Rollo
Originally posted by: SparkyJJO
for all those who can't find the x1800xt 512MB or the 7800gtx 512MB, well, newegg has no more 512MB 7800gtx cards but has the x1800xt 512MB cards in stock. This could be seen two ways, either because there is more supply of the x1800xt or there is more demand for the 7800gtx. But there are x1800xt's available.

The way I see it, nvidia will have the upper hand, then ATI will take control, then switch again, back and forth. Wasn't that long ago that nvidia underestimated the 9700pro and had their pathetic gforce fx line (in comparison to the 9700 and 9800 series). Now, ATI is having trouble keeping up with nvidia. It'll switch around again, and again, and again.

Personally, I don't really care. I'd prefer an ATI x800 over my current nvidia 6800 in some cases. Other times, I am glad I have nvidia.

Actually, the 9700Pro/9800Pro is the only time ATI has ever been ahead of nVidia.


Nah, you're wrong again.

The X800XT/XT PE & X850XT/XT PE > 6800 Ultra

Don't even start on SLI, since yes, if you want to compare SLI to a single card & consider that being ahead, you go right ahead & do that.
It is true that nVidia's technology in that sense was better, & featurewise, the 6800 series was slightly better, but in overall performance terms, they were behind.
Also, just to add: Don't even think about labelling me as a fanboi ;)
I had a Leadtek 6800GT that i loved...heck i liked it better than my X850XT PE...but i got my current 850 for less, & it performs better...so yeah.

I'm comparing high end to high end, period.

6800U>X800XT PE
Why? Better OpenGL, better Linux, SM3, EXR HDR, soft stencil shadows. The only thing the X800 XT PE had on the 6800U is marginally better D3d performance.

6800GT SLI/6800U SLI > X850XTPE
Whether you guys like it or not, nVidia's top end in November 2004 was SLI, and ATIs was the X850XTPE. People can whine about how you can't compare single to double cards and BS like that all they want, but the fact is you could buy it, it was nVs top end, and it blew away anything ATI had at everything.
What the "that's not fair" people are probably forgetting is that nVidia made a business decision to make SLI their top end, and may well have refreshed the 6800U with a Quadro type cooler, faster speeds and more RAM if they had not opted to take the SLI route. (and thank God they did- 6800GT SLI stomped X850XT PEs flat- I would have been pissed off if I had to settle for that!)

So, for the year between 6800U launch and 7800GTX launch, the ONE thing ATI had on nVidia is a little faster D3d performance for a whole 5 months. Big whoop.

I think that is pretty much offset by nVidia giving developers a 32bit, fp16, SM3 GPU to work with a whole YEAR AND A HALF before ATI did.

That's right fellas- we'd have no soft shadows, no EXR HDR now and upcoming if it was up to the catch up team at ATI, and Unreal3 and Unreal Tournament 2007 would be MUCH different games. I heard this from Sweeney's own mouth, so there's no denying it.

If it were up to ATI you wouln't be seeing this stuff for another year or two, because they couldn't figure out how to make it back in April 2004.

 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Valid points, but my post was about performance, not the featureset (which i know was better on the 6800s, but which also means nothing to over 90% of the people using the cards).
Yes nVidia's linux support is better, but very few people gaming care about linux.
OpenGL games...well, there aren't alot of them, which is why i said overall performance with the X800XT/X850XT was better.

And as for your statement that a single card performance lead doesn't really matter, it does for those of us still using AGP, since it allows for us to stick with our systems longer than we could have. The X850XT PE still does well with a lot of games at 1600x1200 with AA & AF, something pretty much no other card from that gen can do.

The points you made are valid, but only for a very small percentage of users.
Most people really only care about performance, & bang for buck.
Of course, if i bring that up, then really no one cares about SLI or the X850s or 6800 Ultras. Then it becomes an X800XL vs. 6800GT argument...

Anyway, i don't like pointless arguing, so i don't even know why i bothered posting in this thread :confused:

The way i see it, each to their own...
 

nts

Senior member
Nov 10, 2005
279
0
0
Originally posted by: Rollo
I'm comparing high end to high end, period.

6800U>X800XT PE
Why? Better OpenGL, better Linux, SM3, EXR HDR, soft stencil shadows. The only thing the X800 XT PE had on the 6800U is marginally better D3d performance.

Agree with OGL performance, Agree with Linux. But for SM3 and HDR, the 6800 is a limited feature set with them. They aren't really all that usuable, sure you can write SM3 shaders that run faster than SM2 but go into more flow control and you choke the 6800 easily.

Soft stencil shadows, I see this is a moot point. Stencil shadows are dead, no new engine is going to be using them. Its going toward Shadow mapping or another texture based approach.

Why not comparing the X850 here aswell...

6800GT SLI/6800U SLI > X850XTPE
Whether you guys like it or not, nVidia's top end in November 2004 was SLI, and ATIs was the X850XTPE. People can whine about how you can't compare single to double cards and BS like that all they want, but the fact is you could buy it, it was nVs top end, and it blew away anything ATI had at everything.
What the "that's not fair" people are probably forgetting is that nVidia made a business decision to make SLI their top end, and may well have refreshed the 6800U with a Quadro type cooler, faster speeds and more RAM if they had not opted to take the SLI route. (and thank God they did- 6800GT SLI stomped X850XT PEs flat- I would have been pissed off if I had to settle for that!)

Ok if you want to go down that road then why not compare the cost aswell, SLI = 2x the cost for upto 50% more performance. IMO not worth it. What about the heat, the noise, the mobo, etc and the X800 are sounding like a much better deal.

I still dont understand how you can compare 2 cards against one, it doesn't make sense.

So, for the year between 6800U launch and 7800GTX launch, the ONE thing ATI had on nVidia is a little faster D3d performance for a whole 5 months. Big whoop.

Little faster? It was pretty significantly faster at a few points. How about Higher IQ, single slot (for X800), Less noise, Less power required...

I think that is pretty much offset by nVidia giving developers a 32bit, fp16, SM3 GPU to work with a whole YEAR AND A HALF before ATI did.

Yeah great, and how useable were they (basic stuff fine, advanced stuff = slideshow). It was simply a check box feature.

That's right fellas- we'd have no soft shadows, no EXR HDR now and upcoming if it was up to the catch up team at ATI, and Unreal3 and Unreal Tournament 2007 would be MUCH different games. I heard this from Sweeney's own mouth, so there's no denying it.

Soft shadows are not dependant on NVIDIA hardware, any hardware (GF3+) can do it...

UT3/UT2007? They aren't out yet...

btw post the quote from Tim.

If it were up to ATI you wouln't be seeing this stuff for another year or two, because they couldn't figure out how to make it back in April 2004.

Yeah keep pushing the couldn't do it argument, maybe you should find some original specs on the R400 chip...

If it were upto NVIDIA you'd be paying 500 dollars for an 5200 FX...