• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

NV 7800 GTX, lowering the bar for IQ again?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: TheSnowman
We first got anisotropic filtering with the NV10 and ATI was playing catch-up with the AF quality until the NV30 came out and droped the bar. So no, Nvidia hasn't been folowing ATI here, they just gave up on leading.

Sorry, wrong again:
http://www.anandtech.com/showdoc.aspx?i=1821&p=14
The transitions between mip bands is smoother on the NVIDIA setting.
Finally, NVIDIA's claim that their "performance" mode offers equal to or greater quality than ATI's is actually true.[/L]
(performance on 5900/5800)

Both ATI and NVIDIA's quality modes are virtually identical, you'd be hard pressed to find a difference between the two.
(quality on 5900/5800)

It's a pretty well known fact that the 5800s AF had driver issues for 3-4 months, it was a new core after all. The issues were fixed, both IQ and fps increased.
 
Wrong again? Nvidia always had the best AF quality though their NV25 which held the bar, and then they dropped it with their NV30. That is what I said in what you quoted and you posted an article comparing the NV35 to the r350 which obviously has absolutely no bearing on what I said. This reminds me of the last time you said I was wrong about the Chaos Theory patch, maybe next time you feel like telling me I am wrong you might check to see if maybe you just misunderstood first?
 
#1 It was anouced over a year ago and vsync didn't work then and it doesn't work now.

Funny just because it was announced a year ago doesn't mean it was released. The AT article did not come out until late november. I dont know what you are talking about.

#2 What the hell are you talking about color accuracy for? We are talking about filtering here and there are many different levels of filtering accuracy such as bilinear, trilinear and various forms of anisotropy.

You are talking about accuracy though. YOu mean the different levels of filtering i guess. Still, both companies look just as good as the other.

#3 If you acutally took the time to comprehend even just the orignal post, you would have already seen that your suggestion had been tried long ago.

Excuse me for even suggesting something :roll:

#4 See my responce to Ben; and again, you won't see as long as you don't want to, but just becuase you want to hold your hands over your eyes doesn't mean the rest of us are going to.

I could care less who has better accuracy right now. Notice i am running the [sarcasm]king of IQ[/sarcasm] 5900XT.

As i have said numerous times, this is a bug; not a cheap optimization.

-Kevin
 
Originally posted by: BenSkywalker
No acutally the one releasing new cards with even worse filtering quality is the one that is most guilty right now.

Are you talking about ATi or nVidia? The R300 core parts are superior to the R420 parts(which is a very, very sad statement knowing how bad the R300 parts are). Personally I'm pretty p!ssed at both ATi and nV. ATi for starting the entire 'let's kill IQ for the sake of benches' mentality and nV for following along too d@mn willingly. No matter how much money you spend right now you can't even get a decent low/mid tier part with decent filtering- that is pathetic.


ATI's filtering looks pretty darn decent to me (x800 series), at least when compared to nVidia.
 
Originally posted by: M0RPH

"Even with HQ in the Nv control panel (which lowers performance with 25%, totally slaughtered by my X800XTPE performancewise) the shimmering stays awfull.

Ati - 4xAA/8xAF

Nvidia - 4xAA/8xAF (HQ!)"

The Nvidia clip was done in their "High Quality" mode!

Wait a second, are you saying that your ATI X800XT PE absolutely slaughtered the 7800GTX? Even with NV's HQ? hmm....What did you do, set YOUR IQ to "performance"?
 
Originally posted by: TheSnowman
Wrong again? Nvidia always had the best AF quality though their NV25 which held the bar, and then they dropped it with their NV30. That is what I said in what you quoted and you posted an article comparing the NV35 to the r350 which obviously has absolutely no bearing on what I said. This reminds me of the last time you said I was wrong about the Chaos Theory patch, maybe next time you feel like telling me I am wrong you might check to see if maybe you just misunderstood first?

The thing is, the AF fixes for the nV35 fixed it for the nV30 as well IIRC. You're right they dropped AF quality for 3-4 months, but they did fix it with a later driver revision. The article I linked to had a nice explanation of the driver fix. No offense intended.
 
But again, they never brought the AF back to what it was on the Geforce4 and they simply can't becuase of the changes they made at the hardware level. Drivers have been all over the place for both ATI and Nvidia but that is a whole other story.
 
The english version of the article is out ? 3dcenter (Thanks munky, I put a link in the OP)

Originally posted by: Gamingphreek

Well they both have equally tainted paths. All the reviews so far has put all IQ on the exact same level sometimes with the nod going to either side. So i dont see how one could be filtering worse than the other.
No they don?t have equally tainted paths. The problem is that most the hardware sites did very little (if anything) in the area of IQ comparisons. They usually they just checked a couple of shots to compare the AA differences. And not all games will have a problem with texture aliasing either. They probably also followed Nvidia?s benchmarking suggestions. ?

Nvidia writes ?

Quantitative image quality analysis demonstrates that the NVIDIA "Quality" setting produces superior image fidelity to competitive solutions therefore "High Quality" mode is not recommended for benchmarking.

3dcenter.org responds ?

However, we also cannot confirm this: The anisotropic filter of the G70 chips does also show shimmering textures under "High Quality", the current Radeon cards do not flicker even with "A. I." optimizations enabled. We are really interested in to know more about how Nvidias "quantitative image quality analysis" examines the image quality.

Forgive me if I?m a little skeptical that this is a ?bug,?. The problem has been known about for over a year -- so why isn?t it fixed? If it was a ?bug?, then why did NV reduce the quality on the 7800 with their new optimizations over what IQ the 6800 had? NV obviously added new AF optimizations for the 7800. If those opts. don?t speed things up (but reduce IQ) why put them in there in the first place? Why get reduced IQ with no speed up?

3dcenter initially ? ?speculated? ? that the texture aliasing may be from underfiltering (but subsequently said the matter is more complicated and may be a driver bug). If the problem is from underfiltering then the only way to solve the problem is to do more filtering = more work = slower performance. Or find a better way to do optimizations like ATI. I guess we will see if NV can fix the problem with no drop in performance.

Also, ?

The GeForce 6800 (or GeForce 6600) has to be configured to use "High Quality" to circumvent texture shimmering as much as possible. With the 7800, this seems to be useless; even when using "High Quality", the new chip tends to texture shimmering. The old Nvidia GeForce FX shows nearly perfect textures, though.

ATI's Radeon X800, even when using standard settings, seems to be far superior to any GeForce 6800 or 7800 already. There are areas which tend to flicker faintly, but altogether, only the angle-dependant AF reduction in the tunnel is distracting. The GeForce 7800's "High Quality" quality is clearly surpassed.

That means: All benchmarks using standard settings, no matter if GeForce 7800 or 6800, versus a Radeon, are wrong: Nvidia offers, at this time, the by far worse AF quality. Radeon standard settings are better (speaking in terms of image quality) than 6800 standard settings, whilst the 7800's standard settings are even worse.

 
From personal experience, ATi filtering with or without opts is better than the 7800 series filtering. Maybe once nvidia releases an update that fixes the shimmering for the 7800 it may end up being on par or close to the ATi cards.
 
However, we also cannot confirm this: The anisotropic filter of the G70 chips does also show shimmering textures under "High Quality", the current Radeon cards do not flicker even with "A. I." optimizations enabled. We are really interested in to know more about how Nvidias "quantitative image quality analysis" examines the image quality.
ATI default IQ shimmers, what is this guy talking about. Although, it's better than 6800 default IQ, it's not better than High Quality.
 
Originally posted by: VIAN
However, we also cannot confirm this: The anisotropic filter of the G70 chips does also show shimmering textures under "High Quality", the current Radeon cards do not flicker even with "A. I." optimizations enabled. We are really interested in to know more about how Nvidias "quantitative image quality analysis" examines the image quality.
ATI default IQ shimmers, what is this guy talking about. Although, it's better than 6800 default IQ, it's not better than High Quality.


He's saying that when you switch to HQ mode with a 7800 it doesn't turn off all the opts like it should.
 
What advantage do you have of 16x AF if you get 2x AF at maximum at certain angles only, are exposed to texture shimmering while other cards provide flicker-free textures? All benchmarks using the standard setting for NV40 and G70 against the Radeon are invalid, because the Nvidia cards are using general undersampling which can (and does) result in texture shimmering. We know and love GeForce 7 for the Tranparency Antialiasing and the high-performance implementation of SM3 features, but the GeForce 7 series cannot be configured to deliver flicker-free AF textures while the GeForce 6 series and the Radeon series can (of course) render flicker-free AF quality.
Love what it says here. If you read the discriptions under the videos section, it proves what I've been saying all along.

He's saying that when you switch to HQ mode with a 7800 it doesn't turn off all the opts like it should.
ATI does the same thing. And it has to stop.
 
Yeah, the thing is they can't make it stop as they have "optimsations" built into the hardware. I'm crossing my fingers that the r520 raises the bar for filtering, but I'm not holding my breath.
 
me too, come on ATI, break away from your history of lowering the bar on filtering.

Although the R300 filtering is crappier, it's difficult to argue that it was a nice step as AF was now brought to the mainstream. However there should always be able to get the best image quality possible by disabling optimizations that suck.
 
The thing is, ATI improved their AF greatly with the r300. Sure it wasn't up to par with the Geforce4 but it was a damn big step up from what ATI had on the r200. ATI only lowered their filtering quality at the hardware level once with the rv350, and that was at the same time Nvidia was going nhialisitc with their FX line and the "brilinear" driver crap.
 
I personally have no idea if this PERTICULAR shimmering is a result of non-working filtering or if it is in fact a bug. Then again.... neither do any of you (no offense). Why don't we just declare the shimmer up for review and wait for the next driver release and bench them both to see if framerate decreased and IQ increased. Just stop acting like you personally know exactly what has caused this shimmering and if it was/wasn't Nv's intent to have it in the code for their benefit.

-Sorry if i have offended any of you, but these seem to be the facts. Time will reveal all.
 
Originally posted by: Marsumane
I personally have no idea if this PERTICULAR shimmering is a result of non-working filtering or if it is in fact a bug. Then again.... neither do any of you (no offense). Why don't we just declare the shimmer up for review and wait for the next driver release and bench them both to see if framerate decreased and IQ increased. Just stop acting like you personally know exactly what has caused this shimmering and if it was/wasn't Nv's intent to have it in the code for their benefit.

-Sorry if i have offended any of you, but these seem to be the facts. Time will reveal all.

Sounds good to me 🙂
 
Originally posted by: Acanthus
the fx line got almost a 50% performance increase across the board from release to the latest drivers...

Link or conformation please? That would be awesome. This could be just the boost I need to finally get playable framerates at 12x10 and 1400x1050 in HL2!
 
Originally posted by: ssvegeta1010
Originally posted by: Acanthus
the fx line got almost a 50% performance increase across the board from release to the latest drivers...

Link or conformation please? That would be awesome. This could be just the boost I need to finally get playable framerates at 12x10 and 1400x1050 in HL2!

They aren't getting the boost. They gradually recieved the boost lol. As long as you are running somewhere around the Release 70 series, you already have the boost. lol.

-Kevin
 
Originally posted by: ssvegeta1010
Originally posted by: Acanthus
the fx line got almost a 50% performance increase across the board from release to the latest drivers...

Link or conformation please? That would be awesome. This could be just the boost I need to finally get playable framerates at 12x10 and 1400x1050 in HL2!

I worded that kind of funny, i meant from the release of the FX line, to the final drivers that updated code for the FX series.
 
Back
Top