• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

NV 7800 GTX, lowering the bar for IQ again?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: M0RPH
Originally posted by: Hacp
Originally posted by: Hacp
I think some people are a little bit too picky.................


:thumbsup:

Text
Nuff said. If you don't see it then you need glasses or you need to get a decent monitor. If it's acceptable to you then I guess you're just the type nVidia loves. Someone who's willing to take whatever garbage is fed to them with open mouth.

:roll:

Let's see: would I rather be gaming at 19X14 4X8X with some AF shimmer in some games at fast speeds, or settling for whatever blocky settings you burn your eyes with?

Errrr........ummmmmmm.........it is a tough call.....

LOL
 
The videos dont do it justice either. Its very noticable in person.

Is a deal breaker for me? No. I still like my GTX very much, its just a very annoying *little* problem. I dont notice it in some games, and then in some its very noticable. I notice it a lot in BF2 when in a vehicle, which is the game I play the most right now.
 
Originally posted by: Ackmed
The videos dont do it justice either. Its very noticable in person.

Is a deal breaker for me? No. I still like my GTX very much, its just a very annoying *little* problem. I dont notice it in some games, and then in some its very noticable. I notice it a lot in BF2 when in a vehicle, which is the game I play the most right now.

Brian Burke has stated they are working on it, I'm sure there will be resolution. If not, it may be one of the trades you make to own that card.

You've GOT to be loving that card though, Ackmed! 😉
 
Does this happen with the 6600gt? if so i haven't seen any thing like it.

And its strange, no one else mentioned that the 7800gtx has "bad" IQ...like reviews...or i could be wrong, i dunno.
 
Originally posted by: M0RPH
Originally posted by: wanderer27
Man, what am I missing ? I don't see it ?

They both look pretty darn good too me ;p

Did you read the guy's post where he tells you what to look for? It's not something that just jumps out at you but when you see it, it becomes obvious.

Look at the areas marked on this screenshot: Text

In the distance you will see the shimmering. The line on the front of the jeep just looks like crap. There's no need for video to see that one, it's obvious right in the screenshot: Text

Looking at these i expected to see something to the effect of what u see when the sun is in your eyes when ur trying to view something in rl or some crazy glare effect off of something. But after looking at the ati vs nv side by side, i see just a slightly different image. not a "wrong" or "right" image, just something different. They both seem acceptable to my eyes (then again a 9800p suits me well for my iq req) and i really cant say they are "cheating" here but it does seem that if the ati image is the same as the other nv cards, (non 7800) then they have some "issue" to say the least. Intentional? I dont think we can say that. Cheating? Even less so that we can declare this the definate intent. Also the reviewers didnt even notice this while reviewing/using this card so I assume that I am not alone on my opinion of this issue.

-Summary: it really isnt a big deal to me, it just seems different and its too early to say that they are "cheating" or even did this intentionally.
 
Originally posted by: Rollo
Originally posted by: Ackmed
The videos dont do it justice either. Its very noticable in person.

Is a deal breaker for me? No. I still like my GTX very much, its just a very annoying *little* problem. I dont notice it in some games, and then in some its very noticable. I notice it a lot in BF2 when in a vehicle, which is the game I play the most right now.

Brian Burke has stated they are working on it, I'm sure there will be resolution. If not, it may be one of the trades you make to own that card.

You've GOT to be loving that card though, Ackmed! 😉

Yeah I saw that quote. I dont put too much stock in it though. If they fix it, fantastic. If they do it "soon", even better.

Yeah Im loving it. Its the only single card out that can run my res the way I like it. If SOMEONE would just come out with an aftermarket cooler that puts the heat out, I would be very, very happy. That card, plus 4 sticks of ram at 3.5v, are very hot. I dont know which is hotter.. but are both too hot to touch.

 
Is a deal breaker for me? No. I still like my GTX very much, its just a very annoying *little* problem. I dont notice it in some games, and then in some its very noticable.

I think that about sums it up.
 
I've noticed the shimmering is worse with the 77.77 driver set. It also gave a "lagged" feel when playing games for me. Switched back to the 77.76 and the shimmering seems to be less pronounced and everything is nice and smooth. Going to wait on nVidia to release a fix for the shimmering before I change drivers again.
 
Originally posted by: wanderer27
Okay, after watching those areas on both Videos a few more times, I do see a little bit of difference.

In all honesty though, it really just looks like the ATI has AA on and the Nvidia card doesn't. The viewing angles are also a little bit different.

Quality is really good on both cards to me. I don't know that it's anything that's really going to affect which card I would get.

People just don't read anymore do they? From the post that I linked to:

"Even with HQ in the Nv control panel (which lowers performance with 25%, totally slaughtered by my X800XTPE performancewise) the shimmering stays awfull.

Ati - 4xAA/8xAF

Nvidia - 4xAA/8xAF (HQ!)"

The Nvidia clip was done in their "High Quality" mode!
 
Yawn, Morph are you calling me a fanboy?????? I have No bias towards ATI or Nvidia at all. I'm just saying that it is not noticable at all to me. Neither is AA or AF apparently (when actually playing games, not screenshots) ... I have bad eyes 🙁 . If ATI had this problem, and I saw those same exact pics, I would have said the same thing.
 
Originally posted by: M0RPH
Originally posted by: wanderer27
Okay, after watching those areas on both Videos a few more times, I do see a little bit of difference.

In all honesty though, it really just looks like the ATI has AA on and the Nvidia card doesn't. The viewing angles are also a little bit different.

Quality is really good on both cards to me. I don't know that it's anything that's really going to affect which card I would get.

People just don't read anymore do they? From the post that I linked to:

"Even with HQ in the Nv control panel (which lowers performance with 25%, totally slaughtered by my X800XTPE performancewise) the shimmering stays awfull.

Ati - 4xAA/8xAF

Nvidia - 4xAA/8xAF (HQ!)"

The Nvidia clip was done in their "High Quality" mode!

I think people just choose what they read better and who they reply to.

Yes, nVidia acknowledges an issue here and is working on it. No, most people don't seem to think it's as huge a deal as you, and everyone thinks it varies by game and situation.

I forget who it was, but there was an ATer that reported the same sort of things with ATIs AF "optomizations" when the R420 came out.

I think you're just trying to convince yourself you did the right thing not buying a 7800GTX while the rest of us are enjoying what it has to offer.

You think the R520 won't have any issues? LOL- they can't even build one to sell so far- what are the odds drivers will be perfect on rev 1 hardware that's been through four re-tapings?

Can you say "Not too good" Morph? I thought you could.
 
Rollo, I'll bet my bottom dollar that if Ati had the shimmering effect you would shun it like the plague. I do agree though that the issue isn't bad but definatley tells me that ATI Does have higher image quality. I'm really eager to see what the R520 has to bring to the table for the 7800. I'm neither pro ATI or NVIDIA, well maybe an lil' bit ATI, even though I have 6800GT. ATI hurry your arse up!!!!
 
Originally posted by: lavaheadache
Rollo, I'll bet my bottom dollar that if Ati had the shimmering effect you would shun it like the plague. I do agree though that the issue isn't bad but definatley tells me that ATI Does have higher image quality. I'm really eager to see what the R520 has to bring to the table for the 7800. I'm neither pro ATI or NVIDIA, well maybe an lil' bit ATI, even though I have 6800GT. ATI hurry your arse up!!!!

You'd lose that bet- I bought the X800XT PE before they changed the drivers to fix the shimmer? (and I knew about it?)
 
so youre saying if ATI had a shimmering problem you would say no biggie? I find that hard to believe, but you have proven to be an honest man in the past
 
Let's get something straight here. The videos I pointed out are just one small example of this issue that plagues Nvidia cards. The video shown was Nvidia HQ mode... the default quality mode is much worse, and this is of course the mode that all reviewer's benchmarks are based on. As some others have said, it can be better or worse depending on what game you're playing.

But how would you feel if you spend $400 on a video card and your favorite game is full of shimmering textures? The whole reason people buy $400 video cards is so they can turn up all the eye candy including full AF/AA. And why do we all want to play with AF/AA? Because we want not only high frame rates, we also want nice image quality. Well apparently Nvidia has decided that image quality is not so important, that it's more important to squeeze out a few more FPS in order to impress you in reviews and get you to buy their card. Suckers.
 
Originally posted by: lavaheadache
so youre saying if ATI had a shimmering problem you would say no biggie? I find that hard to believe, but you have proven to be an honest man in the past

I would list it as an item to consider when evaluating the card, no doubt. I wouldn't ignore it, and I'm not here either. I haven't noticed this gaming, and IMO opinion an IQ flaw you have to be told about isn't the end of the world.

When the ATI shimmering problem came about it was a bigger deal than this because ATI stated they had no optomizations (when they did) and went out of their way to ask reviewers to disable nVs when benching.

 
Its annoying that Nvidia is up to their little dirty tricks again by fiddling with quality settings on benchmarks.
In '03, they never admitted to cheating on Futuremark and stated Futuremark was purposely making Nvidia look bad. Then Nvidia re-partnered with Futuremark, paid its subscription and a more pliant Futuremark stated that Nvidia doesn't cheat but broke rules (what a case of double speak).
http://www.infoworld.com/article/03/06/03/HNnocheat_1.html
ATI admitted to cheating and released a statement saying they would never cheat again.
Fast forward to '04, Nvidia is secretly lowering performance settings again.
http://www.geek.com/news/geeknews/2004Mar/bga20040426024897.htm
With its checkered past, I'm not surprised that Nvidia is still continuing its policy of cheating on benchmarks.
I suppose the only way for a fair comparison is to lower AA/AF,mipmap,texture settings on an ATI card vs higher settings for an Nvidia card.

 
I stand corrected. I thought the LOD bias was supposed to fix that.

I think you are making a bigger issue than it is OP.

This is not lowering the bar for IQ by any means. It is merely a bug in the programming of a new chip. Deal with it!

-Kevin

Btw: As i was reading this thread and i saw Ackmed and Rollo and a bunch of people agreeing and what not, i just smiled 😛
 
ATI and Nvidia are both guilty of optimizing for specific benchmarks within the 3dMark series. Since then neither has "broken the rules". So im not sure what dirty little tricks you are talking about there.

As for 2004 i assume you mean the FX and the 6 series. Remember Nvidia was still using Shader Replacement, which in their case did lower IQ. However, by no means was it cheating.

I suppose the only way for a fair comparison is to lower AA/AF,mipmap,texture settings on an ATI card vs higher settings for an Nvidia card.

What in the hell are you talking about??

-Kevin
 
Originally posted by: Gamingphreek
I stand corrected. I thought the LOD bias was supposed to fix that.

I think you are making a bigger issue than it is OP.

This is not lowering the bar for IQ by any means. It is merely a bug in the programming of a new chip. Deal with it!

This isn't a new problem with the 7800. It's been an issue with the 6800 series cards as well, it just seems to have gotten even worse with the 7800. If you actually visit the threads posted by the OP and do some reading, you'll understand the issue better.

 
Originally posted by: M0RPH
Originally posted by: Gamingphreek
I stand corrected. I thought the LOD bias was supposed to fix that.

I think you are making a bigger issue than it is OP.

This is not lowering the bar for IQ by any means. It is merely a bug in the programming of a new chip. Deal with it!

This isn't a new problem with the 7800. It's been an issue with the 6800 series cards as well, it just seems to have gotten even worse with the 7800. If you actually visit the threads posted by the OP and do some reading, you'll understand the issue better.

The LOD bais fixed most of the problems with the 6800's. However as i said this cannot be considered lowering IQ. What, do you think Nvidia said, "If we make this image flicker we can get 10% more performance" or something!? No, it is merely a bug. I agree that i would like it fixed, however it isn't that big of a deal. Get over it!

-Kevin
 
Originally posted by: Gamingphreek
ATI and Nvidia are both guilty of optimizing for specific benchmarks within the 3dMark series. Since then neither has "broken the rules". So im not sure what dirty little tricks you are talking about there.

As for 2004 i assume you mean the FX and the 6 series. Remember Nvidia was still using Shader Replacement, which in their case did lower IQ. However, by no means was it cheating.

I suppose the only way for a fair comparison is to lower AA/AF,mipmap,texture settings on an ATI card vs higher settings for an Nvidia card.

What in the hell are you talking about??

-Kevin

I'm not sure what you are saying about shader replacements. I could be wrong. All I gather from the article is lowered mipmap levels and lower quality textures.

 
Nvidia used Shader Replacement in their Geforce FX and Geforce 6 series chips. In the FX case it did have a noticable impact on IQ.

As for the article, there is no way that the older cards have an advantage such as that. 8x to 16x and 8x just simply slaughters it.... umm how about no. Read reviews about the 7 series from any trusted reviewer, and you will see.

-Kevin
 
Back
Top