A 7900GTX at 700mhz/1800mhz always gets beaten by a X1900XTX at stock

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: BFG10K

Screeenshots are useless because the problem is only visible during movement; you can't see texture wiggling, shimmering and/or crawling in screenshots.

yes you can, the textures get warped or something, its hard to explain. ill post a screenshot in a bit.

edit: at least you can see if the AF quality is really bad, i guess it doesnt actually shimmer in the screenshot, but where there is bad AF quality there is shimmering :|
 

CP5670

Diamond Member
Jun 24, 2004
5,686
787
126
Yeah, in extreme cases it's quite apparent in screenshots too. You can notice heavy moire artifacts and even distortion/warping on any sort of high resolution grid-like texture, such as square floor tiles.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
I'm an nVidiot, and even I can tell the difference in Q & HQ. But as I have a CRT(NEC FE1250+(Diamondtron)), I don't see any shimmering in Q or HQ.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
pics!

okay, HERE is the HQ shot. notice how the floor texture isnt "uniform" and it kinda of goes in waves, and it changes its appearance as it gets farther away from you? thats what causes the slight shimmer that is in HQ mode (almost unnoticeable unless you're looking for it)

now HERES a high performance mode one with all optimizations turned on :Q

as you can see, its even more "wavey" than the HQ shot and shimmers like a b!tch :|
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
your mouseover thing isnt working :(

and its not HQ vs Q, theres not much difference there, the second one is high performance (the lowest quality setting). i just wanted to show what really bad AF looks like :Q

edit: HERE is a shot of Quality mode. not that great of a difference in the screenshot, but it did shimmer slightly more in game
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
I don't see diddly squat difference on the tiles between high quality and quality.

I see a difference with High Quality/Quality vs High Performance, however, I only can tell if I flick between the before and after images on different tabs.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
For me the difference is when I move around in the game I find that certain textures flicker or crawl with just quality on.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: schneiderguy
your mouseover thing isnt working :(

and its not HQ vs Q, theres not much difference there, the second one is high performance (the lowest quality setting). i just wanted to show what really bad AF looks like :Q

edit: HERE is a shot of Quality mode. not that great of a difference in the screenshot, but it did shimmer slightly more in game
Ah, ok. I had not tested the mouseover in IE. It doesn't appear to work in IE.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,405
8,585
126
this started out as a dumb thread, it might have changed into something worthwhile. but posting a review from march in july is dumb.
 

mylok

Senior member
Nov 1, 2004
265
0
0
it seems some people do not think there is a IQ difference between Q and HQ on nvidia drivers if so why does nvidia take a performance hit when switching to HQ and also why would they have a HQ setting? why not just call quality high quality?
maybe ATI should just downgrade the default settings to quality. I do hope more reveiw sites will take the quality settings into consideration in future reviews. who spends $500 on a video card to run anything other than the best IQ.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
AFAI remember when I had my GTX there was that whole debacle about shimmering (which actually was pretty bad in BF2) where you were told to raise the quality to "High Quality' to fix it and then of course everyone realize that wacked out like 10% or more performance.

It's not a dumb thread, and
posting a review from March isn't dumb when the point of the article is to show a newly released card with drivers not as mature trouncing a card with nearly a year-mature drivers OCed to 700/1800.

No other reviews of this type exist that I'm aware of.

You think nvidia released drivers in the last few months that made up a 10-25% performance deficit in nearly every game reviewed on a card with a significant overclock?

They use 6.3 ati drivers which are not that old either
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
1900xt does perform better than 7900GTX on specific settings( high AA/AF ), its old news. While 7900GTX perform better in OpenGL apps, (no AA/AF, high resolution.)
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Originally posted by: SolMiester
So, what are you guys trying to prove in all of this?, certainly sounds a bit anal to me, like someones going to notice 1-10fps different when you are over 50fps?!

Some people just prefer ATI and other NV.....Give it up!

To me, ATI is a follower!, NV the innovator. Lets face it, ATI only started getting into 3D performance once 3DFX finished. ATI have gone a different and complex route with there latest offerings and until unified shader technology with the next DX10/games, no-ones going to know who has got it right.

I'm not going to try and stuff my opinion down anyone esles throat.

ATI has done some things first, with nVidia following. It's not a cut and dried situation since sometimes both ATI and nVidia are developing similar features concurrently. At this point does it even matter who entered the market first? That's like saying that Sega is a follower and not an innovator because they weren't in the games market before Nintendo was. Never mind that Sega has created some fantastic, and innovative, games.

As for unified shader tech, you do realize that nVidia is going to go this route as well right? Basically ATI is the leader and nVidia the follower as far as unified shaders go. You even admit ATI is going a different route from nVidia so it's not all follow the leader with ATI. Give ATI credit for going the route they think is right. It may be that nVidia chose the currect path but as you said, until we see performance numbers, we don't know whether the R600 or G80 was the correct way to go to maximize performance in the current and near future games.


I am not a fan of Legit Reviews so I won't comment on what it does or does not show.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: BFG10K
yes you can, the textures get warped or something,
Screenshots are next to useless for showing the problem.

a screenshot with no texture warping will not shimmer in game. a screenshot with high amounts of texture warping will shimmer in game. the shimmer is caused by the texture warping, as you move the textures "move" because they keep warping, which is caused by bad AF quality *cough* nvidia *cough*, which creates the "shimmer" effect
 

videopho

Diamond Member
Apr 8, 2005
4,185
29
91
Originally posted by: ElFenix
this started out as a dumb thread, it might have changed into something worthwhile. but posting a review from march in july is dumb.

I concur...The difference in fps for the most parts is ~ 5% between the two cards one way or other. If someone can actually see the 5% differences in game please explain. Otherwise, enough is enough. What a waste!

 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: beggerking
1900xt does perform better than 7900GTX on specific settings( high AA/AF ), its old news. While 7900GTX perform better in OpenGL apps, (no AA/AF, high resolution.)

Yep I see it the exact same way. Nvidia has always been good with OpenGL. The 1900xt/x does a really good job with high settings. I believe both cards to a pretty good job at high resolution.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: videopho
Originally posted by: ElFenix
this started out as a dumb thread, it might have changed into something worthwhile. but posting a review from march in july is dumb.

I concur...The difference in fps for the most parts is ~ 5% between the two cards one way or other. If someone can actually see the 5% differences in game please explain. Otherwise, enough is enough. What a waste!

5%, you mean, 10-20%+ for a card already overclocked by quite a lot

 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,405
8,585
126
Originally posted by: beggerking
1900xt does perform better than 7900GTX on specific settings( high AA/AF ), its old news. While 7900GTX perform better in OpenGL apps, (no AA/AF, high resolution.)

who buys a 7900GTX and doesn't play with AA/AF?
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: ElFenix
Originally posted by: beggerking
1900xt does perform better than 7900GTX on specific settings( high AA/AF ), its old news. While 7900GTX perform better in OpenGL apps, (no AA/AF, high resolution.)

who buys a 7900GTX and doesn't play with AA/AF?

at high enough resolution AA become pointless.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: beggerking
Originally posted by: ElFenix
Originally posted by: beggerking
1900xt does perform better than 7900GTX on specific settings( high AA/AF ), its old news. While 7900GTX perform better in OpenGL apps, (no AA/AF, high resolution.)

who buys a 7900GTX and doesn't play with AA/AF?

at high enough resolution AA become pointless.

Bullshit, if you use high resolution its very likely youll have a big screen, which makes it so that AA is still needed... Now if you were gonna play 2048x1536 on a 17" monitor, sure AA wouldnt be needed

With a 24" or 30" screen I can bet even if you had 4096x3072 you would still notice jaggies
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: ShadowOfMyself
Originally posted by: beggerking
Originally posted by: ElFenix
Originally posted by: beggerking
1900xt does perform better than 7900GTX on specific settings( high AA/AF ), its old news. While 7900GTX perform better in OpenGL apps, (no AA/AF, high resolution.)

who buys a 7900GTX and doesn't play with AA/AF?

at high enough resolution AA become pointless.

Bullshit, if you use high resolution its very likely youll have a big screen, which makes it so that AA is still needed... Now if you were gonna play 2048x1536 on a 17" monitor, sure AA wouldnt be needed

With a 24" or 30" screen I can bet even if you had 4096x3072 you would still notice jaggies

well, how many ppl have 24" or 30" screens? for 20" or less, 1600x1200 above, AA isn't appearant.