FiringSquad does image quality comparisons...again

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: ronnn
Originally posted by: schneiderguy
Originally posted by: ronnn
A review that shows static screen shots of ancient games

between HL2 and CoD2 they have over 50,000 players playing at one time. i guess you would rather have them test a game like FEAR that has 1/50th of the players? :confused:


Most of them are 12 year olds playing on the antique in the rec room. Are you guys really trying to deny that ati has superior iq? Just wait for the g80, default settings will be good and the tune will change. :music:

edit: Actually just bothered with your link, I see that half life (by that I assume counterstrike) is the big winner. Wow lets have an article that compares iq of counterstrike between a geforce 32mx and a x1950xtx! Should be entertaining.

no i dont deny ATI has superior iq

but why buy a $500 video card if you dont play games? :confused: CS and CSS are by far the most popular games, "old" or not. a LOT of my friends have very up to date machines and still play CS 1.6.. why? its a hell of a lot more fun than any up to date game with shiny graphics (ie FEAR, etc.)

 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
And why buy a video card that has lower IQ, when both will play those games at great frame rates, as well as supply extra features for other games?
 

nib95

Senior member
Jan 31, 2006
997
0
0
Tbh I'm considering picking up an X1950 XTX, as my SLI 7900 GT's give me banding issues :(
 

nib95

Senior member
Jan 31, 2006
997
0
0
Originally posted by: gersson
Originally posted by: nib95
Tbh I'm considering picking up an X1950 XTX, as my SLI 7900 GT's give me banding issues :(

I think that's your monitors fault.


That is the obvious assumption.
But I am pretty sure I didnt get the same banding problems from my X1900 XTX.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Wreckage
So the IQ between the two companies is just about equal? I guess this should remove the whole IQ FUD from the arguments we all love to have here. :p

Although somehow I doubt it. :confused:

Most of the IQ arguments here are dealing with the driver IQ settings and how they relate to frames.

Nvidia's AA is better when it's not taking as big as a hit as ATI's AA. While it is heavier, the feature to use 8xSAA is generally discarded when enabling "High Quality" in the driver along with it. Much is the same with ATI's 6xAA with Quality Adaptive AA. The transparency AA with ATI doesn't match Nvidia's TrAA, especially when using AA along side it. In some cases, I found it to bring as much as a 25% decrease in performance on my X1900XTX.

With AF, Nvidia's is only inferior due to it's inability to be angle independent. That along side with ATI's more vibrant and/or warm colors gives them the edge in the AF.

Overall I think it's time for some new tech to hit. I'm well tired of these constantly reoccuring debates about tech that has been around as long as the 7 series. (The X1k series is starting to get very old as well.)
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: josh6079
Originally posted by: Wreckage
So the IQ between the two companies is just about equal? I guess this should remove the whole IQ FUD from the arguments we all love to have here. :p

Although somehow I doubt it. :confused:

Most of the IQ arguments here are dealing with the driver IQ settings and how they relate to the screen.

Nvidia's AA is better when it's not taking as big as a hit as ATI's AA. While it is heavier, the feature to use 8xSAA is generally discarded when enabling "High Quality" in the driver along with it. Much is the same with ATI's 6xAA with Quality Adaptive AA. The transparency AA with ATI doesn't match Nvidia's TrAA, especially when using AA along side it. In some cases, I found it to bring as much as a 25% decrease in performance on my X1900XTX.

With AF, Nvidia's is only inferior due to it's inability to be angle independent. That along side with ATI's more vibrant and/or warm colors gives them the edge in the AF.

Overall I think it's time for some new tech to hit. I'm well tired of these constantly reoccuring debates about tech that has been around as long as the 7 series. (The X1k series is starting to get very old as well.)

Sorry sir, we're getting right on to that!

*rushes off to make josh his very own new tech video card*
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Sorry sir, we're getting right on to that!

*rushes off to make josh his very own new tech video card*
As long as you don't paper launch it, and it brings something new to the screen I'll buy. Can you have it out before December?
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: josh6079
Sorry sir, we're getting right on to that!

*rushes off to make josh his very own new tech video card*
As long as you don't paper launch it, and it brings something new to the screen I'll buy. Can you have it out before December?

Better still, you can set your own launch date sir!
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: ronnn
And why buy a video card that has lower IQ, when both will play those games at great frame rates, as well as supply extra features for other games?


Can you possibly stop addressing questions with more questions? It really leads to nowhere.

Furthermore, the FS article just solidifies my theory that the reason Nvidia takes such a large performance hit when quality settings go up, is exactly that, quality. Nvidia renders everything. You see, everything. You can see in the magnified shots that ATI does not render everything as nvidia does. This would greatly explain nvidia performance hits when image quality setting are ramped up. ATI does not seem to render every single detail as nvidia does. And of course, the details is what makes up the entire scene being rendered. I'm just going to throw hypothetical numbers out here, so don't start an uproar sounding like "Where did you get those numbers from!!! Where's your proof!!! Nvidiot!!!!"

I say again, hypothetical.

Nvidia renders 100% of detail for unlimited distance in a scene and ATI renders 85% of detail for limited distance. Then crank up the resolutions and IQ settings. It is exponentially more work for the NV card than the ATI card to maintain as high framerates because it renders everything at unlimited distance. ATI cuts the corners and can maintain a higher framerate because it does not render the little details that go unnoticed until magnified as FS did.

This explains a lot for me. ATI fans can kick and scream over this and suddenly say FS is paid off, but I don't believe so.

After seeing this article, it just reinforces my opinion on this. You guys are entitled to yours as I am mine. But no amount of wordgames will change my mind after this. Enjoy.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: josh6079
Sorry sir, we're getting right on to that!

*rushes off to make josh his very own new tech video card*
As long as you don't paper launch it, and it brings something new to the screen I'll buy. Can you have it out before December?

If the rumors are true, you can get a G80 by then.

R600 I heard was December or even 1st qtr 07. Maybe they will "Launch" it in December and then you can "buy" it next year.
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
Originally posted by: Wreckage
Originally posted by: josh6079
Sorry sir, we're getting right on to that!

*rushes off to make josh his very own new tech video card*
As long as you don't paper launch it, and it brings something new to the screen I'll buy. Can you have it out before December?

If the rumors are true, you can get a G80 by then.

R600 I heard was December or even 1st qtr 07. Maybe they will "Launch" it in December and then you can "buy" it next year.

Funny you say that cos the same can be true for Nvidia: "Launches" October and available November :p
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Yeah, wasn't the G80 supposed to be out in August? Instead the 7950GT looks to be the only thing in sight for at least the immediate future from Nvidia.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: keysplayr2003
Originally posted by: ronnn
And why buy a video card that has lower IQ, when both will play those games at great frame rates, as well as supply extra features for other games?


Can you possibly stop addressing questions with more questions? It really leads to nowhere.

Have a link to support this, or just a case of shooting the messenger?

I say again, hypothetical.

Nvidia renders 100% of detail for unlimited distance in a scene and ATI renders 85% of detail for limited distance. Then crank up the resolutions and IQ settings. It is exponentially more work for the NV card than the ATI card to maintain as high framerates because it renders everything at unlimited distance. ATI cuts the corners and can maintain a higher framerate because it does not render the little details that go unnoticed until magnified as FS did.

Hl2 is the only game I know of (not saying there are not others) that ati has a shorter draw distance. I don't know if that is a driver issue or if valve has to change the in game settings for ati cards. I do agree that both companies cut corners to maintain high fps. It just appears that nvidia is cutting more corners and is being caught out at this time. At least as far as default settings go.


This explains a lot for me. ATI fans can kick and scream over this and suddenly say FS is paid off, but I don't believe so.

Not sure who is kicking and screaming (link?) ;) I would say the majority of review sites have a vested interest, as they accept advertising from these companies. While I would be surprised by an overt payoff, I do think most review sites are careful not to overly antagonize big advertisers, while maintaining good traffic.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: ronnn
Originally posted by: keysplayr2003
Originally posted by: ronnn
And why buy a video card that has lower IQ, when both will play those games at great frame rates, as well as supply extra features for other games?


Can you possibly stop addressing questions with more questions? It really leads to nowhere.

Have a link to support this, or just a case of shooting the messenger?

I say again, hypothetical.

Nvidia renders 100% of detail for unlimited distance in a scene and ATI renders 85% of detail for limited distance. Then crank up the resolutions and IQ settings. It is exponentially more work for the NV card than the ATI card to maintain as high framerates because it renders everything at unlimited distance. ATI cuts the corners and can maintain a higher framerate because it does not render the little details that go unnoticed until magnified as FS did.

Hl2 is the only game I know of (not saying there are not others) that ati has a shorter draw distance. I don't know if that is a driver issue or if valve has to change the in game settings for ati cards. I do agree that both companies cut corners to maintain high fps. It just appears that nvidia is cutting more corners and is being caught out at this time. At least as far as default settings go.


This explains a lot for me. ATI fans can kick and scream over this and suddenly say FS is paid off, but I don't believe so.

Not sure who is kicking and screaming (link?) ;) I would say the majority of review sites have a vested interest, as they accept advertising from these companies. While I would be surprised by an overt payoff, I do think most review sites are careful not to overly antagonize big advertisers, while maintaining good traffic.


Kicking and screaming = Figure of speech. Not saying anyone really is at this point, but that will happen randomly.

And I totally do not understand your "shooting the messenger" statement. Beyond that, I have no clue what kind of link you are asking for.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Nice to see some videos, that clearly show the shimmering problem NV can have. Ive said it for over a year, got called a liar, FUD spreader, and everything else you can think of. Now multiple review sites are finally acknowledging the problem. Its easy to see it in the videos, and its much easier to see (I couldnt ignore it) on a 24" LCD right in front of you. As Ive said many times.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: dug777
Originally posted by: josh6079
Sorry sir, we're getting right on to that!

*rushes off to make josh his very own new tech video card*
As long as you don't paper launch it, and it brings something new to the screen I'll buy. Can you have it out before December?

Better still, you can set your own launch date sir!

Okay ya ready? I'll go ahead and launch it now and do some reviews over it, but it will be available on retail shelves on Sept 14th, 2012. (So it's not a paper launch, okay?)

X4950XG4

[*]128 texture/vertex/pixel unified shader pipelines.
[*]45 nm die process.
[*]20 Watt peak power consumption
[*]4 GB GDDR6 Memory
[*]HDVP/HDFP/and MMC-"Magic Movie Condom" support. (The others just sound good and will probably be a form of useless anti-piracy firmware as well)

Located on the same pcb substrate:

D8800 Core 4 Quadro
[*]..........you get the idea

Performance

[*]Prey
999 fps minimum

[*]HL2
999 fps minimum

[*]FEAR
997 fps minimum

[*]Oblivion
998 fps minimum

[*]Quake Wars
80 fps minimum due to EA's coding and horrible patches

[*]Crysis
70 fps minimum due to EA's coding and horrible patches

[*]Project Offset
999 fps minimum because EA wasn't involved.

Estimated MSRP $700, practically a steal.

Pics coming......................uh..................not so soon.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: josh6079
Using a 7800GT @570/1300 right now. TrAA is better than AAA. Although, I am missing the vibrance and better AF that I had. :(

Turn on digital vibrance very slightly, itll improve the color pallete for games.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Ackmed
Nice to see some videos, that clearly show the shimmering problem NV can have.
Of course ignoring the many people who have complained about shimmering on ATI cards. :roll:


Ive said it for over a year, got called a liar, FUD spreader, and everything else you can think of.
Hey, if the shoe fits.....

Now multiple review sites are finally acknowledging the problem. Its easy to see it in the videos, and its much easier to see (I couldnt ignore it) on a 24" LCD right in front of you. As Ive said many times.

bla, bla, bla, we know your song to bad it's on a broken record.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Acanthus
Originally posted by: josh6079
Using a 7800GT @570/1300 right now. TrAA is better than AAA. Although, I am missing the vibrance and better AF that I had. :(

Turn on digital vibrance very slightly, itll improve the color pallete for games.

Ah, good point. I'll try that.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
HQ AF maybe be superior, but how about default AF of ATi against default of NV. Im sure NV has the better angle dependent AF. But nothing really to be proud of.. lol. To me, on default NV's AF shows better detail than ATi's default.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Ackmed
And yeah, they did gloss over shimmering. At least its mentioned for a change though.

Now you're saying:

Nice to see some videos, that clearly show the shimmering problem NV can have.

Where are these videos? No where did the OP's link, nor anyone elses post, provide them. I'm not saying you're wrong, but where did that come from?

Originally posted by: CookieMonster
HQ AF maybe be superior, but how about default AF of ATi against default of NV. Im sure NV has the better angle dependent AF. But nothing really to be proud of.. lol. To me, on default NV's AF shows better detail than ATi's default.
The way both filter angle dependent AF in an anisotropic setting do not differ in the slightest. Only when utilizing ATI's "HQAF" (angle independent) do they differ.

I tried the vibrance settings. Boy, you don't need to move them much.....
Thanks for the tips.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Of course ignoring the many people who have complained about shimmering on ATI cards.
Whatever shimmer ATi has it's vastly less than nVidia's at default settings.

The way both filter angle dependent AF in an anisotropic setting do not differ in the slightest.
I assume you're referring to HQ nVidia AF against standard ATi AF, right?

Because Q nVidia AF most certainly does look different to standard ATi AF.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: BFG10K
Of course ignoring the many people who have complained about shimmering on ATI cards.
Whatever shimmer ATi has it's vastly less than nVidia's at default settings.

The way both filter angle dependent AF in an anisotropic setting do not differ in the slightest.
I assume you're referring to HQ nVidia AF against standard ATi AF, right?

Because Q nVidia AF most certainly does look different to standard ATi AF.

In what ways?