X1900 still lacking detail in HL2

Wreckage

Banned
Jul 1, 2005
5,529
0
0
This was a problem with older ATI cards but it seems that it did not get fixed in the newer X1900 cards or with updates to HL2 & Catalyst.

See for yourself in these screenshots
http://www.hothardware.com/viewarticle.aspx?page=8&articleid=777&cid=2

If you open each of the standard shots individually and skip through them quickly, you're likely to notice a bit more detail in the shots taken with the GeForce 7800 GTX versus those taken with the Radeon using its standard angular dependant anisotropic filtering mode, disregarding artifacts produced by the JPG compression.

The same seemed to be true when inspecting the 16x aniso images. Of course, image quality analysis is objective by its nature, but based on these images, we think the GeForce 7800 GTX has the better image quality as it relates to anisotropic filtering when standard "optimized" aniso is used

A possible explanation from Ratchet @ Rage3D
http://www.rage3d.com/board/showpost.php?p=1333964749&postcount=10

If ATI keeps running lower detail it would rule out using HL2 as a benchmark and may explain their numbers on the source engine.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Of course, from the next page:

As we demonstrated on the previous page, ATI's high-quality anisotropic filtering modes offer arguably the best anisotropic filtering available in a consumer level graphics card. And the performance data on this page shows that there is virtually no reason to have it disabled and use the lower quality setting. ATI's high-quality aniso modes perform just barely below the comparable "standard" modes in both games, and have a minimal impact on performance versus just using trilinear filtering.

So I'm not sure how relevant it really is that NVIDIA's angle-dependent AF is maybe a hair better than ATI's angle-dependent AF. It's a pretty subtle difference in those screenshots.

I'm having trouble accessing that second link right now; I'll have to check it out later. I can't see any differences in object detail in the screenshots in the first article.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I'm looking at the 16x angle-dependent AF sceens, and I see almost no difference. Certainly nothing to hint that one has noticeably less detail. Why dont you (wreakage) explain where exactly in those screens does the 7800 have more detail...
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Wreckage
Originally posted by: Matthias99
I can't see any differences in object detail in the screenshots in the first article.
More screenshots.

http://rage3d.com/reviews/video/atix1800xt/afcomp.php

Missing lines, bushes, branches, fence details, etc.

Link worked better that time.

Only thing I see is those couple bushes on the right (which do seem to be missing... weird).

There are some lighting highlights on the ground (closer to the train in the background) that are different, but it's hard to tell from that shot if they're 'wrong' on the R520. Not sure which 'branches' and 'fence details' you are referring to.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
I can't see any real difference between the default quality AF screenshots on either card. The missing bushes are definitely noticeable though.
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Originally posted by: Matthias99
Originally posted by: Wreckage
Originally posted by: Matthias99
I can't see any differences in object detail in the screenshots in the first article.
More screenshots.

http://rage3d.com/reviews/video/atix1800xt/afcomp.php

Missing lines, bushes, branches, fence details, etc.

Link worked better that time.

Only thing I see is those couple bushes on the right (which do seem to be missing... weird).

There are some lighting highlights on the ground (closer to the train in the background) that are different, but it's hard to tell from that shot if they're 'wrong' on the R520. Not sure which 'branches' and 'fence details' you are referring to.


The lights on the ground are artifacts. Ratchet makes that clear in the thread.

 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
58
91
there probably isn't much difference in still game screenshots. but when playing quick fps games with 16x AF the x1900xt(x) would run much smoother than the 7900gt(x)
 

samusaraniv

Member
Mar 12, 2006
119
0
0
I'm with Munky on this one -- the difference between the 16x NVIDIA/ATI AF is most definitely nominal, HARDLY noticeable at all. Perhaps I'm blind..

In motion, perhaps I would see more of a difference between the two.
 

Steelski

Senior member
Feb 16, 2005
700
0
0
Originally posted by: Wreckage
This was a problem with older ATI cards but it seems that it did not get fixed in the newer X1900 cards or with updates to HL2 & Catalyst.

See for yourself in these screenshots
http://www.hothardware.com/viewarticle.aspx?page=8&articleid=777&cid=2

If you open each of the standard shots individually and skip through them quickly, you're likely to notice a bit more detail in the shots taken with the GeForce 7800 GTX versus those taken with the Radeon using its standard angular dependant anisotropic filtering mode, disregarding artifacts produced by the JPG compression.

The same seemed to be true when inspecting the 16x aniso images. Of course, image quality analysis is objective by its nature, but based on these images, we think the GeForce 7800 GTX has the better image quality as it relates to anisotropic filtering when standard "optimized" aniso is used

A possible explanation from Ratchet @ Rage3D
http://www.rage3d.com/board/showpost.php?p=1333964749&postcount=10

If ATI keeps running lower detail it would rule out using HL2 as a benchmark and may explain their numbers on the source engine.

where abouts is this so obvious less detail that you speak of. Do you mean the crappy way that Source renders thin things like trees on ATI cards with AF on...... you can check it out by comparing shots where AF is not on.
the only other artifact i could find was in the 16AF shot on the Nvidia card, the bottom right where the large area is at a crappy angle so the card can not detect it.....oops. its not filtered....
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: RobertR1
The lights on the ground are artifacts. Ratchet makes that clear in the thread.

I couldn't figure out what you were talking about at first, but Wreckage linked just one post of the thread, not the whole thing.

link to whole thread at Rage3D

Apparently the lighting on the ground is a rendering bug. The missing bushes are pretty strange, though. Didn't seem to be any followup on it...

 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Anyone know if the problems with HL2 have changed with newer drivers since that Hothardware article is 2 months old?
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Sounds like you're making a mountain out of a molehill. Both the ATI and the NV card do a terrible job with only trilinear filtering on, NV's 16X AF (regular quality) is slightly better looking than ATI's, and ATI's HQ is better than NV's AF, which has no high quality to compare it to.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Not to mention, you get your eyes poked with needles when using Nvidia cards.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
The nVidia bad shadow thing in Farcry all over again. Fanboys cry out by the millions. Real users yawn.

/thread.
 

FalllenAngell

Banned
Mar 3, 2006
132
0
0
Originally posted by: akugami
The nVidia bad shadow thing in Farcry all over again. Fanboys cry out by the millions. Real users yawn.

/thread.

LOL

This is interesting though, makes all ATIs past victories in HL2 seem sort of diminished if they weren't rendering the whole scene.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: FalllenAngell
Originally posted by: akugami
The nVidia bad shadow thing in Farcry all over again. Fanboys cry out by the millions. Real users yawn.

/thread.

LOL

This is interesting though, makes all ATIs past victories in HL2 seem sort of diminished if they weren't rendering the whole scene.

The ATI fanboys extended the Farcry thread to dozens of pages, yet they think this thread is pointless.

They also say how good the AF is and yet there seems to be no difference other than the removal of detail :roll:
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
Originally posted by: Wreckage
Originally posted by: FalllenAngell
Originally posted by: akugami
The nVidia bad shadow thing in Farcry all over again. Fanboys cry out by the millions. Real users yawn.

/thread.

LOL

This is interesting though, makes all ATIs past victories in HL2 seem sort of diminished if they weren't rendering the whole scene.

The ATI fanboys extended the Farcry thread to dozens of pages, yet they think this thread is pointless.

They also say how good the AF is and yet there seems to be no difference other than the removal of detail :roll:

I see, so the ATI fanboys decried the bad shadow rendering in the gun in Farcry and the nVidia fanboys spent time arguing with the ATI fanboys about how relevant it is to the enjoyment of the game. Of course the nVidia fanboys have to return the favor so they can start another thread arguing over another meaningless feature. I always felt the cries of outrage over nVidia's shadow rendering problem in Farcry was more drama than any real problem. And this problem seems to be the same thing.

Like I said, the nVidia Farcry shadow rendering bug all over again.
/thread