"FarCry Performance Revisited: ATI Strikes Back with Shader Model 2.0b"

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Why all the comparisons of last years HW to this years? I would hope the 6800 is faster. BTW, why no mention that last years top end nVidia card is MUCH SLOWER, sometimes 1/2 the speed compared to the 9800XT in their benches?

Probably did it to show if the patch gives any kind of speed boost to the older cards. As for the comparison between the 9800 and 5900 it doesnt really matter. They were showing the rendering path increases. Not doing a review of the 5900 vs 9800.
 

Shad0hawK

Banned
May 26, 2003
1,456
0
0
Originally posted by: Ackmed
Originally posted by: PrayForDeath
Anyone noticed how the 9800XT has double the performance of the 5950Ultra in those benchmarks?
How is that?


The FX series was never very strong at DX9. Farcry doesnt use that much of it, but when it does it can put the hammer on your system.

Also, the 1.2 patch fixed many graphical issues that the 1.1 patch brought it. It also drop the performance because it fixed the issues.

Pretty ironic that the 1.1 patch gave the FX series a huge boost, but also introduced a lot of graphical "bugs". The 1.2 patch comes out (almost) just as the 6800 series does, raising the 6800's performance a hair, and fixing the FX problems, but dropping the frames a lot.


a "hair"?

ROFL!
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: Rage187
Longhorn is slated for a 2006 release, and most people beielve it will slip at least 1 yr.


So thats 4 years away.


Um, even if it does slip, thats 3 years.

... count much?
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: GeneralGrievous
I dont see why/how R500 will be released so soon. It's not like R420 is being totally outdone by the NV40, any lack of sales is due to lack of card output and not a poor design.

Teh Xbox2.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: Genx87
Why all the comparisons of last years HW to this years? I would hope the 6800 is faster. BTW, why no mention that last years top end nVidia card is MUCH SLOWER, sometimes 1/2 the speed compared to the 9800XT in their benches?

Probably did it to show if the patch gives any kind of speed boost to the older cards. As for the comparison between the 9800 and 5900 it doesnt really matter. They were showing the rendering path increases. Not doing a review of the 5900 vs 9800.
I was wondering about that, but I dont see any "2.0b" 9800 tests. My point is they seem to go out of their way several times about how slow the 9800XT is compared to new cards, but no mention that the 5950 is twice as slow as the one that is "a Galaxy behind"?
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
I think everyone has pretty much written off the FX series for DX9 performance. At $300, your choices are the 9800XT, 5950U, and 6800, so xbit ignored the 5950U and just compared the 9800XT and 6800.

Anywho, the X800s get a decent boost, but there are a few benches where the 6800s just demolish the X800s in terms of minimum fps--very impressive, considering the FX series.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Just to clarify, MS has said that DX10s release will coincide with longhorns release, which is slated for fall 2006.

To have ANY card that would support DX10 come out this year, or even next year, would be completely pointless.
 

PrayForDeath

Diamond Member
Apr 12, 2004
3,478
1
76
Originally posted by: oldfart
Originally posted by: Genx87
Why all the comparisons of last years HW to this years? I would hope the 6800 is faster. BTW, why no mention that last years top end nVidia card is MUCH SLOWER, sometimes 1/2 the speed compared to the 9800XT in their benches?

Probably did it to show if the patch gives any kind of speed boost to the older cards. As for the comparison between the 9800 and 5900 it doesnt really matter. They were showing the rendering path increases. Not doing a review of the 5900 vs 9800.
I was wondering about that, but I dont see any "2.0b" 9800 tests. My point is they seem to go out of their way several times about how slow the 9800XT is compared to new cards, but no mention that the 5950 is twice as slow as the one that is "a Galaxy behind"?

Isn't SM2.0b an X800 Feature?
 

biostud

Lifer
Feb 27, 2003
19,741
6,822
136
Originally posted by: PrayForDeath
Originally posted by: oldfart
Originally posted by: Genx87
Why all the comparisons of last years HW to this years? I would hope the 6800 is faster. BTW, why no mention that last years top end nVidia card is MUCH SLOWER, sometimes 1/2 the speed compared to the 9800XT in their benches?

Probably did it to show if the patch gives any kind of speed boost to the older cards. As for the comparison between the 9800 and 5900 it doesnt really matter. They were showing the rendering path increases. Not doing a review of the 5900 vs 9800.
I was wondering about that, but I dont see any "2.0b" 9800 tests. My point is they seem to go out of their way several times about how slow the 9800XT is compared to new cards, but no mention that the 5950 is twice as slow as the one that is "a Galaxy behind"?

Isn't SM2.0b an X800 Feature?

Exactly
 

stnicralisk

Golden Member
Jan 18, 2004
1,705
1
0
Originally posted by: SickBeast
Originally posted by: NFactor
If Microsoft intends to release DX10 with Longhorn then it will take R600 or 700 before the software actually is out.

Maybe people will just start programming for OpenGL; it will be fully capable of all of DX10's features.

OpenGL actually takes skill to program though DX lays everything out for the programmer. Hopefully D3 engine will be very utilized that way there remains a competition between DX and OGL and both APIs rush towards next generation ideas.
 

lowinor

Junior Member
Mar 20, 2003
21
0
0
Originally posted by: stnicralisk
OpenGL actually takes skill to program though DX lays everything out for the programmer. Hopefully D3 engine will be very utilized that way there remains a competition between DX and OGL and both APIs rush towards next generation ideas.

I'm curious as to where that sentiment comes from -- in my experience programming for both APIs, OpenGL is the easier one to write code for, DirectX just supports more hardware features at the API level.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: lowinor
Originally posted by: stnicralisk
OpenGL actually takes skill to program though DX lays everything out for the programmer. Hopefully D3 engine will be very utilized that way there remains a competition between DX and OGL and both APIs rush towards next generation ideas.

I'm curious as to where that sentiment comes from -- in my experience programming for both APIs, OpenGL is the easier one to write code for, DirectX just supports more hardware features at the API level.

IMO, OpenGL is easier to do *simple* things with. When you start getting into the extensions, it gets more complicated, whereas DirectX has more complex things built into the core API. Hopefully OpenGL 2.0 (whenever it gets approved) will address some of this.
 

lowinor

Junior Member
Mar 20, 2003
21
0
0
Originally posted by: Matthias99
IMO, OpenGL is easier to do *simple* things with. When you start getting into the extensions, it gets more complicated, whereas DirectX has more complex things built into the core API. Hopefully OpenGL 2.0 (whenever it gets approved) will address some of this.
That's at least understandable, but I don't think I'd use the term 'skill' to describe what it takes to manage said extensions.

Either way, the skill in developing 3d graphics lies in algorithm design, the API has little to do with this beyond which hardware features it enables.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Nobody knows when DX10 will be out. Just pure speculations
I would luv to see NV50/R500 being DX10 cards. Even if ,we the gamers, don't have games that will support DX10 due to lack of the platform, programmers surely will have beta versions of it.
That will help them to experiment on the fortcoming platform and provide better DX10 games for the future. Though I highly doubt that we will see a serious DX10 game for a long long time.
As always when DX10 cards will be out, it's only then that we will start to see serious DX9 games with sophisticated heavily shadowed games.
And as someone stated there's always Opengl that can help to provide DX10 features.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,207
126
Originally posted by: GeneralGrievous
Why does xbit insist on not doing AA/AF benches? These numbers are mostly worthless for comparing card to card.

It really looks like the x800 Pro gets worse day by day. It's performing much closer to 6800 than to GT levels.

Correct me if I'm wrong, General, but doesn't that make a certain amount of sense? Both the X800 Pro and the plain 6800 are 12-pipe cards.
 

sharkeeper

Lifer
Jan 13, 2001
10,886
2
0
Next Spring the R500 will be out and Longhorn probably wont be for 18 months.

Yes they will have a launch next spring and cards will be on backorder for 11 months!

Cheers!
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Ok there are a couple of points that no one has really touched on yet...

1. Far Cry however graphically intense it is was never built around a SM3.0 code. All that happened was they extended the CryEngines code and added some features. It seems that ATI's 2.0B supports those features while not really supporting all the other new features that Nvidia supports (displacement mapping for instance)

2. The R500 and the NV50 will probably not incorporate DX10. However they will both have SM3.0 paths.

3. I would think that the reason they didn't do AA and AF benches right off the bat are because of optimizations. All AA and AF do is start arguments with fanboys pointing fingers and saying one side cheats. THere is no way to get an accurate representation of who does it for real and who does it more effectively.

3. I dont think microsoft will release DX10 for XP. DX10 is supposedly only for Longhorn. Nothing else. Therfore i dont think either ATI or Nvidia will jump the gun by that much... however, what could happen is the fall refresh before longhorn they could release a DX10 version card, that is unlikely as they would have to completely redesign the architecture though.

4. There are no tried and true PS3.0 games out yet. Until we have one we wont know whether 2.0b is just as good or not. But what we do know is the SM3.0 is not using all of its features yet.

Also remember that Far Cry 1.2 patch was recalled do to a whole crap load of bugs.

-Kevin
 

futuristicmonkey

Golden Member
Feb 29, 2004
1,031
0
76
Because it was a review of ATI cards using the PS2.0b rendering path. They threw in the SM3.0 6800 scores for an idea on how both optimized paths compare.

I'm sorry for not making my point so apparent, but, read the first line of my post and think about it again. If they are slanted towards nVidia, they wont show the 6800nu being owned by the 9800xt. Owned because its last gen. Suppose they showed the SM2.0 scores of the 6800, and showed it being beaten by a last gen card.....well...thats something that a fanboy wouldnt do. If they were objective, and not biased, they wouldve showed the 6800's SM2.0 score. Then maybe, the 6800 would be "the galaxy" behind the older card.

OTOH, you may be right. But the fact that they kept comparing the 9800xt to the 6800 really makes me think that they are nVidia fanboys.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Genx87
How do you "properly" support DX9? And exactly what would a 5900 need to support in order to "properly" support it?

I'm not an expert on this by any means, but from what I understand DX9 requires 24-bit precision throughout the codepath in order to be fully DX9 compliant in hardware. I'm pretty sure that the NV30 can only run 16-bit precision in many situations. On top of that, from what I understand, the NV30 takes a massive hit when running shaders, which greatly impacts its DX9 performance.
 

Viper96720

Diamond Member
Jul 15, 2002
4,390
0
0
Originally posted by: futuristicmonkey
Because it was a review of ATI cards using the PS2.0b rendering path. They threw in the SM3.0 6800 scores for an idea on how both optimized paths compare.

I'm sorry for not making my point so apparent, but, read the first line of my post and think about it again. If they are slanted towards nVidia, they wont show the 6800nu being owned by the 9800xt. Owned because its last gen. Suppose they showed the SM2.0 scores of the 6800, and showed it being beaten by a last gen card.....well...thats something that a fanboy wouldnt do. If they were objective, and not biased, they wouldve showed the 6800's SM2.0 score. Then maybe, the 6800 would be "the galaxy" behind the older card.

OTOH, you may be right. But the fact that they kept comparing the 9800xt to the 6800 really makes me think that they are nVidia fanboys.

So you want them showing ATI beating Nvidia. Then you can call them ATI fanboys.
 

futuristicmonkey

Golden Member
Feb 29, 2004
1,031
0
76
Originally posted by: Viper96720
Originally posted by: futuristicmonkey
Because it was a review of ATI cards using the PS2.0b rendering path. They threw in the SM3.0 6800 scores for an idea on how both optimized paths compare.

I'm sorry for not making my point so apparent, but, read the first line of my post and think about it again. If they are slanted towards nVidia, they wont show the 6800nu being owned by the 9800xt. Owned because its last gen. Suppose they showed the SM2.0 scores of the 6800, and showed it being beaten by a last gen card.....well...thats something that a fanboy wouldnt do. If they were objective, and not biased, they wouldve showed the 6800's SM2.0 score. Then maybe, the 6800 would be "the galaxy" behind the older card.

OTOH, you may be right. But the fact that they kept comparing the 9800xt to the 6800 really makes me think that they are nVidia fanboys.

So you want them showing ATI beating Nvidia. Then you can call them ATI fanboys.

Thats a very ignorant thing to say.

If they were to bench the 6800nu using SM2.0, and the 9800xt beat it even worse, they wouldnt be fanboys. They are fanboys because they didnt show a benchmark that couldve shown the 9800xt (a last gen card) beating a 6800nu (a next/current gen card), and because they kept saying things that would have you believe that the 9800xt and 6800nu are in the same category. That is just stupid. These are things that a fanboy would do: they would only put their side of the story in - only showing benchmarks where "their" brand of card beats the other one. These people are journalists, and it is their job to publish the whole story.
 

quikah

Diamond Member
Apr 7, 2003
4,178
729
126
NV30 can run 32-bit precision in all situations. Therefore it is dx9 compliant. It is just slow running 32-bit so you try to run 16-bit where possible.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: lowinor
Originally posted by: Matthias99
IMO, OpenGL is easier to do *simple* things with. When you start getting into the extensions, it gets more complicated, whereas DirectX has more complex things built into the core API. Hopefully OpenGL 2.0 (whenever it gets approved) will address some of this.
That's at least understandable, but I don't think I'd use the term 'skill' to describe what it takes to manage said extensions.

Either way, the skill in developing 3d graphics lies in algorithm design, the API has little to do with this beyond which hardware features it enables.

I didn't say it took 'skill' (perhaps you're attributing someone else's quote to me?), just that it was more complicated. :p

Certainly both APIs are comparable; there's only so many ways to do 3D rendering.