FiringSquad's new article

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: nemesismk2
Originally posted by: PrayForDeath
Originally posted by: nemesismk2
It looks like I made the right choice in getting a 6800 GT, it gives me ATI beating Doom3 performance and when the new 70 drivers from nvidia are released ATI beating Half Life 2 performance as well! :D

How can you say that while they haven't posted X800 benchies yet?

I can say it because I HAVE seen results and the 6800 GT with the latest drivers outperform the x800 Pro. I guess you have not seen some of the leaked half life 2 performance reviews?

Which happens to be about 3 FPS difference at most....

But over in the UK we can get a premodded 16 pipe Pro, which cost nearly the same as the Pros, so i dont kno, i think i might go for them, they will be able to make up that 3 FPS with the extra 4 pipes.

 

Childs

Lifer
Jul 9, 2000
11,313
7
81
I cant really tell the difference between DX8.1 and 9 in those screenshots. Not enough to justify a new video card if you have a FX. I would think the differences during gameplay would be even less noticeable.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Dman877
Originally posted by: apoppin
Originally posted by: Dman877
Summery: Using DX9 in HL2 over DX8.1 on a FX halves your framerate while ATI cards take a 15 - 20% hit.
Pretty accurate summary . . . :roll:
Once DirectX 9 is enabled, GeForce FX cards took a significant performance hit in our testing. For GeForce FX 5700 Ultra and 5600 Ultra, we witnessed performance declines of up to 2.5 times running the DirectX 8.1 path in a couple of cases with the video stress test. In comparison, RADEON 9600 XT?s worst-case scenario was a performance decline of 23% at 1600x1200 with 4xAA and 8xAF. On the high-end cards, GeForce FX 5950 Ultra performance dropped by a factor of two once the DirectX 9 path was enabled (versus RADEON 9800 XT?s 10-27%). Essentially, enabling the DX9 path with GeForce FX cards knocks your frame rate in half with Valve?s video stress test, the performance dropoffs are sometimes even worse for GeForce FX in Counter-Strike: Source beta. Just take a look at the trilinear benchmarks on page 7. It isn?t pretty for GeForce FX at 1024x768 and 1280x1024 with the DX9 path enabled.


Um.. what's the problem?
no problem . . . a complement, really, for a most succinct summary . . .
"Pretty accurate summary"

. . .

:roll:

now one gets to agree or disagree with HL2 optimizations for ati cards

:p

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Batelauer:

A lot of people own FX series cards. If you just bought a new FX card a few months ago, why would you want to upgrade again so soon? Assuming you don't have unlimited funds to throw around like some people.

I don't understand what you're trying to say here? What does people not wanting to upgrade again have to do with anything you quoted me saying?


Mostly, I just wanted the article to show all the current gen cards, it's more relevant than last year's cards. The people with last year's cards are stuck anyway, the only people I can see this article benefitting are people considering last year's cards.*

BTW- I like how 4/6 effects added by DX9 in this game are water effects. I had no idea this game had such a nautical slant. :roll:

*The recent FX owner is better off with this game than the 9800XT owner is with Doom3. The FX owner can run DX 8.1 and double his performance.

 

MegaWorks

Diamond Member
Jan 26, 2004
3,819
1
0
*The recent FX owner is better off with this game than the 9800XT owner is with Doom3. The FX owner can run DX 8.1 and double his performance.

More blah! blah! blah! from Rollo. So are you saying that DX 8.1 is better than DX9 now. Weren't you the one claiming that SM 3.0 cards are better then SM 2.0 because they're future prof. right! So now you're saying that NVIDIA DX 8.1 cards are better than DX 9.0 cards. And you want me to believe you're not an NV fanboy.:roll:
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: MegaWorks
*The recent FX owner is better off with this game than the 9800XT owner is with Doom3. The FX owner can run DX 8.1 and double his performance.

More blah! blah! blah! from Rollo. So are you saying that DX 8.1 is better than DX9 now. Weren't you the one claiming that SM 3.0 cards are better then SM 2.0 because they're future prof. right! So now you're saying that NVIDIA DX 8.1 cards are better than DX 9.0 cards. And you want me to believe you're not an NV fanboy.:roll:

I say nothign of the sort.

To say a FX owner is in a better position with this game than a 9800 owner is with Doom 3 is true, oh quick to judge one.

If 9800 owners could pick 95% of the visuals in Doom 3 and double their framerates to the 60s at 10X7 4X8X, there would be no ATI issue with Doom 3 at all.

I also never said DX8.1 is better than DX9?

You want me to believe you're not on a vendetta to pick apart anything I say when you apparently don't even understand it? :roll:


 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
Originally posted by: MegaWorks
*The recent FX owner is better off with this game than the 9800XT owner is with Doom3. The FX owner can run DX 8.1 and double his performance.

More blah! blah! blah! from Rollo. So are you saying that DX 8.1 is better than DX9 now. Weren't you the one claiming that SM 3.0 cards are better then SM 2.0 because they're future prof. right! So now you're saying that NVIDIA DX 8.1 cards are better than DX 9.0 cards. And you want me to believe you're not an NV fanboy.:roll:


I think it speaks more to Valve's product support. They kind of put Id to shame since Id pretty much left ATI owners out to dry.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: MegaWorks
*The recent FX owner is better off with this game than the 9800XT owner is with Doom3. The FX owner can run DX 8.1 and double his performance.

More blah! blah! blah! from Rollo. So are you saying that DX 8.1 is better than DX9 now. Weren't you the one claiming that SM 3.0 cards are better then SM 2.0 because they're future prof. right! So now you're saying that NVIDIA DX 8.1 cards are better than DX 9.0 cards. And you want me to believe you're not an NV fanboy.:roll:
LOL. yeah, very typical tRollo nV fanboy stuff. To him, it's always nV>ATi. He spins everything that way. Even if the R3xx is FASTER in DX 9 than the nV3x is in DX8.1 (which it is in this article), he still manges to spin it toward nVidia. :roll:

The other important fact is as soon as new cards come out, the previous gen cards all self destruct and they no longer matter. I mean, no one still owns a R3xx or nV3x based card anymore, right?

As for the article, I agree the choice of cards is a bit odd. The only thing I can think of is those were the cards they had on hand at the time of testing.

I would think either have a low end and a high end test, or bench all the cards in one review.
 

Childs

Lifer
Jul 9, 2000
11,313
7
81
Originally posted by: gururu
I think it speaks more to Valve's product support. They kind of put Id to shame since Id pretty much left ATI owners out to dry.

How do you figure? In Doom3 everyone uses the default codepath, and you can specify special codepaths for the older cards. Exactly how is id put to shame?
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
Originally posted by: Childs
Originally posted by: gururu
I think it speaks more to Valve's product support. They kind of put Id to shame since Id pretty much left ATI owners out to dry.

How do you figure? In Doom3 everyone uses the default codepath, and you can specify special codepaths for the older cards. Exactly how is id put to shame?

It just seems that valve recognized that FX owners might suffer and did something about it. I haven't heard anything about Id having such concerns over ATI performance. Even JC kind of snubbed the Humus tweak, acting like he didn't care if it did raise performance.
 

lordtyranus

Banned
Aug 23, 2004
1,324
0
0
To say a FX owner is in a better position with this game than a 9800 owner is with Doom 3 is true, oh quick to judge one.
According to anand's benches, the 9800 is barely behind the 5900s in Doom 3.

http://www.anandtech.com/video...c.aspx?i=2146&p=4
<br>[url]http://www2.hardocp.com/article.html?art=NjQyLDQ=
[/url]

The only ATI card that performs truly poorly in Doom 3 for its price is the x800 Pro. I fail to see your logic in this statement.

If 9800 owners could pick 95% of the visuals in Doom 3 and double their framerates to the 60s at 10X7 4X8X, there would be no ATI issue with Doom 3 at all. /q]
This is the truth. But the 5xxx cards don't get double the framerates of the 9xxx cards in D3.
 

Childs

Lifer
Jul 9, 2000
11,313
7
81
Originally posted by: gururu
It just seems that valve recognized that FX owners might suffer and did something about it. I haven't heard anything about Id having such concerns over ATI performance. Even JC kind of snubbed the Humus tweak, acting like he didn't care if it did raise performance.


Dunno about that. Anyone from a G4mx and above can play Doom. Its a more graphically demanding engine to boot. Carmack didnt want to play favorites, but obviously you can make tweaks for a specific architecture. He told everyone what he was going to do a long time ago, if ATI didnt listen, then thats id's fault?
 

MegaWorks

Diamond Member
Jan 26, 2004
3,819
1
0
LOL. yeah, very typical tRollo nV fanboy stuff. To him, it's always nV>ATi. He spins everything that way. Even if the R3xx is FASTER in DX 9 than the nV3x is in DX8.1 (which it is in this article), he still manges to spin it toward nVidia. :roll:

The other important fact is as soon as new cards come out, the previous gen cards all self destruct and they no longer matter. I mean, no one still owns a R3xx or nV3x based card anymore, right?

thank you for proving my point.
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
Originally posted by: Childs
Dunno about that. Anyone from a G4mx and above can play Doom. Its a more graphically demanding engine to boot. Carmack didnt want to play favorites, but obviously you can make tweaks for a specific architecture. He told everyone what he was going to do a long time ago, if ATI didnt listen, then thats id's fault?

All I'm saying is that is a nice gesture on Valve's part. They didn't have to do it. There was nothing 'nice' done by Id. Sure, ATI sucks with OGL, but when the developer does what they can to overcome card limitations, it's a really cool thing for the consumer.



 

Childs

Lifer
Jul 9, 2000
11,313
7
81
Originally posted by: gururu
All I'm saying is that is a nice gesture on Valve's part. They didn't have to do it. There was nothing 'nice' done by Id. Sure, ATI sucks with OGL, but when the developer does what they can to overcome card limitations, it's a really cool thing for the consumer.


I actually thought all the NV owners where going to get the shaft after the Shader Day event. They should have mentioned then they were going to have an DX8.1 codepath that looks just as good and is faster than the DX9 codepath. Of course, not as many people would be sportin 9600XTs and 9800Pros with their HL2 voucher in one hand and their peckers in the other! HAHAHAHAHA j/k

Anyways, I dont really care. I guess I'm still pissed at Valve because of Steam.
 

acx

Senior member
Jan 26, 2001
364
0
71
Wow, the 6800GT loses half (50fps) of it's performance going from 1024x768(105) to 1280x1024(51) in 4xAA/8xAF? Is that real?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
LOL- like cheeping little monkeys the ATI support group jumps into action!

The recent FX owner is better off with this game than the 9800XT owner is with Doom3. The FX owner can run DX 8.1 and double his performance.

There is one point here intellectually challenged people. I stated it's preferable to run HL2 with most of the IQ twice as fast, than to be stuck like the 9800 owners are with Doom 3.

No more, no less. Doom 3 vs HL2 , no DX8 is better than DX9, nothing. You said it, I didn't.

My first point in the original post is that the article is disappointing because it only includes one of the five current generation cards.

I didn't say that because I thought any cards "disappeared" as one Old Fart suggested, I said that because on previews for upcoming games reviewers usually use current hardware, not least year's?

Or is that too tough for your little heads to comprehend???

Let me help:
Why don't you look back at all the review sites you frequent, and tell me how many of the new game preview reviews use 5 last gen cards, 1 current gen card when there are 5 current gen cards to review?
Uh huh. They're always using the "Let's show them how upcoming games perform primarily on old hardware!" approach. :roll:

BTW- I realize the article is a expose on the whole forcing DX9 on nV3Xs issue, but if they're going to give us the 6800GT in the mix, why not a X800Pro to compare it with? I think we all know the nV40 does DX9 better than the nV30, so I'm still disappointed.




 

Tobyus

Junior Member
Aug 18, 2004
12
0
0
Originally posted by: acx
Wow, the 6800GT loses half (50fps) of it's performance going from 1024x768(105) to 1280x1024(51) in 4xAA/8xAF? Is that real?

I believe my 6800 GT gets 83 fps at 1280x960 with 4xAA/8xAF at max settings, I get 55 or so at 1600x1200 at 4xAA/8xAF at the same quality settings as well. I can't get to firingsquad.com from work, so I can't see what you all are talking about. I just wanted to let you know that if that is what they are showing as results for the stress test, then they have messed it up worse than DH did.
 

Childs

Lifer
Jul 9, 2000
11,313
7
81
Originally posted by: Tobyus
Originally posted by: acx
Wow, the 6800GT loses half (50fps) of it's performance going from 1024x768(105) to 1280x1024(51) in 4xAA/8xAF? Is that real?

I believe my 6800 GT gets 83 fps at 1280x960 with 4xAA/8xAF at max settings, I get 55 or so at 1600x1200 at 4xAA/8xAF at the same quality settings as well. I can't get to firingsquad.com from work, so I can't see what you all are talking about. I just wanted to let you know that if that is what they are showing as results for the stress test, then they have messed it up worse than DH did.


I get about the same as you, on a slower CPU than what FS is using.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: Rollo
LOL- like cheeping little monkeys the ATI support group jumps into action!

The recent FX owner is better off with this game than the 9800XT owner is with Doom3. The FX owner can run DX 8.1 and double his performance.

There is one point here intellectually challenged people. I stated it's preferable to run HL2 with most of the IQ twice as fast, than to be stuck like the 9800 owners are with Doom 3.

No more, no less. Doom 3 vs HL2 , no DX8 is better than DX9, nothing. You said it, I didn't.

My first point in the original post is that the article is disappointing because it only includes one of the five current generation cards.

I didn't say that because I thought any cards "disappeared" as one Old Fart suggested, I said that because on previews for upcoming games reviewers usually use current hardware, not least year's?

Or is that too tough for your little heads to comprehend???

Let me help:
Why don't you look back at all the review sites you frequent, and tell me how many of the new game preview reviews use 5 last gen cards, 1 current gen card when there are 5 current gen cards to review?
Uh huh. They're always using the "Let's show them how upcoming games perform primarily on old hardware!" approach.
That is simply ridiculous. People who own last gen cards are interested to see how an upcoming game will perform on their system. There a TON more people out there running nV3x and R3xx card than the latest gen cards. Are you saying they dont want to know how a new game such as HL2 or other will run on their system? Of course they do. Please.

It is perfectly reasonable to run the latest as well as last gen cards in such a review. I do agree that they should have either left out the GT, and do a separate review on current gen, or have all the current cards in the review.

You always speak so highly of Anandtech reviews. They also used last gen cards and even threw in a GF4 4400. Any complaints?

 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: gururu
It just seems that valve recognized that FX owners might suffer and did something about it. I haven't heard anything about Id having such concerns over ATI performance. Even JC kind of snubbed the Humus tweak, acting like he didn't care if it did raise performance.
The Humus tweak isn't mathematically equivilent to what the output should be, and hence JC didn't care for it since it did the opposite of what he intended for the game engine to do(perfectly replicate non-PS hardware's bias). JC snubbed it because it cost image quality, and that it something he won't compromise on Doom 3 more than nessisary.