RV670 ---------Radeon HD 3870 card high res photo

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I also can't help but notice how in HL2:ep2 the 2900xt sinks to the same level as a 1950xtx as soon as AA is enabled, and drops below the level of a 8800gts. This is the kind of performance hit I am talking about, in a DX9 game based on a 3 year old engine, and that's why I'm saying hopefully Ati will make the appropriate changes in the rv670 so this doesn't happen again.
i saw that - at super-hi resolutions - Anand's comment on your example:
The NVIDIA GeForce 8800 GTS 320MB falls way short of the AMD Radeon HD 2900 XT in this test, as we see the AMD card performing nearly on par with the 8800 GTX this time around. It seems that with our indoor test, high end NVIDIA cards scale better from 1920x1200 to 2560x1600 than the high end AMD part.

Maybe nvidia will make the appropriate changes in the g92 so this doesn't happen again.
The 2900 XT does outperform the 8800 GTS (both 640 and 320MB), . . . the win for the $400 price point certainly goes to the 2900 XT.

thank-you for pointing this out!
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: apoppin
I also can't help but notice how in HL2:ep2 the 2900xt sinks to the same level as a 1950xtx as soon as AA is enabled, and drops below the level of a 8800gts. This is the kind of performance hit I am talking about, in a DX9 game based on a 3 year old engine, and that's why I'm saying hopefully Ati will make the appropriate changes in the rv670 so this doesn't happen again.
i saw that - at super-hi resolutions
Which makes the 2900xt look even more pathetic because it can't outrun a last gen 1950xtx when using high quality settings.
- Anand's comment on your example:
The NVIDIA GeForce 8800 GTS 320MB falls way short of the AMD Radeon HD 2900 XT in this test, as we see the AMD card performing nearly on par with the 8800 GTX this time around. It seems that with our indoor test, high end NVIDIA cards scale better from 1920x1200 to 2560x1600 than the high end AMD part.

Maybe nvidia will make the appropriate changes in the g92 so this doesn't happen again.
The 2900 XT does outperform the 8800 GTS (both 640 and 320MB), . . . the win for the $400 price point certainly goes to the 2900 XT.

thank-you for pointing this out!
Anand's comment was on performance without AA, and doesn't apply to this argument. I'm not paying $400 for a video card to play without AA.
 

spittledip

Diamond Member
Apr 23, 2005
4,480
1
81
So Nvidia is going to launch first. Who is going to jump on the Nvidia card immediately.. and who is going to wait for the comparisons/reviews and price drops?
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
I'm going to wait on the reviews/comparisons/price drops before making my decision. There is no reason not to really. I'm not biased towards either company, but I do hope AMD/ATI does a lot better with this release.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: munky
Originally posted by: apoppin
I also can't help but notice how in HL2:ep2 the 2900xt sinks to the same level as a 1950xtx as soon as AA is enabled, and drops below the level of a 8800gts. This is the kind of performance hit I am talking about, in a DX9 game based on a 3 year old engine, and that's why I'm saying hopefully Ati will make the appropriate changes in the rv670 so this doesn't happen again.
i saw that - at super-hi resolutions
Which makes the 2900xt look even more pathetic because it can't outrun a last gen 1950xtx when using high quality settings.
***What are you talking about? NO $400 card can. i'll let anand answer you:
We did take a quick look at antialiasing for this review using our high end cards. The bottom line is that our 2900 XT and 8800 GTS class cards are capable of 4xAA up to but not including 2560x1600. Frame rate just drops too low under our outdoor test to remain playable in the $300 - $400 price range at the highest resolution with 4xAA enabled. There is a big performance impact that comes along with 4xAA with all of these cards, especially in our already graphically demanding outdoor benchmark.
- Anand's comment on your example:
The NVIDIA GeForce 8800 GTS 320MB falls way short of the AMD Radeon HD 2900 XT in this test, as we see the AMD card performing nearly on par with the 8800 GTX this time around. It seems that with our indoor test, high end NVIDIA cards scale better from 1920x1200 to 2560x1600 than the high end AMD part.

Maybe nvidia will make the appropriate changes in the g92 so this doesn't happen again.
The 2900 XT does outperform the 8800 GTS (both 640 and 320MB), . . . the win for the $400 price point certainly goes to the 2900 XT.

thank-you for pointing this out!
Anand's comment was on performance without AA, and doesn't apply to this argument. I'm not paying $400 for a video card to play without AA.

again anand answers you ... i'll repeat it cause you did: :p

Antialiasing

We did take a quick look at antialiasing for this review using our high end cards. The bottom line is that our 2900 XT and 8800 GTS class cards are capable of 4xAA up to but not including 2560x1600. Frame rate just drops too low under our outdoor test to remain playable in the $300 - $400 price range at the highest resolution with 4xAA enabled. There is a big performance impact that comes along with 4xAA with all of these cards, especially in our already graphically demanding outdoor benchmark.

You are again trying to compare a $400 2900xt to a $650 GTX ... in EP2, the Radeon beats the comparable GTS ... and generally blows away the 1900 series
:confused:

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: apoppin

***What are you talking about? NO $400 card can. i'll let anand answer you:
We did take a quick look at antialiasing for this review using our high end cards. The bottom line is that our 2900 XT and 8800 GTS class cards are capable of 4xAA up to but not including 2560x1600. Frame rate just drops too low under our outdoor test to remain playable in the $300 - $400 price range at the highest resolution with 4xAA enabled. There is a big performance impact that comes along with 4xAA with all of these cards, especially in our already graphically demanding outdoor benchmark.
I am not talking about some ridiculous resolution of 2560x1600. I am talking about the fact that even at more common and playable resolutions like 1920x1200 the 2900xt is no better than a 1950xtx once AA+AF are enabled.
- Anand's comment on your example:

Antialiasing

We did take a quick look at antialiasing for this review using our high end cards. The bottom line is that our 2900 XT and 8800 GTS class cards are capable of 4xAA up to but not including 2560x1600. Frame rate just drops too low under our outdoor test to remain playable in the $300 - $400 price range at the highest resolution with 4xAA enabled. There is a big performance impact that comes along with 4xAA with all of these cards, especially in our already graphically demanding outdoor benchmark.

You are again trying to compare a $400 2900xt to a $650 GTX ... in EP2, the Radeon beats the comparable GTS ... and generally blows away the 1900 series
:confused:

People don't buy $400 video cards and play without AA+AF, don't keep throwing me benches and generalizations about performance in 2007 games using 1999 settings. Not only is the 2900xt unable to beat a last gen 1950xtx, the benches show how the 2900xt goes from being "nearly on par with the 8800gtx" without AA+AF to being barely on par with a 1950xtx once those features are enabled. And you don't think that's a problem?
 

Caveman

Platinum Member
Nov 18, 1999
2,539
35
91
So... Is there much info on the "high end" next gen cards avail? If I read right, the cards discussed here are middle of the road, good price point affairs... what about the "money is no object cards"?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: munky
Originally posted by: apoppin

***What are you talking about? NO $400 card can. i'll let anand answer you:
We did take a quick look at antialiasing for this review using our high end cards. The bottom line is that our 2900 XT and 8800 GTS class cards are capable of 4xAA up to but not including 2560x1600. Frame rate just drops too low under our outdoor test to remain playable in the $300 - $400 price range at the highest resolution with 4xAA enabled. There is a big performance impact that comes along with 4xAA with all of these cards, especially in our already graphically demanding outdoor benchmark.
I am not talking about some ridiculous resolution of 2560x1600. I am talking about the fact that even at more common and playable resolutions like 1920x1200 the 2900xt is no better than a 1950xtx once AA+AF are enabled.
- Anand's comment on your example:

Antialiasing

We did take a quick look at antialiasing for this review using our high end cards. The bottom line is that our 2900 XT and 8800 GTS class cards are capable of 4xAA up to but not including 2560x1600. Frame rate just drops too low under our outdoor test to remain playable in the $300 - $400 price range at the highest resolution with 4xAA enabled. There is a big performance impact that comes along with 4xAA with all of these cards, especially in our already graphically demanding outdoor benchmark.

You are again trying to compare a $400 2900xt to a $650 GTX ... in EP2, the Radeon beats the comparable GTS ... and generally blows away the 1900 series
:confused:

People don't buy $400 video cards and play without AA+AF, don't keep throwing me benches and generalizations about performance in 2007 games using 1999 settings. Not only is the 2900xt unable to beat a last gen 1950xtx, the benches show how the 2900xt goes from being "nearly on par with the 8800gtx" without AA+AF to being barely on par with a 1950xtx once those features are enabled. And you don't think that's a problem?

WHere the hell are you looking? I had an X1950 in my PC before this HD2900 and the performance increase WITH AA AND AF active was pretty substantial.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: munky
Originally posted by: apoppin
Originally posted by: munky
Originally posted by: myocardia
Originally posted by: munky
Originally posted by: apoppin
---unless you can find a situation where the 2900xt is *suddenly crippled* by 4xAA and the 8800GTS is not :p
-and don't be showing me brand new games where AMD's drivers currently suck

Here you go:
8800gts
2900xt

Moreover, the fact alone that Ati apparently needs to update their drivers for every new game released is enough reason to put me off from buying their cards. Sort of reminds me of Nvidia's FX cards, and I'd hate to be in the shoes of the people who bought those :p

Huh? Your "proof" that the X2900 is crippled by 4x AA shows it still outperforming the 640MB 8800 GTS.:D The rest of your post (that I quoted), I agree with, though.

In some games it wins, in some it loses. Also, that review used a reference-spec gts, not the factory OC'd ones. I am not one to favor Nvidia cards... but I call it how I see it, and if I had to chose between the two, I'd pick the gts.
so you are showing me that the 2900xt is still faster then the GTS640?
:confused:
No, I am showing you that the 2900xt takes a bigger hit from AA and AF. You must have ignored the all the games where it looses to come up with your conclusion.
nice


so what IS your point ?
I made my point and you didn't get it.
i get no *reminders* of the FX series ... the 2900xt is easily the EQUAL of the GTS640 OC
Seeing how the 2900xt tanks with AA+AF, or whenever there's a new game out that the drivers havan't been optimized for, I get plenty of reminders. And the fact that a 700M transistor 2900xt card has no chance of matching the performance of a 700M transistor 8800gtx sinks in that reminder even further.

Did you just forget that just about every single game every created for the past 2 years or so has been developed soley with Nvidia in mind? It's no surprise that you get blasted with their logo when launching the .exe.

Now tell me how it's ATI's fault that developers do not work with their team but go to Nvidia and kiss their feet? ATI has to work on drivers for the games and they ALWAYS get it done. Hell at least I never had texture memory leaks that forced me to return to the desktop to clear it. Then at least ATI never blames the developers, refusing to fix an issue.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Caveman
So... Is there much info on the "high end" next gen cards avail? If I read right, the cards discussed here are middle of the road, good price point affairs... what about the "money is no object cards"?

That doesn't make good profits.
 

Caveman

Platinum Member
Nov 18, 1999
2,539
35
91
I'm not worried about profits... I'm just interestin gin comparing future top end to what's currently avail. Can anyone ballpark an estimate? I guess based on previous generations, it's probably "50%" (approx) better than the previous generation?
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: cmdrdredd
Did you just forget that just about every single game every created for the past 2 years or so has been developed soley with Nvidia in mind? It's no surprise that you get blasted with their logo when launching the .exe.

Now tell me how it's ATI's fault that developers do not work with their team but go to Nvidia and kiss their feet? ATI has to work on drivers for the games and they ALWAYS get it done. Hell at least I never had texture memory leaks that forced me to return to the desktop to clear it. Then at least ATI never blames the developers, refusing to fix an issue.

I AM NOT bashing Ati in general, nor am I promoting Nvidia. I can name you numerous examples of twimtbp games that ran faster on Ati HW every generation except this one... the Nvidia logo did not guarantee better performance when Ati had a faster or at least competitive video card, and if it's different now, that's not necessarily the result of vendor-specific game optimizations. What I am talking about is glaring weakness of the r600 architecture with regard to AA+AF performance, and here you guys have spun this into numerous other issues, yet miraculously managed to completely miss my original point.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: munky
Originally posted by: apoppin

***What are you talking about? NO $400 card can. i'll let anand answer you:
We did take a quick look at antialiasing for this review using our high end cards. The bottom line is that our 2900 XT and 8800 GTS class cards are capable of 4xAA up to but not including 2560x1600. Frame rate just drops too low under our outdoor test to remain playable in the $300 - $400 price range at the highest resolution with 4xAA enabled. There is a big performance impact that comes along with 4xAA with all of these cards, especially in our already graphically demanding outdoor benchmark.
I am not talking about some ridiculous resolution of 2560x1600. I am talking about the fact that even at more common and playable resolutions like 1920x1200 the 2900xt is no better than a 1950xtx once AA+AF are enabled.
- Anand's comment on your example:

Antialiasing

We did take a quick look at antialiasing for this review using our high end cards. The bottom line is that our 2900 XT and 8800 GTS class cards are capable of 4xAA up to but not including 2560x1600. Frame rate just drops too low under our outdoor test to remain playable in the $300 - $400 price range at the highest resolution with 4xAA enabled. There is a big performance impact that comes along with 4xAA with all of these cards, especially in our already graphically demanding outdoor benchmark.

You are again trying to compare a $400 2900xt to a $650 GTX ... in EP2, the Radeon beats the comparable GTS ... and generally blows away the 1900 series
:confused:

People don't buy $400 video cards and play without AA+AF, don't keep throwing me benches and generalizations about performance in 2007 games using 1999 settings. Not only is the 2900xt unable to beat a last gen 1950xtx, the benches show how the 2900xt goes from being "nearly on par with the 8800gtx" without AA+AF to being barely on par with a 1950xtx once those features are enabled. And you don't think that's a problem?

again ... what are you talking about?

*I* don't buy a $400 video cards and play any game without 4xAA+16xAF :p

But what moron would buy a $400 video card and expect it to run without penalty with 4x AA at 25x16 or even 19x12?

That is what you are evidently telling me it should do ... these are the *only* benches you are showing me - your example to prove my point.
-- the only thing really "impressive" is that the 1950xtx is ALSO able to beat the GTS at super-hi resolutions and with 4xAA.
:Q

I'm not worried about profits... I'm just interestin gin comparing future top end to what's currently avail. Can anyone ballpark an estimate? I guess based on previous generations, it's probably "50%" (approx) better than the previous generation?
i am "guessing" +30% for an evolutionary upgrade .... maybe 10% more for a good shrink [assuming they want the heat /power to be lower]
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: apoppin
again ... what are you talking about?
Do you want me to post in bold font? Because I told you exactly what I'm talking about, and aparently you just don't get it:

outdoor, 19x12, AA+AF:
1950xtx.... 48.3fps
2900xt...... 47.6fps

indoor, 19x12, AA+AF
1950xtx.... 70.5fps
2900xt...... 70.3fps

*I* don't buy a $400 video cards and play any game without 4xAA+16xAF :p
And yet you're showing me comments how the 2900xt is nearly on par with the gtx with no AA and no AF, like it matters.
But what moron would buy a $400 video card and expect it to run without penalty with 4x AA at 25x16 or even 19x12?
The moron who bought a card with a penalty so huge that it puts him on par with a $200 card from last generation. Nowhere in my post did I expect to have NO penalty at all for AA.
That is what you are evidently telling me it should do ... these are the *only* benches you are showing me - your example to prove my point.
-- the only thing really "impressive" is that the 1950xtx is ALSO able to beat the GTS at super-hi resolutions and with 4xAA.
:Q
And what is your point exactly, besides avoiding my point? That the 2900xt can barely beat the 320mb gts, which also tanks with AA enabled? The fact that the 1950xtx can beat both of them is not a miracle, it's because AA performance is a weakness for both of those cards.
 

praesto

Member
Jan 29, 2007
83
0
0
Anandtechs performance review of HL2 shows that a 320mb gts is performing on par with a hd2900xt at 1900x200 and 2500x1600 resolutions with AA enabled. Lets not forget the ''PLAYS BEST ON ATI'' tag in the video options menu.

Oh and..don't tell me that someone would be stupid if they ought to play HL2 ep2 with a hd2900xt at 2500x1600 with 4x aa...the engine is outdated and there's a bunch of games that are WAY more taxing than the current source engine.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: munky
Originally posted by: apoppin
again ... what are you talking about?
Do you want me to post in bold font? Because I told you exactly what I'm talking about, and aparently you just don't get it:

outdoor, 19x12, AA+AF:
1950xtx.... 48.3fps
2900xt...... 47.6fps

indoor, 19x12, AA+AF
1950xtx.... 70.5fps
2900xt...... 70.3fps

*I* don't buy a $400 video cards and play any game without 4xAA+16xAF :p
And yet you're showing me comments how the 2900xt is nearly on par with the gtx with no AA and no AF, like it matters.
But what moron would buy a $400 video card and expect it to run without penalty with 4x AA at 25x16 or even 19x12?
The moron who bought a card with a penalty so huge that it puts him on par with a $200 card from last generation. Nowhere in my post did I expect to have NO penalty at all for AA.
That is what you are evidently telling me it should do ... these are the *only* benches you are showing me - your example to prove my point.
-- the only thing really "impressive" is that the 1950xtx is ALSO able to beat the GTS at super-hi resolutions and with 4xAA.
:Q
And what is your point exactly, besides avoiding my point? That the 2900xt can barely beat the 320mb gts, which also tanks with AA enabled? The fact that the 1950xtx can beat both of them is not a miracle, it's because AA performance is a weakness for both of those cards.

let ME try one more time:

19x12, 4AA+AF:
1950xtx.... 70.5 fps
2900xt...... 70.3 fps
8800GTS .. 67.9 fps


25x16x12, 4AA+AF:
1950xtx.... 30.8 fps
2900xt...... 30.7 fps
8800GTS .. 27.0

--in the other 2 tests the GTS "wins" by 0.7 of a FPS and by a couple in the other :p

EQUIVALENT cards - yes, you finally appear to get it - that AA kills the GTS and the 2900xt at resolutions over 16x12; it puts them both on a par with x1950xt. At lower resolutions - resolutions MOST people will play at - 2900xt destroys the x1950xt and soundly whips the GTS also in HL2-EP2.


Oh and..don't tell me that someone would be stupid if they ought to play HL2 ep2 with a hd2900xt at 2500x1600 with 4x aa...the engine is outdated and there's a bunch of games that are WAY more taxing than the current source engine.
Source outdated ... i think NOT; not any longer.
If someone is trying to play a new game at 25x16 and also wanting 4xaa ... they are not so smart are they? :p
-i'd think GTX would be the minimum video card for that resolution




 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
Way I figure it is, the X1950XTX is quite a powerful card, and it is starting to be realized against the newer gen. :)
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
<<The moron who bought a card with a penalty so huge that it puts him on par with a $200 card from last generation

um, what $200 from the last generation are you talking about? Looks like X1950 xtx is still going for $250+ on ebay right now. http://search.ebay.com/x1950xt...rcloZQQssPageNameZWLRS

couldn't find any current x1950xtx on fs/ft.

edit, I did find a used one for $220, you might be able to get that one closer to $200 used.

What was the cheapest anybody every saw an x1950xtx for new?
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: JPB
Way I figure it is, the X1950XTX is quite a powerful card, and it is starting to be realized against the newer gen. :)

Yeah, when the X1900XT debuted and went agains the 7900GTX, the X1900XT was overall slighly faster, and now with the newer games with newer features and eye candy, the gap is getting wider.
 

Demoth

Senior member
Apr 1, 2005
228
0
0
As it stands now, the big let down is that neither card maker could deliver a card yet that will handle true DX10 games. Even if you go top end now, or the start of 2008, your going to spend a good $500+ and know you'll be looking at >20FPS at high settings in a few big titles a year down the road. Most card buyers want to hold a $500 purchase for at least a year, more like 2 years in my case.

Both NVIDIA and ATI are continuing the ruse of putting large amounts of RAM on 6600/6800 era performance, giving these lame cards a flashy name like the 8400GT and putting them in boxes so flashy, they blind you when walking down the vid card isle at Best Buy. This shows a total lack of respect to customers and a total lack of integrity. This is worse then what Nvidia is doing now which is sitting on the 8800 series at top selling price till ATI poses any challenge, while keeping tight lipped about immediate releases. Most corps would do the same, but not as many go out of their way to blatently deceive customers.

Fact is, the 7950 GT OCed performs in the same class as a 8800GTS. Not as fast if the 8800 is OCed, but still close enough that gamers would notice little difference.

Nothing upcoming in the near term is all that exciting right now. If the manufacturers would only try for good performance with very low power reqs at the low end and extreme performance at the higher ends, things would be better for everyone. Card makers would have much better crossfire/sli sales at the low to mid end where most of their sales come from. Right now they'll have hoards of angry customers when Crysis arrives. People who figured their 8400GT with 512 DDR2 and a big 'Turbo Charged' certificate on the box, was actually a decent gaming card.

Pretty soon, maybe 2009, card makers will need to release low to mid low range cards that can handle current games, many of them with more and more DX10 features (with lots of presure on game devs by M$). If they don't, they could lose a lot of people to the console market.

Probably would recommend that people who need to upgrade now, hold for the 8800GT at a $200 price point. Probably mid-December. ATI at this point has too many problems- high heat, slow driver implementation, big hit using AA and still a poorer IQ compared to the 8XXX series.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: bryanW1995
<<<<<<<<<<The moron who bought a card with a penalty so huge that it puts him on par with a $200 card from last generation

um, what $200 from the last generation are you talking about? Looks like X1950 xtx is still going for $250+ on ebay right now. http://search.ebay.com/x1950xt...rcloZQQssPageNameZWLRS

couldn't find any current x1950xtx on fs/ft.

edit, I did find a used one for $220, you might be able to get that one closer to $200 used.

What was the cheapest anybody every saw an x1950xtx for new?

actually ... the slightly slower x1950xt is a $200 card ... now :p

GECUBE HV195XTG3-E3 Radeon X1950XT 512MB 256-bit GDDR3 PCI Express x16 HDCP Ready CrossFire Supported Video Card - Retail
-$199.99 at the 'egg .. 3 Business Day Shipping $5.84
---In Stock ... ready for OC to XTX ...

here's the cheapest XTX . . . for $372.99 but ooS
http://www.newegg.com/Product/...x?Item=N82E16814195024

You have to pay well over $400 for an "in stock" Asus XTX; that way you can beat BOTH the 2900xt and 8800GTS in Ep2 at 25x16 when you turn on AA as munky suggests ... never-mind it is still unplayable.
[edit: did somebody just buy it? ... i thought it was in stock 2 minutes ago. Must be the painkillers. :Q]

http://www.newegg.com/Product/...x?Item=N82E16814121024

Anyway ... My point is that it takes resolutions over 16x10/12 to really slow 2900xt to "unplayable" when AA is enabled ... at that same point, the 8800GTS is also feeling the strain ... not unusual for $400 GPUs; any gamer spending the bucks for a 30" monitor should properly match it with the fastest GPU - not mismatch it and then complain about AA being weak. Yeah ... it IS "weak" ... but not as some are claiming --especially when compared to its competition.

As it stands now, the big let down is that neither card maker could deliver a card yet that will handle true DX10 games. Even if you go top end now, or the start of 2008, your going to spend a good $500+ and know you'll be looking at >20FPS at high settings in a few big titles a year down the road. Most card buyers want to hold a $500 purchase for at least a year, more like 2 years in my case.
Agreed. However, i'd like to ADD some *good news* ... DX10 currently doesn't look much better then DX9c and we can completely max out DX9 games at 16x12. We stand better then we did when DX8 changed over to DX9. DX8 looked awful in comparison.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: apoppin
let ME try one more time:

<<19x12, 4AA+AF:
1950xtx.... 70.5 fps
2900xt...... 70.3 fps
8800GTS .. 67.9 fps


<<25x16x12, 4AA+AF:
1950xtx.... 30.8 fps
2900xt...... 30.7 fps
8800GTS .. 27.0

--in the other 2 tests the GTS "wins" by 0.7 of a FPS and by a couple in the other :p

EQUIVALENT cards - yes, you finally appear to get it - that AA kills the GTS and the 2900xt at resolutions over 16x12; it puts them both on a par with x1950xt. At lower resolutions - resolutions MOST people will play at - 2900xt destroys the x1950xt and soundly whips the GTS also in HL2-EP2.

Now let ME try this one more time...
Forget the fact that the 2900xt can match a 8800gts, because it's totally irrelevant to the topic I brought up. The big picture you're missing is that the 2900xt comes close to the 8800gtx without AA but can only barely match a 8800gts with AA. If that doesn't tell you about the weakness of the r600 in regard to AA then just stop reading now because you're definitely not going to get what I say next. The point I am making in this thread is that if Ati doesn't fix the bigger than normal performance drop from AA in the rv670, then they will be at a disadvantage against a future Nvidia card which has more than the 96 shaders active on the 8800gts.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Probably would recommend that people who need to upgrade now, hold for the 8800GT at a $200 price point. Probably mid-December. ATI at this point has too many problems- high heat, slow driver implementation, big hit using AA and still a poorer IQ compared to the 8XXX series.

You are way behind bro. 8800 gt will be out on oct 29, 2950 series in mid november. 2950 will be on 55 nm vs 65 nm for 8800gt, so guess who's going to have higher heat? As to driver implementation, why do you think that nvidia's drivers are better than ati's? Big hit using AA is highly debatable (see munky vs apoppin argument), but it appears that ati has addressed this in rv670. I'm not sure what you mean about IQ, must be because I'm a stooooopid ati guy :)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: munky
Originally posted by: apoppin
let ME try one more time:

<<<<<<<19x12, 4AA+AF:
1950xtx.... 70.5 fps
2900xt...... 70.3 fps
8800GTS .. 67.9 fps


<<<<<<<25x16x12, 4AA+AF:
1950xtx.... 30.8 fps
2900xt...... 30.7 fps
8800GTS .. 27.0

--in the other 2 tests the GTS "wins" by 0.7 of a FPS and by a couple in the other :p

EQUIVALENT cards - yes, you finally appear to get it - that AA kills the GTS and the 2900xt at resolutions over 16x12; it puts them both on a par with x1950xt. At lower resolutions - resolutions MOST people will play at - 2900xt destroys the x1950xt and soundly whips the GTS also in HL2-EP2.

Now let ME try this one more time...
Forget the fact that the 2900xt can match a 8800gts, because it's totally irrelevant to the topic I brought up. The big picture you're missing is that the 2900xt comes close to the 8800gtx without AA but can only barely match a 8800gts with AA. If that doesn't tell you about the weakness of the r600 in regard to AA then just stop reading now because you're definitely not going to get what I say next. The point I am making in this thread is that if Ati doesn't fix the bigger than normal performance drop from AA in the rv670, then they will be at a disadvantage against a future Nvidia card which has more than the 96 shaders active on the 8800gts.

OK ... i get a 2 minute rebuttal
-if you are a slow reader ...

... and you are missing the *biggest picture* of all

2900xt did what it HAD to do ... what AMD aimed it at ... it's "purpose" was to take the upper-midrange - to compete in the GTS slot. It fulfilled that purpose admirably as i and many other happy 2900xt owners can attest to. The fact you pointed out that is is close to GTX performance without AA is just a "plus" to most owners - especially since some games don't run with AA in DX9 without a hack.

otoh, there ARE those who got a 2900xt with the *hope* it would run at 19x12 and still allow for AA ... for *those* budget-minded individuals, it did not turn out so well being a small improvement practically over x1900.

i am NOT disagreeing that AMD *needs* to improve AA for 2950xtx IF it wants to go up against the nvidia Ultra ... if that is your only point ... i never said there was no room for improvement for 2900xt. There is plenty. In fact, i have a list for AMD. :p

i think that is a 'given' about AA and i don't believe that was you original contention ... you have backed down quite a lot in your original argument.

i also do not necessarily believe AA is a weakness in r600 as the 2900xt IS r600 and we are back to it being suited perfectly for competing with the GTS. We have to see what changes AMD makes before we can make any judgment about r600 "weaknesses" or even agree on what is a "bigger than normal" performance drop from AA implementation in-game.
... if you want to pick on weaknesses pick on r600 power usage ... yet r600 power usage seems fine in the 2900pro incarnation.