ATI RV770 in May??!!!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: evolucion8
Originally posted by: golem
Originally posted by: v8envy
ATI's problem past the 9700 days was: they were months behind nv's offerings for slightly better performance. By the time the X800 came out enthusiasts already bought 6800s. Repeat again with the 7800 vs 1800. And again with 7900 vs 1950. And once more with 8800 vs 2900 (although here we don't see the performance improvement). In each case the ATI part was a better performer, but not enough so to warrant an upgrade from the higher end NV parts at release prices.

While ATI still sold wagonloads of those cards they couldn't command the kind of early adopter e-peen premiums that NV's been getting away with -- with the enthusiast market mostly tapped they were left with value conscious upper mainstream/lower enthusiast buyers like myself. Sure, I got the X850XT PE and X1800XT -- but it was for $150 and $249 respectively, 4-6 months after launch. Not $500 and $650 MSRP that ATI would have received had they beaten NV to the market.

My feelings exactly. They also seem to engineer the cards for future trends and not as much for current games. Over time, ATI cards seem to age better and play newer games better than NV cards, but at launch Nvidia usually does better on the current games.

Yeah, I couldn't say it better, ATi even still supporting their Radeon 9X00 series of cards, and during time, their performance gap difference widened, the 9700PRO during time was able to smoke the FX, the X800XT PE was slighly faster than the 6800 Ultra, and then in newer games, it was able to perform considerably faster, that also happened with the X1900 series which was slighly faster or as fast as the 7900GTX, and now, the performance difference is outstanding, even a X1950PRO is able to outperfom it in next generation games. The HD 2900 most of the time was unable to keep up with the 8800GTS 640, and now it can keep up most of the time, and sometimes (SOMETIMES), can rival the 8800GTX. In the Catalyst Dissecting article made by Anandtech also stated that the ATi cards tends to age much better than their nVidia counterparts.

Revisionist history ..

by the time they "age well" they are out of date - except for midrange game rigs

2900xt is *no match* for 8800GTX, i assure you - except when you "get in the game" with a title Like CoJ - and after 4 months, LP does run OK .. but a 8800-GTS320 gives a 2900xt a good run there and an 8800GTX leaves my *crossfire* eating dust

rose.gif


 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Sylvanas
Originally posted by: apoppin
Originally posted by: Sylvanas
Originally posted by: BenSkywalker
According to that link, ATi had a ~50% edge in marketshare over nV in Q4 '04, now nV has a 50% edge over ATi. That is very close to a complete collapse.

In realistic terms, ATi's highest end GPU, its' most bleeding edge part released can not compete with 2 year old nV technology. They very well may make a comeback, but the industry hasn't seen this kind of utter domination since the days of 3Dfx.

If by 2 year old technology you are referring to the G80, it's not meant to compete with it....the 3800 series are competitive on price- if you want a single card with better performance than a G80 (GTX) then thats the 3870X2- which is in the same price bracket. Note the market share difference in Q4 2004 and Q4 2007 between the two is about the same as it was evident back in the 9700pro days, so we have seen it before and will probably see it again.

No .. wishful thinking .. even though it is more elegant perhaps than Gx2, 3870x2 is still a compromise "sandwich card" .. and with the release of Gx2 , you need two x2s .. which is no real "solution"

That depends on what you subjectively classify as a 'solution'. Yes, some people don't like the concept of SLI/CF on a single card and don't think it a *real* *solution* but the fact I can plug it in and get higher frame rates than a single GPU (in the vast majority of games...emphasis on majority) is enough for me to deem it as a 'solution' to a problem of getting better performance. (insert 'but theres input lag' comment here....I haven't noticed it).

Will be interesting to see how things roll out over the next few months.

I won't be surprised that in the GPU world, a wall will be hitted (Like it happened with the CPU world which higher MHz wasn't feasible and higher IPC was almost impossible to attain without requiring great fiddling at the architecture level) for sure, multi core GPU's, like Dual Core CPU's will be the next big thing for performance gains since sliced bread, but like it currently happens with lack of multi threaded optimizations for software, that will also happen with games...

 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: apoppin
Originally posted by: evolucion8
Originally posted by: golem
Originally posted by: v8envy
ATI's problem past the 9700 days was: they were months behind nv's offerings for slightly better performance. By the time the X800 came out enthusiasts already bought 6800s. Repeat again with the 7800 vs 1800. And again with 7900 vs 1950. And once more with 8800 vs 2900 (although here we don't see the performance improvement). In each case the ATI part was a better performer, but not enough so to warrant an upgrade from the higher end NV parts at release prices.

While ATI still sold wagonloads of those cards they couldn't command the kind of early adopter e-peen premiums that NV's been getting away with -- with the enthusiast market mostly tapped they were left with value conscious upper mainstream/lower enthusiast buyers like myself. Sure, I got the X850XT PE and X1800XT -- but it was for $150 and $249 respectively, 4-6 months after launch. Not $500 and $650 MSRP that ATI would have received had they beaten NV to the market.

My feelings exactly. They also seem to engineer the cards for future trends and not as much for current games. Over time, ATI cards seem to age better and play newer games better than NV cards, but at launch Nvidia usually does better on the current games.

Yeah, I couldn't say it better, ATi even still supporting their Radeon 9X00 series of cards, and during time, their performance gap difference widened, the 9700PRO during time was able to smoke the FX, the X800XT PE was slighly faster than the 6800 Ultra, and then in newer games, it was able to perform considerably faster, that also happened with the X1900 series which was slighly faster or as fast as the 7900GTX, and now, the performance difference is outstanding, even a X1950PRO is able to outperfom it in next generation games. The HD 2900 most of the time was unable to keep up with the 8800GTS 640, and now it can keep up most of the time, and sometimes (SOMETIMES), can rival the 8800GTX. In the Catalyst Dissecting article made by Anandtech also stated that the ATi cards tends to age much better than their nVidia counterparts.

Revisionist history ..

by the time they "age well" they are out of date - except for midrange game rigs

2900xt is *no match* for 8800GTX, i assure you - except when you "get in the game" with a title Like CoJ - and after 4 months, LP does run OK .. but a 8800-GTS320 gives a 2900xt a good run there and an 8800GTX leaves my *crossfire* eating dust

rose.gif

You talk about outdated like if we were talking about hairstyles, sure there are DX10 cards, but most games are DX9, and many "outdated cards" like X1950XT are able to run it reasonably well, something that a 7950GT can't without lowering the eye candy. Even a 9700PRO is able to run an old game like Oblivion on medium, something that an FX can't. 8800GTX may leave your crossfire eating dust when there's no profile or optimizations in a game for Crossfire, but there's an optimization for it, the performance difference can be remarkably, a single HD 3870 is slighly faster than a HD 2900XT, which is slighly slower than a 8800GT, but two of them can outperform an ultra...

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: evolucion8
Originally posted by: apoppin
Originally posted by: evolucion8
Originally posted by: golem
Originally posted by: v8envy
ATI's problem past the 9700 days was: they were months behind nv's offerings for slightly better performance. By the time the X800 came out enthusiasts already bought 6800s. Repeat again with the 7800 vs 1800. And again with 7900 vs 1950. And once more with 8800 vs 2900 (although here we don't see the performance improvement). In each case the ATI part was a better performer, but not enough so to warrant an upgrade from the higher end NV parts at release prices.

While ATI still sold wagonloads of those cards they couldn't command the kind of early adopter e-peen premiums that NV's been getting away with -- with the enthusiast market mostly tapped they were left with value conscious upper mainstream/lower enthusiast buyers like myself. Sure, I got the X850XT PE and X1800XT -- but it was for $150 and $249 respectively, 4-6 months after launch. Not $500 and $650 MSRP that ATI would have received had they beaten NV to the market.

My feelings exactly. They also seem to engineer the cards for future trends and not as much for current games. Over time, ATI cards seem to age better and play newer games better than NV cards, but at launch Nvidia usually does better on the current games.

Yeah, I couldn't say it better, ATi even still supporting their Radeon 9X00 series of cards, and during time, their performance gap difference widened, the 9700PRO during time was able to smoke the FX, the X800XT PE was slighly faster than the 6800 Ultra, and then in newer games, it was able to perform considerably faster, that also happened with the X1900 series which was slighly faster or as fast as the 7900GTX, and now, the performance difference is outstanding, even a X1950PRO is able to outperfom it in next generation games. The HD 2900 most of the time was unable to keep up with the 8800GTS 640, and now it can keep up most of the time, and sometimes (SOMETIMES), can rival the 8800GTX. In the Catalyst Dissecting article made by Anandtech also stated that the ATi cards tends to age much better than their nVidia counterparts.

Revisionist history ..

by the time they "age well" they are out of date - except for midrange game rigs

2900xt is *no match* for 8800GTX, i assure you - except when you "get in the game" with a title Like CoJ - and after 4 months, LP does run OK .. but a 8800-GTS320 gives a 2900xt a good run there and an 8800GTX leaves my *crossfire* eating dust

rose.gif

You talk about outdated like if we were talking about hairstyles, sure there are DX10 cards, but most games are DX9, and many "outdated cards" like X1950XT are able to run it reasonably well, something that a 7950GT can't without lowering the eye candy. Even a 9700PRO is able to run an old game like Oblivion on medium, something that an FX can't. 8800GTX may leave your crossfire eating dust when there's no profile or optimizations in a game for Crossfire, but there's an optimization for it, the performance difference can be remarkably, a single HD 3870 is slighly faster than a HD 2900XT, which is slighly slower than a 8800GT, but two of them can outperform an ultra...

it IS like hairstyles .. this IS a tech forum and most of the interest is at the "high end"

i am talking "outdated" only to the extant that Ev8 says "ages well"
--it is all relative

and that is the only thing i was trying to point out

rose.gif


but two of them can outperform an ultra...
agreed .. MOST of the time Mine Do .. but NOT in Lost Planet .. which was a *specific point* i was making; in CoJ a 2900xt beats a GTX, last i looked - because CoJ 'got into the game' with AMD
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Actually, is not just because it got the AMD logo, it's just that they switched COJ to use shaders to resolve Anti Aliasing, something that the efficient but simple GeForce 8 series pixel shaders can't do very well
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: evolucion8
Actually, is not just because it got the AMD logo, it's just that they switched COJ to use shaders to resolve Anti Aliasing, something that the efficient but simple GeForce 8 series pixel shaders can't do very well

Which NVIDIA *Hated*
- if i remember right?! .. something about "unfair" .. the way it was not meant to be programmed

rose.gif
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Well, usually nVidia optimized games tends to use quite a lot hardware vendor extensions which won't work on ATi cards and will slow down the performance, like it happen with Doom 3, that game for some reason was created to run slow on ATi cards, but other games with same engine performed as fast or faster on ATi cards, specially from the previous generations like X1K vs 7 series, games like Quake 4 and Prey for example. I see that nVidia optimized games looks slighly blocky, more CG style or OpenGL style, while ATi games have a more realistic look with some plain looking textures in some places and outstanding pixel shader effects, Xbox 360 inherited the same strenght, slighly blurry textures with pixel shaders everywhere...
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
Originally posted by: evolucion8
Well, usually nVidia optimized games tends to use quite a lot hardware vendor extensions which won't work on ATi cards and will slow down the performance, like it happen with Doom 3, that game for some reason was created to run slow on ATi cards, but other games with same engine performed as fast or faster on ATi cards, specially from the previous generations like X1K vs 7 series, games like Quake 4 and Prey for example. I see that nVidia optimized games looks slighly blocky, more CG style or OpenGL style, while ATi games have a more realistic look with some plain looking textures in some places and outstanding pixel shader effects, Xbox 360 inherited the same strenght, slighly blurry textures with pixel shaders everywhere...

You have it all wrong, Doom 3 ran faster because of stencil shadows which were supported in hardware by NV and not by ATI, when the X1K series came ATI got their OpenGL rendering VASTLY improved and improved even further through Catalyst.

Even to this day, 6 series cards are much better than X8 series cards in OpenGL.

But I digress, this is hardly what's important in this thread.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Piuc2020
Originally posted by: evolucion8
Well, usually nVidia optimized games tends to use quite a lot hardware vendor extensions which won't work on ATi cards and will slow down the performance, like it happen with Doom 3, that game for some reason was created to run slow on ATi cards, but other games with same engine performed as fast or faster on ATi cards, specially from the previous generations like X1K vs 7 series, games like Quake 4 and Prey for example. I see that nVidia optimized games looks slighly blocky, more CG style or OpenGL style, while ATi games have a more realistic look with some plain looking textures in some places and outstanding pixel shader effects, Xbox 360 inherited the same strenght, slighly blurry textures with pixel shaders everywhere...

You have it all wrong, Doom 3 ran faster because of stencil shadows which were supported in hardware by NV and not by ATI, when the X1K series came ATI got their OpenGL rendering VASTLY improved and improved even further through Catalyst.

Even to this day, 6 series cards are much better than X8 series cards in OpenGL.

But I digress, this is hardly what's important in this thread.

Stencil shadows uses a hardware accelerated shadow maps that is accessible through an OpenGL extension, it took the advantage of the 32 Z operations when there's no new color values. ATi can do 32 Z operations as far as Anti Aliasing is on (On HD series, it can do 32 Z in the same way as nVidia does). The X1K also have stencil shadow acceleration through fetch4, which works nicely with OpenGL games, also the performance improvement through a rewritten OpenGL driver...