Inquirer Reports Catalyst 5.6 huge improvement

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
It'll be nice if it's true but I have my doubts.

And yes, a 15%-20% gain is quite possible through a driver release. Much higher in fact if there were bugs holding things back.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: Elcs
Originally posted by: raystorm
Originally posted by: Megatomic
The article doesn't stipulate which cards get it. I read it twice to see if it did.

damn...thats true. I shouldn't get my hopes all high that my 9800 will get those nice performance increases..if that article is true anyways.

:(

Anyway, havent Nvidia produced a 15-20% overall improvement with a simple driver release?

I cant see why it cant happen but it does seem a bit high.

Well, they do give explanations as to where the improvements come from, which makes it *shound* more legit, and ATi said a while back they were working on a new OGL driver, so who knows?
LOMAC sounds like fixing an in game bug to get a 50% improvement though.
 

FlasHBurN

Golden Member
Oct 12, 1999
1,349
0
76
Just wait until they are released. I think the Riddick improvements will be surprising to many. :) As well as the Doom 3 performance. I'm betting you need a X700+ card to see the best improvements, and they will mostly be improvements at lower (software limited) resolutions.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
The only way to achieve those kind of performance increases was if there was a flaw in the code as it was (ie Bad Drivers). Both ATI and Nvidia drivers are very good to begin with. Now if XGI made this claim i would be more apt to believe them (same goes for Sis and what not), because there drivers are no where near as good as they could be.

The article is BS.

-Kevin
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Megatomic
The article doesn't stipulate which cards get it. I read it twice to see if it did.
the Cats are UNIFIED drivers . . . all the 9800/x800 series


============
Originally posted by: Gamingphreek
The only way to achieve those kind of performance increases was if there was a flaw in the code as it was (ie Bad Drivers). Both ATI and Nvidia drivers are very good to begin with. Now if XGI made this claim i would be more apt to believe them (same goes for Sis and what not), because there drivers are no where near as good as they could be.

The article is BS.

-Kevin

MOST likely your post is
:roll:'

"saved" for future 'i told you so'
:thumbsdown:
 

Megatomic

Lifer
Nov 9, 2000
20,127
6
81
Originally posted by: apoppin
Originally posted by: Megatomic
The article doesn't stipulate which cards get it. I read it twice to see if it did.
the Cats are UNIFIED drivers . . . all the 9800/x800 series


============
Originally posted by: Gamingphreek
The only way to achieve those kind of performance increases was if there was a flaw in the code as it was (ie Bad Drivers). Both ATI and Nvidia drivers are very good to begin with. Now if XGI made this claim i would be more apt to believe them (same goes for Sis and what not), because there drivers are no where near as good as they could be.

The article is BS.

-Kevin

MOST likely your post is
:roll:'

"saved" for future 'i told you so'
:thumbsdown:
I know they are unified. But ATI does sometimes change items in the Cats that only affect a certain product line.

And I too quoted Gamingphreek for later analysis up above. He might be right but if he's not he's gonna look like an ass. :)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Megatomic
Originally posted by: apoppin
Originally posted by: Megatomic
The article doesn't stipulate which cards get it. I read it twice to see if it did.
the Cats are UNIFIED drivers . . . all the 9800/x800 series


============
Originally posted by: Gamingphreek
The only way to achieve those kind of performance increases was if there was a flaw in the code as it was (ie Bad Drivers). Both ATI and Nvidia drivers are very good to begin with. Now if XGI made this claim i would be more apt to believe them (same goes for Sis and what not), because there drivers are no where near as good as they could be.

The article is BS.

-Kevin

MOST likely your post is
:roll:'

"saved" for future 'i told you so'
:thumbsdown:
I know they are unified. But ATI does sometimes change items in the Cats that only affect a certain product line.

And I too quoted Gamingphreek for later analysis up above. He might be right but if he's not he's gonna look like an ass. :)

of course if he's 'right' we'll let this quietly drop ;)
:D

As usual, ATI is exaggerating [where DOES the inq get this? an ati employee, duh]

clearly the 5.6 will be a "major" upgrade [which nVidia will "tear apart" looking for "cheating"] that will give "UP TO" the percentages this source is claiming . . . but don't expect "miracles" . . .
;)
 

spazo

Senior member
Apr 5, 2004
344
0
0
Originally posted by: Elcs
Originally posted by: raystorm
Originally posted by: Megatomic
The article doesn't stipulate which cards get it. I read it twice to see if it did.

damn...thats true. I shouldn't get my hopes all high that my 9800 will get those nice performance increases..if that article is true anyways.

:(

Anyway, havent Nvidia produced a 15-20% overall improvement with a simple driver release?

I cant see why it cant happen but it does seem a bit high.


Well if anyone cared to remember, the 68xx series performed very badly with the first driver releases. It wasn't until they had some later ones that the 68xx started to perfrom well
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Insomniak
Originally posted by: Gamingphreek
Originally posted by: gsellis
The Inquirer's 5.6 article

Dang, I figured this would be the lastest "Yeah, Right" thread going. Leaking news (remember, The Inquirer does not sign NDAs), they say 5.6 will be a huge improvement in many games, with LOMAC gaining 50% (not that I play it).

Just so expectations are set, expect to see ATI, nVidia, and The Inquirer trashed in this thread. ;)


I call BS right now.

They "have a feeling" it will be released on June 7 :confused:!?! Wtf is that. That whole article is BS.

-Kevin



I'm with phreek. 15-20% performance improvements?

From a driver release?



Sure. Whatever.

I'll be laughing when this one launches.


ATI has done this in the past (remember the 8500 launch drivers? There was something like 35%+ more performance milked out of that card over the next year.

Also, Nvidia has done this several times in the past with driver releases. I remember two major detonator releases like this at least (THG, Anandtech and others had large articles on the updates).

It's possible to get more performance out of these cards - that's why they have teams working on it!
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: spazo

Well if anyone cared to remember, the 68xx series performed very badly with the first driver releases. It wasn't until they had some later ones that the 68xx started to perfrom well

This is true. The 6x series didnt get an official driver for several months after it was released. When the 6800GT first came out, it was very close between it and the X800 Pro. The GT was still faster, but not like it is today, being much faster in just about everything.

edit,
Originally posted by: jiffylube1024

Also, Nvidia has done this several times in the past with driver releases. I remember two major detonator releases like this at least (THG, Anandtech and others had large articles on the updates).

I remember Anands article, and later on they were not happy with NV. It didnt come out when it was supposed to, and they swore off using beta drivers in reviews again. Of course, that didnt last long.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: jiffylube1024
Originally posted by: Insomniak
Originally posted by: Gamingphreek
Originally posted by: gsellis
The Inquirer's 5.6 article

Dang, I figured this would be the lastest "Yeah, Right" thread going. Leaking news (remember, The Inquirer does not sign NDAs), they say 5.6 will be a huge improvement in many games, with LOMAC gaining 50% (not that I play it).

Just so expectations are set, expect to see ATI, nVidia, and The Inquirer trashed in this thread. ;)


I call BS right now.

They "have a feeling" it will be released on June 7 :confused:!?! Wtf is that. That whole article is BS.

-Kevin



I'm with phreek. 15-20% performance improvements?

From a driver release?



Sure. Whatever.

I'll be laughing when this one launches.


ATI has done this in the past (remember the 8500 launch drivers? There was something like 35%+ more performance milked out of that card over the next year.

Also, Nvidia has done this several times in the past with driver releases. I remember two major detonator releases like this at least (THG, Anandtech and others had large articles on the updates).

It's possible to get more performance out of these cards - that's why they have teams working on it!

actually ati is working on their drivers because they FEAR nVidia's G70 3DMark and Doom3 scores . . . they don't want to be caught [again] like they were with the R8500 vs GF4 debacle.

They will be looking to squeeze every last drop of performance out of their "old" cards . . . . i know since the 4.X to 5.X Cats my 3DMark'05 score has gone up by more than 10% [it's "watchable" on my 9800xt now] . . . should easily break 3K Marks now.

;)

certainly it's possible . . . likely even . . . just not as much as fanATIcs would like to see.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: apoppin

actually ati is working on their drivers because they FEAR nVidia's G70 3DMark and Doom3 scores . . . they don't want to be caught [again] like they were with the R8500 vs GF4 debacle.

They will be looking to squeeze every last drop of performance out of their "old" cards . . . . i know since the 4.X to 5.X Cats my 3DMark'05 score has gone up by more than 10% [it's "watchable" on my 9800xt now] . . . should easily break 3K Marks now.

Cool! I haven't run any of the 3dmarks or any benchmarks in ages... I sold my X800 Pro and downgraded to a 9800 Pro long ago. It's good to see ATI is sqeezing more out of our cards yet!


Still, this is entirely uncomparable to the 8500 situation - that was still the "old" and slow ATI, before the ArtX acquisition. This is the "new and improved" ATI, that is just as clever, sneaky and underhanded as Nvidia ;) .
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: jiffylube1024
Originally posted by: apoppin

actually ati is working on their drivers because they FEAR nVidia's G70 3DMark and Doom3 scores . . . they don't want to be caught [again] like they were with the R8500 vs GF4 debacle.

They will be looking to squeeze every last drop of performance out of their "old" cards . . . . i know since the 4.X to 5.X Cats my 3DMark'05 score has gone up by more than 10% [it's "watchable" on my 9800xt now] . . . should easily break 3K Marks now.

Cool! I haven't run any of the 3dmarks or any benchmarks in ages... I sold my X800 Pro and downgraded to a 9800 Pro long ago. It's good to see ATI is sqeezing more out of our cards yet!


Still, this is entirely uncomparable to the 8500 situation - that was still the "old" and slow ATI, before the ArtX acquisition. This is the "new and improved" ATI, that is just as clever, sneaky and underhanded as Nvidia ;) .

i wouldn't say uncomparable . . . i just did
:roll:

i am "saying" ATI "fears" nVidia's g70 and is working on "free performance" for its customers to mitigate disparities in upcoming benchmarks.

Guess what nVidia is doing?
:evil:
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
I dont think I have ever seen anyone who thought they had all the answers, as much as you do, apop.
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
apoppin: There you go again, spewing all this pro-NVDA BS. You were saying the same before the GF6 came out, and you see what happened? Aside from DIII, ATi was either even or had the upper hand, especially in D3D titles like HL2 and Far Cry. SM3.0 is BS at this point.

What you've got now are two companies with two powerful GPUs playing a poker game, trying to see who will reveal first. And both know, once revealed, they are under huge pressure to deliver in quantity. I have the feeling they will be similar performance wise, but ATi will continue having problems supplying chips at 90 nanometers. On the other side, NVDA's G70 at 110 nanometers may have heat issues.

Time will tell but it's too early to crown a winner here.
 

FlasHBurN

Golden Member
Oct 12, 1999
1,349
0
76
Originally posted by: Gamingphreek
The only way to achieve those kind of performance increases was if there was a flaw in the code as it was (ie Bad Drivers). Both ATI and Nvidia drivers are very good to begin with. Now if XGI made this claim i would be more apt to believe them (same goes for Sis and what not), because there drivers are no where near as good as they could be.

The article is BS.

-Kevin

Saved for future reference. Can't wait to see you eat your words in a few days.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Elcs
Originally posted by: raystorm
Originally posted by: Megatomic
The article doesn't stipulate which cards get it. I read it twice to see if it did.

damn...thats true. I shouldn't get my hopes all high that my 9800 will get those nice performance increases..if that article is true anyways.

:(

Anyway, havent Nvidia produced a 15-20% overall improvement with a simple driver release?

I cant see why it cant happen but it does seem a bit high.


The biggest driver performance increase I can recall in recent memory was when Nvidia circulated those Detonators for GeForce 3 Ti right before the Radeon 8500 launch that put them ahead of the 8500.

This was done, of course, to steal ATi's thunder completely, and it did that perfectly. IIRC the performance increase there was around 10% across the board.

20% is just ridiculous.

You can get 20+% performance improvement out of a card over its lifespan through drivers, but in a single driver? I'll believe it when I see it.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Well, the X800 has been out for a year, so ATI has had enough time to wring out that much extra performance.

Remember, "driver" increases can come from app-specific fixes as well as generic improvements. AFAIK, iD removed their NV30 path b/c nV put all those optimizations into its drivers. ATI could be doing the same now, especially since they recently dropped their (public) aversion to app-specific fixes (Catalyst AI).

Rumor is that R520 will show in September in limited quantities due to low 90nm yields, but will be faster than G70. But G70 is due for retail availability in just a couple of weeks, which meshes with a June 7th launch, per that Inquirer date. So Fuad may be spot on.
 

fstime

Diamond Member
Jan 18, 2004
4,382
5
81
Originally posted by: GTaudiophile
apoppin: There you go again, spewing all this pro-NVDA BS. You were saying the same before the GF6 came out, and you see what happened? Aside from DIII, ATi was either even or had the upper hand, especially in D3D titles like HL2 and Far Cry. SM3.0 is BS at this point.

What you've got now are two companies with two powerful GPUs playing a poker game, trying to see who will reveal first. And both know, once revealed, they are under huge pressure to deliver in quantity. I have the feeling they will be similar performance wise, but ATi will continue having problems supplying chips at 90 nanometers. On the other side, NVDA's G70 at 110 nanometers may have heat issues.

Time will tell but it's too early to crown a winner here.



G70 WILL PWN EVERYTHING, TELL HIM APONIN.
 

EightySix Four

Diamond Member
Jul 17, 2004
5,122
52
91
Originally posted by: BouZouki
Originally posted by: GTaudiophile
apoppin: There you go again, spewing all this pro-NVDA BS. You were saying the same before the GF6 came out, and you see what happened? Aside from DIII, ATi was either even or had the upper hand, especially in D3D titles like HL2 and Far Cry. SM3.0 is BS at this point.

What you've got now are two companies with two powerful GPUs playing a poker game, trying to see who will reveal first. And both know, once revealed, they are under huge pressure to deliver in quantity. I have the feeling they will be similar performance wise, but ATi will continue having problems supplying chips at 90 nanometers. On the other side, NVDA's G70 at 110 nanometers may have heat issues.

Time will tell but it's too early to crown a winner here.



G70 WILL PWN EVERYTHING, TELL HIM APONIN.




roflmfao
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Pete
AFAIK, iD removed their NV30 path b/c nV put all those optimizations into its drivers.

Sort of. :p

NVIDIA (finally) added support for partial precision hints in their OpenGL fragment shader extensions for NV3X -- so that you could write a 32-bit shader and just tell the driver to render it in 16-bit precision if possible. Before that, Doom3 needed a separate rendering path that explicitly had shaders with 16-bit precision. It wasn't that NVIDIA had magically made the FX chips fast enough to not need the reduced precision (although one of the blurbs talking about the change sort of implied that).

As far as the OP, basically, I'll believe it when I see it. No doubt there's a lot of room for improvement in some of these applications, but you sort of have to take this kind of thing with a grain of salt until there are real numbers on the table.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Since they mentioned Catalyst A.I., this means that they start optimizing each game/engine individually. Sort of like how they built-in the Doom 3 Humus tweak into one of their releases IIRC. Only way I see this as being possible. nVidia will fall behind if ATI starts doing this and they don't. One of the reasons could be desperation, but it's not like the X850XTPE is way behind the 6800U. Perhaps they just want to have an even bigger lead. When they say "no visual quality decrease.. we promise!" I believe nothing short of the opposite and I suspect some "optimizations" are going on.

Originally posted by: Gamingphreek
The only way to achieve those kind of performance increases was if there was a flaw in the code as it was (ie Bad Drivers). Both ATI and Nvidia drivers are very good to begin with. Now if XGI made this claim i would be more apt to believe them (same goes for Sis and what not), because there drivers are no where near as good as they could be.

The article is BS.

-Kevin

No... it's quite possible they are selectively optimizing those games. I still don't think it will pan out like they say, but there will be at least some improvement. Theoretically there is that potential though. Usually the graphics would be of lower quality as a byproduct of that. I guess doing this isn't "cheating"? This could be more of the Quack3 crap. What do you guys think?

Originally posted by: spazo
Originally posted by: Elcs
Originally posted by: raystorm
Originally posted by: Megatomic
The article doesn't stipulate which cards get it. I read it twice to see if it did.

damn...thats true. I shouldn't get my hopes all high that my 9800 will get those nice performance increases..if that article is true anyways.

:(

Anyway, havent Nvidia produced a 15-20% overall improvement with a simple driver release?

I cant see why it cant happen but it does seem a bit high.


Well if anyone cared to remember, the 68xx series performed very badly with the first driver releases. It wasn't until they had some later ones that the 68xx started to perfrom well

Yeah, but cards like the 9800 PRO have been out forever, and for the driver release to affect these would be quite an accomplishment. They definitely had the time to make the 9800 PROs run good by now.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: GTaudiophile
apoppin: There you go again, spewing all this pro-NVDA BS. You were saying the same before the GF6 came out, and you see what happened? Aside from DIII, ATi was either even or had the upper hand, especially in D3D titles like HL2 and Far Cry. SM3.0 is BS at this point.

What you've got now are two companies with two powerful GPUs playing a poker game, trying to see who will reveal first. And both know, once revealed, they are under huge pressure to deliver in quantity. I have the feeling they will be similar performance wise, but ATi will continue having problems supplying chips at 90 nanometers. On the other side, NVDA's G70 at 110 nanometers may have heat issues.

Time will tell but it's too early to crown a winner here.
So what . . you're the one spewing pro ATI BS with every one of your mundane posts including crap about SM 3.0 . . . it IS important and ATI agrees with me on THAT point.
:thumbsdown:


and it looks like nVidia's g70 will be 90 nanometers :p
:roll:

i haven't crowned a winner . . . just giving my POV.

===============
Originally posted by: BouZouki

G70 WILL PWN EVERYTHING, TELL HIM APONIN.
nonsense . . . and the real race will be between the r600 series and g80 . . . both of these forthcoming GPUs are castrated compared to their console counterparts. ;)
 

ddogg

Golden Member
May 4, 2005
1,864
361
136
Originally posted by: Gamingphreek
The only way to achieve those kind of performance increases was if there was a flaw in the code as it was (ie Bad Drivers). Both ATI and Nvidia drivers are very good to begin with. Now if XGI made this claim i would be more apt to believe them (same goes for Sis and what not), because there drivers are no where near as good as they could be.

The article is BS.

-Kevin

perfectly put....u cant really get that much performance when there are hardly any bugs in ur drivers....both Nvidia and ATI have good drivers, the article is definetely a load of crap.