Originally posted by: Elcs
Originally posted by: raystorm
Originally posted by: Megatomic
The article doesn't stipulate which cards get it. I read it twice to see if it did.
damn...thats true. I shouldn't get my hopes all high that my 9800 will get those nice performance increases..if that article is true anyways.
Anyway, havent Nvidia produced a 15-20% overall improvement with a simple driver release?
I cant see why it cant happen but it does seem a bit high.
the Cats are UNIFIED drivers . . . all the 9800/x800 seriesOriginally posted by: Megatomic
The article doesn't stipulate which cards get it. I read it twice to see if it did.
Originally posted by: Gamingphreek
The only way to achieve those kind of performance increases was if there was a flaw in the code as it was (ie Bad Drivers). Both ATI and Nvidia drivers are very good to begin with. Now if XGI made this claim i would be more apt to believe them (same goes for Sis and what not), because there drivers are no where near as good as they could be.
The article is BS.
-Kevin
I know they are unified. But ATI does sometimes change items in the Cats that only affect a certain product line.Originally posted by: apoppin
the Cats are UNIFIED drivers . . . all the 9800/x800 seriesOriginally posted by: Megatomic
The article doesn't stipulate which cards get it. I read it twice to see if it did.
============
Originally posted by: Gamingphreek
The only way to achieve those kind of performance increases was if there was a flaw in the code as it was (ie Bad Drivers). Both ATI and Nvidia drivers are very good to begin with. Now if XGI made this claim i would be more apt to believe them (same goes for Sis and what not), because there drivers are no where near as good as they could be.
The article is BS.
-Kevin
MOST likely your post is
:roll:'
"saved" for future 'i told you so'
:thumbsdown:
Originally posted by: Megatomic
I know they are unified. But ATI does sometimes change items in the Cats that only affect a certain product line.Originally posted by: apoppin
the Cats are UNIFIED drivers . . . all the 9800/x800 seriesOriginally posted by: Megatomic
The article doesn't stipulate which cards get it. I read it twice to see if it did.
============
Originally posted by: Gamingphreek
The only way to achieve those kind of performance increases was if there was a flaw in the code as it was (ie Bad Drivers). Both ATI and Nvidia drivers are very good to begin with. Now if XGI made this claim i would be more apt to believe them (same goes for Sis and what not), because there drivers are no where near as good as they could be.
The article is BS.
-Kevin
MOST likely your post is
:roll:'
"saved" for future 'i told you so'
:thumbsdown:
And I too quoted Gamingphreek for later analysis up above. He might be right but if he's not he's gonna look like an ass.![]()
[french accent] but of course [/french accent] :thumbsup:Originally posted by: apoppin
of course if he's 'right' we'll let this quietly drop![]()
Originally posted by: Elcs
Originally posted by: raystorm
Originally posted by: Megatomic
The article doesn't stipulate which cards get it. I read it twice to see if it did.
damn...thats true. I shouldn't get my hopes all high that my 9800 will get those nice performance increases..if that article is true anyways.
Anyway, havent Nvidia produced a 15-20% overall improvement with a simple driver release?
I cant see why it cant happen but it does seem a bit high.
Originally posted by: Insomniak
Originally posted by: Gamingphreek
Originally posted by: gsellis
The Inquirer's 5.6 article
Dang, I figured this would be the lastest "Yeah, Right" thread going. Leaking news (remember, The Inquirer does not sign NDAs), they say 5.6 will be a huge improvement in many games, with LOMAC gaining 50% (not that I play it).
Just so expectations are set, expect to see ATI, nVidia, and The Inquirer trashed in this thread.![]()
I call BS right now.
They "have a feeling" it will be released on June 7!?! Wtf is that. That whole article is BS.
-Kevin
I'm with phreek. 15-20% performance improvements?
From a driver release?
Sure. Whatever.
I'll be laughing when this one launches.
Originally posted by: spazo
Well if anyone cared to remember, the 68xx series performed very badly with the first driver releases. It wasn't until they had some later ones that the 68xx started to perfrom well
Originally posted by: jiffylube1024
Also, Nvidia has done this several times in the past with driver releases. I remember two major detonator releases like this at least (THG, Anandtech and others had large articles on the updates).
Originally posted by: jiffylube1024
Originally posted by: Insomniak
Originally posted by: Gamingphreek
Originally posted by: gsellis
The Inquirer's 5.6 article
Dang, I figured this would be the lastest "Yeah, Right" thread going. Leaking news (remember, The Inquirer does not sign NDAs), they say 5.6 will be a huge improvement in many games, with LOMAC gaining 50% (not that I play it).
Just so expectations are set, expect to see ATI, nVidia, and The Inquirer trashed in this thread.![]()
I call BS right now.
They "have a feeling" it will be released on June 7!?! Wtf is that. That whole article is BS.
-Kevin
I'm with phreek. 15-20% performance improvements?
From a driver release?
Sure. Whatever.
I'll be laughing when this one launches.
ATI has done this in the past (remember the 8500 launch drivers? There was something like 35%+ more performance milked out of that card over the next year.
Also, Nvidia has done this several times in the past with driver releases. I remember two major detonator releases like this at least (THG, Anandtech and others had large articles on the updates).
It's possible to get more performance out of these cards - that's why they have teams working on it!
Originally posted by: apoppin
actually ati is working on their drivers because they FEAR nVidia's G70 3DMark and Doom3 scores . . . they don't want to be caught [again] like they were with the R8500 vs GF4 debacle.
They will be looking to squeeze every last drop of performance out of their "old" cards . . . . i know since the 4.X to 5.X Cats my 3DMark'05 score has gone up by more than 10% [it's "watchable" on my 9800xt now] . . . should easily break 3K Marks now.
Originally posted by: jiffylube1024
Originally posted by: apoppin
actually ati is working on their drivers because they FEAR nVidia's G70 3DMark and Doom3 scores . . . they don't want to be caught [again] like they were with the R8500 vs GF4 debacle.
They will be looking to squeeze every last drop of performance out of their "old" cards . . . . i know since the 4.X to 5.X Cats my 3DMark'05 score has gone up by more than 10% [it's "watchable" on my 9800xt now] . . . should easily break 3K Marks now.
Cool! I haven't run any of the 3dmarks or any benchmarks in ages... I sold my X800 Pro and downgraded to a 9800 Pro long ago. It's good to see ATI is sqeezing more out of our cards yet!
Still, this is entirely uncomparable to the 8500 situation - that was still the "old" and slow ATI, before the ArtX acquisition. This is the "new and improved" ATI, that is just as clever, sneaky and underhanded as Nvidia.
Originally posted by: Gamingphreek
The only way to achieve those kind of performance increases was if there was a flaw in the code as it was (ie Bad Drivers). Both ATI and Nvidia drivers are very good to begin with. Now if XGI made this claim i would be more apt to believe them (same goes for Sis and what not), because there drivers are no where near as good as they could be.
The article is BS.
-Kevin
Originally posted by: Elcs
Originally posted by: raystorm
Originally posted by: Megatomic
The article doesn't stipulate which cards get it. I read it twice to see if it did.
damn...thats true. I shouldn't get my hopes all high that my 9800 will get those nice performance increases..if that article is true anyways.
Anyway, havent Nvidia produced a 15-20% overall improvement with a simple driver release?
I cant see why it cant happen but it does seem a bit high.
Originally posted by: GTaudiophile
apoppin: There you go again, spewing all this pro-NVDA BS. You were saying the same before the GF6 came out, and you see what happened? Aside from DIII, ATi was either even or had the upper hand, especially in D3D titles like HL2 and Far Cry. SM3.0 is BS at this point.
What you've got now are two companies with two powerful GPUs playing a poker game, trying to see who will reveal first. And both know, once revealed, they are under huge pressure to deliver in quantity. I have the feeling they will be similar performance wise, but ATi will continue having problems supplying chips at 90 nanometers. On the other side, NVDA's G70 at 110 nanometers may have heat issues.
Time will tell but it's too early to crown a winner here.
Originally posted by: BouZouki
Originally posted by: GTaudiophile
apoppin: There you go again, spewing all this pro-NVDA BS. You were saying the same before the GF6 came out, and you see what happened? Aside from DIII, ATi was either even or had the upper hand, especially in D3D titles like HL2 and Far Cry. SM3.0 is BS at this point.
What you've got now are two companies with two powerful GPUs playing a poker game, trying to see who will reveal first. And both know, once revealed, they are under huge pressure to deliver in quantity. I have the feeling they will be similar performance wise, but ATi will continue having problems supplying chips at 90 nanometers. On the other side, NVDA's G70 at 110 nanometers may have heat issues.
Time will tell but it's too early to crown a winner here.
G70 WILL PWN EVERYTHING, TELL HIM APONIN.
Originally posted by: Pete
AFAIK, iD removed their NV30 path b/c nV put all those optimizations into its drivers.
Originally posted by: Gamingphreek
The only way to achieve those kind of performance increases was if there was a flaw in the code as it was (ie Bad Drivers). Both ATI and Nvidia drivers are very good to begin with. Now if XGI made this claim i would be more apt to believe them (same goes for Sis and what not), because there drivers are no where near as good as they could be.
The article is BS.
-Kevin
Originally posted by: spazo
Originally posted by: Elcs
Originally posted by: raystorm
Originally posted by: Megatomic
The article doesn't stipulate which cards get it. I read it twice to see if it did.
damn...thats true. I shouldn't get my hopes all high that my 9800 will get those nice performance increases..if that article is true anyways.
Anyway, havent Nvidia produced a 15-20% overall improvement with a simple driver release?
I cant see why it cant happen but it does seem a bit high.
Well if anyone cared to remember, the 68xx series performed very badly with the first driver releases. It wasn't until they had some later ones that the 68xx started to perfrom well
So what . . you're the one spewing pro ATI BS with every one of your mundane posts including crap about SM 3.0 . . . it IS important and ATI agrees with me on THAT point.Originally posted by: GTaudiophile
apoppin: There you go again, spewing all this pro-NVDA BS. You were saying the same before the GF6 came out, and you see what happened? Aside from DIII, ATi was either even or had the upper hand, especially in D3D titles like HL2 and Far Cry. SM3.0 is BS at this point.
What you've got now are two companies with two powerful GPUs playing a poker game, trying to see who will reveal first. And both know, once revealed, they are under huge pressure to deliver in quantity. I have the feeling they will be similar performance wise, but ATi will continue having problems supplying chips at 90 nanometers. On the other side, NVDA's G70 at 110 nanometers may have heat issues.
Time will tell but it's too early to crown a winner here.
nonsense . . . and the real race will be between the r600 series and g80 . . . both of these forthcoming GPUs are castrated compared to their console counterparts.Originally posted by: BouZouki
G70 WILL PWN EVERYTHING, TELL HIM APONIN.
Originally posted by: Gamingphreek
The only way to achieve those kind of performance increases was if there was a flaw in the code as it was (ie Bad Drivers). Both ATI and Nvidia drivers are very good to begin with. Now if XGI made this claim i would be more apt to believe them (same goes for Sis and what not), because there drivers are no where near as good as they could be.
The article is BS.
-Kevin
