VISTA BETA gets ATI x1900 support before XP

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: munky
A birdie told me the 6.2 cats will offer a slight performance boost...

Holy Moly! New drivers have a slight performance boost!

That is news!
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
Holding a little something back for the upcoming G71's are we?
G71 is not expected until March, unlike 6.2 which is expected in February.

There were plenty of beta drivers to be found at the time, all of which worked great as has also been pointed out.
The Genx87's claim wasn't for beta drivers, it was for official drivers.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: BFG10K
Holding a little something back for the upcoming G71's are we?
G71 is not expected until March, unlike 6.2 which is expected in February.

There were plenty of beta drivers to be found at the time, all of which worked great as has also been pointed out.
The Genx87's claim wasn't for beta drivers, it was for official drivers.

I know that and said nothing that contradicts that. Prior to the release of the officials, there were plenty of betas as I stated. I should have known a grammar nazi like you who is incapable of reading between the lines wouldn't comprehend the statement.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Rollo
Originally posted by: munky
A birdie told me the 6.2 cats will offer a slight performance boost...

Holy Moly! New drivers have a slight performance boost!

That is news!

When I said slight... it might have been an understatement
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
I know that and said nothing that contradicts that.
Except beta drivers were never the issue so why bring them up? You skirted the issue in an effort to make nVidia look better, that's why. If somebody wants to have a go at ATi official drivers they should be making the same claims against nVidia.

Also I'm still waiting for the retraction of your BS claim that you can delete nVidia built-in profiles with the standard control panel.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: BFG10K
I know that and said nothing that contradicts that.
Except beta drivers were never the issue so why bring them up? You skirted the issue in an effort to make nVidia look better, that's why. If somebody wants to have a go at ATi official drivers they should be making the same claims against nVidia.

Also I'm still waiting for the retraction of your BS claim that you can delete nVidia built-in profiles with the standard control panel.

Wrong again, Einstein.

Even when nV40 owners didn't have official drivers available we could still fall back on the beta releases, and still have great feature support and performance (one of those beta drivers put the features in place that allowed us to run the Ruby demo).

By contrast ATi owners don't even have beta support at present.

Now, why don't you disappear back to your cave, like a good troll?

 

Tom

Lifer
Oct 9, 1999
13,293
1
76
When was the xt1900 released, a week and a half ago ? It makes no sense to expect new drivers that soon, if you want a stable proven product, you don't buy it the day it comes out. And that applies to everything, cars, microwaves, etc.

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Tom
When was the xt1900 released, a week and a half ago ? It makes no sense to expect new drivers that soon, if you want a stable proven product, you don't buy it the day it comes out. And that applies to everything, cars, microwaves, etc.

So, how do you explain the drivers released just after 3mark06 and just before r580's launch?
 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
Originally posted by: BFG10K
Holding a little something back for the upcoming G71's are we?
G71 is not expected until March, unlike 6.2 which is expected in February.

There were plenty of beta drivers to be found at the time, all of which worked great as has also been pointed out.
The Genx87's claim wasn't for beta drivers, it was for official drivers.

Yes, BUT Nvidia would finalize their clockspeed to get the best yeilds and still beat the 1900's based on the pre-6.2(final) drivers and be too late to change this by mid-Febuary as they would be in production already.
 

imported_michaelpatrick33

Platinum Member
Jun 19, 2004
2,364
0
0
I would hypothesize that all these angry recriminations and vitriol is conjecture based upon the opinion of theories promoted by fans of corporations that deduce ad hoc circular logical style dependent upon theoretical gpu fans that are strangely facing downward in the atx format .... uh .... what were we talking about again?

Clock yields of the G71 will smoke your 6.12 because 0010 0001 is the roxors. Errr 33? Yo my silicon is going pound your silicon *ss. You wanna go? :roll: :laugh: Nothing like self-induced hysterical satire when about to fall asleep at keyboard. I wish everyone would state that ATI and NVIDIA both have great products and some stinker products and let it go. I think there should be a fanboi anynomous 12 steps program. Loovvee, Oh baby, let us looovveee. Yeahh

Edit: Most obvious grammatical errors but plenty more in there I am too tired to ponder this fine evening.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
Even when nV40 owners didn't have official drivers available we could still fall back on the beta releases, and still have great feature support and performance (one of those beta drivers put the features in place that allowed us to run the Ruby demo).
nVidia beta drivers are a joke.

By contrast ATi owners don't even have beta support at present.
If that were true then it would be impossible to install drivers on a X1900 system and get it working.

Now, why don't you disappear back to your cave, like a good troll?
I'm still waiting for the retraction of your BS claim that you can delete nVidia built-in profiles with the standard control panel.

When can we expect this, Mr "My opinions are my own. I have no business affiliation with either AEG or nVidia"?

That you don't have an affiliation to nVidia or AEG doesn't make it any less BS so we still need that retraction.
 

Jodiuh

Senior member
Oct 25, 2005
287
1
81
Sweet! Thanks for giving me something to do/read while I wait for my 6.2's! Now I gotta figure out what I'm gonna do tomorrow :(
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: BFG10K
Even when nV40 owners didn't have official drivers available we could still fall back on the beta releases, and still have great feature support and performance (one of those beta drivers put the features in place that allowed us to run the Ruby demo).
nVidia beta drivers are a joke.

By contrast ATi owners don't even have beta support at present.
If that were true then it would be impossible to install drivers on a X1900 system and get it working.

Now, why don't you disappear back to your cave, like a good troll?
I'm still waiting for the retraction of your BS claim that you can delete nVidia built-in profiles with the standard control panel.

When can we expect this, Mr "My opinions are my own. I have no business affiliation with either AEG or nVidia"?

That you don't have an affiliation to nVidia or AEG doesn't make it any less BS so we still need that retraction.

By contrast ATi owners don't even have beta support at present.
If that were true then it would be impossible to install drivers on a X1900 system and get it working.
No they are not. The Beta 79.11's are probably the best drivers I've ever seen for nV4x owners, and perhaps I'm misremembering, but at the time ATi and the fanATics didn't think Ruby running on nVidia hardware was funny let alone a joke - of course the nVidia owners had the opposite opinion ;)

By contrast ATi owners don't even have beta support at present.
If that were true then it would be impossible to install drivers on a X1900 system and get it working.
Noone said you can't get r580 working (after a fashion) with the 5.13's - it isn't working optimally though - NOT what you would expect from a $650US product...

Now, why haven't you gone back into your cave (or under your bridge) yet? Are you onbe of those "special" trolls too thick to even know where home is?
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Originally posted by: Tom
When was the xt1900 released, a week and a half ago ? It makes no sense to expect new drivers that soon, if you want a stable proven product, you don't buy it the day it comes out. And that applies to everything, cars, microwaves, etc.

Cars, microwaves etc get a lot more stringent testing and checks than the Graphics Industry does. Thousands of people buy cars on launch day and don't have problems with no fuel support (unless it's some wacky fuel car).

Version: 66.93
Release Date: November 9, 2004

WHQL Drivers for GeForce 6 series did not arrive until November 2004 but driver support for 6 Series cards was in the box, not hacked or reworked drivers either.

IMO ATI did a half launch (inadequate driver support) of the X1900 in order to maximise profit. It's not stopping the cards selling out in multiple stores tho so their plan is working for now.
 

Tom

Lifer
Oct 9, 1999
13,293
1
76
Originally posted by: Gstanfor
Originally posted by: Tom
When was the xt1900 released, a week and a half ago ? It makes no sense to expect new drivers that soon, if you want a stable proven product, you don't buy it the day it comes out. And that applies to everything, cars, microwaves, etc.

So, how do you explain the drivers released just after 3mark06 and just before r580's launch?


I don't know what 3dmark06 has to do with what I said, or the release of new drivers for older products.

My point is about the wisdom of waiting until a product has a track record or some revisions instead of being an early adopter.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
No they are not
Yes, they are. There were a few sets that were so bad they stopped half of my OpenGL games from launching.

but at the time ATi and the fanATics didn't think Ruby running on nVidia hardware was funny let alone a joke - of course the nVidia owners had the opposite opinion
You mean like how Dawn ran better on ATi cards through a wrapper than it did natively on nVidia cards?

Noone said you can't get r580 working (after a fashion) with the 5.13's - it isn't working optimally though - NOT what you would expect from a $650US product...
You mean like PureVideo? Or do you mean SLI? Or do you mean nVidia dual-core driver optimizations that nVidia still tells users to disable?

Are you onbe of those "special" trolls too thick to even know where home is?
Not so thick as to not answer a question that will now be asked a third time.

I'm still waiting for the retraction of your BS claim that you can delete nVidia built-in profiles with the standard control panel.

This is the third time I'm asking.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: BFG10K
No they are not
Yes, they are. There were a few sets that were so bad they stopped half of my OpenGL games from launching.

I certainly don't remember issues like that in the recent past, perhaps you didn't cleanup previous drivers properly before installing new ones?

but at the time ATi and the fanATics didn't think Ruby running on nVidia hardware was funny let alone a joke - of course the nVidia owners had the opposite opinion
You mean like how Dawn ran better on ATi cards through a wrapper than it did natively on nVidia cards?

I'll tell you what BFG10K, you post screenshots of dawn from an ATi card and I'll match them with screenshots from a nVidia card. I guarantee I know who will win - ATi cards couldn't render Dawns hair and eyelashes correctly just for starters. Faster does NOT equal better when things are not rendered correctly. Dawn is old hat anyway. Lets see the fanatics get Nalu, Luna or Mad Mod Mike up and running - they couldn't even get Dusk to work.

Noone said you can't get r580 working (after a fashion) with the 5.13's - it isn't working optimally though - NOT what you would expect from a $650US product...
You mean like PureVideo? Or do you mean SLI? Or do you mean nVidia dual-core driver optimizations that nVidia still tells users to disable?

I don't make use of any of those technologiesmyself, but I'm unaware of any show-stopper problems with them.

Are you onbe of those "special" trolls too thick to even know where home is?
Not so thick as to not answer a question that will now be asked a third time.

I'm still waiting for the retraction of your BS claim that you can delete nVidia built-in profiles with the standard control panel.

This is the third time I'm asking.

Keep right on asking bonehead.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
I certainly don't remember issues like that in the recent past, perhaps you didn't cleanup previous drivers properly before installing new ones?
Or perhaps the next official driver didn't have the same problem, which it didn't.

I'll tell you what BFG10K, you post screenshots of dawn from an ATi card and I'll match them with screenshots from a nVidia card. I guarantee I know who will win - ATi cards couldn't render Dawns hair and eyelashes correctly just for starters.
Except ATi rendered the shaders with higher precision and IQ than nVidia did. IIRC the eye-lash/hair problem was a bug in the wrapper which was fixed with a newer version later on.

I don't make use of any of those technologiesmyself, but I'm unaware of any show-stopper problems with them.
You are unaware of any show-stopping problems with PureVideo when it was released? ROFL. How about the fact that it didn't work for months. How's that for show-stopping?

As for SLI and dual-core optmizations, try reading the driver readme sometime and read what nVidia themselves list as known problems. That you are "unaware" doesn't mean jack diddly.

Keep right on asking bonehead.
How lovely, resort to childish insults while continuing to post your pro-nV proganda lies. Don't worry, I think we all know exactly where you stand Mr "my views are my own".

The fact is you are wrong so many times it's not even funny yet you continue posting the same BS over and over again as if nothing happened.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: BFG10K
I certainly don't remember issues like that in the recent past, perhaps you didn't cleanup previous drivers properly before installing new ones?
Or perhaps the next official driver didn't have the same problem, which it didn't.

There never was a problem, genius. If you are referring to Riddick, it was a game problem which a patch was released for (and could be worked around with a command line argument to the game).

I'll tell you what BFG10K, you post screenshots of dawn from an ATi card and I'll match them with screenshots from a nVidia card. I guarantee I know who will win - ATi cards couldn't render Dawns hair and eyelashes correctly just for starters.
Except ATi rendered the shaders with higher precision and IQ than nVidia did. IIRC the eye-lash/hair problem was a bug in the wrapper which was fixed with a newer version later on.

First of all Dawn runs exactly as nVidia intented it run, and with the intended precision. If you go do your homework, you will find that nVidia never called it a DX9 demo - in fact its written in Cg.

About the wrapper - Ati users have to use shader replacement to run Dawn (just like 9700 users have to have shader replacement for Ruby). nVidia on the other hand requires no shader replacement to run the Ruby demo, and I've got quotes from the author of the wrapper justing waiting for you to argue otherwise.


I don't make use of any of those technologiesmyself, but I'm unaware of any show-stopper problems with them.
You are unaware of any show-stopping problems with PureVideo when it was released? ROFL. How about the fact that it didn't work for months. How's that for show-stopping?

So far as I'm aware PureVideo has worked from the day you were able to download and install it, for MPEG2 (99.999% of all available content) anyway.

As for SLI and dual-core optmizations, try reading the driver readme sometime and read what nVidia themselves list as known problems. That you are "unaware" doesn't mean jack diddly.

most of those problems can be worked around however. Go argue with Rollo over SLi - personally don't care about it

Keep right on asking bonehead.
How lovely, resort to childish insults while continuing to post your pro-nV proganda lies. Don't worry, I think we all know exactly where you stand Mr "my views are my own".

The fact is you are wrong so many times it's not even funny yet you continue posting the same BS over and over again as if nothing happened.

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
There never was a problem, genius.
According to you. But then according to you you can remove built-in nVidia driver profiles using the remove button. :cookie:

If I was a betting man I wouldn't be betting on you, that's for sure.

If you are referring to Riddick,
I'm not so why bring up such a strawman?

First of all Dawn runs exactly as nVidia intented it run, and with the intended precision.
Great. Except what I said was that ATi ran the demo with higher precision which is a counter to your claim that ATi ran it at a lower IQ.

About the wrapper - Ati users have to use shader replacement to run Dawn (just like 9700 users have to have shader replacement for Ruby). nVidia on the other hand requires no shader replacement to run the Ruby demo, and I've got quotes from the author of the wrapper justing waiting for you to argue otherwise.
Shader replacement has what to do with your incorrect claim that ATi ran the Dawn demo with lower IQ?

You said there was problem with eye-lashes/hair. I stated this was actually a problem with the wrapper which was fixed later on. You then countered with shader replacement comments above.

Do you even know how to construct a sequitur argument? Or do you just post random comments in the hopes of detracting from the original issue?

So far as I'm aware PureVideo has worked from the day you were able to download and install it, for MPEG2 (99.999% of all available content) anyway.
PureVideo did not do what it advertised on $500 cards and it also took months to fix it. 6.2 will be here in a few weeks, if even that.

most of those problems can be worked around however.
:roll:

They can be "worked around" by disabling SLI or by disabling dual-core optimizaions. That's not a workaround, that's disabling a broken implementation.

Both SLI and dual core have been with us for months and they are still broken, a far cry from 6.2 arriving in a few weeks time if even that. That you choose to slam ATi while having a limbo period of a few weeks of official drivers for the X1900 while ignoring nVidia's problems that have been going on for months speaks highly of your pro-nv bias.

And fourth time: retract your BS comments about being able to delete nVidia built-in profiles with the standard control panel.

If you don't answer this time I will assume you are incapable of even the most basic reading and comprehension tasks.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
Just ignore BFG, he's a tool that likes to argue for the sake of arguing even if he has no point.
Don't you have a date with your 17 MB CCC?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
This really is my last comment to you.

Increased precision DOES NOT mean increaded IQ. Increased precision is used when a lower precision cannot successfully provide usable results fro a given task.

Given that nVidia designed the Dawn demo, i think they have a rather better undertanding than you of which precisions are appropriate where in the program.

Still on IQ, because your wrapper uses shader replacements, the IQ isn't comparable in the first place because different things are being rendered (and in ATi's case, things that the desinger of the program never envisaged or accounted for)!
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
Increased precision DOES NOT mean increaded IQ.
Yes it does. You may be able to argue that you can't see a difference but that's not the same as your incorrect claim that ATi has inferior IQ.

Given that nVidia designed the Dawn demo, i think they have a rather better undertanding than you of which precisions are appropriate where in the program.
AKA they had a better understanding of the NV3x's performance limitations.

Still on IQ, because your wrapper uses shader replacements, the IQ isn't comparable in the first place because different things are being rendered
Just because there's a wrapper it doesn't mean the IQ is different. You may have a claim to FP32 but on that poor hardware I seriously doubt nVidia would've been using that level of precision very much if at all.