81.26 nvidia drivers news from INQ

imported_dwalton

Junior Member
Aug 12, 2005
19
0
0

ForceWare 81.26 Win2000/XP
http://downloads.guru3d.com/download.php?det=1181

NGO NVIDIA Optimized Driver v81.26
http://downloads.guru3d.com/download.php?det=1182


http://www.theinquirer.net/?article=26276

Nvidia is busy right now finishing "ultimate-choice" drivers, ForceWare Rel80.

Rel80 will offer more flex to SLI than any previous driver ever did before. From now onwards, you will be "free", if you have enough cash, to mix different vendors at your preference. That does not mean you can plug a 7800GTX in a 6800GT system, rather the following:

a. You have bought the 6800GT from a Cherry for $400.
b. Would like to upgrade a bit, but Cherry's 6800GT isn't available anymore.
c. Plug in Apricot 6800GT which you bought for 200-220 dollars.
d. Install Rel80, enable SLI, don't need to reboot the system
e. Wurk. Wurk. Wurk. (grunt's speech from WarCraft III)

And if you dare to question whether clocks differ, or cheekily ask yourself: "First one has one DVI and one D-SUB, second one has DVI's only - what now?", the answer is that Rel80 does not give a fudge.

It'll work. Also, Rel80 supports different clocks on chips, so you can actually have a maximum overclock on both cards, regardless of the clock itself, mixing 7800GTX with 478/1310 MHz and 495/1340 - should be no problem at all.

Also, the Rel80 brings the end to one pain we have already written about, and that's vertical synchronisation. With new drivers, SLI will support Vsync on all combinations, which is something many owners have been praying for. Those that believe in god, or gods, that is.

There are also three things that were expected - dual core improvements, HDTV and Linux.

First of all, Rel80 should give your spankin' new dual core system (or an oldish Athlon MP) a new life, since the drivers are supposed to be multi-threaded, and offer some performance increase over single-core systems. As we have already seen on a out-dated, single-threaded benchmark named 3D Mark2001SE, a dual-core CPU can beat a single core one (4800+ broke 30K, 4000+/FX-53 can't) - and that's just the beginning.

Second of all, HDTV. With Rel80, you can freely connect your SLI setup to a standard or HD TV device, and play in whatever resolution you want. With speed of SLI, of course.


 

Kalessian

Senior member
Aug 18, 2004
825
12
81
This has turned out to be an outstanding driver release, IMO. I don't think a mere update has packed this many features before.
 

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
Originally posted by: Kalessian
This has turned out to be an outstanding driver release, IMO. I don't think a mere update has packed this many features before.

I agree, I really like the different Speed allowance, this is ten X better then the If you buy a better MAster card then you slave the master steps down to Slave lvls (Not trying to take a stab at ATI but its true). I was really certain that something like this would require SLI2 or something like that before the could get that worked out. Dual core performance increase will also be a feature I will greatly happy with.
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
It should be obvious that nVidia is trying to steal ATI's R520 thunder like they did with the Radeon 8500.

They will use IQ hax0rs galore to get the FPS they think they need to put their 7800GTX above the X1800XT. And of course, we can expect the ever NVDA-loyal Anand to disregard this possibility and skip IQ comparisons entirely.

Expect nVidia to officially launch these drivers within hours of R520's release.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
SLI will support Vsync on all combinations
I so hope that is true. I will have to test this as soon as I get my second KO from eVGA. Meanwhile, I have been playing with a single KO with v-sync, and I realized how much I like playing with v-sync better.

Edit: so far the only disappointment I've had with the 81.26 betas is that the AF shimmering is still present.
 

entropy1982

Golden Member
Jul 10, 2005
1,053
0
0
only sucky thing is u cant combine 7800 with the next gen nvidia cards.... (i'm assuming this since you can't combine 6800 with 7800) ... it would only be to the advantage of nvidia to have customers buy their next card no matter who is better in the next gen cards (ati or nv) since you wouldnt want your 7800gtx going to waste even in a couple years. oh well :)
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: GTaudiophile
It should be obvious that nVidia is trying to steal ATI's R520 thunder like they did with the Radeon 8500.

They will use IQ hax0rs galore to get the FPS they think they need to put their 7800GTX above the X1800XT. And of course, we can expect the ever NVDA-loyal Anand to disregard this possibility and skip IQ comparisons entirely.

Expect nVidia to officially launch these drivers within hours of R520's release.

I am pretty sure the quote "Your hatred for AT is apparent, and you are scum" fits pretty well :D Seriously, why post here if you dislike Anand? I do not find him biased.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: ArchAngel777
Originally posted by: GTaudiophile
It should be obvious that nVidia is trying to steal ATI's R520 thunder like they did with the Radeon 8500.

They will use IQ hax0rs galore to get the FPS they think they need to put their 7800GTX above the X1800XT. And of course, we can expect the ever NVDA-loyal Anand to disregard this possibility and skip IQ comparisons entirely.

Expect nVidia to officially launch these drivers within hours of R520's release.

I am pretty sure the quote "Your hatred for AT is apparent, and you are scum" fits pretty well :D Seriously, why post here if you dislike Anand? I do not find him biased.

:D (and it's in my sig because it was aimed at me first time ;))
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
That sounds too good to be true. All those things from Nv. Impressive.
 

entropy1982

Golden Member
Jul 10, 2005
1,053
0
0
Originally posted by: dwalton

ForceWare 81.26 Win2000/XP
http://downloads.guru3d.com/download.php?det=1181

NGO NVIDIA Optimized Driver v81.26
http://downloads.guru3d.com/download.php?det=1182


http://www.theinquirer.net/?article=26276

Nvidia is busy right now finishing "ultimate-choice" drivers, ForceWare Rel80.

Rel80 will offer more flex to SLI than any previous driver ever did before. From now onwards, you will be "free", if you have enough cash, to mix different vendors at your preference. That does not mean you can plug a 7800GTX in a 6800GT system, rather the following:

a. You have bought the 6800GT from a Cherry for $400.
b. Would like to upgrade a bit, but Cherry's 6800GT isn't available anymore.
c. Plug in Apricot 6800GT which you bought for 200-220 dollars.
d. Install Rel80, enable SLI, don't need to reboot the system
e. Wurk. Wurk. Wurk. (grunt's speech from WarCraft III)

And if you dare to question whether clocks differ, or cheekily ask yourself: "First one has one DVI and one D-SUB, second one has DVI's only - what now?", the answer is that Rel80 does not give a fudge.

It'll work. Also, Rel80 supports different clocks on chips, so you can actually have a maximum overclock on both cards, regardless of the clock itself, mixing 7800GTX with 478/1310 MHz and 495/1340 - should be no problem at all.

Also, the Rel80 brings the end to one pain we have already written about, and that's vertical synchronisation. With new drivers, SLI will support Vsync on all combinations, which is something many owners have been praying for. Those that believe in god, or gods, that is.

There are also three things that were expected - dual core improvements, HDTV and Linux.

First of all, Rel80 should give your spankin' new dual core system (or an oldish Athlon MP) a new life, since the drivers are supposed to be multi-threaded, and offer some performance increase over single-core systems. As we have already seen on a out-dated, single-threaded benchmark named 3D Mark2001SE, a dual-core CPU can beat a single core one (4800+ broke 30K, 4000+/FX-53 can't) - and that's just the beginning.

Second of all, HDTV. With Rel80, you can freely connect your SLI setup to a standard or HD TV device, and play in whatever resolution you want. With speed of SLI, of course.

Can you explain this please? I mean I am planning to buy a 7800gtx ... i will also have a happauge 550 tv card.... so can i record HDTV somehow through my cablebox?
 

monster64

Banned
Jan 18, 2005
466
0
0
Nice, finally SLi is getting much closer to an upgrade path. Getting a second 6800 GT on ebay for $200 and running SLi will be sweet.
 

Fenuxx

Senior member
Dec 3, 2004
907
0
76
Well, the 81.26 set is junk as it stands now, quirky graphics anomalies, etc. I did a pre-Beta XTreme-G-ified Laptop-modded driver for my Inspiron 9300, and they gave me these scores:


--3DMark2001 Overall--
81.26: 18122
77.77: 18568

--3DMark2003 Overall--
81.26: 8362
77.77: 8789

*Tests performed at default settings at 1024x768


Not too good. Too big of a performance drop IMO for just a driver change. I have no doubt that NVIDIA will fix these problems upon release, but as it stands, I would recommend staying away from them ;) .
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: GTaudiophile
It should be obvious that nVidia is trying to steal ATI's R520 thunder like they did with the Radeon 8500.

They will use IQ hax0rs galore to get the FPS they think they need to put their 7800GTX above the X1800XT. And of course, we can expect the ever NVDA-loyal Anand to disregard this possibility and skip IQ comparisons entirely.

Expect nVidia to officially launch these drivers within hours of R520's release.

shut up
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: geforcetony
Well, the 81.26 set is junk as it stands now, quirky graphics anomalies, etc. I did a pre-Beta XTreme-G-ified Laptop-modded driver for my Inspiron 9300, and they gave me these scores:


--3DMark2001 Overall--
81.26: 18122
77.77: 18568

--3DMark2003 Overall--
81.26: 8362
77.77: 8789

*Tests performed at default settings at 1024x768


Not too good. Too big of a performance drop IMO for just a driver change. I have no doubt that NVIDIA will fix these problems upon release, but as it stands, I would recommend staying away from them ;) .

The only issue I have seen with these drivers so far (and I have tested them with plenty of games, old and new) is something weird with the lighting and shadowind in 3DMark05 and a very slight dropoff in 3Dmark scores.

Frankly, I could care less about the 3DMark issues, because games, do perform noticeably faster with this release (and with beautiful IQ). I play games, not benchmarks on my system.

The fact that these drivers perform well in games and not so well in 3DMark is actually an encouraging sign to me as far as overall driver quality is concerned.
 

Fenuxx

Senior member
Dec 3, 2004
907
0
76
As far as IQ is concerned, I really couldn't see much of a difference between 81.26 and 77.77.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Frackal
Did you notice an INCREASE in IQ?

I certainly did. As I said above, the IQ in this release (3DMark05 not withstanding) is excellent.

Performance wise and demo wise, nVidia's "Mad Mod Mike" and "Luna" demos recieve massive boosts with these drivers. Interestingly "dawn" and "Dusk" are slightly slower, the very old (GF2 MX) "Toy Soldiers" demo ehxibits far smoother panning performance with these drivers than it has for quite some time now.
 

Fenuxx

Senior member
Dec 3, 2004
907
0
76
The 80-series as it stands now is pretty immature. If you like them, go ahead and use them, but they have more problems than they fix, so I would hold off on them for now.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
What problems have you had with them? I haven't had any stability issues with the betas at all, but I haven't exactly run them through a bunch of tests either. Are you using the XTreme-G versons? If so, I would hardly think it's fair to comment on NV's drivers when you're using a tweaked/hacked version.
 

coomar

Banned
Apr 4, 2005
2,431
0
0
i think the age of empires 3 demo looks better with 81.26

i'll be testing the driver out with a lot of games tomorrow (and maybe later tonight)
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Insomniak
Will be very nice if it actually comes true.

What does that mean..? The tests we've seen were actually run on what I would consider publicly available drivers. There isn't any speculation here...