• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Nvidia 84.20 drivers

TraumaRN

Diamond Member
So who else has installed the new 84.20 drivers?

If you have have you noticed the increase in F.E.A.R in FPS?

At 1152x864 I went from averageing 47FPS w/ 4X/16X to average of 56FPS at same settings. :Q:thumbsup:

I can also finally play and achieve decent FPS in 1280x960 where with 4X/8X I averaged 46FPS

Anyone else noticing this with their Nvidia hardware? I know it says in the release notes increased performance with AA enabled but I wasnt expecting this. 🙂

So comments? and please keep the trolling/flamebaited comments out.
 
Wow, sounds good, will get these babies installed pronto for my laptops 6800 Ultra Go.
But these should hopefully help my two 7900 GT's out 😀
 
Yeah Nvidia has really caught up (surpased?) ATI in this game with the new drivers. I was amazed at the 7900 series reviews with the 84.17 issue drivers. Not sure if 84.20 outdo these even but either way its neck and neck in fear now!! Bravo Nvidia!!
 
Those are some very impressive performance boosts. I'll switch over to it once the XG version comes out, which will probably be in a day or two. This may also explain the discrepancies between AT's 7900 and X1900 reviews that people have been talking about.
 
Do you actually get increased fps in the game, and have you measured it, or just the built in benchmark?
 
Originally posted by: munky
Do you actually get increased fps in the game, and have you measured it, or just the built in benchmark?

Built in benchmark plus a subjective opinion that the game is much smoother wont hitch in moments of lots of smoke/explosions/gunfire/Alma appearances, also my maximum frame rate went up and the percentages for over 40FPS increased. now at 1152 w/4x/16x it's 75% over 40FPS and 25% between 25-40FPS. Previous driver was 55% over 40FPS and 45% between 25-40

EDIT: I also tweaked the OC on my 7800, bumping it 10 on the core and memory but that would not account for the increases I'm seeing
 
Originally posted by: DeathBUA
Originally posted by: munky
Do you actually get increased fps in the game, and have you measured it, or just the built in benchmark?

Built in benchmark plus a subjective opinion that the game is much smoother wont hitch in moments of lots of smoke/explosions/gunfire/Alma appearances, also my maximum frame rate went up and the percentages for over 40FPS increased. now at 1152 w/4x/16x it's 75% over 40FPS and 25% between 25-40FPS. Previous driver was 55% over 40FPS and 45% between 25-40

EDIT: I also tweaked the OC on my 7800, bumping it 10 on the core and memory but that would not account for the increases I'm seeing

So does anyone wanna run a fraps measurement while playing to compare the difference between the drivers? It wouldnt be the first time a certain benchmark was using "optomizations" because the camera follows a predefined path...
 
Originally posted by: munky
Originally posted by: DeathBUA
Originally posted by: munky
Do you actually get increased fps in the game, and have you measured it, or just the built in benchmark?

Built in benchmark plus a subjective opinion that the game is much smoother wont hitch in moments of lots of smoke/explosions/gunfire/Alma appearances, also my maximum frame rate went up and the percentages for over 40FPS increased. now at 1152 w/4x/16x it's 75% over 40FPS and 25% between 25-40FPS. Previous driver was 55% over 40FPS and 45% between 25-40

EDIT: I also tweaked the OC on my 7800, bumping it 10 on the core and memory but that would not account for the increases I'm seeing

So does anyone wanna run a fraps measurement while playing to compare the difference between the drivers? It wouldnt be the first time a certain benchmark was using "optomizations" because the camera follows a predefined path...

No offense but the fact that I can play it now at setting that before would have bogged my computer down into unplayable levels before is proof to me that it's more than some cheap fix and more than the fact that I bumped my core 10Mhz.
 
Originally posted by: DeathBUA
Originally posted by: munky
Originally posted by: DeathBUA
Originally posted by: munky
Do you actually get increased fps in the game, and have you measured it, or just the built in benchmark?

Built in benchmark plus a subjective opinion that the game is much smoother wont hitch in moments of lots of smoke/explosions/gunfire/Alma appearances, also my maximum frame rate went up and the percentages for over 40FPS increased. now at 1152 w/4x/16x it's 75% over 40FPS and 25% between 25-40FPS. Previous driver was 55% over 40FPS and 45% between 25-40

EDIT: I also tweaked the OC on my 7800, bumping it 10 on the core and memory but that would not account for the increases I'm seeing

So does anyone wanna run a fraps measurement while playing to compare the difference between the drivers? It wouldnt be the first time a certain benchmark was using "optomizations" because the camera follows a predefined path...

No offense but the fact that I can play it now at setting that before would have bogged my computer down into unplayable levels before is proof to me that it's more than some cheap fix and more than the fact that I bumped my core 10Mhz.

and he is looking for more proof to back up your one, personal experience.
 
I just ran the benchmark last night and at 1920x1200 with everything full blast, I could switch to 2X AA without dropping below 20 fps. 4X AA started to chug. As for gameplay, it's a little loggy at that rez - didn't try it at something lower, but I will later on and post results if I get a chance.
 
Before I leave for Ann Arbor for the evening I'll just point to the videocard driver forums at Guru3D, they have a discussion on the 84.20's as well. And it seems other people are echoing my statement concerning FEAR.

Link to the thread

from page 3 onward are people discussing the benefits. Have a great time fragging Alma
 
My only concern is "optimizations" in these new drivers. A 20% performance increase in a game is a HUGE increase due to drivers alone. If your game was messed up due to bad drivers and these fixed issues it's one thing (Black & White 2 on ATI) but we haven't been lead to believe this is the case with FEAR on nVidia cards. It was always assumed ATI had the lead due to better shader performance.
 
FEAR runs smooth, gain a few synthetic points, most importantly, BF2 and COD2 seem to run a little nicer. Very nice drivers.
 
Originally posted by: akugami
My only concern is "optimizations" in these new drivers. A 20% performance increase in a game is a HUGE increase due to drivers alone. If your game was messed up due to bad drivers and these fixed issues it's one thing (Black & White 2 on ATI) but we haven't been lead to believe this is the case with FEAR on nVidia cards. It was always assumed ATI had the lead due to better shader performance.

I was wondering about this too. These GPU manufacturers have a tendancy to give out marketing tripe. What if the huge number of shader pixels is exactly that? Marketing tripe?
Because this new 20% performance boost or whatever from these new drivers in FEAR show that Nvidias lack of performance was not soley due to having less shader pixels at all. Pehraps just a driver optimisation issue? Perhaps HDR+AA and the shimmering is the same? Might we see drivers to tackle these issues too?

It will be interesting to see how everything pans out?
 
Yeah Nvidia has really caught up (surpased?) ATI in this game with the new drivers.
I don't see how given ATi is pumping out monthly official drivers while nVidia's last official driver was released in December 2005.

nVidia need to stop hiring AEG agent and start releasing more frequent official drivers.
 
Originally posted by: BFG10K
Yeah Nvidia has really caught up (surpased?) ATI in this game with the new drivers.
I don't see how given ATi is pumping out monthly official drivers while nVidia's last official driver was released in December 2005.

nVidia need to stop hiring AEG agent and start releasing more frequent official drivers.


Nvidia release too many drivers imo. Either that or just enough.
As long as your not going by officially certified drivers.

Check Laptopvideo2go.com.
Theres always new Beta drivers popping up on that site.
 
Originally posted by: nib95
Originally posted by: akugami
My only concern is "optimizations" in these new drivers. A 20% performance increase in a game is a HUGE increase due to drivers alone. If your game was messed up due to bad drivers and these fixed issues it's one thing (Black & White 2 on ATI) but we haven't been lead to believe this is the case with FEAR on nVidia cards. It was always assumed ATI had the lead due to better shader performance.

I was wondering about this too. These GPU manufacturers have a tendancy to give out marketing tripe. What if the huge number of shader pixels is exactly that? Marketing tripe?
Because this new 20% performance boost or whatever from these new drivers in FEAR show that Nvidias lack of performance was not soley due to having less shader pixels at all. Pehraps just a driver optimisation issue? Perhaps HDR+AA and the shimmering is the same? Might we see drivers to tackle these issues too?

It will be interesting to see how everything pans out?


AFAIK HDR/AA is a hardware issue unsovlable by drivers
 
Back
Top