Dual core gaming drivers???

chinkgai

Diamond Member
Apr 4, 2001
3,904
0
71
i wonder why there is such a huge drop in performance when the tests are run in high quality? altho not for all the benchmarks...interesting? any ideas?
 

chinkgai

Diamond Member
Apr 4, 2001
3,904
0
71
i read that there are a ton of graphical bugs in alot of games and sometimes in benchmarks...
i guess i dont really want to try these...
 

Sentential

Senior member
Feb 28, 2005
677
0
0
Very intereting to say the least. I hope ATi has something similar planned (which Im sure they do)
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,319
16,147
136
I have 2 X2 boxes. Unfortunately they are both not equipped with nvidia cards. One is a work box with a 9600pro (fanless) and the other is a dedicated F@H box that has a 2 meg PCI (not express) card.
 

ElTorrente

Banned
Aug 16, 2005
483
0
0
Awesome!

Of course there will be a few bugs in the first couple releases, but these look VERY promising. And people were saying dual core had no benefit- especially for present day gaming that is only single threaded.
 

Furen

Golden Member
Oct 21, 2004
1,567
0
0
I'm a bit skeptical about the performance improvements due to "dual-core" CPUs. If they had tested with both a 4800+ and a 4000+ it would be much more believable. As it is, it looks to me like nVidia is trying very hard to optimize the drivers enough to blunt ATI's R520 launch (nothing wrong with optimizing, after all, current 3d graphics ARE pure optimization; so unless theres a visual degradation then can optimize to their heart's content).
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
I saw 1000 pt increase in 3dmark2k1se at same settings.....not sure if dual core is even running one bit cause I cant see task manager...

going to run specviewperf as I will be able to see task manager...
 

McGeyser

Member
Jan 23, 2005
96
0
0
Originally posted by: chinkgai
i wonder why there is such a huge drop in performance when the tests are run in high quality? altho not for all the benchmarks...interesting? any ideas?

I find this interesting myself. The performance gain should be across the board evenly, if the Dual Core was directly responsible for the FPS gains.

We all know John Carmack from Id was adamant about Nvidia GPU's in his pre-Doom 3 press release, as being the only solution for Doom3. Here is Carmack discussing Nvidia's edge for Doom3.

Is 81.26 exploting a lighting trick or function, and thereby fudging benchmarks, by lowering the quality of something else? We are all aware of the secret optomization cheating drivers in benchmarks in past history of both Nvidia and ATI. Are they at it again?



More 81.26 Lighting Problems and distortion

I notice the 81.26 now features support for Geforce 2 series, which was removed from the previous sets of drivers, specifically stated in the release notes. Why and how would they now support old architecture while still having trimmed 18M off the overall file size?

 

entropy1982

Golden Member
Jul 10, 2005
1,053
0
0
Originally posted by: Markfw900
I have 2 X2 boxes. Unfortunately they are both not equipped with nvidia cards. One is a work box with a 9600pro (fanless) and the other is a dedicated F@H box that has a 2 meg PCI (not express) card.

may i ask why you have a dedicated F@H box? :)
 

entropy1982

Golden Member
Jul 10, 2005
1,053
0
0
Originally posted by: Duvie
I saw 1000 pt increase in 3dmark2k1se at same settings.....not sure if dual core is even running one bit cause I cant see task manager...

going to run specviewperf as I will be able to see task manager...

Cool so this doesn't just work with 7800gtx..... these things are so hard to test... cuz it's hard for th ehuman eye to see some distortions :-/ .... keep tryin duvie and the rest this is exciting stuff =P
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Well in specviewperf 8.01 I saw no signs of dual core running with task manager.....

I did however see quite an increase...

tested at 2.55ghz 10x255 with 1:1 cas 2.5-4-4-10 vs 2.61ghz (9x290) with 183divider cas 2.5-3-3-8...

so I ran same 385/850 on the card with same settings.....The older score had more bandwidth and 60mhz more speed...

Heere are the results


Before (using 7x series drivers)

Run All Summary

---------- SUM_RESULTS\3DSMAX\SUMMARY.TXT
3dsmax-03 Weighted Geometric Mean = 39.37

---------- SUM_RESULTS\CATIA\SUMMARY.TXT
catia-01 Weighted Geometric Mean = 32.63

---------- SUM_RESULTS\ENSIGHT\SUMMARY.TXT
ensight-01 Weighted Geometric Mean = 19.80

---------- SUM_RESULTS\LIGHT\SUMMARY.TXT
light-07 Weighted Geometric Mean = 27.92

---------- SUM_RESULTS\MAYA\SUMMARY.TXT
maya-01 Weighted Geometric Mean = 67.08

---------- SUM_RESULTS\PROE\SUMMARY.TXT
proe-03 Weighted Geometric Mean = 49.49

---------- SUM_RESULTS\SW\SUMMARY.TXT
sw-01 Weighted Geometric Mean = 25.62

---------- SUM_RESULTS\UGS\SUMMARY.TXT
ugs-04 Weighted Geometric Mean = 34.81


http://www.spec.org/gpc/opc.data/vp8/summary.html


and now using the 81.26


Run All Summary

---------- SUM_RESULTS\3DSMAX\SUMMARY.TXT
3dsmax-03 Weighted Geometric Mean = 44.73

---------- SUM_RESULTS\CATIA\SUMMARY.TXT
catia-01 Weighted Geometric Mean = 30.36

---------- SUM_RESULTS\ENSIGHT\SUMMARY.TXT
ensight-01 Weighted Geometric Mean = 23.49

---------- SUM_RESULTS\LIGHT\SUMMARY.TXT
light-07 Weighted Geometric Mean = 29.96

---------- SUM_RESULTS\MAYA\SUMMARY.TXT
maya-01 Weighted Geometric Mean = 71.81

---------- SUM_RESULTS\PROE\SUMMARY.TXT
proe-03 Weighted Geometric Mean = 51.66

---------- SUM_RESULTS\SW\SUMMARY.TXT
sw-01 Weighted Geometric Mean = 30.59

---------- SUM_RESULTS\UGS\SUMMARY.TXT
ugs-04 Weighted Geometric Mean = 31.71


3dsmax was 10% increase....
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,319
16,147
136
Originally posted by: entropy1982
Originally posted by: Markfw900
I have 2 X2 boxes. Unfortunately they are both not equipped with nvidia cards. One is a work box with a 9600pro (fanless) and the other is a dedicated F@H box that has a 2 meg PCI (not express) card.

may i ask why you have a dedicated F@H box? :)

Check my stats. Soon to be number 3 in Team Anandtech. I already have 10 other boxes, including dual opterons, and a 4400 X2. Why not have a dedicated F@H box ? Actually, I have 3 dedicated total, 2 Athlon64@2.2 and the 3800@2.55. The my work box 4400@2.55 is also on 24/7, and sometimes I leave the Dual Opteron box on also.
 

entropy1982

Golden Member
Jul 10, 2005
1,053
0
0
Originally posted by: Markfw900
Originally posted by: entropy1982
Originally posted by: Markfw900
I have 2 X2 boxes. Unfortunately they are both not equipped with nvidia cards. One is a work box with a 9600pro (fanless) and the other is a dedicated F@H box that has a 2 meg PCI (not express) card.

may i ask why you have a dedicated F@H box? :)

Check my stats. Soon to be number 3 in Team Anandtech. I already have 10 other boxes, including dual opterons, and a 4400 X2. Why not have a dedicated F@H box ? Actually, I have 3 dedicated total, 2 Athlon64@2.2 and the 3800@2.55. The my work box 4400@2.55 is also on 24/7, and sometimes I leave the Dual Opteron box on also.


i dunno figured the money could be better spent... but i'm no one to judge ... if that's what u like more power to u dude :)... didnt mean any offense by my first post ... just the first time i ever heard of that
 

entropy1982

Golden Member
Jul 10, 2005
1,053
0
0
Originally posted by: Duvie
Well in specviewperf 8.01 I saw no signs of dual core running with task manager.....

I did however see quite an increase...

tested at 2.55ghz 10x255 with 1:1 cas 2.5-4-4-10 vs 2.61ghz (9x290) with 183divider cas 2.5-3-3-8...

so I ran same 385/850 on the card with same settings.....The older score had more bandwidth and 60mhz more speed...

Heere are the results


Before (using 7x series drivers)

Run All Summary

---------- SUM_RESULTS\3DSMAX\SUMMARY.TXT
3dsmax-03 Weighted Geometric Mean = 39.37

---------- SUM_RESULTS\CATIA\SUMMARY.TXT
catia-01 Weighted Geometric Mean = 32.63

---------- SUM_RESULTS\ENSIGHT\SUMMARY.TXT
ensight-01 Weighted Geometric Mean = 19.80

---------- SUM_RESULTS\LIGHT\SUMMARY.TXT
light-07 Weighted Geometric Mean = 27.92

---------- SUM_RESULTS\MAYA\SUMMARY.TXT
maya-01 Weighted Geometric Mean = 67.08

---------- SUM_RESULTS\PROE\SUMMARY.TXT
proe-03 Weighted Geometric Mean = 49.49

---------- SUM_RESULTS\SW\SUMMARY.TXT
sw-01 Weighted Geometric Mean = 25.62

---------- SUM_RESULTS\UGS\SUMMARY.TXT
ugs-04 Weighted Geometric Mean = 34.81


http://www.spec.org/gpc/opc.data/vp8/summary.html


and now using the 81.26


Run All Summary

---------- SUM_RESULTS\3DSMAX\SUMMARY.TXT
3dsmax-03 Weighted Geometric Mean = 44.73

---------- SUM_RESULTS\CATIA\SUMMARY.TXT
catia-01 Weighted Geometric Mean = 30.36

---------- SUM_RESULTS\ENSIGHT\SUMMARY.TXT
ensight-01 Weighted Geometric Mean = 23.49

---------- SUM_RESULTS\LIGHT\SUMMARY.TXT
light-07 Weighted Geometric Mean = 29.96

---------- SUM_RESULTS\MAYA\SUMMARY.TXT
maya-01 Weighted Geometric Mean = 71.81

---------- SUM_RESULTS\PROE\SUMMARY.TXT
proe-03 Weighted Geometric Mean = 51.66

---------- SUM_RESULTS\SW\SUMMARY.TXT
sw-01 Weighted Geometric Mean = 30.59

---------- SUM_RESULTS\UGS\SUMMARY.TXT
ugs-04 Weighted Geometric Mean = 31.71


3dsmax was 10% increase....

thx for the testing
 

Velk

Senior member
Jul 29, 2004
734
0
0
Originally posted by: McGeyser
Originally posted by: chinkgai
i wonder why there is such a huge drop in performance when the tests are run in high quality? altho not for all the benchmarks...interesting? any ideas?

I find this interesting myself. The performance gain should be across the board evenly, if the Dual Core was directly responsible for the FPS gains.

We all know John Carmack from Id was adamant about Nvidia GPU's in his pre-Doom 3 press release, as being the only solution for Doom3. Here is Carmack discussing Nvidia's edge for Doom3.

Is 81.26 exploting a lighting trick or function, and thereby fudging benchmarks, by lowering the quality of something else? We are all aware of the secret optomization cheating drivers in benchmarks in past history of both Nvidia and ATI. Are they at it again?



More 81.26 Lighting Problems and distortion

Well, you are welcome to that conspiracy theory, however I'd imagine the fact that 3/4 of the tests got considerably *worse* under the new drivers would tend to suggest it probably isn't 'secret optomization' but just buggy code. Or were you thinking that the quake test getting better was IQ cheating and 3dmark HQ tests getting worse were bugs ? 8)

As I mentioned previously, the first beta run of the 80 series I tried produced no performance advantage and were riddled with display bugs. That's probably why they are labelled as beta.



 

McGeyser

Member
Jan 23, 2005
96
0
0
Originally posted by: Velk
Originally posted by: McGeyser
Originally posted by: chinkgai
i wonder why there is such a huge drop in performance when the tests are run in high quality? altho not for all the benchmarks...interesting? any ideas?

I find this interesting myself. The performance gain should be across the board evenly, if the Dual Core was directly responsible for the FPS gains.

We all know John Carmack from Id was adamant about Nvidia GPU's in his pre-Doom 3 press release, as being the only solution for Doom3. Here is Carmack discussing Nvidia's edge for Doom3.

Is 81.26 exploting a lighting trick or function, and thereby fudging benchmarks, by lowering the quality of something else? We are all aware of the secret optomization cheating drivers in benchmarks in past history of both Nvidia and ATI. Are they at it again?



More 81.26 Lighting Problems and distortion

Well, you are welcome to that conspiracy theory, however I'd imagine the fact that 3/4 of the tests got considerably *worse* under the new drivers would tend to suggest it probably isn't 'secret optomization' but just buggy code. Or were you thinking that the quake test getting better was IQ cheating and 3dmark HQ tests getting worse were bugs ? 8)

As I mentioned previously, the first beta run of the 80 series I tried produced no performance advantage and were riddled with display bugs. That's probably why they are labelled as beta.



As far as conspiracy theory goes, that term applies to the situation at hand, that which we are not exactly sure of or can prove ATM.


In the past it is a documented fact, no longer conspiracy "theory", that ATI and Nvidia tweaked their drivers to optomize for specific .exe's such as benchmark testing and games to showcase their cards in an attractive light. Here is just one example of known debauchery. There is much besides this article on the web about it. I am just wondering if because the new 81.26 supports GF2 again, and appears to have lighting and shadow issues and HQ settings are worse than stock, if that isn't a clue.

http://www.hardocp.com/article.html?art=MTEx
 

McGeyser

Member
Jan 23, 2005
96
0
0
Heh, I thought Nvidia released this driver, what a moron I am. Here is the release notes on Guru3d about this release, s0rry for being stupid out loud:

Publisher Description

The NGO NVIDIA Optimized Driver is a tweaked version of the NVIDIA ForceWare driver. The main purpose is to satisfy the users with better performance and image quality. The Driver has support for all Geforce cards.

Changes:
* Added nHancer by Martin Korndorfer
* Added PCI Latency Tool by Audiotrak
* Added RefreshForce by Gregory Maynard-Hoare
* Automaticly set SystemPages to "ffffffff" for a better performance and compatibility

PCI Latency Tool:
We decided to include this handy tool so you could adjust the PCI latency in order to avoid PCI bus choking. It helps optimize performance and resolve stuttering and sound issues. It's recommended to set latency values to 64 for all your PCI/AGP Devices. This tool doesnt support PCI Express cards.

nHancer:
With nHancer the creation and management of profiles is very easy, and the user receives as much assistance as he needs, to understand what he's doing.

Included Utilities:
* Added nHancer by Martin Kornd?rfer
* Added PCI Latency Tool by Audiotrak
* Added RefreshForce by Gregory Maynard-Hoare

Notes:
* We are not responsible for any damage caused by the use of this modified driver. Use it at your own risk!
* This driver has been modified and is in no way affiliated or supported by nVidia Corporation.

 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
I definitely cannot find anything to show it is dual core capable one bit...at least with my 6x series card that is..
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Some of you must have missed this months ago when it was in the "Latest News" section on the home page. The performance gains are due to work the GPU normally does (mainly vertex processing) being offloaded to the 2nd CPU core. The same gains may be possible with Hyper-Threading and not just dual core, assuming there are enough processor resources to handle both the extra vertex processing, and everything to do with the game that the CPU normally handles. That may not be the case though, which might be why nVidia never released a Hyper-Threading specific driver in the past, and has just now started using a dual core specific driver.